Patent Issued for System and method for a semantically-driven smart data cache (USPTO 11120065): Aetna Inc.
2021 OCT 04 (NewsRx) -- By a
The patent’s assignee for patent number 11120065 is
News editors obtained the following quote from the background information supplied by the inventors: “Enterprise systems include servers, storage and associated software deployed in a large scale that may serve as an information technology infrastructure for businesses, governments, or other large organizations. Enterprise systems manage large volumes of data and are designed to offer and provide high levels of transaction performance and data security. These systems are also designed to support business processes, information flows, data analytics, and other functions. Enterprise systems include various individual system assets and resources. In the age of complexity of information, enterprise systems manage myriad data sources containing simple flat files and relational databases to unstructured and geo-spatial data. This, in turn, increases complexity of providing access to diverse data sources to consuming applications.”
As a supplement to the background information on this patent, NewsRx correspondents also obtained the inventors’ summary information for this patent: “An embodiment of the disclosure provides a method of integrating data across multiple data stores in a smart cache in order to provide data to one or more recipient systems. The method includes automatically ingesting diverse data from a plurality of data sources, automatically reconciling the ingested diverse data by updating semantic models based on the ingested diverse data, storing the ingested diverse data based on one or more classification of the data sources according to the semantic models, automatically generating scalable service endpoints which are semantically consistent according to the classification of the data sources, and responding to a call from the one or more recipient systems by providing data in the classification of the data sources.
“Another embodiment of the disclosure provides a non-transitory computer readable medium for integrating data across multiple data stores in a smart cache in order to provide data to one or more recipient systems. The non-transitory computer readable medium contains program instructions for causing a server to perform the method including: automatically ingesting diverse data from a plurality of data sources, automatically reconciling the ingested diverse data by updating semantic models based on the ingested diverse data, storing the ingested diverse data based on one or more classification of the data sources according to the semantic models, automatically generating scalable service endpoints which are semantically consistent according to the classification of the data sources, and responding to a call from the one or more recipient systems by providing data in the classification of the data sources.
“Yet another embodiment of the disclosure provides a system for integrating data across multiple data stores in a smart cache in order to provide data to one or more recipient systems. The system includes one or more databases storing semantic models and machine learning algorithms and one or more servers. The servers are configured to: automatically ingest diverse data from a plurality of data sources and automatically reconcile the ingested diverse data by performing one or more of: (a) updating semantic models based on the ingested diverse data, (b) structuring the ingested diverse data, wherein the structuring comprises realigning and reformatting data elements in the ingested diverse data into a standardized representation based on the semantic models, and © organizing the ingested diverse data, wherein the organizing comprises aligning the ingested diverse data to multiple structures within the semantic models. The servers are further configured to: store the ingested diverse data based on one or more classification of the data sources according to the semantic models, automatically generate scalable service endpoints which are semantically consistent according to the classification of the data sources, and respond to a call from the one or more recipient systems by providing data in the classification of the data sources.”
The claims supplied by the inventors are:
“1. A method of integrating data across multiple data stores in a smart cache in order to provide data to one or more recipient systems, the method comprising: automatically ingesting diverse data from a plurality of data sources; automatically reconciling the ingested diverse data by updating semantic models based on the ingested diverse data; storing the ingested diverse data based on one or more classification of the data sources according to the semantic models; automatically generating scalable service endpoints which are semantically consistent according to the classification of the data sources, wherein the generated scalable service endpoints are a plurality of application programming interfaces (APIs); in response to receiving a call from the one or more recipient systems, determining a protocol supported by the smart cache based on the scalable service endpoints; and responding to the call from the one or more recipient systems by providing data in the classification of the data sources.
“2. The method according to claim 1, wherein the generated scalable service endpoints are stable as long as the ingested diverse data sources remain semantically stable.
“3. The method according to claim 1, further comprising: automatically reconciling the ingested diverse data by structuring the ingested diverse data, wherein the structuring comprises realigning and reformatting data elements in the ingested diverse data into a standardized representation based on the semantic models.
“4. The method according to claim 3, further comprising: automatically reconciling the ingested diverse data by organizing the ingested diverse data, wherein the organizing comprises aligning the ingested diverse data to multiple structures within the semantic models.
“5. The method according to claim 4, further comprising: automatically optimizing the ingested diverse data by monitoring usage of the ingested diverse data and caching new representations of the ingested diverse data as needed to improve performance.
“6. The method according to claim 5, wherein the automatically structuring, the automatically organizing, and the automatically optimizing are continuously optimized using machine learning techniques.
“7. The method according to claim 1, wherein the classification of the data sources includes relational data, unstructured data, columnar data, geo-spatial data, and key-value store.
“8. The method according to claim 1, further comprising: publishing the generated scalable service endpoints by updating a registry containing service definitions, interfaces, operations, and parameters.
“9. The method according to claim 1, further comprising: determining a change in at least one data source.
“10. The method according to claim 9, wherein the change is a removal of data in the at least one data source.
“11. A non-transitory computer readable medium for integrating data across multiple data stores in a smart cache in order to provide data to one or more recipient systems, the non-transitory computer readable medium containing program instructions for causing a server to perform the method comprising: automatically ingesting diverse data from a plurality of data sources; automatically reconciling the ingested diverse data by updating semantic models based on the ingested diverse data; storing the ingested diverse data based on one or more classification of the data sources according to the semantic models; automatically generating scalable service endpoints which are semantically consistent according to the classification of the data sources, wherein the generated scalable service endpoints are a plurality of application programming interfaces (APIs); in response to receiving a call from the one or more recipient systems, determining a protocol supported by the smart cache based on the scalable service endpoints; and responding to the call from the one or more recipient systems by providing data in the classification of the data sources.
“12. The non-transitory computer readable medium according to claim 11, wherein the generated scalable service endpoints are stable as long as the ingested diverse data sources remain semantically stable.
“13. The non-transitory computer readable medium according to claim 11, wherein the server further performs the method comprising: automatically reconciling the ingested diverse data by structuring the ingested diverse data, wherein the structuring comprises realigning and reformatting data elements in the ingested diverse data into a standardized representation based on the semantic models.
“14. The non-transitory computer readable medium according to claim 13, wherein the server further performs the method comprising: automatically reconciling the ingested diverse data by organizing the ingested diverse data, wherein the organizing comprises aligning the ingested diverse data to multiple structures within the semantic models.
“15. The non-transitory computer readable medium according to claim 14, wherein the server further performs the method comprising: automatically optimizing the ingested diverse data by monitoring usage of the ingested diverse data and caching new representations of the ingested diverse data as needed to improve performance.
“16. The non-transitory computer readable medium according to claim 15, wherein the automatically structuring, the automatically organizing, and the automatically optimizing are continuously optimized using machine learning techniques.
“17. The non-transitory computer readable medium according to claim 11, wherein the classification of the data sources includes relational data, unstructured data, columnar data, geo-spatial data, and key-value store.
“18. The non-transitory computer readable medium according to claim 11, wherein the server further performs the method comprising: publishing the generated scalable service endpoints by updating a registry containing service definitions, interfaces, operations, and parameters.
“19. The non-transitory computer readable medium according to claim 11, wherein the server further performs the method comprising: determining a change in at least one data source.
“20. A system for integrating data across multiple data stores in a smart cache in order to provide data to one or more recipient systems, the system comprising: one or more databases storing semantic models and machine learning algorithms; and one or more servers configured to: automatically ingest diverse data from a plurality of data sources; automatically reconcile the ingested diverse data by performing one or more of: updating semantic models based on the ingested diverse data, structuring the ingested diverse data, wherein the structuring comprises realigning and reformatting data elements in the ingested diverse data into a standardized representation based on the semantic models, and organizing the ingested diverse data, wherein the organizing comprises aligning the ingested diverse data to multiple structures within the semantic models; store the ingested diverse data based on one or more classification of the data sources according to the semantic models; automatically generate scalable service endpoints which are semantically consistent according to the classification of the data sources, wherein the generated scalable service endpoints are a plurality of application programming interfaces (APIs); in response to receiving a call from the one or more recipient systems, determining a protocol supported by the smart cache based on the scalable service endpoints; and respond to the call from the one or more recipient systems by providing data in the classification of the data sources.”
For additional information on this patent, see:
(Our reports deliver fact-based news of research and discoveries from around the world.)
Findings from Ministry of Education Provides New Data on Risk Management (Applicability of Boosting Techniques In Calibrating Safety Performance Functions for Freeways): Risk Management
“Method For Providing An Authenticated Digital Identity” in Patent Application Approval Process (USPTO 20210286868): Patent Application
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News