Patent Issued for Resource management system (USPTO 11645623): Progressive Casualty Insurance Company
2023 MAY 25 (NewsRx) -- By a
The assignee for this patent, patent number 11645623, is
Reporters obtained the following quote from the background information supplied by the inventors:
“1. Technical Field
“This disclosure relates to structural management, and specifically to systems that methods that manage big data.
“2. Related Art
“Resource coordination is challenging. Some processes attempt to provision materials and services but fail to assimilate them into a holistic resource. Some processes attempt to provide access to materials and some attempt to provide access to the skills needed to process them. These systems attempt to provision access via separate front-facing software.
“Known processes attempt to manage access to a resource without managing the screened and vetted professionals that install them. Such systems fail to efficiently process large data associated with these resources and services. They cannot manage multiple resources and the large data associated with them. As such, it is difficult to track progress and establish measurable objectives making the monitoring processes meaningless. Adaptability and flexibility is a challenge for these systems, as many are custom-made and personalized to different end-users.”
In addition to obtaining background information on this patent, NewsRx editors also obtained the inventors’ summary information for this patent: “The disclosed resource management systems provide rich visualizations. The systems streamline processes across selections, procurement, and services using intelligent caching and proxies that simplify managing remote resources and large data. The systems generate graphically rich interactive screens that dynamically render project information over time through invisible mappings while guaranteeing financial commitments. The mappings establish associations between resource addresses for remote sources and remote destinations to local sources through intelligent caches and proxies. The invisible mappings re-direct what is usually served by remote sources via external requests to local sources. The systems create the impression that content is served independently through containers and computer framing, without the delay and bandwidth consumption that usually comes with such technology.
“The systems provide alerts and status indicators while providing observations that end-users make electronically. In operation, some end-users have access to projects in their domain through desktop software and mobile apps by the system’s knowledge of its users. When access is granted, end-users coordinate services, enter observations, request assessments, establish analytics, track outcomes, track quality, and receive guarantees.
“To access objects that render content, connections are usually made between remote resources and local interfaces via remote requests and responses. Establishing network connections for each request/response for materials and services consumes network bandwidth and causes delay as many data exchanges must occur before a request can be serviced. Further, when content must be collected from multiple remote resources, some resources include deep links that contain the memory location (address) of embedded content that may be served outside of the network domain. Some linked content is served by remote resources that redirects the user to compromised external environments not served by the intended-origin server and/or network. Such link surfing may violate a same-origin server policy and/or a common domain security policy that some enterprise systems require. Subjecting users to domains outside of their demilitarized zones can cause bottlenecks that cause some origin severs to stop responding to user requests while waiting for remote responses.
“Rather than requiring end-users to access multiple remote external resources when managing desired resources, the nationwide resource management system of FIG. 1 uses a declarative client 102 for data fetching, data retrieving, load tracking, error rate tracking, caching, and updating end-user and expert interfaces 104 and 116. When an end-user’s interface 104 request hits an application server through a secure transfer protocol such as a secure version of Hypertext Transfer Protocol (HTTPS), for example, a load balancer distributes the network traffic across multiple servers and/or server clusters. The request originates from the primary stack that requests services from the resources required to run and service an application. The resources may include a Web server, a database, and networking rules. Using a JavaScript library that supports the end-user interfaces such as a React-based web user interface-framework, the systems serves end-user interfaces (UIs) through UI components and guidelines directed to many components from interface layouts to language selections through a UI Toolkit 110 shown in FIG. 1. The systems provide several layout components including those based on a Flexible Box Module or flexbox that may serve as a dimensional layout module that provides accessibility, modularity, responsiveness, and theming and further reflects color selections, option type selection, and layouts. An exemplary React-based framework uses grommet in some alternate processes.
“The application programming interface (API) that comprises a set of software routines used by the declarative client 102 is a graphical API. The declarative client 102 uses a normalized, in-memory cache to dramatically speed up the execution of queries. The cache normalizes the query results before saving them to memory by splitting results into individual objects, assigning unique identifiers to each object, and storing the objects in a flattened data structure associated with their unique identifiers in memory. A flattened data has no hierarchical order with no two files having the same name even in different directories. A unique identifier may combine the objects’ names with a sequential operator designation and/or identifier and/or may specify the objects’ path with the associated query. The in-memory cache is a device used to store data temporarily and deliver data at a rate faster than the rate the data is received. It improves system performance by reducing the number of times the declarative client 102 must go through the relatively slow process of reading from and writing to a conventional memory.”
The claims supplied by the inventors are:
“1. A system that manages remote and local data comprising: a declarative client for retrieving data, tracking data loading, and caching data in response to a transmission of auto-generated queries from an end-user interface; the declarative client sitting on an immutable image served by web services of a secure private cloud platform; a serverless compute engine that receives the immutable image as a template in which a secure container is built and a plurality of tasks that process the immutable image within the secure container; an application programming interface that comprises software executed by the declarative client to extract data in response to the auto-generated queries from the end-user interactive; and wherein the declarative client includes a normalized in-memory cache that break up results of the auto-generated queries into individual objects that are each associated with a unique identifier across different directories and a unique name within a local cache to speed up the execution of the auto-generated queries; wherein the extracted data comprises data extracted from deconstructed downloaded content in which original computer links between data elements are intercepted and mapped to redirected computer links that locate the downloaded content within a local centralized database.
“2. The system of claim 1 further comprising an autogenerated query builder formed by a query generator.
“3. The system of claim 2 where the query builder is based on a plurality of application models and a plurality of auto-generated queries comprise a GraphQL service.
“4. The system of claim 1 further comprising a software layer that optimizes a voice messaging service, a video messaging service, and a textual messaging servicing.
“5. The system of claim 1 further comprising a payment platform remote from the declarative client and the application programming interface that communicates directly with the interface and the application programming interface and pushes a token to the interface and the application programming interface from which payment is drawn against.
“6. The system of claim 1 further comprising authentication and authorization servers that generate user pools that configure accessibility rights to the remote and local data.
“7. The system of claim 6 where the authentication and authorization servers render a two-way authentication that comprises global positioning data processed by an end-user’s device.
“8. The system of claim 1 where the extracted data comprises a referral of a service professional.
“9. The system of claim 1 where acceptance of a referral of a service professional occurs electronically through an end-user’s mobile device and a Web application.
“10. The system of claim 1 where acceptance of a quote from a service professional is based on a shared cryptographic secret stored on the system and on an end-user’s device.
“11. A non-transitory computer-readable storage medium having stored thereon a plurality of software instructions that, when executed by a hardware processor, causes: retrieving data and caching data in response to a transmission of queries from an end-user interface via a declarative client; the declarative client sitting on an immutable image served by web services of a secure cloud platform; receiving via a serverless compute engine the immutable image as a template in which a secure container is built and a plurality of tasks that process the immutable image within the secure container; and extracting data in response to the queries from the end user interface via an application programming interface that comprises a software executed by the declarative client; wherein the declarative client includes a normalized in-memory cache that breaks up results of the queries into individual objects, and are stored individually in a memory and that are each associated with a unique identifier across different directories and a unique name within a local cache to speed up the execution of database queries; wherein the extracted data comprises data extracted from deconstructed downloaded content in which original computer assigned links between data elements are intercepted and mapped automatically to redirected computer links that locate the downloaded content within a local centralized database.
“12. The non-transitory computer-readable medium of claim 11 further comprising an autogenerated query builder generated by a query generator.
“13. The non-transitory computer-readable medium of claim 12 where the query builder is based on a plurality of application models and a plurality of auto-generated queries comprise a GraphQL service.
“14. The non-transitory computer-readable medium of claim 11 further comprising a optimizing a voice messaging service, a video messaging service, and a textual messaging servicing that is directly linked to an end-user.
“15. The non-transitory computer-readable medium of claim 11 further comprising pushing a nonreducible token to the interface and the application programming interface from which a payment is drawn against.
“16. The non-transitory computer-readable medium of claim 11 further comprising generating user pools that configure accessibility rights to a remote data and a local data via authentication and authorization servers.
“17. The non-transitory computer-readable medium of claim 16 where the authentication and authorization servers render a two-way authentication comprising global positioning data rendered by an end-user’s device.
“18. The non-transitory computer-readable medium of claim 11 where the extracted data comprises a referral of a service professional.
“19. The non-transitory computer-readable medium of claim 11 where an acceptance of a referral of a service professional occurs only electronically through an end-user’s mobile device.
“20. The non-transitory computer-readable medium of claim 11 where an acceptance of a quote from a service professional is based on a shared cryptographic secret stored on the non-transitory computer-readable medium and on an end-user’s device.”
For more information, see this patent: Edwards, Sara. Resource management system.
(Our reports deliver fact-based news of research and discoveries from around the world.)
“Dimensionality Reduction Of Multi-Attribute Consumer Profiles” in Patent Application Approval Process (USPTO 20230148331): Patent Application
Elevated prevalence and treatment of sleep disorders from 2011 to 2020; a nationwide population-based retrospective cohort study in Korea: Sleep Diseases and Conditions – Sleep Disorders
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News