“Verification Of Privacy In A Shared Resource Environment” in Patent Application Approval Process (USPTO 20200151351)
2020 MAY 28 (NewsRx) -- By a
This patent application is assigned to
The following quote was obtained by the news editors from the background information supplied by the inventors: “The present embodiments relate to security associated with application service provider in a shared resource environment. More specifically, the embodiments relate to privacy assessment and privacy preservation as related to the application service provider and associated data.
“A data steward is an entity responsible for management and proficiency of stored data to ensure fitness of data elements. An example of a data steward is a hospital collecting information from multiple patients and medical professional, where data collected needs to be protected according to privacy and legislation requirements. More specifically, a data steward is responsible for data processing, data policies, data guidelines and administration of information in compliance with policy and regulatory obligations. A data steward is also known as the data controller in certain legislations such as General Data Protection Regulation (GDPR). Thus, the data steward is an entity responsible for ensuring that security and privacy policies comply with regulatory and governance initiatives to manage privacy and confidentiality of data.
“The role and responsibilities of the data steward may include serving as a data custodian, which includes addressing classification of data and associated risk tolerance. The data steward is responsible for provisioning access to data, including reviewing and authorizing data access requests individually or defining a set of rules to determine eligibility for the access. For example, eligibility may be based on a business function, roles, etc. The data steward is an aspect within the information technology platform to ensure privacy and appropriate access of associated data. However, the data steward is one entity in a system of a plurality of entities. It is understood that with the development and growth of information technology and associated infrastructure, such as shared sources, there are locations where security of data and/or associated services may be compromised in a manner that is beyond the control of the data steward.
“While privacy legislations such as the Health Insurance Portability and Accountability Act (HIPAA) and GDPR impose obligations on data stewards to protect the privacy of data owners, data stewards also make use of services to maintain data and help provide anonymized queries to private datasets. Because private data can be queried in an anonymized fashion, for example to perform research studies, data stewards need to ensure they detect any potential data leakages and that in case those arise, that they act promptly to stop them.”
In addition to the background information obtained for this patent application, NewsRx journalists also obtained the inventors’ summary information for this patent application: “The embodiments include a system, computer program product, and method for facilitating auditing of private data through the collection, test, computation of a privacy score based on auxiliary information and specific characteristics of the service used for anonymization and private data, notification processing and delivery.
“In one aspect, a system is provided with a computer platform and one or more associated tools for managing privacy, and more specifically for assessment of privacy preservation. A processing unit is operatively coupled to memory and is in communication with a tool in the form of an Auditing and Privacy Verification Evaluator, hereinafter referred to as the Evaluator. The Evaluator functions to receive a preferred level of privacy for a computing resource. In addition, the Evaluator performs a confidence level assessment of candidate inferences, and from this assessment forms a set of inferred entities and selectively assigns individual candidate inferences to an inferred entity set. The Evaluator performs a privacy preservation assessment for the formed set. This assessment returns a privacy score that can be used as a leakage indicator. The Evaluator populates a data container with inferred entities that violate the preferred privacy level.
“In another aspect, a computer program device is provided to perform a privacy preservation assessment. The device has program code embodied therewith. The program code is executable by a processing unit to receive a preferred level of privacy for a computing resource. The program code performs a confidence level assessment of candidate inferences, and from this assessment forms a set of inferred entities and selectively assigns individual candidate inferences to an inferred entity set. Program code is also provided to perform a privacy preservation assessment for the formed set. This assessment returns a privacy score directed at a leakage indicator. A data container is provided operatively coupled to the program code and program code device. The program code populates the data container with inferred entities that violate the preferred privacy level.
“In yet another aspect, a method is provided for supporting and performing a privacy preservation assessment. A preferred level of privacy for a computing resource is received, and a confidence level assessment of candidate inferences is performed. From this assessment, a set of inferred entities is formed and individual candidate inferences are selectively assigned to an inferred entity set. Thereafter, a privacy preservation assessment is performed for the formed set. This assessment returns a privacy score directed at a leakage indicator. A data container is populated with inferred entities that violate the preferred privacy level.
“These and other features and advantages will become apparent from the following detailed description of the presently preferred embodiment(s), taken in conjunction with the accompanying drawings.”
The claims supplied by the inventors are:
“1. A system comprising: a processing unit operatively coupled to memory; an auditing and privacy verification evaluator in communication with the processing unit, the evaluator to automatically perform a privacy preservation assessment, the assessment including de-anonymization of an associated dataset and inferring private information, the evaluator to: receive a preferred level of privacy for service and data associated with a computing resource; form a set of inferred entities, including selectively assign individual candidate inferences to an inferred entity set responsive to a confidence level assessment of the individual candidate inferences; and perform a privacy preservation assessment for the formed set of inferred entities, including the performed assessment to return a privacy score, the score being a leakage indicator of private information associated with the service; and a data container operatively coupled to the evaluator, and the evaluator to populate the data container with the formed set of inferred entities violating the preferred privacy level.
“2. The system of claim 1, wherein the data container includes raw data corresponding to the formed set of inferred entities and the one or more candidate inferences, and further comprising the evaluator to compare the set of inferred entities to the raw data, and create an adjusted set of inferences.
“3. The system of claim 2, further comprising the evaluator to adjust the set of inferred entities, including remove a failed prediction from the formed set of inferred entities.
“4. The system of claim 3, further comprising the evaluator to re-compute the privacy score based on the adjusted set of inferred entities, including return an updated privacy score corresponding to the adjusted set of inferred entities.
“5. The system of claim 4, further comprising the evaluator to iteratively evaluate the candidate inferences based on the updated privacy score, and create a modified set of candidate inferences subject to the evaluation.
“6. The method of claim 1, further comprising the evaluator to periodically perform the privacy preservation assessment responsive to a substantive modification of an element selected from the group consisting of: a dataset, a privacy method change, and an explicit request from a cloud service steward.
“7. The system of claim 1, further comprising the evaluator to selectively add an entity to the formed set of inferred entities, the selective addition directed at external data and occurring prior to receipt of the privacy score.
“8. The system of claim 1, wherein the assessment utilizes more than one test, and further comprising the Evaluator to select one or more of the tests responsive to a characteristic of the associated dataset.
“9. A computer program product to automatically perform a privacy preservation assessment against a shared pool of configurable computing resources and associated data, the assessment including de-anonymization of an associated dataset and inferring private information, the program code executable by a processing unit to: receive a preferred level of privacy for service and data associated with a computing resource; form a set of inferred entities, including selectively assign individual candidate inferences to an inferred entity set responsive to a confidence level assessment of the individual candidate inferences; and perform a privacy preservation assessment for the formed set of inferred entities, including the performed assessment to return a privacy score, the score being a leakage indicator of private information associated with the service; and a data container operatively coupled to the program code, including the program code to selectively populate the data container with the formed set of inferred entities violating the preferred privacy level.
“10. The computer program product of claim 9, wherein the data container includes raw data corresponding to the formed set of inferred entities and the one or more candidate inferences, and further comprising program code to compare the set of inferred entities to the raw data, and create an adjusted set of inferences.
“11. The computer program product of claim 10, further comprising program code to adjust the set of inferred entities, including remove a failed prediction from the formed set of inferred entities.
“12. The computer program product of claim 11, further comprising program code to re-compute the privacy score based on the adjusted set of inferred entities, including return an updated privacy score corresponding to the adjusted set of inferred entities.
“13. The computer program product of claim 11, further comprising program code to iteratively evaluate the candidate inferences based on the updated privacy score, and create a modified set of candidate inferences subject to the evaluation.
“14. The computer program product of claim 9, further comprising program code to periodically perform the privacy preservation assessment responsive to a substantive modification of an element selected from the group consisting of: a dataset, a privacy method change, and an explicit request from a cloud service steward.
“15. The computer program product of claim 9, further comprising program code to selectively add an entity from the formed set of inferred entities, the selective addition directed at external data and occurring prior to receipt of the privacy score.
“16. A method comprising: automatically performing a privacy preservation assessment against a shared pool of configurable computing resources and associated data, the assessment including de-anonymizing an associated dataset and inferring private information, including: receiving a preferred level of privacy for service and data associated with computing resources; forming a set of inferred entities, including selectively assigning individual candidate inferences to an inferred entity set responsive to a confidence level assessment of the individual candidate inferences; and performing a privacy preservation assessment for the formed set of inferred entities, including returning a privacy score, the score being a leakage indicator of private information associated with the service; and selectively populating a data container with the formed set of inferred entities responsive to the returned privacy violating the privacy expectation level.
“17. The method of claim 16, wherein the data container includes raw data corresponding to the formed set of inferred entities and the one or more candidate inferences, and further comprising comparing the set of inferred entities to the raw data, and creating an adjusted set of inferences.
“18. The method of claim 17, further comprising adjusting the set of inferred entities, including removing a failed prediction from the formed set of inferred entities.
“19. The method of claim 18, further comprising re-computing the privacy score based on the adjusted set of inferred entities, including returning an updated privacy score corresponding to the adjusted set of inferred entities.
“20. The method of claim 18, further comprising iteratively evaluating the candidate inferences based on the updated privacy score, and creating a modified set of candidate inferences subject to the evaluation.”
URL and more information on this patent application, see: Skourtis, Dimitrios; Angel,
(Our reports deliver fact-based news of research and discoveries from around the world.)



Walla Walla County approved for Phase 2 economic reopening
Accounts Receivable Support Services
Advisor News
- NAIFA: Financial professionals are essential to the success of Trump Accounts
- Changes, personalization impacting retirement plans for 2026
- Study asks: How do different generations approach retirement?
- LTC: A critical component of retirement planning
- Middle-class households face worsening cost pressures
More Advisor NewsAnnuity News
- Trademark Application for “INSPIRING YOUR FINANCIAL FUTURE” Filed by Great-West Life & Annuity Insurance Company: Great-West Life & Annuity Insurance Company
- Jackson Financial ramps up reinsurance strategy to grow annuity sales
- Insurer to cut dozens of jobs after making splashy CT relocation
- AM Best Comments on Credit Ratings of Teachers Insurance and Annuity Association of America Following Agreement to Acquire Schroders, plc.
- Crypto meets annuities: what to know about bitcoin-linked FIAs
More Annuity NewsHealth/Employee Benefits News
- Red and blue states alike want to limit AI in insurance. Trump wants to limit the states.
- CT hospital, health insurer battle over contract, with patients caught in middle. Where it stands.
- $2.67B settlement payout: Blue Cross Blue Shield customers to receive compensation
- Sen. Bernie Moreno has claimed the ACA didn’t save money. But is that true?
- State AG improves access to care for EmblemHealth members
More Health/Employee Benefits NewsLife Insurance News