“Verification Of Privacy In A Shared Resource Environment” in Patent Application Approval Process (USPTO 20200151351) - Insurance News | InsuranceNewsNet

InsuranceNewsNet — Your Industry. One Source.™

Sign in
  • Subscribe
  • About
  • Advertise
  • Contact
Home Now reading Newswires
Topics
    • Advisor News
    • Annuity Index
    • Annuity News
    • Companies
    • Earnings
    • Fiduciary
    • From the Field: Expert Insights
    • Health/Employee Benefits
    • Insurance & Financial Fraud
    • INN Magazine
    • Insiders Only
    • Life Insurance News
    • Newswires
    • Property and Casualty
    • Regulation News
    • Sponsored Articles
    • Washington Wire
    • Videos
    • ———
    • About
    • Advertise
    • Contact
    • Editorial Staff
    • Newsletters
  • Exclusives
  • NewsWires
  • Magazine
  • Newsletters
Sign in or register to be an INNsider.
  • AdvisorNews
  • Annuity News
  • Companies
  • Earnings
  • Fiduciary
  • Health/Employee Benefits
  • Insurance & Financial Fraud
  • INN Exclusives
  • INN Magazine
  • Insurtech
  • Life Insurance News
  • Newswires
  • Property and Casualty
  • Regulation News
  • Sponsored Articles
  • Video
  • Washington Wire
  • Life Insurance
  • Annuities
  • Advisor
  • Health/Benefits
  • Property & Casualty
  • Insurtech
  • About
  • Advertise
  • Contact
  • Editorial Staff

Get Social

  • Facebook
  • X
  • LinkedIn
Newswires
Newswires RSS Get our newsletter
Order Prints
May 28, 2020 Newswires
Share
Share
Post
Email

“Verification Of Privacy In A Shared Resource Environment” in Patent Application Approval Process (USPTO 20200151351)

Insurance Daily News

2020 MAY 28 (NewsRx) -- By a News Reporter-Staff News Editor at Insurance Daily News -- A patent application by the inventors Skourtis, Dimitrios (Santa Cruz, CA); Angel, Nathalie Baracaldo (San Jose, CA); Zhang, Rui (San Francisco, CA), filed on November 13, 2018, was made available online on May 14, 2020, according to news reporting originating from Washington, D.C., by NewsRx correspondents.

This patent application is assigned to International Business Machines Corporation (Armonk, New York, United States).

The following quote was obtained by the news editors from the background information supplied by the inventors: “The present embodiments relate to security associated with application service provider in a shared resource environment. More specifically, the embodiments relate to privacy assessment and privacy preservation as related to the application service provider and associated data.

“A data steward is an entity responsible for management and proficiency of stored data to ensure fitness of data elements. An example of a data steward is a hospital collecting information from multiple patients and medical professional, where data collected needs to be protected according to privacy and legislation requirements. More specifically, a data steward is responsible for data processing, data policies, data guidelines and administration of information in compliance with policy and regulatory obligations. A data steward is also known as the data controller in certain legislations such as General Data Protection Regulation (GDPR). Thus, the data steward is an entity responsible for ensuring that security and privacy policies comply with regulatory and governance initiatives to manage privacy and confidentiality of data.

“The role and responsibilities of the data steward may include serving as a data custodian, which includes addressing classification of data and associated risk tolerance. The data steward is responsible for provisioning access to data, including reviewing and authorizing data access requests individually or defining a set of rules to determine eligibility for the access. For example, eligibility may be based on a business function, roles, etc. The data steward is an aspect within the information technology platform to ensure privacy and appropriate access of associated data. However, the data steward is one entity in a system of a plurality of entities. It is understood that with the development and growth of information technology and associated infrastructure, such as shared sources, there are locations where security of data and/or associated services may be compromised in a manner that is beyond the control of the data steward.

“While privacy legislations such as the Health Insurance Portability and Accountability Act (HIPAA) and GDPR impose obligations on data stewards to protect the privacy of data owners, data stewards also make use of services to maintain data and help provide anonymized queries to private datasets. Because private data can be queried in an anonymized fashion, for example to perform research studies, data stewards need to ensure they detect any potential data leakages and that in case those arise, that they act promptly to stop them.”

In addition to the background information obtained for this patent application, NewsRx journalists also obtained the inventors’ summary information for this patent application: “The embodiments include a system, computer program product, and method for facilitating auditing of private data through the collection, test, computation of a privacy score based on auxiliary information and specific characteristics of the service used for anonymization and private data, notification processing and delivery.

“In one aspect, a system is provided with a computer platform and one or more associated tools for managing privacy, and more specifically for assessment of privacy preservation. A processing unit is operatively coupled to memory and is in communication with a tool in the form of an Auditing and Privacy Verification Evaluator, hereinafter referred to as the Evaluator. The Evaluator functions to receive a preferred level of privacy for a computing resource. In addition, the Evaluator performs a confidence level assessment of candidate inferences, and from this assessment forms a set of inferred entities and selectively assigns individual candidate inferences to an inferred entity set. The Evaluator performs a privacy preservation assessment for the formed set. This assessment returns a privacy score that can be used as a leakage indicator. The Evaluator populates a data container with inferred entities that violate the preferred privacy level.

“In another aspect, a computer program device is provided to perform a privacy preservation assessment. The device has program code embodied therewith. The program code is executable by a processing unit to receive a preferred level of privacy for a computing resource. The program code performs a confidence level assessment of candidate inferences, and from this assessment forms a set of inferred entities and selectively assigns individual candidate inferences to an inferred entity set. Program code is also provided to perform a privacy preservation assessment for the formed set. This assessment returns a privacy score directed at a leakage indicator. A data container is provided operatively coupled to the program code and program code device. The program code populates the data container with inferred entities that violate the preferred privacy level.

“In yet another aspect, a method is provided for supporting and performing a privacy preservation assessment. A preferred level of privacy for a computing resource is received, and a confidence level assessment of candidate inferences is performed. From this assessment, a set of inferred entities is formed and individual candidate inferences are selectively assigned to an inferred entity set. Thereafter, a privacy preservation assessment is performed for the formed set. This assessment returns a privacy score directed at a leakage indicator. A data container is populated with inferred entities that violate the preferred privacy level.

“These and other features and advantages will become apparent from the following detailed description of the presently preferred embodiment(s), taken in conjunction with the accompanying drawings.”

The claims supplied by the inventors are:

“1. A system comprising: a processing unit operatively coupled to memory; an auditing and privacy verification evaluator in communication with the processing unit, the evaluator to automatically perform a privacy preservation assessment, the assessment including de-anonymization of an associated dataset and inferring private information, the evaluator to: receive a preferred level of privacy for service and data associated with a computing resource; form a set of inferred entities, including selectively assign individual candidate inferences to an inferred entity set responsive to a confidence level assessment of the individual candidate inferences; and perform a privacy preservation assessment for the formed set of inferred entities, including the performed assessment to return a privacy score, the score being a leakage indicator of private information associated with the service; and a data container operatively coupled to the evaluator, and the evaluator to populate the data container with the formed set of inferred entities violating the preferred privacy level.

“2. The system of claim 1, wherein the data container includes raw data corresponding to the formed set of inferred entities and the one or more candidate inferences, and further comprising the evaluator to compare the set of inferred entities to the raw data, and create an adjusted set of inferences.

“3. The system of claim 2, further comprising the evaluator to adjust the set of inferred entities, including remove a failed prediction from the formed set of inferred entities.

“4. The system of claim 3, further comprising the evaluator to re-compute the privacy score based on the adjusted set of inferred entities, including return an updated privacy score corresponding to the adjusted set of inferred entities.

“5. The system of claim 4, further comprising the evaluator to iteratively evaluate the candidate inferences based on the updated privacy score, and create a modified set of candidate inferences subject to the evaluation.

“6. The method of claim 1, further comprising the evaluator to periodically perform the privacy preservation assessment responsive to a substantive modification of an element selected from the group consisting of: a dataset, a privacy method change, and an explicit request from a cloud service steward.

“7. The system of claim 1, further comprising the evaluator to selectively add an entity to the formed set of inferred entities, the selective addition directed at external data and occurring prior to receipt of the privacy score.

“8. The system of claim 1, wherein the assessment utilizes more than one test, and further comprising the Evaluator to select one or more of the tests responsive to a characteristic of the associated dataset.

“9. A computer program product to automatically perform a privacy preservation assessment against a shared pool of configurable computing resources and associated data, the assessment including de-anonymization of an associated dataset and inferring private information, the program code executable by a processing unit to: receive a preferred level of privacy for service and data associated with a computing resource; form a set of inferred entities, including selectively assign individual candidate inferences to an inferred entity set responsive to a confidence level assessment of the individual candidate inferences; and perform a privacy preservation assessment for the formed set of inferred entities, including the performed assessment to return a privacy score, the score being a leakage indicator of private information associated with the service; and a data container operatively coupled to the program code, including the program code to selectively populate the data container with the formed set of inferred entities violating the preferred privacy level.

“10. The computer program product of claim 9, wherein the data container includes raw data corresponding to the formed set of inferred entities and the one or more candidate inferences, and further comprising program code to compare the set of inferred entities to the raw data, and create an adjusted set of inferences.

“11. The computer program product of claim 10, further comprising program code to adjust the set of inferred entities, including remove a failed prediction from the formed set of inferred entities.

“12. The computer program product of claim 11, further comprising program code to re-compute the privacy score based on the adjusted set of inferred entities, including return an updated privacy score corresponding to the adjusted set of inferred entities.

“13. The computer program product of claim 11, further comprising program code to iteratively evaluate the candidate inferences based on the updated privacy score, and create a modified set of candidate inferences subject to the evaluation.

“14. The computer program product of claim 9, further comprising program code to periodically perform the privacy preservation assessment responsive to a substantive modification of an element selected from the group consisting of: a dataset, a privacy method change, and an explicit request from a cloud service steward.

“15. The computer program product of claim 9, further comprising program code to selectively add an entity from the formed set of inferred entities, the selective addition directed at external data and occurring prior to receipt of the privacy score.

“16. A method comprising: automatically performing a privacy preservation assessment against a shared pool of configurable computing resources and associated data, the assessment including de-anonymizing an associated dataset and inferring private information, including: receiving a preferred level of privacy for service and data associated with computing resources; forming a set of inferred entities, including selectively assigning individual candidate inferences to an inferred entity set responsive to a confidence level assessment of the individual candidate inferences; and performing a privacy preservation assessment for the formed set of inferred entities, including returning a privacy score, the score being a leakage indicator of private information associated with the service; and selectively populating a data container with the formed set of inferred entities responsive to the returned privacy violating the privacy expectation level.

“17. The method of claim 16, wherein the data container includes raw data corresponding to the formed set of inferred entities and the one or more candidate inferences, and further comprising comparing the set of inferred entities to the raw data, and creating an adjusted set of inferences.

“18. The method of claim 17, further comprising adjusting the set of inferred entities, including removing a failed prediction from the formed set of inferred entities.

“19. The method of claim 18, further comprising re-computing the privacy score based on the adjusted set of inferred entities, including returning an updated privacy score corresponding to the adjusted set of inferred entities.

“20. The method of claim 18, further comprising iteratively evaluating the candidate inferences based on the updated privacy score, and creating a modified set of candidate inferences subject to the evaluation.”

URL and more information on this patent application, see: Skourtis, Dimitrios; Angel, Nathalie Baracaldo; Zhang, Rui. Verification Of Privacy In A Shared Resource Environment. Filed November 13, 2018 and posted May 14, 2020. Patent URL: http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220200151351%22.PGNR.&OS=DN/20200151351&RS=DN/20200151351

(Our reports deliver fact-based news of research and discoveries from around the world.)

Older

Walla Walla County approved for Phase 2 economic reopening

Newer

Accounts Receivable Support Services

Advisor News

  • NAIFA: Financial professionals are essential to the success of Trump Accounts
  • Changes, personalization impacting retirement plans for 2026
  • Study asks: How do different generations approach retirement?
  • LTC: A critical component of retirement planning
  • Middle-class households face worsening cost pressures
More Advisor News

Annuity News

  • Trademark Application for “INSPIRING YOUR FINANCIAL FUTURE” Filed by Great-West Life & Annuity Insurance Company: Great-West Life & Annuity Insurance Company
  • Jackson Financial ramps up reinsurance strategy to grow annuity sales
  • Insurer to cut dozens of jobs after making splashy CT relocation
  • AM Best Comments on Credit Ratings of Teachers Insurance and Annuity Association of America Following Agreement to Acquire Schroders, plc.
  • Crypto meets annuities: what to know about bitcoin-linked FIAs
More Annuity News

Health/Employee Benefits News

  • Red and blue states alike want to limit AI in insurance. Trump wants to limit the states.
  • CT hospital, health insurer battle over contract, with patients caught in middle. Where it stands.
  • $2.67B settlement payout: Blue Cross Blue Shield customers to receive compensation
  • Sen. Bernie Moreno has claimed the ACA didn’t save money. But is that true?
  • State AG improves access to care for EmblemHealth members
More Health/Employee Benefits News

Life Insurance News

  • Corporate PACs vs. Silicon Valley
  • IUL tax strategy at center of new lawsuit filed in South Carolina
  • National Life Group Announces 2025-2026 LifeChanger of the Year Grand Prize Winner
  • International life insurer Talcott to lay off more than 100 in Hartford office
  • International life insurer to lay off over 100 in Hartford office
Sponsor
More Life Insurance News

- Presented By -

Top Read Stories

More Top Read Stories >

NEWS INSIDE

  • Companies
  • Earnings
  • Economic News
  • INN Magazine
  • Insurtech News
  • Newswires Feed
  • Regulation News
  • Washington Wire
  • Videos

FEATURED OFFERS

Elevate Your Practice with Pacific Life
Taking your business to the next level is easier when you have experienced support.

LIMRA’s Distribution and Marketing Conference
Attend the premier event for industry sales and marketing professionals

Get up to 1,000 turning 65 leads
Access your leads, plus engagement results most agents don’t see.

What if Your FIA Cap Didn’t Reset?
CapLock™ removes annual cap resets for clearer planning and fewer surprises.

Press Releases

  • RFP #T22521
  • Hexure Launches First Fully Digital NIGO Resubmission Workflow to Accelerate Time to Issue
  • RFP #T25221
  • LIDP Named Top Digital-First Insurance Solution 2026 by Insurance CIO Outlook
  • Finseca & IAQFP Announce Unification to Strengthen Financial Planning
More Press Releases > Add Your Press Release >

How to Write For InsuranceNewsNet

Find out how you can submit content for publishing on our website.
View Guidelines

Topics

  • Advisor News
  • Annuity Index
  • Annuity News
  • Companies
  • Earnings
  • Fiduciary
  • From the Field: Expert Insights
  • Health/Employee Benefits
  • Insurance & Financial Fraud
  • INN Magazine
  • Insiders Only
  • Life Insurance News
  • Newswires
  • Property and Casualty
  • Regulation News
  • Sponsored Articles
  • Washington Wire
  • Videos
  • ———
  • About
  • Advertise
  • Contact
  • Editorial Staff
  • Newsletters

Top Sections

  • AdvisorNews
  • Annuity News
  • Health/Employee Benefits News
  • InsuranceNewsNet Magazine
  • Life Insurance News
  • Property and Casualty News
  • Washington Wire

Our Company

  • About
  • Advertise
  • Contact
  • Meet our Editorial Staff
  • Magazine Subscription
  • Write for INN

Sign up for our FREE e-Newsletter!

Get breaking news, exclusive stories, and money- making insights straight into your inbox.

select Newsletter Options
Facebook Linkedin Twitter
© 2026 InsuranceNewsNet.com, Inc. All rights reserved.
  • Terms & Conditions
  • Privacy Policy
  • InsuranceNewsNet Magazine

Sign in with your Insider Pro Account

Not registered? Become an Insider Pro.
Insurance News | InsuranceNewsNet