Patent Issued for Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance (USPTO 11025675)
2021 JUN 21 (NewsRx) -- By a
Patent number 11025675 is assigned to
The following quote was obtained by the news editors from the background information supplied by the inventors: “Over the past years, privacy and security policies, and related operations have become increasingly important. Breaches in security, leading to the unauthorized access of personal data (which may include sensitive personal data) have become more frequent among companies and other organizations of all sizes. Such personal data may include, but is not limited to, personally identifiable information (PII), which may be information that directly (or indirectly) identifies an individual or entity. Examples of PII include names, addresses, dates of birth, social security numbers, and biometric identifiers such as a person’s fingerprints or picture. Other personal data may include, for example, customers’ Internet browsing habits, purchase history, or even their preferences (i.e., likes and dislikes, as provided or obtained through social media). While not all personal data may be sensitive, in the wrong hands, this kind of information may have a negative impact on the individuals or entities whose sensitive personal data is collected, including identity theft and embarrassment. Not only would this breach have the potential of exposing individuals to malicious wrongdoing, the fallout from such breaches may result in damage to reputation, potential liability, and costly remedial action for the organizations that collected the information and that were under an obligation to maintain its confidentiality and security. These breaches may result not only in financial loss, but loss of credibility, confidence, and trust from individuals, stakeholders, and the public.
“Many organizations that obtain, use, and transfer personal data, including sensitive personal data, have begun to address these privacy and security issues. To manage personal data, many companies have attempted to implement operational policies and processes that comply with legal requirements, such as Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) or the U.S.’s Health Insurance Portability and Accountability Act (HIPPA) protecting a patient’s medical information. The European Union’s General Data Protection Regulation (GDPR) may fine companies up to 4% of their global worldwide turnover (revenue) for not complying with its regulations (companies must comply by
“Many regulators recommend conducting privacy impact assessments, or data protection risk assessments along with data inventory mapping. For example, the GDPR requires data protection impact assessments. Additionally, the United Kingdom ICO’s office provides guidance around privacy impact assessments. The OPC in
“For many companies handling personal data, privacy audits, whether done according to AICPA Generally Accepted Privacy Principles, or ISACA’s IT Standards, Guidelines, and Tools and Techniques for Audit Assurance and Control Professionals, are not just a best practice, they are a requirement (for example,
“Many of these breaches have their roots in vulnerabilities that may be found in software applications, websites, or other computer code that collect, use and process personal data. The computer code may be an in-house application or solution, or one provided by a third party. When an organization’s auditors or privacy team members conduct a privacy audit or assessment, they typically direct questions to software developers in an attempt to obtain answers they need to address compliance with privacy standards. Unfortunately, the auditors and developers do not always use the same vernacular or technical language. As an example, auditors might ask a developer, “List for me all the personal data that you collect,” or “are you using any third-party code?” A developer, when responding, might, for example, not understand that a user’s IP address is considered personal data, especially according to some laws. A developer might also not understand that third party code includes, for example, including snippets of HTML for a hosted library from Google’s hosted library, or the use of other software development kits (SDKs). With multitudes of questions during the audit process, the disconnect or language barrier may lead to vulnerabilities. Thus, auditors may ask a multitude of questions, but the disconnect from the language barrier might not lead to the identification or resolution of many privacy-related issues because the auditors are not obtaining the right answers to those questions.
“In light of the above, there is currently a need for improved systems and methods for assessing mobile applications, websites, and other computer code for features and conditions that may have an impact on a company’s compliance with privacy standards.”
In addition to the background information obtained for this patent, NewsRx journalists also obtained the inventors’ summary information for this patent: “According to exemplary embodiments, a system for operationalizing privacy compliance is described herein. The system may be comprised of one or more servers and client computing devices that execute one or more software modules that perform functions and methods related to the input, processing, storage, retrieval, and display of campaign data related to a privacy campaign. A privacy campaign may be any business function, system, product, technology, process, project, engagement, initiative, campaign, etc., that may utilize personal data collected from one or more persons or entities. Campaign data may data representative of one or more attributes related to the personal data collected as part of the campaign.
“A computer-implemented data processing system and method is operable for electronically receiving the input of campaign data associated with a privacy campaign, and electronically calculating a risk level for the privacy campaign based on the campaign data.
“The system is operable for displaying on a graphical user interface (GUI) a prompt to create an electronic record for a privacy campaign. The system receives a command to create an electronic record for the privacy campaign, creates an electronic record for the privacy campaign and digitally stores the record. The system presents on one or more graphical user interfaces a plurality of prompts for the input of campaign data related to the privacy campaign. It electronically receives any campaign data input by one or more users. The privacy campaign data may relate to a description of the campaign, one or more types of personal data related to the campaign, a subject from which the personal data was collected, the storage of the personal data, and access to the personal data.
“The system processes the campaign data by electronically associating the campaign data with the record for the privacy campaign, and digitally storing the campaign data associated with the record for the campaign.
“Using a microprocessor, the system calculates a “Risk Level” for the campaign based on the campaign data, electronically associates the risk level with the record for the campaign; and digitally stores the risk level associated with the record for the campaign (e.g., in a storage device such as a networked hard drive, a cloud drive, the hard drive of one or more computing devices, etc.).
“The users of the system may be an owner of the campaign, who may be a privacy officer (i.e., personnel working in an organization under the Chief Privacy Officer). The privacy officer may input an initial portion of the campaign data, such as the name of the campaign, the description of the campaign, and the business group responsible for administering the privacy operations related to that campaign.
“The system is also operable for accepting an input to add one or more collaborators designated by one or more users (who may be an owner) to input campaign data. Once a user designates a collaborator, who may be another owner or a business office representative, the system sends an electronic message to the collaborator regarding the addition of the collaborator for adding campaign data for the privacy campaign (e.g., letting the collaborator know that he has been added to the campaign, providing him with system login details, responsibilities, and deadlines for providing his portion of the campaign data). The collaborator may be designated to input different portions of the campaign data. The collaborator may be designated to provide input for one or more prompts, including one or more questions. The collaborator may be designated to provide input for part of a question.
“The system is operable for accepting one or more inputs of campaign data from the users, who may be owners or collaborator(s), and for any campaign data that has been added, the system electronically associates the campaign data received from the input of the users with the record of the campaign, and digitally stores the campaign data received from the input of the collaborator with the record for the campaign (again, this may also be in a storage device such as a networked hard drive, a cloud drive, the hard drive of one or more computing devices, etc.).
“The system can collect this campaign data by presenting a plurality of prompts for inputting the campaign data to the users (who may be a privacy officer, a business rep, or other collaborators). The prompts may be presented through a series of computer-generated GUIs (for example, webpages), wherein each GUI displays one or more of the prompts for campaign data, and wherein each GUI page is presented one at a time (e.g., in a screen by screen manner, for example, in five phases as shown in FIGS. 8 through 14). One or more graphical user interface pages may be an online form comprising one or more fields in which a user can input data. The graphical user interface may have a visually displayed shape (such as a circle) that can be filled by the user to input data. The graphical user interface may have a drop-down menu from which a user can select an item.
“Also to facilitate collaboration, a computer implemented method may be operable for instantiating a real-time communication session overlaying a portion of a user interface. One or more GUI pages having prompts may display an indicator (e.g., the “comment button” shown in FIGS. 9 through 13), wherein if the indicator is selected, it retrieves a list of one or more collaborators associated with at least one record related to the information displayed on the online graphical user interface, wherein the list of collaborators includes the user and at least one other person. The system then electronically instantiates a real-time communication session (e.g., an instant messaging session, a chat session, etc.) between the user and the one or more collaborators in a computer-generated window, wherein the window having the real-time communications session overlays the online graphical user interface, covering at least a portion of the graphical user interface.
“When the user responds to the prompts and enters inputs (for example, through fields, drop down menus, check boxes, radial selections), the system may be operable to automatically populate one or more fields based on the data input history of the user. The system may also be operable to automatically populate one or more fields for the entry of data inputs based on the type of campaign data entered from a previous input (e.g., if the input is related to personal data, the check boxes commonly used for personal data can be automatically checked. See, e.g., FIG. 10). Based on the input of campaign data received from the one or more users, the system can present further prompts related to the campaign data that was input (e.g., if a user selects a box that indicates that the personal information being collected includes personal data, the user can be presented with another dialog with more selections related to personal data. See, e.g., FIG. 10).
“The system is also operable for sending reminders. If required campaign data has not been received, the system sends one or more electronic notifications that indicates that required campaign data has not yet been provided, thereby facilitating the gathering of different portions of information from one or more collaborators until all the required campaign data for a privacy campaign has been input.
“The system is operable to use the campaign data input into the system to calculate a “Risk Level”. The system electronically retrieves from a database the campaign data associated with the record for the campaign, electronically determines a plurality of “weighting factors” for the campaign, wherein the plurality of weighting factors are based upon a number of factors including the nature of the personal data associated with the campaign, the physical location of the personal data associated with the campaign, the number of individuals having access to the personal data associated with the campaign, the length of time that the personal data associated with the campaign will be retained in storage, the type of individual from which the personal data associated with the campaign originated, and the country of residence of the individual from which the personal data associated with the campaign originated. Each weighting factor is electronically assigned a higher numerical value if the risk associated with the factor is higher.
“In addition to the determining the weighting factors, the system electronically assigns a “relative risk rating” for each of the plurality of factors. Based on weighting factors and the relative risk rating for each of the plurality of factors, the system electronically calculates a risk level for the campaign. The system may use an algorithm to make this calculation, for example, the Risk Level may be electronically calculated as the sum of a plurality of: a weighting factor multiplied by the relative risk rating of the factor (i.e., Risk Level for campaign=(Weighting Factor of Factor 1)*(Relative Risk Rating of Factor 1)+(Weighting Factor of Factor 2)*(Relative Risk Rating of Factor 2)+(Weighting Factor of Factor N)*(Relative Risk Rating of Factor N).
“The system may also determine an Overall Risk Assessment for the campaign and digitally store the Overall Risk Assessment with the record for the campaign, and wherein the Overall Risk Assessment is determined based upon a plurality of numerical ranges of risk levels (e.g., a campaign having a Risk Level of 1-7 is “low risk,” (2) campaigns with a Risk Level of 8-15 are “medium risk,” and (3) campaigns with a Risk Level of over 16 as “high risk”).”
There is additional summary information. Please visit full patent to read further.
The claims supplied by the inventors are:
“1. A computer system for electronically analyzing computer code to generate a data map, the computer system comprising: one or more computer processors; and computer memory operatively coupled to the one or more computer processors, wherein the computer system is configured for: receiving, from a particular user, by the one or more computer processors, a request to generate a privacy-related data map for particular computer code; at least partially in response to receiving the request: determining, by the one or more computer processors, a location of the particular computer code; automatically obtaining, by the one or more computer processors, the particular computer code based on the determined location; automatically electronically analyzing the particular computer code, by the one or more computer processors, to determine one or more privacy-related attributes of the particular computer code, each of the one or more privacy-related attributes indicating one or more types of personal information that the particular computer code collects or accesses, by: connecting to an application executing on one or more remote computing devices using an application programming interface; scanning one or more data repositories on the one or more remote computing devices to identify one or more data attributes, wherein the one or more data attributes are associated with a processing activity, and wherein the processing activity is associated with the particular computer code; analyzing the one or more data attributes and correlating metadata for the scanned one or more data repositories with particular attributes of the one or more data attributes discovered in the one or more data repositories; and determining, based at least in part on analyzing the one or more data attributes and correlating the metadata for the scanned one or more data repositories with the particular attributes of the one or more data attributes, one or more of the one or more privacy-related attributes of the particular computer code; and electronically generating, by the one or more computer processors, a data map of the one or more privacy-related attributes; digitally storing, by the one or more computer processors, the data map in the computer memory; and electronically displaying, by the one or more computer processors, the data map to the particular user.
“2. The computer system of claim 1, wherein electronically generating the data map comprises: analyzing the particular computer code to identify a storage location of data comprising the one or more types of personal information; retrieving the data from the storage location; and generating a visual representation of the particular computer code that includes the data.
“3. The computer system of claim 1, wherein electronically generating the data map comprises: analyzing the particular computer code to identify a storage location of data comprising the one or more types of personal information; retrieving the data from the storage location; after retrieving the data from the storage location, identifying one or more pieces of the data that comprise a particular type of the one or more types of personal information; and generating a visual representation of the particular type of the one or more types of personal information that includes the one or more pieces of the data that comprise the particular type of the one or more types of personal information.
“4. The computer system of claim 1, wherein the computer system is further configured for: receiving an indication that the particular computer code has been modified; at least partially in response to receiving the indication, analyzing the particular computer code to identify one or more changes in the one or more privacy-related attributes of the particular computer code; and modifying the data map to reflect the identified one or more changes.
“5. The computer system of claim 4, wherein the computer system is further configured for: continuously modifying the data map based at least in part on one or more additional changes identified, by the one or more processors, in response to receiving one or more additional indications that the particular computer code has been modified.
“6. The computer system of claim 1, wherein the one or more privacy-related attributes further identify a storage location of one or more pieces of personal information of the one or more types of personal information that the particular computer code collects or accesses.
“7. A non-transitory computer-readable medium storing computer-executable instructions for: receiving, from a particular user, a request to generate a data map for one or more privacy-related attributes of a piece of computer code, the request comprising one or more criteria; determining a location of the piece of computer code; automatically obtaining the piece of computer code based on the determined location; automatically electronically analyzing the piece of computer code to determine the one or more privacy-related attributes of the piece of computer code, each of the one or more privacy-related attributes indicating one or more types of personal information that the piece of computer code collects or accesses, by: connecting to an application executing on one or more remote computing devices using an application programming interface; scanning one or more data repositories on the one or more remote computing devices to identify one or more data attributes, wherein the one or more data attributes are associated with a processing activity, and wherein the processing activity is associated with the piece of computer code; analyzing the one or more data attributes and correlating metadata for the scanned one or more data repositories with particular attributes of the one or more data attributes discovered in the one or more data repositories; and determining, based at least in part on analyzing the one or more data attributes and correlating the metadata for the scanned one or more data repositories with the particular attributes of the one or more data attributes, one or more of the one or more privacy-related attributes of the piece of computer code; after determining the one or more privacy-related attributes of the piece of computer code, electronically generating a data map of the one or more privacy-related attributes based at least in part on the one or more criteria; digitally storing the data map in computer memory; and electronically displaying the data map to the particular user.
“8. The non-transitory computer-readable medium of claim 7, wherein the steps of automatically analyzing the piece of computer code and electronically generating the data map are executed in response to receiving the request.
“9. The non-transitory computer-readable medium of claim 7, wherein: the one or more criteria comprise one or more criteria to generate a data map based at least in part on a particular type of the one or more types of personal information.
“10. The non-transitory computer-readable medium of claim 7, wherein electronically generating the data map comprises: analyzing the piece of computer code to identify a storage location of data comprising the one or more types of personal information; retrieving the data from the storage location; identifying one or more pieces of the data that comprise a particular type of the one or more types of personal information; and generating a visual representation of the particular type of the one or more types of personal information that includes the one or more pieces of the data that comprise the particular type of the one or more types of personal information.
“11. The non-transitory computer-readable medium of claim 10, wherein: the one or more criteria comprise one or more criteria to generate a data map based at least in part on a plurality of privacy campaigns.
“12. The non-transitory computer-readable medium of claim 7, the method further comprising: receiving an indication that the piece of computer code has been modified; in response to receiving the indication, analyzing the piece of computer code to identify one or more changes in the one or more privacy-related attributes of the piece of computer code; and modifying the data map based at least in part on the identified one or more changes.
“13. The non-transitory computer-readable medium of claim 12, the method further comprising: continuously modifying the data map based at least in part on one or more additional changes identified in response to receiving one or more additional indications that the piece of computer code has been modified.
“14. The non-transitory computer-readable medium of claim 7, wherein the one or more privacy-related attributes further identify a storage location of one or more pieces of personal information of the one or more types of personal information that the piece of computer code collects or accesses.
“15. The non-transitory computer-readable medium of claim 14, wherein the one or more privacy-related attributes further identify one or more access permissions of the one or more pieces of personal information.”
There are additional claims. Please visit full patent to read further.
URL and more information on this patent, see: Barday, Kabir A. Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance.
(Our reports deliver fact-based news of research and discoveries from around the world.)



US Economy On A Comeback? Data Shows A ‘New Kind Of Normal’
Asian Markets Skid On Jitters Over Future Fed Action
Advisor News
- 2026 may bring higher volatility, slower GDP growth, experts say
- Why affluent clients underuse advisor services and how to close the gap
- America’s ‘confidence recession’ in retirement
- Most Americans surveyed cut or stopped retirement savings due to the current economy
- Why you should discuss insurance with HNW clients
More Advisor NewsAnnuity News
- Guaranty Income Life Marks 100th Anniversary
- Delaware Life Insurance Company Launches Industry’s First Fixed Indexed Annuity with Bitcoin Exposure
- Suitability standards for life and annuities: Not as uniform as they appear
- What will 2026 bring to the life/annuity markets?
- Life and annuity sales to continue ‘pretty remarkable growth’ in 2026
More Annuity NewsHealth/Employee Benefits News
- Hawaii lawmakers start looking into HMSA-HPH alliance plan
- EDITORIAL: More scrutiny for HMSA-HPH health care tie-up
- US vaccine guideline changes challenge clinical practice, insurance coverage
- DIFS AND MDHHS REMIND MICHIGANDERS: HEALTH INSURANCE FOR NO COST CHILDHOOD VACCINES WILL CONTINUE FOLLOWING CDC SCHEDULE CHANGES
- Illinois Medicaid program faces looming funding crisis due to federal changes
More Health/Employee Benefits NewsLife Insurance News