Insurers in the crosshairs over AI
On July 11, the New York State Department of Financial Services issued its Final Circular Letter regarding the “Use of Artificial Intelligence Systems and External Consumer Data and Information Sources in Insurance Underwriting and Pricing.”
NYSDFS published its Proposed Circular Letter in January, calling on insurers and others in the state that use external consumer data and information sources and artificial intelligence systems, to assess and mitigate bias, inequality and discriminatory decision making or other adverse effects in the underwriting and pricing of insurance policies.
Although NYSDFS recognized the value of ECDIS and AI in simplifying and expediting the insurance underwriting process, the agency wanted to mitigate the potential for harm. And if the opening section of the final letter is any indication, NYSDFS did not back down.
Proxy assessments
The final letter clarifies the requirement of a proxy assessment mandating that insurers demonstrate that ECDIS do not serve as a “proxy” for any protected classes that result in unfair or unlawful discrimination. It adds a sentence stating that whether ECDIS correlates with a protected class may be determined using data available to the insurer or may be reasonably inferred using accepted statistical methodologies. The final letter mandates that if such correlations are found, insurers should consider whether the use of such ECDIS is required by a legitimate business necessity.
Quantitative assessments
The final letter clarifies a requirement of quantitative assessments, specifying that these apply only to protected classes for which data are available or may be reasonably imputed using statistical methodologies. Insurers are not expected to collect additional data. The additional requirement of a qualitative assessment of unfair and unlawful discrimination remains largely unchanged.
What data is “available,” however? The issue of data scarcity—defined by Expert.ai as “the lack of data that could possibly satisfy the need of the system to increase the accuracy of predictive analytics”—is growing. Indeed, as aptly noted by AI observers, “While the internet generates enormous amounts of data daily, quantity doesn’t necessarily translate to quality when it comes to training AI models. Researchers need diverse, unbiased and accurately labeled data—a combination that is becoming increasingly scarce.”
Governance and risk management
The final letter keeps “the expectation that insurers take an appropriate, risk-based approach to utilizing ECDIS and AIS,” but NYSDFS declined to provide additional detail on thresholds for sufficiency regarding risk management procedures. “It is up to insurers,” the NYSDFS stated, “to determine the appropriate sufficiency thresholds and standards of proof based on the product and the particular use of ECDIS or AIS.”
Board and senior management oversight
NYSDFS reiterated that senior management and the boards “have a responsibility for the overall outcomes of the use of ECDIS and AIS.”
Third-party vendors
Insurers retain responsibility for understanding any tools used in underwriting and pricing developed and deployed by third-party vendors and ensuring that they comply with all applicable laws and regulations. The Final Rule contains a new Point 37 specifying that “where appropriate and available,” insurers should include terms in contracts with third-party vendors that i) provide audit rights or entitle the insurer to receive audit reports by qualified auditing entities; and ii) require the third-party vendor to cooperate with the insurer regarding regulatory inquiries and investigations related to the insurer’s use of the third-party vendor’s product or services.
The NYSDFS Final Letter reiterates:
- Insurers’ use of emerging technologies, such as AIS and ECDIS, must comply with all applicable federal and state laws and regulations.
- An insurer should not use ECDIS or AIS in underwriting and pricing unless the insurer has determined that the ECDIS or AIS does not collect or use criteria that would constitute unfair or unlawful discrimination or an unfair trade practice.
- The responsibility to comply with antidiscrimination laws remains with the insurer at all times.
- Insurers should not use ECDIS or AIS for underwriting or pricing purposes unless they can establish that the data source or model is not based on any class protected pursuant to Insurance Law Article 26 and/or if such use would result in or permit any unfair discrimination or otherwise violate the Insurance Law or applicable regulations.
- Insurers should be able to demonstrate that ECDIS are supported by generally accepted actuarial standards of practice and are based on actual or reasonably anticipated experience. They should assess, in a multistep assessment process, whether the use of ECDIS or AIS produces disproportionate adverse effects in underwriting or pricing for similarly situated insureds or insureds of a protected class.
- Both qualitative and quantitative assessments for assessing whether the use of ECDIS/AI produces unfair and unlawful discrimination, frequency of testing, and documentation requirements remain.
Takeaways for insurers
A recent InsuranceNewsNet article indicates that “innovative companies are implementing AI-powered technology into the underwriting process, resulting in fast applications, instant offers, and personalized policies.” Yet state as well as federal regulators are looking closely at the potential discriminatory effect of AI. Within the insurance industry specifically, other regulators of insurers will be sure to follow NYSDFS’s lead. The proposed letter of Jan. 17 followed a Model Bulletin of the National Association of Insurance Commissioners on the use of artificial intelligence systems in insurance, adopted in December 2023. Nearly a dozen states have adopted it thus far.
NAIC reminds all insurers that those using AI systems must comply with all applicable federal and state insurance laws and regulations, including those addressing unfair discrimination. And with the Federal Trade Commission geared up to examine how AI and other technological tools are being used in the realm of personalized pricing, in the most recent AI developments—and issuing subpoenas to financial companies and consultants—it appears that those using these systems in all industries will need to proceed thoughtfully.
© Entire contents copyright 2024 by InsuranceNewsNet.com Inc. All rights reserved. No part of this article may be reprinted without the expressed written consent from InsuranceNewsNet.com.
Frances M. Green is an attorney with Epstein Becker Green. Contact her at [email protected].
Calif. signs law requiring high school finance education class
NAIC working group adopts long-awaited accelerated underwriting guidance
Advisor News
Annuity News
Health/Employee Benefits News
Property and Casualty News