Top Stories
No comments
N.Y. regulators hit insurance companies for multiple policy versions
By Doug Bailey
New York regulators have put insurance companies on notice for selling multiple versions of the same policies, saying the practice is discriminatory and can result in less favorable versions of the same product being sold to low-income and minority consumers.
In a letter to life insurance companies warning them against discrimination, New York Department of Financial Services Superintendent Adrienne A. Harris, said several insurers have developed multiple versions of products, the terms and conditions of which vary depending upon which insurance producer sells the product to a consumer in the individual market.
For example, some insurers have developed a different version of a product or have created additional features at the request of a particular producer solely to be offered to that producer’s clients. In other instances, producers have developed products that they then ask an insurer to offer exclusively through that producer. Sometimes there are differences in compensation between producers that sell the different versions and in other cases there is little or no difference. Some insurers also have offered different versions of a product through different producers as a marketing strategy.
“The different versions are sold to consumers with the same expectation of life and with identical needs, goals, or personal or financial circumstances, resulting in similarly situated consumers receiving different terms, conditions, benefits, fees, or premiums for the same policies or contracts in the individual market,” the superintendent’s letter said. “Consumers have no way of knowing that there are other versions of the product – versions that may be more suitable for them – that are offered and sold to other consumers with identical needs, goals, or circumstances and that may be cheaper. These sales practices result in unfair and unlawful discrimination among similarly situated individuals.”
N.Y. official calls for fairness, transparency
Harris accused insurers of using the complexity of insurance products to exploit vulnerable groups and denying consumers equitable access to products without prejudice or bias.
"Insurers have a responsibility to ensure fairness and transparency in their products, and discrimination against consumers based on factors such as gender or ethnicity is against the law," said Superintendent Harris.
Harris noted that life insurers are free to set appropriate underwriting standards, which may or may not include different underwriting for different products, without violating discrimination laws as long as the underwriting standards have a factual and rational basis, are grounded in generally accepted insurance and actuarial principles, and are not contrary to law.
But insurers say tailoring a policy to a customer’s specific needs is exactly what consumers want and is frequently done so without any discrimination or bias.
N.Y. life insurance group hits back
“The life insurance industry offers New York families financial protection through a full spectrum of products designed for the specific needs of consumers,” said the Life Insurance Council of New York, which represents 75 life insurers in the state. “From Buffalo to Brooklyn, there are many paths to find products that fit individual needs and budgets.“
New York’s action typifies the growing concern among insurance regulators that in this age of data mining, Internet searches, and social media disclosures, companies are accessing and using personal information about consumers acquired without their consent.
Colorado initiated a draft rule to impose oversight and transparency requirements on insurance companies that use big data about consumers or feed such data into predictive models and algorithms. Officials there said they want to make sure that people’s social media habits, credit scores and other less traditional customer data cannot be used against them when they shop for insurance.
“The use of nontraditional factors in extremely complicated predictive models that carriers may be using could lead to unintended consequences, namely unfairly discriminatory practices regarding protected classes,” Jason Lapham, big data and artificial intelligence policy director of the Colorado Division of Insurance said in a statement.
Officials there say Colorado was the first state in the nation to build big data restrictions into its insurance regulations.
A.I. also cited for potential bias
The Connecticut Insurance Department and the California Department of Insurance have issued similar bulletins about the use of artificial intelligence and the potential for racial bias and discriminatory insurance practices.
The American Academy of Actuaries has said insurance companies regularly consider some protected characteristics – such as sex and disability status, which are considered key risk criteria in underwriting – and have asked regulators to clarify when using such information would be considered unfair or a discriminatory practice.
“In the life insurance space, the current focus is on the use of external consumer data to supplement traditional underwriting,” said an analysis of the issue published in the New York Law Journal. “There is no standard definition of external consumer data, but it can include credit scores, social media habits, location data, purchasing habits, home ownership, educational attainment, occupation, licensures, civil judgments, and court records. The data can be incorporated into automated decision-making systems, such as risk scoring or pricing algorithms.”
The analysis authors, Ellen M. Dunn, Joan M. Loughnane, and Mallory W. Edel, are with the law firm of Sidley Austin. They said while there is no question AI can achieve efficiencies and benefits for both insurer and client, companies have to be ready to defend their use of the new data and underwriting algorithms.
“Insurers must be prepared to explain how complex algorithmic models operate and facilitate transparency into underwriting decisions,” they wrote. “Insurers must also be able to demonstrate that their use of AI incorporates safeguards against compromising health data and other sensitive personal information of policyholders.”
Doug Bailey is a journalist and freelance writer who lives outside of Boston. He can be reached at [email protected].
© Entire contents copyright 2023 by InsuranceNewsNet.com Inc. All rights reserved. No part of this article may be reprinted without the expressed written consent from InsuranceNewsNet.com.
Doug Bailey is a journalist and freelance writer who lives outside of Boston. He can be reached at [email protected].
Financial planning for young adults ‘not a straight line’
Don’t let your financial brand be the invisible hero
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News