Technology of the future shouldn't trap people in the past - Insurance News | InsuranceNewsNet

InsuranceNewsNet — Your Industry. One Source.™

Sign in
  • Subscribe
  • About
  • Advertise
  • Contact
Home Now reading Newswires
Topics
    • Advisor News
    • Annuity Index
    • Annuity News
    • Companies
    • Earnings
    • Fiduciary
    • From the Field: Expert Insights
    • Health/Employee Benefits
    • Insurance & Financial Fraud
    • INN Magazine
    • Insiders Only
    • Life Insurance News
    • Newswires
    • Property and Casualty
    • Regulation News
    • Sponsored Articles
    • Washington Wire
    • Videos
    • ———
    • About
    • Advertise
    • Contact
    • Editorial Staff
    • Newsletters
  • Exclusives
  • NewsWires
  • Magazine
  • Newsletters
Sign in or register to be an INNsider.
  • AdvisorNews
  • Annuity News
  • Companies
  • Earnings
  • Fiduciary
  • Health/Employee Benefits
  • Insurance & Financial Fraud
  • INN Exclusives
  • INN Magazine
  • Insurtech
  • Life Insurance News
  • Newswires
  • Property and Casualty
  • Regulation News
  • Sponsored Articles
  • Video
  • Washington Wire
  • Life Insurance
  • Annuities
  • Advisor
  • Health/Benefits
  • Property & Casualty
  • Insurtech
  • About
  • Advertise
  • Contact
  • Editorial Staff

Get Social

  • Facebook
  • X
  • LinkedIn
Newswires
Newswires RSS Get our newsletter
Order Prints
March 20, 2023 Newswires
Share
Share
Tweet
Email

Technology of the future shouldn't trap people in the past

Berkshire Eagle, The (Pittsfield, MA)

COMMENTARY

If you are a chain smoker applying for life insurance, you might think it makes sense to be charged a higher premium because your lifestyle raises your risk of dying young. If you have a propensity to rack up speeding tickets and run the occasional red light, you might begrudgingly accept a higher price for auto insurance.

But would you think it fair to be denied life insurance based on your Zip code, online shopping behavior or social media posts? Or to pay a higher rate on a student loan because you majored in history rather than science? What if you were passed over for a job interview or an apartment because of where you grew up? How would you feel about an insurance company using the data from your Fitbit or Apple Watch to figure out how much you should pay for your health-care plan?

Political leaders in the United States have largely ignored such questions of fairness that arise from insurers, lenders, employers, hospitals and landlords using predictive algorithms to make decisions that profoundly affect people's lives. Consumers have been forced to accept automated systems that today scrape the internet and our personal devices for artifacts of life that were once private - from genealogy records to what we do on weekends - and that might unwittingly and unfairly deprive us of medical care, or keep us from finding jobs or homes.

With Congress thus far failing to pass an algorithmic accountability law, some state and local leaders are now stepping up to fill the void. Draft regulations issued last month by Colorado's insurance commissioner, as well as recently proposed reforms in D.C. and California, point to what policymakers might do to bring us a future where algorithms better serve the public good.

The promise of predictive algorithms is that they make better decisions than humans - freed from our whims and biases. Yet today's decision-making algorithms too often use the past to predict - and thus create - people's destinies. They assume we will follow in the footsteps of others who looked like us and have grown up where we grew up, or who studied where we studied - that we will do the same work and earn the same salaries.

Predictive algorithms might serve you well if you grew up in an affluent neighborhood, enjoyed good nutrition and health care, attended an elite college, and always behaved like a model citizen. But anyone stumbling through life, learning and growing and changing along the way, can be steered toward an unwanted future. Overly simplistic algorithms reduce us to stereotypes, denying us our individuality and the agency to shape our own futures.

For companies trying to pool risk, offer services or match people to jobs or housing, automated decision-making systems create efficiencies. The use of algorithms creates the impression that their decisions are based on an unbiased, neutral rationale. But too often, automated systems reinforce existing biases and long-standing inequities.

Consider, for example, the research that showed an algorithm had kept several Massachusetts hospitals from putting Black patients with severe kidney disease on transplant waitlists; it scored their conditions as less serious than those of White patients with the same symptoms. A Pro-Publica investigation revealed that criminal offenders in Broward County, Fla., were being scored for risk - and therefore sentenced - based on faulty predictors of their likelihood to commit future violent crime. And Consumer Reports recently found that poorer and less-educated people are charged more for car insurance.

Because many companies shield their algorithms and data sources from scrutiny, people can't see how such decisions are made. Any individual who is quoted a high insurance premium or denied a mortgage can't tell if it has to do with anything other than their underlying risk or ability to pay. Intentional discrimination based on race, gender and ability is not legal in the United States. But it is legal in many cases for companies to discriminate based on socioeconomic status, and algorithms can unintentionally reinforce disparities along racial and gender lines.

The new regulations being proposed in several localities would require companies that rely on automated decision-making tools to monitor them for bias against protected groups - and to adjust them if they are creating outcomes that most of us would deem unfair.

In February, Colorado adopted the most ambitious of these reforms. The state insurance commissioner issued draft rules that would require life insurers to test their predictive models for unfair bias in setting prices and plan eligibility, and to disclose the data they use. The proposal builds on a groundbreaking 2021 state law - passed despite intense insurance industry lobbying efforts against it - meant to protect all kinds of insurance consumers from unfair discrimination by algorithms and other AI technologies.

In D.C., five city council members last month reintroduced a bill that would require companies using algorithms to audit their technologies for patterns of bias - and make it illegal to use algorithms to discriminate in education, employment, housing, credit, health care and insurance. And just a few weeks ago in California, the state's privacy protection agency initiated an effort to prevent bias in the use of consumer data and algorithmic tools.

Although such policies still lack clear provisions for how they will work in practice, they deserve public support as a first step toward a future with fair algorithmic decision-making. Trying these reforms at the state and local level might also give federal lawmakers the insight to make better national policies on emerging technologies.

"Algorithms don't have to project human bias into the future," said Cathy O'Neil, who runs an algorithm auditing firm that is advising the Colorado insurance regulators. "We can actually project the best human ideals onto future algorithms. And if you want to be optimistic, it's going to be better because it's going to be human values, but leveled up to uphold our ideals."

I do want to be optimistic - but also vigilant. Rather than dread a dystopian future where artificial intelligence overpowers us, we can prevent predictive models from treating us unfairly today. Technology of the future should not keep haunting us with ghosts from the past.

Bina Venkataraman is a Washington Post columnist covering the future and a fellow at the Belfer Center for Science and International Affairs at Harvard University. She was previously editorial page editor of The Boston Globe.

The use of algorithms creates the impression that their decisions are based on an unbiased, neutral rationale. But too often, automated systems reinforce existing biases and longstanding inequities. Consider, for example, the research that showed an algorithm had kept several Massachusetts hospitals from putting Black patients with severe kidney disease on transplant waitlists; it scored their conditions as less serious than those of White patients with the same symptoms.

Older

China Stock Market Expected To Open In The Red [Real-Time Trader]

Newer

The US Remains a Grim Leader in Preterm Births. Why? And Can We Fix It?

Advisor News

  • The best way to use a tax refund? Create a holistic plan
  • CFP Board appoints K. Dane Snowden as CEO
  • TIAA unveils ‘policy roadmap’ to boost retirement readiness
  • 2026 may bring higher volatility, slower GDP growth, experts say
  • Why affluent clients underuse advisor services and how to close the gap
More Advisor News

Annuity News

  • Insurer Offers First Fixed Indexed Annuity with Bitcoin
  • Assured Guaranty Enters Annuity Reinsurance Market
  • Ameritas: FINRA settlement precludes new lawsuit over annuity sales
  • Guaranty Income Life Marks 100th Anniversary
  • Delaware Life Insurance Company Launches Industry’s First Fixed Indexed Annuity with Bitcoin Exposure
More Annuity News

Health/Employee Benefits News

  • Harshbarger presses insurance CEOs on market control, vertical integration, conflict of interest
  • Health outlook: Big stories to track now, with takeaways for your business
  • SSI in Florida: High Demand, Frequent Denials, and How Legal Help Makes a Difference
  • CATHOLIC UNIVERSITY IN ILLINOIS STILL COVERS 'ABORTION CARE' WITH CAMPUS INSURANCE
  • Major health insurer overspent health insurance funds
More Health/Employee Benefits News

Life Insurance News

  • Insurance industry is healthy but uncertain in 2026
  • AM Best Downgrades Credit Ratings of A-CAP Group Members; Maintains Under Review with Negative Implications Status
  • Md. A.G. Brown: Former DC Teacher to Serve One Year in Jail for Felony Insurance Theft Scheme
  • ‘Baseless claims’: PacLife hits back at Kyle Busch in motion to dismiss suit
  • Melinda J. Wakefield
Sponsor
More Life Insurance News

- Presented By -

Top Read Stories

More Top Read Stories >

NEWS INSIDE

  • Companies
  • Earnings
  • Economic News
  • INN Magazine
  • Insurtech News
  • Newswires Feed
  • Regulation News
  • Washington Wire
  • Videos

FEATURED OFFERS

Elevate Your Practice with Pacific Life
Taking your business to the next level is easier when you have experienced support.

ICMG 2026: 3 Days to Transform Your Business
Speed Networking, deal-making, and insights that spark real growth — all in Miami.

Your trusted annuity partner.
Knighthead Life provides dependable annuities that help your clients retire with confidence.

8.25% Cap Guaranteed for the Full Term
Guaranteed cap rate for 5 & 7 years—no annual resets. Explore Oceanview CapLock FIA.

Press Releases

  • ePIC Services Company and WebPrez Announce Exclusive Strategic Relationship; Carter Wilcoxson Appointed President of WebPrez
  • Agent Review Announces Major AI & AIO Platform Enhancements for Consumer Trust and Agent Discovery
  • Prosperity Life Group® Names Industry Veteran Mark Williams VP, National Accounts
  • Salt Financial Announces Collaboration with FTSE Russell on Risk-Managed Index Solutions
  • RFP #T02425
More Press Releases > Add Your Press Release >

How to Write For InsuranceNewsNet

Find out how you can submit content for publishing on our website.
View Guidelines

Topics

  • Advisor News
  • Annuity Index
  • Annuity News
  • Companies
  • Earnings
  • Fiduciary
  • From the Field: Expert Insights
  • Health/Employee Benefits
  • Insurance & Financial Fraud
  • INN Magazine
  • Insiders Only
  • Life Insurance News
  • Newswires
  • Property and Casualty
  • Regulation News
  • Sponsored Articles
  • Washington Wire
  • Videos
  • ———
  • About
  • Advertise
  • Contact
  • Editorial Staff
  • Newsletters

Top Sections

  • AdvisorNews
  • Annuity News
  • Health/Employee Benefits News
  • InsuranceNewsNet Magazine
  • Life Insurance News
  • Property and Casualty News
  • Washington Wire

Our Company

  • About
  • Advertise
  • Contact
  • Meet our Editorial Staff
  • Magazine Subscription
  • Write for INN

Sign up for our FREE e-Newsletter!

Get breaking news, exclusive stories, and money- making insights straight into your inbox.

select Newsletter Options
Facebook Linkedin Twitter
© 2026 InsuranceNewsNet.com, Inc. All rights reserved.
  • Terms & Conditions
  • Privacy Policy
  • InsuranceNewsNet Magazine

Sign in with your Insider Pro Account

Not registered? Become an Insider Pro.
Insurance News | InsuranceNewsNet