CFPB Director's Notebook: Algorithms, Artificial Intelligence, and Fairness in Home Appraisals
* * *
Today, the
In conjunction with the
Algorithmic appraisals that use so-called automated valuation models can be used as a check on a human appraiser or in place of an appraisal. Unlike an appraisal or broker price opinion, where an individual person looks at the property and assesses the comparability of other sales, automated valuations rely on mathematical formulas and number crunching machines to produce an estimate of value.
While machines crunching numbers might seem capable of taking human bias out of the equation, they can't. Based on the data they are fed and the algorithms they use, automated models can embed the very human bias they are meant to correct. And the design and development of the models and algorithms can reflect the biases and blind spots of the developers. Indeed, automated valuation models can make bias harder to eradicate in home valuations because the algorithms used cloak the biased inputs and design in a false mantle of objectivity.
Inaccurate or biased algorithms can lead to serious harm. A home valued too high can lock a homeowner into an unaffordable mortgage and increase the risk of foreclosure. A home valued too low can deprive homeowners of access to their equity and limit the mobility of sellers. In addition to harming homeowners, systemic biases in valuations, either too low or too high, hurt neighborhoods, distort the housing market, and impact the tax base. When it comes to buying or selling a home, we all need and deserve fair and nondiscriminatory home valuations.
The proposed rule would, if finalized, create basic safeguards to mitigate the risks associated with automated valuation models. Covered institutions that employ these models to help make home value decisions would have to take steps to boost confidence in valuation estimates and protect against data manipulation. The proposed rule would also require companies to have policies and processes in place to avoid conflicts of interest, to conduct random sample testing and reviews, and to comply with nondiscrimination laws.
This proposal complements recent work by the
Emerging AI-marketed technologies can negatively impact civil rights, fair competition, and consumer protection. Because technology has the power to transform our lives, we must ensure that AI-marketed technologies do not become an excuse for evasion of the law and consumer harm. It is critical that these emerging technologies comply with the law.
We will continue to monitor the development and use of automated systems, and work closely with our federal and state partners to enforce federal consumer financial protection laws. This includes protecting the rights of American homeowners, buyers, and sellers, regardless of whether legal violations occur by a human or by a machine.
Read the proposed rule, Quality Control Standards for Automated Valuation Models (https://files.consumerfinance.gov/f/documents/cfpb_automated-valuation-models_proposed-rule-request-for-comment_2023-06.pdf).
Comments must be received within 60 days after publication of the proposed rule in the
* * *
Original text here: https://www.consumerfinance.gov/about-us/blog/algorithms-artificial-intelligence-fairness-in-home-appraisals/
NCUA, Agencies Request Comment on Quality Control Standards for Automated Valuation Models Proposed Rule
BRIAN GRUBER JOINS FIRST MID INSURANCE GROUP
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News