House Energy & Commerce Subcommittee Issues Testimony From Lawyers' Committee for Civil Rights Under Law Managing Attorney Brody
* * *
I. Introduction
Chair Bilirakis, Ranking Member Schakowsky, and Members of the Subcommittee, thank you for the opportunity to testify today on bipartisan, bicameral legislation that seeks to strengthen data privacy, security, and civil rights. My name is
The Lawyers' Committee uses legal advocacy to achieve racial justice, fighting inside and outside the courts to ensure that Black people and other people of color have voice, opportunity, and power to make the promises of our democracy real. The Lawyers' Committee works at the intersection of racial justice, technology, and privacy to address predatory commercial data practices, discriminatory algorithms, invasions of privacy, disinformation, and online harms that disproportionately affect Black people and other people of color, including people with intersectional identities, like immigrants, women of color, and LGBTQI+ people of color.
We care about privacy because it ensures that who we are cannot be used against us unfairly. That is why privacy rights are civil rights. The "inviolability of privacy," the
That's why we are encouraged by the new American Privacy Rights Act (APRA), a bipartisan and bicameral effort to safeguard data privacy and civil rights online. Passing comprehensive privacy legislation would be a major advancement for the public good. I would like to thank Chair
* * *
1
* * *
Like the American Data Privacy and Protection Act, the American Privacy Rights Act would effectively establish "building codes for the Internet." Just as construction regulations enable us to build safe homes and physically expand upwards and outwards, so too will strong data protection rules establish an infrastructure for American leadership in online commerce. These foundational rules include data minimization, civil rights and consumer protections, transparency, data security, individual rights to control one's data, and multilayered enforcement.
The American Privacy Rights Act, however, has several key improvements over previous legislation. It prohibits forced arbitration of claims involving discrimination or other "substantial privacy harms." It allows individuals to opt-out from algorithms that would make consequential decisions about them based on personal data. It has stronger protections for health data. The Act shortens the timespan individuals would have to wait before they can enforce their rights. And it prohibits "dark patterns" that impair and individual's control over their own data.
But the bar for comprehensive privacy legislation has also been raised in the last two years, as states have enacted more privacy and civil rights protections for their citizens. Many states are passing or considering comprehensive privacy laws. Some of these are fairly protective, such as
* * *
2
3 Maryland Online Data Privacy Act of 2024, S.B. 541, 2024 Leg., 446th Sess. (Md. 2024).
4 Press Release, Cal. Priv. Prot. Agency, CPPA Applauds Governor Newsom for Approving the California Delete Act (
5 Washington My Health My Data Act, ch. 191, 2023 Wash. Sess. Laws 867.
6 See 740 ILL. COMP. STAT. 14/ (2008).
7 See, e.g., S. 148, 193d Gen. Ct. (Mass. 2023).
* * *
But we must also recognize that while residents of some states may enjoy data protections, they are the minority. Nationwide, people in most jurisdictions are being left behind.
Any new federal legislation must account for both evolving protections and gaps at the state level. This new bill must be at least as strong as the state laws to justify any form of preemption that would restrict the states from continuing in their role as the "laboratories of democracy." And it needs to extend protections nationally, so the entire country benefits.
The American Privacy Rights Act represents an imperfect but needed bargain to protect everyone's rights amid this patchwork landscape. Who we are and how we behave can be collected, analyzed, and exploited by companies at home and by nations abroad. Nationwide, people are harmed everyday by algorithmic discrimination, fraud, stalking, and other abuses fueled by the invasion of our privacy. Consumer data fuels disinformation campaigns by foreign adversaries who seek to undermine American democracy. Comprehensive privacy protection is also necessary to mitigate risks to individuals from artificial intelligence. If personal data is the new oil, AI and other algorithmic technologies represent a new form of combustion. Time is short and stakes are high; we must require these new technologies to be safe and effective from the outset, lest they blow up in our face.
Overall, we're highly encouraged by the following provisions in the new bill. But we also urge the Committee to consider key fixes and improvements.
First, the bill would prohibit discriminatory uses of personal data and require companies to test their algorithms for bias. Years of reporting and research show that algorithms used for advertising, pricing, and eligibility decisions frequently produce discriminatory outcomes across critical areas of the economy. A review of over two million mortgage applications found that Black applicants were 80 percent more likely to be rejected by mortgage approval algorithms when compared with similar white applicants.8 Scoring algorithms used by auto insurers judge applicants "less on driving habits and increasingly on socioeconomic factors."9 When an algorithm executes its mission of creating efficiency by finding hidden correlations amid large sets of data, it will often mistake the long-term consequences of discrimination and inequality for an individual's preferences and traits.10 These mistaken shortcuts fail to account for the fact that while a person may be in part a product of their circumstances, that does not mean they necessarily are or should be limited by those circumstances. Expediency is no excuse for segregation.
* * *
8
9 CONSUMER REPS.,
* * *
This discrimination disproportionately impacts Black people and other people of color, including people with intersectional identities, like immigrants, women of color, and LGBTQI+ people of color.11 However, the current language in the anti-discrimination provision contains an exception which could unintentionally allow advertising, marketing, or soliciting that segregates base on racial groups and populations. If a business posts a sign that says, "Whites Only", it should not matter whether it is written in ink or pixels, words, or code. The discrimination and harm are the same. The legal consequences should be the same too. We believe fixing this provision is an easy but important action to secure the core intent of the American Privacy Rights Act.
Second, the bill would require companies to collect and use only as much personal data as is necessary, proportionate, and limited to provide the services that consumers expect, and to safeguard that data. This builds consumer trust and reduces the risk of personal data being exploited for fraud, theft, and deceptive practices. Identity theft and fraud disproportionately impact communities of color, and low-income consumers are less likely to have the resources to bounce back after being ripped off.12 Data limitations and privacy rules create a bedrock level of trust for consumers, who can better make choices about how to interact and allow their data to be used when they know what to expect. However, we are concerned that this bill has backtracked from ADPPA by removing the right of individuals to bring claims based on violations of the rules governing the collection, processing, and retention of sensitive covered data. We also believe that the advertising provisions need clarification to avoid confusion for businesses and consumers alike. Additionally, service providers for government entities need to be covered by the bill, as they were in ADPPA, to avoid potential loopholes and gamesmanship.
* * *
10 See generally WHITE HOUSE OFF. OF SCI. & TECH. POL'Y, BLUEPRINT FOR AN AI BILL OF RIGHTS 24-25 (2022) [hereinafter Blueprint], https://www.whitehouse.gov/wp-content/uploads/2022/10/Blueprint-for-an-AI-Bill-of-Rights.pdf;
11 See CHUNG, supra note 10, at 10; MILLNER & TRAUB, supra note 10.
12
* * *
Third, the bill would give individuals transparency into and control over how their data is used while prohibiting "dark patterns"--some of the worst practices that trick individuals into unwittingly divulging data and create design barriers that prevent individuals from exercising their rights. The bill clearly empowers individuals with the ability to access, correct, delete, and port their own information. Yet, parts of the bill could undermine protections enforced by other federal agencies. For example, the bill's displacement of the Communications Act is vague and overbroad. This will have serious, possibly unintended consequences, including endangering the
Lastly, we applaud the meaningful enforcement authority that this bill vests in federal, state, and individual actors. Having a private right of action allows individuals to obtain relief from a court when a company violates the Act. The ability to bring a private lawsuit is particularly important for communities of color that, historically, could not rely on the government to vindicate their rights. That is why practically every major civil rights law has a private right of action. A right without a remedy is no right at all to people who have been wronged.
In addition, the bill gives many new responsibilities and authorities to the
We have long said that when it comes to privacy legislation, we cannot afford to make the perfect the enemy of the good. In a strong compromise that can stand the test of time, no one gets everything they want. There are parts of this bill that we do not like. But millions in this nation--disproportionately Black people and other people of color--currently face severe and ongoing harms from the invasion of their privacy, including discrimination, stalking, fraud, and other exploitative data practices. Those individuals, and this
* * *
13 See Nat'l Coal. on Black Civic Participation v. Wohl (NCBCP I), 498 F.Supp. 3d 457, 464 (S.D.N.Y. 2020); Nat'l Coal. on Black Civic Participation v. Wohl (NCBCP III), 661 F. Supp. 3d 78 (2023).
14
* * *
Almost sixty years ago, we decided as a nation that our polity is stronger when everyone has a fair chance.
II. Lack of Privacy and Online Civil Rights Protections Enable Algorthmic Discrimination Across the Economy
Equal opportunity, privacy, and civil rights are intertwined with technological advancement. Algorithms use data to make decisions about all aspects of peoples' lives, determining who can rent a house, who can get a loan, who can get a deal, and consequentially--who cannot. One of the greatest civil rights challenges of our generation is to ensure that our new data-driven economy does not replicate or amplify existing discrimination. To ensure that technology serves all of us. But achieving the full measure of freedom in a data-driven economy also requires freedom from discrimination, which is increasingly amplified online through algorithmic bias and pervasive data collection.
Although algorithmic systems are widely used, they pose a high risk of discrimination, disproportionately harming Black communities and other communities of color. Because these algorithmic technologies are typically built using societal data that reflects generations of discriminatory practices such as redlining and segregation, they often replicate and reinforce past patterns of discrimination. The tools of the future lock us into the mistakes of the past.
Commercial surveillance, data collection, and algorithmic decision-making reinforce a separate and unequal society, and correspondingly an unequal market. Each click, habit, and individual characteristic is collected and cataloged to discern not just preferences, but sensitive data about individuals' race, religion, gender, and other traits--or proxies for them. Algorithmic systems use this information to determine what products consumers see, what price or interest rate they are quoted, and what eligibility they qualify for.
Privacy legislation is a civil rights issue because privacy protections can help ensure that people's identities and characteristics cannot be used against them unfairly. Such protections can empower communities of color and open doors for marginalized populations. It can also provide clarity to businesses and level the playing field for entrepreneurs.
However, there is currently no comprehensive federal privacy law. Existing anti-discrimination laws have many gaps and limitations as well. Some exclude retail or have unresolved questions as to whether they apply to online businesses. Others apply to specific sectors, like housing and employment, but may not cover new types of online services used to match individuals to these opportunities. To give a few examples, under current federal civil rights law it would be legal for an online retailer to charge higher prices to women or to refuse to sell products to Christians.15 A service provider could use discriminatory algorithms to look for workers to target for recruitment so long as the provider does not meet the definition of an "employment agency" under Title VII.16 And it is wholly unclear whether existing laws will apply at all to discrimination in many new online-only economies related to online gaming, influencers, streamers, and other creators.
As a result of gaps in federal law, individuals currently have little recourse against a modern barrage of discriminatory algorithmic tools. The race for more advanced models builds a demand for more data, and correspondingly more surveillance of consumers, regardless of whether these commercial data practices reinforce the structural racism and systemic bias that pervade our society. Tech companies have every incentive to misuse personal data, intentionally or unintentionally, in ways that harm historically marginalized communities through deception, discrimination, exploitation, and perpetuation of redlining. Without a strong privacy standard, companies holding personal data can use it to directly discriminate against people of color or other marginalized groups. They can also make data available to other actors who use it to discriminate, or design data processing practices in a manner that negligently, recklessly, or knowingly causes discriminatory or otherwise harmful results, such as algorithmic bias or promotion of disinformation.17
* * *
15 See 42 U.S.C. Sec.Sec. 1981, 2000a; Shaare Tefila Congregation v. Cobb,
16
17 As one court observed in a 2020 case involving voter intimidation robocalls targeted at Black communities: NCBCP I, 498 F.Supp. 3d at 464.
* * *
The bottom line is that if these companies and data brokers were not collecting, aggregating, and using vast quantities of personal data in privacy-invasive ways in the first place, many of these harms would not happen or would be far more difficult to execute.
Not surprisingly, extensive documentation cited in the attached appendix demonstrates that exploitative data practices enable worse treatment of Black people and other historically marginalized communities and unequal access to goods and services due to discriminatory algorithms across sectors of the economy that include housing, employment, credit and finance, insurance, healthcare, education, retail, and public accommodations.
For example, online real estate brokerage Redfin was sued for engaging in redlining in violation of the Fair Housing Act. Redfin offered limited service to homes under a certain price, which depressed sale prices.
* * *
18
19
* * *
In lending, too often a consumer's identity will determine which products they are offered.
Algorithmic discrimination is also increasingly manifesting in retail settings. For eight years,
* * *
20
21 See
22
23 Press Release,
24 Press Release, Nat'l Inst. of Standards & Tech., NIST Study Evaluates Effects of Race, Age, Sex on
25
* * *
III. The American Privacy Rights Act
The "American Privacy Rights Act" would establish a national data privacy and data security framework. We are pleased to see that it prioritizes civil rights protections that address data-driven discrimination. The draft legislation is a significant step towards enacting a major comprehensive privacy proposal which can gain bipartisan and bicameral support. We are encouraged by the progress of this legislation and want to address this Committee on the core strengths of this discussion draft and needed improvements to capitalize on those strengths.
A. Welcomed Civil Rights Protections and Needed Fixes
It is past time to enact a comprehensive consumer privacy law that safeguards civil rights online.
* * *
26
27
28
29 See, e.g.,
* * *
We welcome the civil rights provisions of the "American Privacy Rights Act" that will prohibit many common forms of online discrimination. The bill prohibits using personal data to discriminate based on protected characteristics. It would also apply to discriminatory algorithms and technologies that use them, such as commercial uses of biased facial recognition systems.30 The bill allows companies to process protected class data for the purpose of self-testing to root out discrimination, or to expand the pool of applicants, candidates, or customers. The anti-discrimination provision would also preserve free speech; it does not apply to non-commercial activities or to private clubs or groups, which are the same exceptions in the Civil Rights Act of 1964.
However, the addition of a clause in the bill exempting "advertising, marketing, or soliciting economic opportunities or benefits to underrepresented populations or members of protected classes" is rife for abuse. This provision needs to be deleted. First, underrepresentation is undefined in the Act and completely depends on context. An advertiser could combine multiple identity traits and jerry rig the relevant target market to make almost any demographic appear "underrepresented." Second, the provision directly allows advertising and solicitations to be targeted to specific protected classes. Because of the nature of online targeted advertising, this means that other groups who are not in the target audience would be excluded from receiving those opportunities. This would allow "whites only" retail ads or "men only" financial services ads, for example. This is precisely the type of conduct that caused the
On a more positive note, the civil rights provision in the bill will also apply to social media platforms. These protections should help increase fairness in recommendation algorithms that have been shown to disadvantage creators and influencers of color.32 These provisions are critical in addressing civil rights online. We are encouraged by the digital access rights and transparency requirements that will help identify discriminatory practices, and we urge this Committee to make needed light touch fixes and include strong civil rights protections in legislation going forward.
* * *
30
31 See Press Release,
* * *
B. Opportunity to Improve Algorithmic Assessments
In addition, the bill requires pre-deployment algorithmic design evaluations and post-deployment impact assessments for the algorithms employed by covered entities. We have seen algorithms reproduce patterns of discrimination in employment recruiting, housing, education, finance, mortgage lending, credit scoring, healthcare, vacation rentals, ridesharing, and other services.33 We applaud that the bill requires the assessments to test for discrimination in these types of economic opportunities, as well as to explicitly test for disparate impacts on the basis of protected characteristics.
However, there is a large opportunity to improve this section based on learnings about data practices and AI development over the past few years. In particular, the Lawyers' Committee prefers a regulatory structure that assigns different and detailed requirements for developers and deployers of algorithms, to recognize their different roles. Developers are often better situated to assess design issues, while deployers often are better situated to assess implementation issues. As currently written, it could be difficult for a developer who is not a deployer, or vice versa, to comply with the requirements.
We recommend modifying or replacing this subsection with more precise mandates, such as in the Lawyers' Committee's own model legislation, the "Online Civil Rights Act."34 This will lead to more prescriptive assessments of algorithmic tools both before and after deployment, while also giving clearer guidance to industry and more tailored requirements for those building innovative technologies on top of large personal data sets.
We also support redefining the covered algorithm definition and consequential decision definition, so that algorithmic tools are evaluated based on their ability and likelihood to impact the critical areas of life that matter to individuals. At present, the definition of covered algorithm is critically broken. It would sweep up any computational process--i.e. any use of a computer. This would subject many extremely basic or unobjectionable forms of computation to provisions that use this term, such as the algorithmic impact assessments. Following the model of our Online Civil Rights Act, we recommend defining "covered algorithm" more narrowly to focus on technologies like AI and machine learning, and also tying the definition to circumstances involving "consequential actions." These revisions would mandate that developers and deployers of algorithmic tools, like AI systems, should be required to evaluate and audit their products for discrimination, bias, and harm both before and after deploying or offering their products in interstate commerce. First, developers and deployers should be required to conduct a short form evaluation checking whether it is plausible that the use of an algorithm may result in a covered harm under the act. If harm isn't plausible, they should be free from further regulatory requirements.
* * *
32
33 Letter from Civil Rights, Civil Liberties, and Consumer Protection Organizations to the
34
* * *
If harm is plausible, the legislation should require developers and deployers to engage an independent auditor to evaluate the algorithm's design, how it makes or contributes to decisions about significant life opportunities, how the algorithm might produce harm and how that harm can be mitigated. Deployers should then annually assess the algorithm as it is used, detailing any changes in its use or any harms it produces, including measuring disparate impacts. Developers should review these assessments to determine if the algorithm needs modifications, and the evaluations, assessments, and reviews should be publicly shared and reported to a federal regulator. Sunlight is the best disinfectant.35 The evaluation should include a detailed review and description so that external researchers can evaluate how the covered algorithm functions, including its risks, uses, benefits, limitations, and other pertinent attributes. Both evaluations and assessments should be reported to the
C. Important Data Minimization Standards
Pervasive access to peoples' personal data, often obtained without the knowledge or consent of the individual, can lead to discriminatory, predatory, and unsafe practices. The internet is rife with examples of how the overcollection of data leads to unsafe conditions for people of color or particularly vulnerable individuals. Companies should not collect or use more personal info than is necessary to do what the individual expects them to do. Beyond basic cybersecurity and legal obligations, companies also should not use personal data for secondary purposes that an individual does not expect or has not consented to. The reason is simple: personal data collected by companies can proliferate in a way that maximizes risk for the individual and for society at-large. This has become glaringly apparent to those seeking reproductive health care or seeking to protect Black children from online abuses.
* * *
35 See
* * *
For example, data broker SafeGraph collected, packaged, and sold location data specifically tracking visitors to over 600
* * *
36
37 See
38
39 BUSINESSWIRE, Social Media Victims Law Center Files Suit Against Social Media Giants for the Race-Driven Anguish Suffered by One Small-Town Family (
40 See
* * *
Clear baseline protections for data collection, including both primary and secondary uses of data, should be enacted to help cage these types of risks, and prevent harms. Personal data are the fuel of online commerce. They can be used for good--to create new products, conduct beneficial research, mitigate disparities, or tailor experiences that consumers want. They can also be abused--to steal someone's identity, exclude someone from opportunities, target someone for harassment or abuse, engage in predatory practices and scams, or to reinforce legacies of segregation and redlining. Keeping data collection, use, and sharing limited to what is necessary, proportionate, and limited to provide expected services is essential to keeping consumers safe.
This Act reduces the amount of data that will fall into the wrong hands, providing a knock-on benefit in combating illicit fraud and identity theft. Data breaches are often especially problematic for people of color living on fixed or low incomes.41 Companies track cell phone location data without consent and sell this data to debt collectors and other predatory actors, which disproportionately harms low income Black and Brown communities.42 This bill's data minimization requirements will restrict data collection and use to purposes necessary for providing services expected by an individual and restricts secondary uses or data sharing without explicit opt-in consent. Importantly, the bill imposes tighter protections for "sensitive covered data," such as restricting its use for advertising. However, the data minimization requirements should also be enforceable by a private right of action. The private right of action in the
We are encouraged that the Act imposes a baseline duty on all covered entities to collect or use covered data only as needed and appropriate. It does not create a "notice and consent" regime in which any practice is allowed so long as a consumer consents after being presented with lengthy and legalistic privacy notices. Notice and consent has repeatedly been shown to be a failure. We do not allow individuals to consent to more arsenic in their drinking water or opt-out of smoke alarms in their homes. We do not expect consumers to inspect a car's engine before driving it off the sales lot. Rather, we impose baseline protections so that consumers can trust that products are safe and functional without having to look under the hood. Consumers should expect no less from digital products.
* * *
41
42
* * *
D. Individual Empowerment through Transparency and Control
Transparency about how companies collect and use data will ultimately shed light on discriminatory practices. Providing individuals with understandable, easy to read, privacy policies detailing data collection puts the individual in the driver's seat. This transparency, coupled with giving users the ability to access, correct, or delete their data, lets individuals make empowered choices. They can choose to access and correct their data, opening pathways to self-sufficient fixes for inaccurate background check reports, which disproportionately harm Black and Brown people.43 Provisions contained in the American Privacy Rights Act give individuals the power to delete their data and empower them to protect themselves. They can reduce their data footprint, or take away their data from insecure third parties, minimizing the risk of fraud, identify theft, and exploitation. Or they can port their data to another company that will give them better service.
Additionally, new provisions boost individual autonomy, including a prohibition on "dark pattern" user interfaces designed to thwart decision making and deceive and deprive users of the choices they hold. This addition is to be applauded and protected. Likewise, the prohibition on pre-dispute arbitration agreements for those under the age of 18 or alleging a substantial privacy harm--which includes harms from discrimination--also works to empower individuals. Individuals should not be denied access to a court of law when seeking to protect their privacy or the privacy of their children.
Requirements on third parties who buy, sell, and collect data, also known colloquially as data brokers, also work to empower individuals and increase transparency. Currently, data brokers continually collect and amass data for sale. Some may not be accurate. Most individuals are unaware what information is bought or collected about them. This data is then used to conduct background checks for employment, housing, and other services, as well as for credit scoring. Inaccuracies disproportionately harm people of color, as well as those who have a conviction or arrest record--even if that arrest never resulted in a conviction. This bill rectifies that harm by creating a data broker registry, reporting requirements, and a national opt-out registry. In addition, the bill requires businesses--including data brokers--to minimize the data they collect, restricts selling such data to third parties without consent, and provides individuals with a right to access, correct, or delete their data.
* * *
43
* * *
However, the American Privacy Rights Act should tread carefully. We see two problems with current provisions regarding data brokers. First, the
Second, broad carve outs for service providers in different contexts could undermine enforcement and empowerment. Data brokers acting as service providers to a government entity are not covered by the current draft of the Act. This was not the case in ADPPA, which did apply to this category of service providers. Ad-tech companies and data brokers also have a recent history of attempting to exploit "service provider" exemptions to continue to exploit individual's data.
E. Making Privacy Rights Real Through Real Enforcement.
As encouraged as we are about some provisions of this Act, we know that data privacy legislation will only live up to its promise if it is able to be readily enforced. We are encouraged that the American Privacy Rights Act gives clear enforcement authority to federal, state and individual actors. The best way for an individual to safeguard their rights is to be able to seek a remedy to the injury they suffer themselves, in a court of law. This private right of action must be protected and expanded to include core data minimization sections of
However, there is more to be done to make the promise of this legislation real for the individuals and agencies that would enforce it. As mentioned, the
* * *
44 Press Release, Consumer Fin. Prot. Bureau, CFPB Launches Inquiry Into the Business Practices of Data Brokers (
* * *
IV. Conclusion
The "American Privacy Rights Act" is a promising piece of legislation aimed at solving long lingering critical problems. We appreciate this Committee's continued attention to the issue and the opportunity to testify on how to strengthen privacy protections for individuals and curb data-driven discrimination. The Lawyers' Committee looks forward to working on a bipartisan basis to strengthen this bill so that it could advance over the finish line, ultimately securing long overdue data privacy and online civil rights protections for all people, particularly Black people and other people of color who are most often targeted for harm in the digital world.
* * *
Original text here: https://d1dth6e84htgma.cloudfront.net/David_Brody_Testimony_IDC_Privacy_Hearing_2024_04_17_671151ad5b.pdf
How to buy life insurance in 8 steps
World Bank Returns to the Cat Bond Market Providing Financial Protection to Jamaica
Advisor News
Annuity News
Health/Employee Benefits News
Life Insurance News