Skip To Main

CLARKE, WYDEN AND BOOKER INTRODUCE ALGORITHMIC ACCOUNTABILITY ACT OF 2022 TO REQUIRE NEW TRANSPARENCY AND ACCOUNTABILITY FOR AUTOMATED DECISION SYSTEMS

Legislation Requires Assessment of Critical Algorithms and New Public Disclosures; Bill Endorsed by AI Experts and Advocates; Bill Will Set the Stage For Future Oversight by Agencies and Lawmakers

FOR IMMEDIATE RELEASE:

February 7, 2022

MEDIA CONTACT: 

e: jeanette.lenoir@mail.house.gov 

c: 202.480.5737

Washington, D.C.Representative Yvette D. Clarke, D-N.Y., with U.S. Senator Ron Wyden, D-Ore., and Senator Cory Booker, D-N.J., have introduced the Algorithmic Accountability Act of 2022, a landmark bill to bring new transparency and oversight of software, algorithms and other automated systems that are used to make critical decisions about nearly every aspect of Americans’ lives. 

“When algorithms determine who goes to college, who gets healthcare, who gets a home, and even who goes to prison, algorithmic discrimination must be treated as the highly significant issue that it is. These large and impactful decisions, which have become increasingly void of human input, are forming the foundation of our American society that generations to come will build upon. And yet, they are subject to a wide range of flaws from programing bias to faulty datasets that can reinforce broader societal discrimination, particularly against women and people of color. It is long past time Congress act to hold companies and software developers accountable for their discrimination by automation,” said Rep. Clarke. “With our renewed Algorithmic Accountability Act, large companies will no longer be able to turn a blind eye towards the deleterious impact of their automated systems, intended or not. We must ensure that our 21st Century technologies become tools of empowerment, rather than marginalization and seclusion.”

“If someone decides not to rent you a house because of the color of your skin, that’s flat-out illegal discrimination. Using a flawed algorithm or software that results in discrimination and bias is just as bad. Our bill will pull back the curtain on the secret algorithms that can decide whether Americans get to see a doctor, rent a house or get into a school. Transparency and accountability are essential to give consumers choice and provide policymakers with the information needed to set the rules of the road for critical decision systems,” Wyden said. 

“As algorithms and other automated decision systems take on increasingly prominent roles in our lives, we have a responsibility to ensure that they are adequately assessed for biases that may disadvantage minority or marginalized communities,” said Sen. Booker. “This is why I am proud to reintroduce this legislation and create the transparency needed to prevent unwanted disparities and to hold bad actors accountable.”

The bill requires companies to conduct impact assessments for bias, effectiveness and other factors, when using automated decision systems to make critical decisions. It also creates, for the first time, a public repository at the Federal Trade Commission of these systems, and adds 75 staff to the commission to enforce the law. 

It is co-sponsored by democratic Sens. Brian Schatz, D-Hawaii, Mazie Hirono, D-Hawaii, Ben Ray Luján, D-N.M., Tammy Baldwin, D-Wis., Bob Casey, D-Pa., and Martin Heinrich, D-N.M.

“Discrimination and bias can’t be left unchecked just because the decision is being made by an automated system and a faulty algorithm,” said Senator Hirono. “This bill will require companies to look at the impact of their automation and provider consumers with the knowledge of when and how they’ve been impacted. Consumers deserve fair and equitable treatment.”

“Too often, Big Tech’s algorithms put profits before people, from negatively impacting young people’s mental health, to discriminating against people based on race, ethnicity, or gender, and everything in between,” said Senator Baldwin. “It is long past time for the American public and policymakers to get a look under the hood and see how these algorithms are being used and what next steps need to be taken to protect consumers. I am proud to support the Algorithmic Accountability Act of 2022 to take the first step in giving the folks of Wisconsin and our country transparency and accountability from Big Tech’s harmful algorithms.”

Clarke, Wyden and Booker updated the 2019 Algorithmic Accountability Act after speaking with dozens of experts, advocacy groups and other stakeholders on how to improve the bill. The 2022 legislation shares the goals of the earlier bill, but includes numerous technical improvements, including clarifying what types of algorithms and companies are covered, ensuring assessments put consumer impacts at the forefront, and providing more details about how reports should be structured. A full summary of the bill is available here.

The Algorithmic Accountability Act is endorsed by a broad array of experts and civil society organizations: Access Now, Accountable Tech, Aerica Shimizu Banks, Brandie Nonnecke, PhD, Center for Democracy and Technology (CDT), Color of Change, Consumer Reports, Credo AI, EPIC, Fight for the Future, IEEE, JustFix, Montreal AI Ethics Institute, OpenMined, Parity AI and US PIRG.

“Big Tech’s problem of algorithmic bias has gone on for too long and we can no longer allow for these issues to go unregulated,” said Arisha Hatch, Vice President of Color Of Change. “When bias in algorithms goes unchecked, Black people are subjected to discrimination in healthcare, housing, education, and employment — impacting nearly all parts of our lives. In order to reduce the impact of this bias, Big Tech and their operations must proactively detect and address discrimination. Companies conducting their own audits is a first step but prevention will be key. The Algorithmic Accountability Act can effectively protect Black people from automated discrimination, equip the FTC with the resources necessary to enforce these protections and create a more equitable digital space. Color Of Change commends Sens. Wyden and Booker and Rep. Clarke for advancing racial justice equities in tech regulation. We hope Congress will pass this instrumental legislation.”

“Poorly designed algorithms can result in inaccurate outcomes, inconsistent results, serious discriminatory impacts, and other harms,” said Nandita Sampath, Policy Analyst at Consumer Reports. “The Algorithmic Accountability Act is an important foundation to provide researchers and policymakers with the tools to identify who can be impacted by these emerging technologies and how. We look forward to continue working with the sponsors of the bill to seek out the most effective ways to mitigate algorithmic harm.”

Introduced with more than 30 Original Cosponsors in the House, including: Reps. Eleanor Holmes Norton (D-DC), Raul M. Grijalva (AZ-03), Gwen Moore (WI-04), Ayanna Pressley (MA-07), G.K. Butterfield (NC-01), Marc A. Veasey (TX-33), Karen Bass (CA-37), Steve Cohen (TN-09), Rick Larsen (WA-02), Lori Trahan (MA-03), Jamaal Bowman (NY-16), Stacey E. Plaskett (D-VI), David J. Trone (MD-06), Bonnie Watson Coleman (NJ-12), Emanuel Cleaver (MO-05), Mondaire Jones (NY-17), Sheila Jackson Lee (TX-18), Adriano Espaillat (NY-13), James P. McGovern (MA-02), Frederica S. Wilson (FL-24), Donald M. Payne (NJ-10), Alma S. Adams (NC-12), Robin L. Kelly (IL-02), Ilhan Omar (MN-05), Ro Khanna (CA-17), Jerry McNerney (CA-09), Brenda L. Lawrence (MI-14), Sean Casten (IL-06), André Carson (IN-07), Dwight Evans (PA-03), and Jared Huffman (CA-02).

The full bill text is available here

A summary of the bill is available here.

A section-by-section of the bill is available here.

Supporters of the bill can be found here.

A web version of this release is available here.

###

Yvette D. Clarke has been in Congress since 2007. She represents New York’s Ninth Congressional District, which includes Central and South Brooklyn. Clarke is Chair of the Congressional Black Caucus Taskforce on Immigration, a Senior Member of the House Energy and Commerce Committee, and a Senior Member of the House Committee on Homeland Security.

Issues: ,