Skip to content

Association to Testify on DC’s Stop Discrimination by Algorithms Act

The District is the first jurisdiction in the nation to introduce a comprehensive bill that would hold companies accountable if their algorithmic decision-making programs harm vulnerable communities.

A public hearing is scheduled this week on the legislation that would hold businesses responsible for preventing biases in algorithms used in decision-making processes for “important life opportunities” and require them to report and correct algorithms that have shown biases in housing, employment, credit, and health care.

The Association has been engaged in ongoing discussions with stakeholders and has submitted a letter to the DC Council outlining its concerns. While the Association supports the intent of the bill – to protect vulnerable communities – the legislation is confusing, burdensome and overreaches. 

Businesses relying on a “service provider” – a vendor that performs “algorithmic eligibility determinations or algorithmic information availability determinations” – would be required to execute a written agreement with that provider to comply with the law. This is a key issue of concern for credit unions as it is not feasible or possible for each credit union to require Fannie Mae or credit bureaus to certify compliance with the bill. Any company or platform that uses AI, auto-decisioning, or machine learning tools to assist decision-making has spent considerable time and money creating and protecting this information. Simply handing this information over to the Attorney General’s Office for inspection would be a significant risk to these investments and is an attempted overreach by the DC government.

The Association will testify on Thursday at the DC Council public hearing. You can view the bill here

Join Our Mailing List

Keep up with the latest industry info, advocacy updates, member spotlights and upcoming events.

Name