DC’s Algorithm Bill is Out of Synch
The consumer credit industry is committed to equality and addressing discrimination in the allocation of credit and financial services. Which is why earlier this week, the American Financial Services Association and a group of eight other consumer credit trade associations shared concerns regarding the District of Columbia’s proposed B24-0558, the “Stop Discrimination by Algorithms Act of 2021,” while highlighting the federal and District laws and policies that protect consumers and prohibit discrimination in the credit and financial services industries.
As Karima Woods, Commissioner of the DC Department of Insurance, Securities and Banking, stated in her hearing testimony, “An algorithm is simply a process used to perform a calculation. Like any other technology, it is not inherently good nor bad.” But B24-0558 assumes that all automated tools using machine learning and personal data to make predictions and decisions are inherently bad, failing to recognize the everyday benefits that automated decision-making systems provide, and the local and federal policies in place that already protect consumers in such matters.
“B24-558 is a one-size-fits-all regulatory approach that fails to consider the unique legal and regulatory structure of the financial services industry. Financial institutions are already highly regulated and supervised—often by more than one regulator—and existing regulation and examination procedures capture the risks of algorithmic decision-making, making this approach redundant,” the letter notes. “Financial institutions and service providers already provide numerous disclosures in compliance with the laws outlined above and to ensure consumers understand their accounts and products. Adding new sets of consumer disclosures and notices duplicate the information already provided under other laws, but in a slightly different format. These proposed disclosures risk overloading consumers with information so voluminous that it may become meaningless, which may lead to more confusion and undermine the central purpose of such disclosures.”
As the letter notes, the financial services industry believes that technology can make financial services safer, more convenient, and more inclusive. Algorithms make credit decisions more accurate, more fair, faster and more affordable by judging applicants on their credit worthiness. Algorithms also eliminate some of the risk of the biases that can be found in human interactions and can help identify products and services designed to benefit communities, (including historically underserved populations), helping close the racial wealth gap. The use of algorithms is also crucial for protecting all consumers and financial institutions alike from fraud.
The letter also notes use of technology is governed by District laws that provide increased transparency and consumer protections in all credit transactions, regardless of whether that transaction in conducted in person, manually or involves an algorithm or automation. At the federal level, the Bureau of Consumer Financial Protection (“CFPB”), Board of Governors of the Federal Reserve System (“FRB”), Office of the Comptroller of the Currency (“OCC,”) Federal Deposit Insurance Corporation (“FDIC”), and the National Credit Union Administration (“NCUA”) all are actively engaged on the matter.
Read the full letter here. AFSA, joined by several other consumer credit trades associations, requested that the bill be withdrawn and that the DC Council bring all stakeholders on the matter together to identify the best policy approaches that can serve both protect consumers and ensure they access to the financial services they require.
October 7th, 2022