In many contexts, an algorithm is too high-stakes a tool to be wielded without outside supervision. That’s a view shared by ethics experts and regulators.
Policy conversations are growing more and more frequent, says Kak—ranging from when audits should be mandated to the role of regulatory agencies in doing the mandating.
One early signal: The Algorithmic Accountability Act, introduced by a group of U.S. senators in April 2019, would require corporations to analyze and fix algorithms that result in biased or discriminatory practices against Americans.
“There need to be legal structures around how these are conducted and what standards they need to meet, rather than leave it up to the discretion of companies,” says Kak. “This includes creating pathways that allow external actors researchers, journalists, or regulatory agencies - to audit these systems that are typically shielded by corporate secrecy laws."
Those processes ultimately add “good” friction into algorithmic systems, she says - and they’re helped along by a wide cast. Auditing is not just a task for those who write the code; the process should include engineers, data scientists, social scientists, lawyers, designers, psychologists, and more engaging with "equal footing,” says Chowdhury.
"It’s about recognizing the different forms of expertise that are required to evaluate algorithms,” says Kak. “Lawyers, policy people, and researchers, but also so many of the community advocates that have made the largest strides...in the last few years, in creating a noise when they have been directly impacted by these systems."
And just like Europe’s data protection laws, says Kak, it’s vital that the next wave of algorithmic accountability policy “create that space” to stop moving forward with a problematic system altogether, no matter the incentives or potential losses.
Already, some cities and companies have deemed certain algorithms too racist, sexist, or otherwise dangerous to use.
For example: the wave of public and private bans on certain uses of facial recognition tech that swept the country this year.
Algorithmic auditing is just one tool in this broader societal push to ensure that the emerging technologies we interact with are safe, fair, and used responsibly. In some cases, no amount of auditing can offset the dangers.