US legislators introduced a bill that requires large companies to audit machine learning powered systems such as facial recognition or ad targeting algorithms to check for bias.
The Algorithmic Accountability Act
The act is sponsored by Senators Cory Booker a Democrat and Ron Wyden a Republican. In the House an equivalent bill is sponsored by Representative Yvette Clarke a Democrat. If the bill passes it would ask that the Federal Trade Commission create rules for evaluation of highly sensitive automated systems. The companies that use such systems would need to check the algorithms powering the systems to determine whether they are biased or discriminatory. They would also need to determine if the systems pose a threat to privacy or the security of individuals.
The act targets companies that access large amounts of information, making over $50 million per year, hold information on at least 1 million people or devices, or primarily act as data brokers who buy and sell consumer data. The companies would be required to examine a large range of algorithms, including those that could affect a consumer's legal rights, that attempt to predict or analyze consumer behavior, involve large amounts of sensitive information or systematically monitor a large publicly accessible physical place. Theoretically, this would include a huge area of the tech economy. If a report should uncover major risks of discrimination, privacy problems of other issues, the company involved is to address them within a reasonable time.
Facebook sued for bias
The bill comes just a few short weeks after Facebook was sued by the US Department of Housing and Urban Development. The Department had alleged its algorithms resulted in a targeting system which unfairly limited who saw housing ads. The sponsors mentioned this lawsuit in a press release.
The suit is reported in a recent article: The Department of Housing and Urban Development announced Thursday it is suing Facebook for violating the Fair Housing Act by allowing advertisers to limit housing ads based on race, gender and other characteristics. The agency also said Facebook’s ad system discriminates against users even when advertisers did not choose to do so. ProPublica first reported in 2016 that Facebook allowed housing advertisers to exclude users by race. Then in 2017, ProPublica found that—despite Facebook’s promised changes—the company was still letting advertisers exclude users by race, gender, ethnicity, family status, ability and other characteristics."
Bill covers many controversial AI areas
The bill would also include training data that could also result in biased outcomes. An example would be a facial recognition pattern trained on mostly white subjects can as a result misidentify people of other races.
Senator Ron Wyden noted the importance of an audit for bias noting that “computers are increasingly involved in the most important decisions affecting Americans’ lives — whether or not someone can buy a home, get a job or even go to jail. But instead of eliminating bias, too often these algorithms depend on biased assumptions or data that can actually reinforce discrimination against women and people of color.”
As reported in a Digital Journal article some time ago the academic world has already noted the importance of ethics in the field of AI: "The advances of artificial intelligence together with machine learning into greater areas of life requires a review of the ethical and human-facing implications, according to the University of Guelph. A new hub has been launched to address such issues."
Back in October of 2017 the Google AI chief expressed concerns about algorithms: "John Giannandrea AI chief at Google is concerned that bias is being built in to many of the machine-learning algorithms by which the robot makes decisions. At a recent Google conference on the relationships between AI systems and humans Giannandrea said: “The real safety question, if you want to call it that, is that if we give these systems biased data, they will be biased.”"
Previously published in the DIgital Journal
No comments:
Post a Comment