Important news that deserves serious scrutiny and inclusion in the current debate about law enforcement reform:
IBM is canceling its facial recognition programs and calling for an urgent public debate on whether the technology should be used in law enforcement.
In a letter to Congress on Monday,
IBM (IBM) CEO Arvind Krishna said the company wants to work with lawmakers to advance justice and
racial equity through police reform, educational opportunities and the responsible use of technology.
"IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values," he added.
Why? What could be the problem with what should be an objective data-based approach to keeping communities safe?
The problem of
algorithmic bias in data science has become more pronounced, and
there's evidence that AI-powered algorithms display bias against women and black people. Federal researchers found widespread evidence of racial bias in nearly 200 facial recognition algorithms in an extensive US
government study last year, highlighting the technology's potential for misuse.
Widespread evidence? Oh, you mean like when Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots?
ACLU
In a test the ACLU recently conducted of the facial recognition tool, called “Rekognition,” the software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime.
The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.).
That’s right. Amazon’s facial recognition software is RACIST AF. Too bad they proceeded to sell it to an undisclosed number of law enforcement agencies across the country.
Business Insider
When asked how many police departments across the US are using Amazon's facial recognition tech, Jassy said, "I don't think we know the total number of police departments that are using [Amazon's] facial recognition technology. We have 165 services in our technology infrastructure platform, and you can use them in any combination you want."
Amazon is notorious for tracking user behavior in minute detail across its website. The company apparently isn't applying that same attention to detail with its facial recognition technology.
What could go wrong.
Please note, both black people AND Women face bias. I mean, the racism is a no brainer. But women, too? Of course that shouldn’t be a surprise once you consider the racial and gender disparities endemic to the companies behind facial recognition software:
Who can forget when now-former Google employee James Damore wrote that infamous
treatise on the woeful indignity of focusing on gender parity (let’s not also ignore that it has a wikipedia page, even though he’s just a rando white dude):
[That] Men and women have psychological differences that are a result of their underlying biology. Those differences make them differently suited to and interested in the work that is core to Google. Yet Google as a company is trying to create a technical, engineering, and leadership workforce with greater numbers of women than these differences can sustain, and it’s hurting the company.
Fast Company (2019)
The makeup of employees at the tech giants, particularly at the management level, remains predominantly white and male. The failure to significantly improve triggered the ire of the Congressional Black Caucus in a visit to Silicon Valley last year.
Well what could go wrong with an industry rife with misogyny and racism whose products could potentially determine who enters the criminal justice system and who doesn’t?
As we discuss the many ways to implement change when it comes to Police, let’s not forget that for years now law enforcement has been soliciting more and more help from Big Tech in the area of facial recognition — with some disastrous results that bode poorly for the marginalized and communities of color.
So make no mistake, this news is a BIG. DEAL. That IBM is choosing to immediately highlight facial recognition’s proximity to law enforcement shows just how close we were to the end of the American Experiment (or its fulfillment if you’re a white male supremacist).
Let’s put the pressure on Congress to act and prevent the use and abuse of such practices.