Orlando police once again ditch Amazon’s facial recognition software - 7 minutes read
Orlando police ditch Amazon’s facial recognition platform a second time
Amazon’s controversial Rekognition platform, its artificial intelligence-powered facial recognition software, is no longer being used by Orlando law enforcement, ending the second attempt to use the technology in a pilot phase in central Florida. The reason: the city didn’t have the necessary equipment or bandwidth to get it properly running and never once was able to test it live.
The news, reported today by Orlando Weekly, marks another high-profile setback for Rekognition, which has been plagued by criticism of its contributions to bias policing, unlawful surveillance, and racial profiling, as well as the clandestine way Amazon has gone about selling it to police departments while it’s still in active development.
The use of Rekognition by law enforcement first came to light in May of last year, thanks only to documents obtained and made public by the American Civil Liberties Union of Northern California. At the time, Amazon was selling the cloud-based facial recognition platform to police departments in Orlando and Oregon’s Washington County, but it had not made its pitches to law enforcement public and in fact took measures like non-disclosure agreements to keep them private.
However, Amazon received significant pushback from the AI community, activists, and civil rights organizations fearful its inherent flaws would contribute to unlawful surveillance and other rights-infringing activities like racial profiling and wrongful arrests. Research showed that Amazon’s system could return significant numbers of false matches, and that it a harder time accurately identifying the gender of darker-skinned individuals and females.
Amazon remained steadfast in its defense that Rekognition was to be used as an ancillary tool for policing, and that officers were instructed only to rely on it when it had identified a match with 99 percent accuracy. But it’s not clear how actively Amazon is monitoring participating agencies for violations of its terms of service, which the company claims allow it to suspend or ban organizations and individuals that use Rekognition unlawfully or unethically. The company said last year that it would continue to sell the software to US law enforcement, despite significant criticism from both outside and within the company. Critics included employees, shareholders, and prominent AI researchers.
As a result of the pressure, it appeared that Orlando let its contract with Amazon expire in late June of last year, The New York Timesreported. But the pilot program began again, Orlando Weeklyreports, in October of last year, when police tried to get the system running on four cameras stationed around the police department’s downtown headquarters and one camera located outside a community recreation center.
Now, roughly 10 months later, the program is again getting the axe. According to local police, it costs too much money and was far too cumbersome to install, with Amazon employees failing to help the city get even one reliable live stream up and running that could run the software in real time. The company reportedly offered to provide its own cameras, but the city refused to rely on Amazon hardware.
“At this time, the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing,” the city’s Chief Administrative Office wrote in a memo to the City Council. Orlando’s police department has “no immediate plans regarding future pilots to explore this type of facial recognition technology.” The city’s chief information officer, Rosa Akhtarkhavari, told Orlando Weeklyof the second trial phase, “We haven’t even established a stream today. We’re talking about more than a year later.” Akhtarkhavari said the system was never tested on a live image even once.
The ACLU’s Matt Cagle, a technology and civil liberties attorney and vocal Rekognition critic who helped publicize Amazon’s work with law enforcement, said in statement given to The Verge,“Congratulations to the Orlando Police Department for finally figuring out what we long warned — Amazon’s surveillance technology doesn’t work and is a threat to our privacy and civil liberties.” Cagle added that, “This failed pilot program demonstrates precisely why surveillance decisions should be made by the public through their elected leaders, and not by corporations secretly lobbying police officials to deploy dangerous systems against the public.”
This is far from the end for Rekognition. The software is still in use Oregon’s Washington County, with an April article from The Washington Postclaiming it has “supercharged” police efforts in the state. That implementation, which mostly consists of a database that can cross-check uploaded photos of faces against known criminal databases, appears to be less invasive than a real-time video feed running the facial recognition tech on unsuspecting citizens.
Still, US cities are beginning to push back against unregulated use of facial recognition, with Oakland, California joining its Bay Area counterpart San Francisco in voting to ban government use of the technology just yesterday. The third and only other city with a law on the books banning police use of the software is Somerville, Massachusetts. More cities are expected to mount defenses against use of facial recognition software in the future, even as Amazon plows forward on pitching it to agencies around the country.
In a statement, Amazon defended its sale of the software to law enforcement. “We believe our customers — including law enforcement agencies and other groups working to keep our communities safe — should have access to the best technology. We also believe that facial recognition can materially benefit society, as we’ve seen with Amazon Rekognition’s use to combat human trafficking, as one example,” a spokesperson told The Verge.
“One customer alone has used Rekognition to identify over 9,000 trafficking victims. Over the past several months, we’ve talked to customers, researchers, academics, policymakers, and others to understand how to best balance the benefits of facial recognition with the potential risks,” the spokesperson added. “We outline clear guidelines in our documentation and blog for public safety use, where we also reiterated our support for the creation of a national legislative framework covering facial recognition.”
Source: The Verge
Powered by NewsAPI.org
Keywords:
Orlando, Florida • Amazon.com • Amazon.com • Artificial intelligence • Facial recognition system • Orlando, Florida • Technology • Central Florida • Bandwidth (signal processing) • Orlando Weekly • Media bias • Police • Crime • Surveillance • Racial profiling • Amazon.com • Police • Law enforcement • Public law • American Civil Liberties Union • Northern California • Amazon.com • Cloud computing • Facial recognition system • Orlando, Florida • Oregon • Washington County, Florida • Law enforcement • Non-disclosure agreement • Private property • Amazon.com • Artificial intelligence • Civil and political rights • Organization • Crime • Surveillance • Copyright infringement • Racial profiling • Amazon.com • Gender • Amazon.com • Police • Amazon.com • Terms of service • Artificial intelligence • Orlando, Florida • Amazon.com • Orlando, Florida • Amazon.com • Streaming media • Software • Streaming media • Amazon.com • Computer hardware • Orlando, Florida • Facial recognition system • Chief information officer • Orlando, Florida • American Civil Liberties Union • Civil liberties • Amazon.com • Law enforcement • Orlando Police Department • Amazon.com • Surveillance • Employment • Coercion • Privacy • Civil liberties • Surveillance • Decision-making • Public sector • Election • Leadership • Corporation • Police • Risk • Systems engineering • Public • Software • Oregon • Washington County, Oregon • Database • Photograph • Database • Real-time computing • Video • Facial recognition system • High tech • Facial recognition system • Oakland, California • San Francisco • Police • Somerville, Massachusetts • Facial recognition system • Amazon.com • Amazon.com • Sales • Customer • Police • Technology • Facial recognition system • Amazon.com • Human trafficking • Facial recognition system • Blog • Facial recognition system •
Amazon’s controversial Rekognition platform, its artificial intelligence-powered facial recognition software, is no longer being used by Orlando law enforcement, ending the second attempt to use the technology in a pilot phase in central Florida. The reason: the city didn’t have the necessary equipment or bandwidth to get it properly running and never once was able to test it live.
The news, reported today by Orlando Weekly, marks another high-profile setback for Rekognition, which has been plagued by criticism of its contributions to bias policing, unlawful surveillance, and racial profiling, as well as the clandestine way Amazon has gone about selling it to police departments while it’s still in active development.
The use of Rekognition by law enforcement first came to light in May of last year, thanks only to documents obtained and made public by the American Civil Liberties Union of Northern California. At the time, Amazon was selling the cloud-based facial recognition platform to police departments in Orlando and Oregon’s Washington County, but it had not made its pitches to law enforcement public and in fact took measures like non-disclosure agreements to keep them private.
However, Amazon received significant pushback from the AI community, activists, and civil rights organizations fearful its inherent flaws would contribute to unlawful surveillance and other rights-infringing activities like racial profiling and wrongful arrests. Research showed that Amazon’s system could return significant numbers of false matches, and that it a harder time accurately identifying the gender of darker-skinned individuals and females.
Amazon remained steadfast in its defense that Rekognition was to be used as an ancillary tool for policing, and that officers were instructed only to rely on it when it had identified a match with 99 percent accuracy. But it’s not clear how actively Amazon is monitoring participating agencies for violations of its terms of service, which the company claims allow it to suspend or ban organizations and individuals that use Rekognition unlawfully or unethically. The company said last year that it would continue to sell the software to US law enforcement, despite significant criticism from both outside and within the company. Critics included employees, shareholders, and prominent AI researchers.
As a result of the pressure, it appeared that Orlando let its contract with Amazon expire in late June of last year, The New York Timesreported. But the pilot program began again, Orlando Weeklyreports, in October of last year, when police tried to get the system running on four cameras stationed around the police department’s downtown headquarters and one camera located outside a community recreation center.
Now, roughly 10 months later, the program is again getting the axe. According to local police, it costs too much money and was far too cumbersome to install, with Amazon employees failing to help the city get even one reliable live stream up and running that could run the software in real time. The company reportedly offered to provide its own cameras, but the city refused to rely on Amazon hardware.
“At this time, the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing,” the city’s Chief Administrative Office wrote in a memo to the City Council. Orlando’s police department has “no immediate plans regarding future pilots to explore this type of facial recognition technology.” The city’s chief information officer, Rosa Akhtarkhavari, told Orlando Weeklyof the second trial phase, “We haven’t even established a stream today. We’re talking about more than a year later.” Akhtarkhavari said the system was never tested on a live image even once.
The ACLU’s Matt Cagle, a technology and civil liberties attorney and vocal Rekognition critic who helped publicize Amazon’s work with law enforcement, said in statement given to The Verge,“Congratulations to the Orlando Police Department for finally figuring out what we long warned — Amazon’s surveillance technology doesn’t work and is a threat to our privacy and civil liberties.” Cagle added that, “This failed pilot program demonstrates precisely why surveillance decisions should be made by the public through their elected leaders, and not by corporations secretly lobbying police officials to deploy dangerous systems against the public.”
This is far from the end for Rekognition. The software is still in use Oregon’s Washington County, with an April article from The Washington Postclaiming it has “supercharged” police efforts in the state. That implementation, which mostly consists of a database that can cross-check uploaded photos of faces against known criminal databases, appears to be less invasive than a real-time video feed running the facial recognition tech on unsuspecting citizens.
Still, US cities are beginning to push back against unregulated use of facial recognition, with Oakland, California joining its Bay Area counterpart San Francisco in voting to ban government use of the technology just yesterday. The third and only other city with a law on the books banning police use of the software is Somerville, Massachusetts. More cities are expected to mount defenses against use of facial recognition software in the future, even as Amazon plows forward on pitching it to agencies around the country.
In a statement, Amazon defended its sale of the software to law enforcement. “We believe our customers — including law enforcement agencies and other groups working to keep our communities safe — should have access to the best technology. We also believe that facial recognition can materially benefit society, as we’ve seen with Amazon Rekognition’s use to combat human trafficking, as one example,” a spokesperson told The Verge.
“One customer alone has used Rekognition to identify over 9,000 trafficking victims. Over the past several months, we’ve talked to customers, researchers, academics, policymakers, and others to understand how to best balance the benefits of facial recognition with the potential risks,” the spokesperson added. “We outline clear guidelines in our documentation and blog for public safety use, where we also reiterated our support for the creation of a national legislative framework covering facial recognition.”
Source: The Verge
Powered by NewsAPI.org
Keywords:
Orlando, Florida • Amazon.com • Amazon.com • Artificial intelligence • Facial recognition system • Orlando, Florida • Technology • Central Florida • Bandwidth (signal processing) • Orlando Weekly • Media bias • Police • Crime • Surveillance • Racial profiling • Amazon.com • Police • Law enforcement • Public law • American Civil Liberties Union • Northern California • Amazon.com • Cloud computing • Facial recognition system • Orlando, Florida • Oregon • Washington County, Florida • Law enforcement • Non-disclosure agreement • Private property • Amazon.com • Artificial intelligence • Civil and political rights • Organization • Crime • Surveillance • Copyright infringement • Racial profiling • Amazon.com • Gender • Amazon.com • Police • Amazon.com • Terms of service • Artificial intelligence • Orlando, Florida • Amazon.com • Orlando, Florida • Amazon.com • Streaming media • Software • Streaming media • Amazon.com • Computer hardware • Orlando, Florida • Facial recognition system • Chief information officer • Orlando, Florida • American Civil Liberties Union • Civil liberties • Amazon.com • Law enforcement • Orlando Police Department • Amazon.com • Surveillance • Employment • Coercion • Privacy • Civil liberties • Surveillance • Decision-making • Public sector • Election • Leadership • Corporation • Police • Risk • Systems engineering • Public • Software • Oregon • Washington County, Oregon • Database • Photograph • Database • Real-time computing • Video • Facial recognition system • High tech • Facial recognition system • Oakland, California • San Francisco • Police • Somerville, Massachusetts • Facial recognition system • Amazon.com • Amazon.com • Sales • Customer • Police • Technology • Facial recognition system • Amazon.com • Human trafficking • Facial recognition system • Blog • Facial recognition system •