New York City, USA: facial recognition cameras reinforcing racist policing - new research
Neighbourhoods in Bronx, Brooklyn and Queens subject to greater CCTV surveillance
Digital volunteers mapped more than 25,500 CCTV cameras across the city
‘The shocking reach of facial recognition technology in the city leaves entire neighbourhoods exposed to mass surveillance’ - Matt Mahmoudi
People in the US city of New York are subject to “shocking” mass surveillance through facial recognition technology cameras, with the invasive technology especially trained on areas of the city with greater concentrations of non-white residents, new research by Amnesty International and partners has revealed today.
New analysis - published as part of a global Ban The Scan campaign - shows how the New York Police Department’s vast surveillance operation particularly affects people already targeted for stop-and-frisk across the five boroughs of the city.
In the Bronx, Brooklyn and Queens the research shows that the higher the proportion of non-white residents, the higher the concentration of facial recognition-compatible CCTV cameras.
The findings are based on crowdsourced data obtained by thousands of digital volunteers as part of the Decode Surveillance NYC project. Volunteers mapped more than 25,500 CCTV cameras across New York City and Amnesty worked with data scientists to compare this data with statistics on stop-and-frisk and demographic data.
Facial recognition technologies for identification are systems of mass surveillance that violate the right to privacy, and threaten the rights to freedom of assembly, equality and non-discrimination.
The NYPD used facial recognition technologies in at least 22,000 cases between 2016 and 2019. Data on incidents of stop-and-frisk by the NYPD since 2002 shows Black and Latinx communities have been the overwhelming target of such tactics.
Last year, Amnesty sued the NYPD after it refused to disclose public records regarding its acquisition of facial recognition technologies and other surveillance tools. The case is ongoing. Amnesty is calling for a total ban on the use, development, production, sale and export of facial recognition technologies for mass surveillance purposes by both states and the private sector.
Matt Mahmoudi, Amnesty International’s Artificial Intelligence and Human Rights Researcher, said:
“Our analysis shows that the NYPD’s use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City.
“We have long known that stop-and-frisk in New York is a racist policing tactic. We now know that the communities most targeted with stop-and-frisk are also at greater risk of discriminatory policing through invasive surveillance.
“The shocking reach of facial recognition technology in the city leaves entire neighbourhoods exposed to mass surveillance. The NYPD must now disclose exactly how this invasive technology is used.
“Banning facial recognition for mass surveillance is a much-needed first step towards dismantling racist policing, and the New York City Council must now immediately move towards a comprehensive ban.”
CCTV used as anti-protest ‘scare tactic’
During the Black Lives Matter protests of mid-2020, people in New York attending protests experienced higher levels of exposure to recognition technology surveillance. For example, a protester walking from the nearest subway station to Washington Square Park would have been under surveillance by NYPD Argus cameras for the entirety of their route.
Matt Mahmoudi added:
“When we looked at routes that people would have walked to get to and from protests from nearby subway stations, we found nearly total surveillance coverage by publicly-owned CCTV cameras, mostly NYPD Argus cameras.
“The pervasive use of facial recognition technology is effectively a digital stop-and-frisk. The use of mass surveillance technology at protest sites is being used to identify, track and harass people who are simply exercising their human rights.
“This is a deliberate scare tactic by the NYPD that has no place in a free society, and must be stopped immediately.”
Amnesty has launched a new website - https://nypd-surveillance.amnesty.org/ - that allows users to discover how much of any potential walking route between two locations in New York City might be exposed to facial recognition technology surveillance. Amnesty’s website also allows users to track how much of the technology is used between any of the major tourist attractions in the city by plotting the distance and possible route taken.
Amnesty is encouraging New Yorkers to take action by sending a letter of protest to their council member demanding the introduction of a bill that prohibits facial recognition technology to help protect their communities.
Amnesty’s research partners on the facial recognition technology project includ: Julien Cornebise of the Computer Science Department, University College London; BetaNYC, a civic organisation; and Dr Damon Wischik, an independent data scientist.