Featured no image

Published on September 12th, 2022 📆 | 7518 Views ⚑

0

Campaigners call on police to end use of face recognition technology


Powered by iSpeech

Fourteen campaign groups have published an open letter requesting the Metropolitan Police Commissioner to stop using "useless and highly invasive" facial recognition technologies.

On his first day on the job, 14 campaign groups have written to Metropolitan Police Commissioner Sir Mark Rowley requesting an end to the use of facial recognition technologies by police forces. They called the technologyĀ  ā€œprivacy-eroding, inaccurate and wastefulā€.

Big Brother Watch, Liberty and Black Lives Matter UK were among the organisations that signed the letter.

Facial recognition technologies have increasingly been used by police authorities around the world to fight against crime. TheĀ Metropolitan PoliceĀ and the South Wales Police are some of the forces thatĀ have been knownĀ to use these technologies. However, this use of facial-recognition technologies has led to civil rights challenges and condemnation from human rights groups, who argue that the technology is often mistaken andĀ biased.

ā€œDuring observations at deployments, Big Brother Watch has witnessed multiple false positive matches, which have led to innocent individuals being forced to prove their identity to police officers,ā€ the campaignersā€™ letter said. ā€œIf the use of this technology becomes more widespread, these incidents will become commonplace, resulting in further injustices and increased public mistrust of the Met.ā€

The groups claimed thatĀ  87 per cent of the alerts generated by current facial recognition technologies result in misidentifications. One such case was the identification of a 14-year-old black schoolboy in uniform, and a French exchange student who had only been in the country for a few days.

The technology is said to be less accurate for women and people of colour, the group said, despite it being used in areas with a higher density of ethnic minorities.

ā€œPublic trust in the police has collapsed in the capital and is being further damaged by the Metā€™s repeated use of Orwellian facial recognition technology which is both useless and highly invasive," said Silkie Carlo, director of Big Brother Watch.Ā 

"These Minority Report-style cameras have done absolutely nothing to reduce high rates of violent crime but risk putting our police on a par with those in surveillance states like China and Russia. They have no place in a democracy.ā€

Moreover, in addition to the technologyā€™s failures, campaigners have also criticised the breach of citizensā€™ privacy that comes along with the use of such technologies.

Earlier this year, an independent review commissioned by the Ada Lovelace Foundation of UK legislation called for the government to pass laws that will govern biometric technologies and ensure their ethical use.





Last year,Ā the UKā€™s Information Commissionerā€™s Office (ICO) expressed similar concerns regarding the reckless and inappropriate use of facial recognition in public spaces, banning facial-recognition companyĀ Clearview AI and demanding that the company delete all the data it held that related to UK citizens. Clearview AIā€™s business model was based on scraping billions of publicly available images from social media to train its facial-recognition software. It was later sold to law enforcement agencies to help identify people from closed-circuit television footage.

ā€œWe all have the right to go about our lives without being surveilled by the police,ā€ said Martha Spurrier, director of Liberty. ā€œBut the Metropolitan Policeā€™s use of live facial recognition is violating our rights and threatening our liberties.ā€

A Met spokeswoman responded by saying: ā€œLive Facial Recognition (LFR) is a technology that has been helping the Met to locate dangerous individuals and those who pose a serious risk to our communities.

ā€œThe Met has primarily focused the use of LFR on the most serious crimes; locating people wanted for violent offences, including knife and gun crime, or those with outstanding warrants who are proving hard to find.

ā€œOperational deployments of LFR technology have been in support of longer-term violence reduction initiatives and have resulted in a number of arrests for serious offences including conspiracy to supply Class A drugs, assault on emergency service workers, possession with intent to supply Class A & B drugs, grievous bodily harm and being unlawfully at large having escaped from prison.

ā€œFalse alert rates across our operational deployments are between 0% and 0.08%.ā€

In 2021, the Council of Europe, a 47-country human rights and democracy organisation, publishedĀ a set of guidelines [PDF] for governments, lawmakers, providers, and businesses laying out its proposals for use of facial recognition technologies.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Source link

Tagged with: ā€¢ ā€¢ ā€¢ ā€¢ ā€¢



Comments are closed.