Cryptheory – Just Crypto

Cryptocurrencies are our life! Get an Overview of Market News

UK police testing retrospective facial recognition that could identify criminals and missing people

9 min read

 

Police forces across the UK are experimenting with retrospective facial recognition (FR) technology to examine video footage and photographs for criminals, i can reveal. 

British company Reveal, which supplies standard body cameras to 30 of the UK’s 43 police forces, said 10 forces had been evaluating its new FR technology’s effectiveness on clips and images of their own officers’ faces. 

Hampshire Constabulary, Humberside Police, North Wales Police and South Yorkshire Police are among the forces to have either started or poised to begin trials of the retrospective software, which uses an algorithm that originated as a science project at a “well-known British university”. 

Police could use the retrospective FR software to examine still images or videos submitted by the public to cross-compare with custody records to identify suspects that may have otherwise gone apprehended. 

Reveal also produces K-Series body cameras equipped with live FR to check the faces of passers-by against suspected criminal watch lists, which human rights groups claim is inaccurate and fuels racial profiling, gender bias and mass surveillance. This is not currently being tested by British police.

The new K9-Series body camera is equipped with live facial recognition AI (Photo: Reveal)
The new K9-Series body camera is equipped with live facial recognition AI (Photo: Reveal)

While some police forces have expressed “lots of interest in” the technology, they are “painfully aware” of the thresholds they would have to meet to deploy it in conformance with the law and interests of the public, Alasdair Field, Reveal’s chief executive said. 

“I don’t want mass surveillance and I don’t want my kids to inherit a world where you know you’re going to be spied on – that is not what I’m trying to do in any way,” he told i.

He said he was “very satisfied” that its algorithm does not show undue bias and very confident it would not be abused thanks to the company’s close vetting and auditing policies.

Humberside Police said the force was in the “very early stages of working with [Reveal] to assess their suitability for our requirements,” while North Wales Police confirmed it was about to start testing on sample test data that was not live or from the public. 

“The force is exploring whether we could use this technology in the future to assist investigations by using this capability in a retrospective manner, subject to data protection considerations,” a spokesperson for Hampshire Constabulary told i.

“However, we have no plans to explore using this technology in a ‘live’ setting,” they added. 

A spokesperson for South Yorkshire Police said the force was in the very early stages of a pilot trial, adding that they thought it could be useful in terms of identifying outstanding suspects.

Devon and Cornwall Police confirmed it had been shown demonstrations of Reveal’s software and was “open to considering developing technologies if it supports enhancing our service to the public, balanced against the collection of personal data.” 

A spokesperson for the National Police Chiefs’ Council (NPCC) said it currently has a national Facial Recognition Technology Board to offer support and guidance to forces.

“It is for each individual force to make their own decisions on what facial recognition software they use and how it is implemented,” they said. 

UK police testing retrospective facial recognition that could identify criminals and missing people

 

The retroactive FR identified the writer’s face from a single image (Photo: i)

Reveal also sells non-FR body cams to 173 local councils and eight fire services across the UK, alongside Southern Railway, security services, prisons, retailers and more than 1,500 organisations in more than 40 countries.

The Information Commissioner’s Office (ICO) recently warned that live FR should not be used simply because it’s available, saying that businesses and organisations must demonstrate that it is for a specific purpose, is “reasonably necessary” and that they have considered and rejected other less intrusive monitoring methods “for good reason”.

Improving efficiency, reducing costs or existing as part of an existing business model are not sufficient reasons to use it, the data protection authority cautioned in June.

Elizabeth Denham, the UK’s Information Commissioner, said she was “deeply concerned” about the potential for the technology to be used “inappropriately, excessively or even recklessly”, adding that when sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the effects could be significant.

How Reveal’s FR technology works

Reveal’s FR customers create a watch list of faces for the software to scan for. In the case of the police, they’re most likely to use it to search for suspected criminals or missing persons.

The watch list must be specific to a purpose and limited to as few people as possible in order to satisfy GDPR rules and ICO recommendations.

The company’s live FR AI is built into its K9-Series body cams. It breaks each face it encounters down into biometric descriptors, effectively a unique number for each face that acts a bit like a fingerprint.

It analyses faces in real time and immediately discards the biometric data for any faces it determines are not matches against the watch list it’s been instructed to check against.

The retrospective FR software runs in management suite software for computers and is capable of recognising people of interest based on a single photograph, including pictures taken of a screen, faces that are partially obscured or wearing masks.

Mr Field said he was confident the system was capable of recognising people wearing face paint deliberately applied to trick FR, including obscuring the bridge of the nose and eye position.

Mr Field said that Reveal planned to ask its customers what their intentions for using FR were to decide whether it was justifiable in line with the rules and ethics in place within Europe and specifically the UK.

“The ICO is absolutely right, FR needs to be carefully managed and carefully thought out,” he said. “The very last thing I want to do is sleep badly at night knowing that we managed to enable nefarious governments to do bad things. 

“We will restrict where we sell on those criteria, our customers will have to demonstrate to us that they’ve got the policy and the training in place and we will review that.

“I do believe we’re the first credible company to combine facial recognition and bodycams – that might be unfair to say that, but I’d say we’re the first platform capable of supporting proper policing.”  

Civil rights groups have long opposed the live use of the controversial software.

Freedom of Information requests from campaign group Big Brother Watch published in May 2019 found facial recognition software used by the Metropolitan Police had incorrectly identified members of the public in 96 per cent of matches in trials between 2016 and 2018, while a study from the University of Essex commissioned by the Met to observe six out of 10 test deployments of the software found it had been wrong in 81 per cent of cases.

The UK’s Court of Appeal upheld an appeal brought by former Liberal Democrat councillor Ed Bridges against South Wales police force’s use of live FR in August last year, ruling it unlawful and claiming the force has not taken reasonable steps to determine whether the software had a gender or racial bias.

Silkie Carlo, director of Big Brother Watch, said greater police adoption of FR had “eye-watering possibilities for abuse”. 

“This could turn encounters with the police, whether at protests, on the roads or during stop and search, into an Orwellian police lineup resulting in yet more intrusive information gathering,” she said. 

“This Orwellian, deeply flawed spying technology has no place being sold or used in the UK. It is the stuff of dystopian fiction, not a functioning democracy.

“Live FR subjects innocent citizens to a constant, suspicionless police line up and obliterates privacy. This feature is more likely to result in legal challenges than contracts.”

Megan Goulding, lawyer at Liberty, said the “safest, and only, thing to do with facial recognition is to ban it.” 

“Last year in our landmark case the Court agreed that this dystopian surveillance tool violates our rights and threatens our liberty. Adding facial recognition to body-worn cameras would embed the everyday use of this oppressive and discriminatory technology in routine policing and make it impossible to know when we are being tracked and monitored via the secret collection of our sensitive personal data,” she said. 

“Facial recognition will not make us safer, it will turn public spaces into open-air prisons, and entrench patterns of discrimination that already oppress entire communities. 

“If this Government is serious about creating safer communities, it needs to listen to the people who experience discriminatory over-policing, stop relying on coercion, control and punishment and instead look at solutions that rebuild trust. The safest, and only, thing to do with facial recognition is to ban it.”

Mr Field is adamant that body cameras would make “crap mass surveillance devices,” maintaining that the cameras’ wide-angle lenses and relatively limited range would make them poor at recording incidents at anything beyond close quarters.

Research from the Information Commissioner’s Office (ICO) in January 2019 found strong public support for the use of LFR for law enforcement purposes.

Of 2,202 adults, 82 per cent said it was acceptable for the police to use live FR, with 72 per cent saying they agreed or strongly agreed that LFR should be used on a permanent basis in areas of high crime.

However, research carried out on behalf of the London Policing Ethics Panel, the Mayor’s Office for Policing and Crime and the University
College London Institute for Global City Policing found the majority of Asian and black people surveyed were opposed to the technology.

Another survey by the Ada Lovelace Institute found that 55 per cent of respondents believed that government should place limits on police use of live FR, while 29 per cent said they were uncomfortable with the police using it due to concerns including privacy and a lack of trust in the police to use live FR ethically.

 “We can limit the number of faces on the camera to a relatively small amount in the terms of the licence granted to an operator, which, coupled with the wide-angle, prevents mass surveillance,” he said.

“We are acutely aware that was one of the key problems with South Wales Police was not that the algorithm was biased, but that the police force hadn’t done enough to satisfy that it wasn’t biased. 

Reveal’s FR scored “extremely highly” against the standards of the US’ NIST – the National Institute of Standards and Technology, which evaluates the efficacy of FR algorithms submitted by agencies, organisations and businesses – and the company plans to submit it for formal testing for accuracy issues before selling the technology, he added.   

Mr Field said he had no plans to integrate its live FR into more sophisticated cameras equipped with zoom lenses and wider fields of view, and while he acknowledges it would be possible for the technology to fall into the hands of bad actors, they’d have to evade annual audits and have lied about their intentions.

UK police testing retrospective facial recognition that could identify criminals and missing people
UK police testing retrospective facial recognition that could identify criminals and missing people

 

Despite entering an image of the Queen in her eighties, Reveal’s retrospective FR identified her as a teenager (Photo: i)

While the many countries within the European Union would present a “relatively strong” case for being able to use the technology, many other countries would not satisfy Reveal’s requirements in terms of human rights records and other indices, meaning the company would not sell to companies operating in countries such as China or Russia, he said. 

“We would never sell to a government with a poor human rights record that didn’t have strong local legislation with oversight and that we didn’t know the purpose of.”

“The public appears to support the justifiable use of FR in quite large numbers – a significant proportion say that they think it’s a good idea,” he said.

“That doesn’t mean to say that there won’t be a very noisy bunch of people that think it’s a terrible idea, and I think it’s a terrible idea if it’s abused and used poorly and for mass surveillance. 

“We’re human beings and we’re trying our very best, not just because I’m frightened of Big Brother Watch having a go at us, I want to do the right thing. The benefits of this technology are so strong and I want to make sure it is entirely robust. “The most important thing for me is sleeping well at night.” 

All content in this article is for informational purposes only and in no way serves as investment advice. Investing in cryptocurrencies, commodities and stocks is very risky and can lead to capital losses.
BlackRock (IBIT), the Grayscale Bitcoin Trust (GBTC), Fidelity (FBTC), Ark Invest/21Shares (ARKB), Bitwise (BITB), Franklin (EZBC), Invesco/Galaxy (BTCO), VanEck (HODL), Valkyrie (BRRR), WisdomTree (BTCW), Hashdex (DEFI)

Leave a Reply

Your email address will not be published. Required fields are marked *