Unai Emery vision impresses as Spaniard 'wins race for Arsenal job'
China drops anti-dumping probe of U.S. sorghum imports
Crypto Startup Circle Raises $110 Million In Investment Round
Rights Group Calls for End to Police Facial Recognition
16 May 2018, 03:22 | Kelvin Horton
Facial recognition software have been found to be wrong over 91% of the time
Big Brother Watch (BBW), a civil rights organization from the UK that "works to roll back the surveillance state", released a report in which it reveals that the UK Metropolitan Police's experimental facial recognition system is wrong 98% of the time, thus making it virtually useless.
The privacy group also said that: "automated facial recognition technology is now used by United Kingdom police forces without a clear legal basis, oversight or governmental strategy".
"We have been extremely disappointed to encounter resistance from the police in England and Wales to the idea that such testing is important or necessary", Big Brother Watch said in the report.
South Wales Police used the facial recognition software at various events including the Uefa Champions League 2017 final in Cardiff, worldwide rugby matches and concerts.
London's Metropolitan Police has tested AFR at a total of three events, including the city's Notting Hill carnival in 2016 and 2017, and a "Remembrance Sunday" event in November, the watchdog discovered.
Currently, there is no legislation in the United Kingdom that regulates the use of facial recognition systems through CCTV cameras by the police, nor is there any independent oversight for the police's use of these systems.
The UK's Information Commissioner has threatened to take legal action over the use of facial recognition in law enforcement if the police and government can not prove the technology is being deployed legally.
"When trialling facial recognition technologies, forces must show regard to relevant policies, including the Surveillance Camera Code of Practices and the Information Commissioner's guide", it said in a statement. The HD cameras detect all the faces in a crowd and compare them with existing police photographs, including mug shots.
Big Brother Watch's report found that the police's use of the technology is "lawless" and could breach the right to privacy protected by the Human Rights Act. "They weren't able to get the detail from the picture".
How have the police forces responded?
"Regarding "false" positive matches - we do not consider these as false positive matches because additional checks and balances are in place to confirm identification following system alerts", it said in a statement.
The first was at Notting Hill, but the person identified was no longer wanted for arrest because the information used to generate the watch list was out of date.
It also raised concerns about racial bias in the kit used, criticising the Met Police for saying it would not record ethnicity figures for the number of individuals identified, either correctly or not.
It said a "number of safeguards" prevented any action being taken against innocent people.
"Automated facial recognition technology is now used by United Kingdom police forces without a clear legal basis, oversight or governmental strategy", the group said. However, the British people will need to decide whether or not they want to live in a world where they are continuously watched, intrusively surveilled, and biometrically tracked, and think about how that may affect their fundamental rights. The majority of the people whose faces were scanned automatically were also not notified that the police system has "matched" them as targets.
The Home Office told the BBC it plans to publish its biometrics strategy in June, and it "continues to support police to respond to changing criminal activity and new demands".
HTC is working on a next-gen 'blockchain phone'
Of course, as TNW points out, these types of features can already be installed on basically any smartphone already on the market. In terms of specifications, the Finney blockchain phone comes with a Qualcomm Snapdragon 845 processor.