Email
Password
Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: Facial recognition — Big money, a lot of errors and real risks

By Paul Wallis     Jul 6, 2019 in Technology
London - Recent reports of the UK Metropolitan Police achieving an 81% error rate in recognition are bad enough. They pale in the face of the sort of money and politics being thrown in to what is becoming a real mess.
Let’s start with the Met Police case as the current major disaster for facial recognition to illustrate how facial recognition is being taken up everywhere, and so horribly mishandled.
The Metropolitan police system isn’t just getting it wrong occasionally. It’s been failing at that rate on an ongoing basis since 2016. If that’s so, it’s quite likely that any use of facial recognition would be very easy to challenge, and if not actually unlawful, would be subject to serious doubt in terms of reliability as evidence.
The story here is that a system which uses facial recognition can make lots of mistakes, probably through raw data sourcing. Camera lenses may have scratches or dust on them. The picture may not be too good. The imaging software may have bugs. Moisture may create a secondary micro-lens in part of an image, distorting images.
There are other issues, too, notably the fact that facial recognition technologies are particularly bad at recognising black faces, in comparison to white. Black women are apparently the hardest for the facial recognition systems to get right. It’s probably the first documented case of racial discrimination based on software.
Now consider those facts in terms of such technologies being any real use at all in security, police, or other scenarios. Biometrics are designed to record unique profiles, not as search engines for cameras. This sort of data, however refined, simply cannot be infallible, or necessarily accurate enough to positively identify anyone.
There should also be a lot more attention to the fact that misidentification could be wrong at law, and subject to civil actions or even class actions. Sloppiness in tech products is usually accepted by manufacturers and distributors and the sad little bastards who peddle this junkware. Elsewhere on Earth, it’s not so accepted, particularly if a person suffers any injury as a result of such pervasive, and clearly half-ass, tech.
Major risk - The Surveillance Society as a cash cow
How did you guess – The other gigantic problem is a total lack of interest in anything related to practical issues due to the vast amount of money in surveillance. The vacuous venal vermin who spruik surveillance systems as the cure for people, whether they are the people you’re looking for or not, are doing well. Facial recognition tech is the new cash cow for them.
All you need is enough hype and paranoia to convince anyone that this tech will solve everything. Call it “innovative”, “disruptive” or any other buzzword, and you’ve sold your Gestapo In A Box. Like the Gestapo, it may or may not work at all, and is guaranteed not to be much fun for anyone.
Surveillance tech covers a vast range of technologies including facial recognition. Your Handy Dandy facial recognition tech is more than likely to be integrated with these other technologies. You can see why regulation around the world is being so downright illiterate. The fewer laws, the less risk for the makers and sellers.
It’s Christmas every day, especially when some new product comes on the market. They’re selling to employers, police, intelligence services, governments and anyone sufficiently motivated to want to identify people for any reason, real or imaginary.
Facial recognition is also used to record sales in shops, etc., too, so the market is truly gigantic. Meanwhile – The net outcome across the surveillance spectrum is lousy tech, big money, and an equally total lack of competence or care when it comes to the effects of using facial recognition.
The Surveillance State as a new hazard
China is one of the major suppliers of this whole class of technology. One of those outcomes just happens to be straight from China’s version of the Surveillance State. The application of facial recognition technology will sound very familiar to China watchers, particularly those who know China’s Social Credit artificial intelligence-based surveillance system. Protesters in Baltimore were targeted by facial recognition technologies, and arrested not for protesting, but for any other charges available against individuals. Given the notorious inaccuracy of facial recognition for reading black faces, that doesn’t sound like it was much more than an excuse.
Better still, forensic sketches have been used as a basis for arrest on the basis of “reasonable suspicion”. Given that forensic sketches and profiles may be based on the blurry memories of people providing information about suspects, and/or “join the dots” generic imagery, accuracy apparently isn’t an issue.
Fun fact - It seems that half of the American adult public are on police facial recognition databases. The Atlantic has said “You no longer own your face” in its coverage of many of its articles on surveillance and facial recognition, and it’s looking like a fair description of the facts. (In theory, you do own the rights to your own image, but who’s got the money to fight a case on that basis?)
Taking back privacy and rights
The risk is that a bit of software can wreck your life, invade your privacy, and there’s no real comeback unless a few geriatric politicians pass some laws. These surveillance systems are a sort of “presumption of guilt” on everyone. Facial recognition can start a totally erroneous legal procedure against any person on no basis at all but its software.
The quick legislative fix is that verification of facial recognition must be a minimum standard of evidence before laying of charges. The right to use facial recognition as evidence must be based on admissibility any evidence to the court and strict cross-examination if it is. (Similar to forensics, electronic evidence is easy to challenge using expert witnesses.)
A vague picture of someone cannot be the sole basis for legal procedures, let alone police prosecutions. As usual, the law is way behind the technology, and it’s high time it caught up.
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of DigitalJournal.com
More about facial recognition technology, facial recognition software, UK Metropolican polcie facial recognition errors, suveillance state, china social credit system
More news from
Latest News
Top News