Connect with us

Hi, what are you looking for?

Tech & Science

Op-Ed: Facial recognition — Big money, a lot of errors and real risks

Let’s start with the Met Police case as the current major disaster for facial recognition to illustrate how facial recognition is being taken up everywhere, and so horribly mishandled.
The Metropolitan police system isn’t just getting it wrong occasionally. It’s been failing at that rate on an ongoing basis since 2016. If that’s so, it’s quite likely that any use of facial recognition would be very easy to challenge, and if not actually unlawful, would be subject to serious doubt in terms of reliability as evidence.
The story here is that a system which uses facial recognition can make lots of mistakes, probably through raw data sourcing. Camera lenses may have scratches or dust on them. The picture may not be too good. The imaging software may have bugs. Moisture may create a secondary micro-lens in part of an image, distorting images.
There are other issues, too, notably the fact that facial recognition technologies are particularly bad at recognising black faces, in comparison to white. Black women are apparently the hardest for the facial recognition systems to get right. It’s probably the first documented case of racial discrimination based on software.
Now consider those facts in terms of such technologies being any real use at all in security, police, or other scenarios. Biometrics are designed to record unique profiles, not as search engines for cameras. This sort of data, however refined, simply cannot be infallible, or necessarily accurate enough to positively identify anyone.
There should also be a lot more attention to the fact that misidentification could be wrong at law, and subject to civil actions or even class actions. Sloppiness in tech products is usually accepted by manufacturers and distributors and the sad little bastards who peddle this junkware. Elsewhere on Earth, it’s not so accepted, particularly if a person suffers any injury as a result of such pervasive, and clearly half-ass, tech.
Major risk – The Surveillance Society as a cash cow
How did you guess – The other gigantic problem is a total lack of interest in anything related to practical issues due to the vast amount of money in surveillance. The vacuous venal vermin who spruik surveillance systems as the cure for people, whether they are the people you’re looking for or not, are doing well. Facial recognition tech is the new cash cow for them.
All you need is enough hype and paranoia to convince anyone that this tech will solve everything. Call it “innovative”, “disruptive” or any other buzzword, and you’ve sold your Gestapo In A Box. Like the Gestapo, it may or may not work at all, and is guaranteed not to be much fun for anyone.
Surveillance tech covers a vast range of technologies including facial recognition. Your Handy Dandy facial recognition tech is more than likely to be integrated with these other technologies. You can see why regulation around the world is being so downright illiterate. The fewer laws, the less risk for the makers and sellers.
It’s Christmas every day, especially when some new product comes on the market. They’re selling to employers, police, intelligence services, governments and anyone sufficiently motivated to want to identify people for any reason, real or imaginary.
Facial recognition is also used to record sales in shops, etc., too, so the market is truly gigantic. Meanwhile – The net outcome across the surveillance spectrum is lousy tech, big money, and an equally total lack of competence or care when it comes to the effects of using facial recognition.
The Surveillance State as a new hazard
China is one of the major suppliers of this whole class of technology. One of those outcomes just happens to be straight from China’s version of the Surveillance State. The application of facial recognition technology will sound very familiar to China watchers, particularly those who know China’s Social Credit artificial intelligence-based surveillance system. Protesters in Baltimore were targeted by facial recognition technologies, and arrested not for protesting, but for any other charges available against individuals. Given the notorious inaccuracy of facial recognition for reading black faces, that doesn’t sound like it was much more than an excuse.
Better still, forensic sketches have been used as a basis for arrest on the basis of “reasonable suspicion”. Given that forensic sketches and profiles may be based on the blurry memories of people providing information about suspects, and/or “join the dots” generic imagery, accuracy apparently isn’t an issue.
Fun fact – It seems that half of the American adult public are on police facial recognition databases. The Atlantic has said “You no longer own your face” in its coverage of many of its articles on surveillance and facial recognition, and it’s looking like a fair description of the facts. (In theory, you do own the rights to your own image, but who’s got the money to fight a case on that basis?)
Taking back privacy and rights
The risk is that a bit of software can wreck your life, invade your privacy, and there’s no real comeback unless a few geriatric politicians pass some laws. These surveillance systems are a sort of “presumption of guilt” on everyone. Facial recognition can start a totally erroneous legal procedure against any person on no basis at all but its software.
The quick legislative fix is that verification of facial recognition must be a minimum standard of evidence before laying of charges. The right to use facial recognition as evidence must be based on admissibility any evidence to the court and strict cross-examination if it is. (Similar to forensics, electronic evidence is easy to challenge using expert witnesses.)
A vague picture of someone cannot be the sole basis for legal procedures, let alone police prosecutions. As usual, the law is way behind the technology, and it’s high time it caught up.

Avatar photo
Written By

Editor-at-Large based in Sydney, Australia.

You may also like:

Business

Catherine Berthet (L) and Naoise Ryan (R) join relatives of people killed in the Ethiopian Airlines Flight 302 Boeing 737 MAX crash at a...

Business

Turkey's central bank holds its key interest rate steady at 50 percent - Copyright AFP MARCO BERTORELLOFulya OZERKANTurkey’s central bank held its key interest...

World

A vendor sweats as he pulls a vegetable cart at Bangkok's biggest fresh market, with people sweltering through heatwaves across Southeast and South Asia...

Tech & Science

Microsoft and Google drubbed quarterly earnings expectations.