Email
Password
Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: Emotion detection software — Out of date, irrational, or naive?

By Paul Wallis     Jul 18, 2019 in Technology
Sydney - The Surveillance Society has achieved another biometric milestone as scientists try to debunk unreliable “emotion detection” software which they say is based on old science. This software is used for employment, market research, and other fun things.
A muiti-university study has found some odd things. Seems some scientists believe facial expressions are reliable identifiers of emotions. Researchers say that’s too simple, and that expressions like scowls may mean something quite else than anger. You might be trying to focus intensely on something, or have a stomach ache, for example, hence the expression. The researchers have been polite enough to call emotion detection “inexact”. Ah…yeah…
The researchers are saying that while facial expressions do convey information, the information can be easily misinterpreted, The general finding is that a different approach, using body language, voice changes and visual cues, would be more comprehensive.
Out of date?
The “outdated” element in this situation is that for many years facial expressions have been used by security agencies and in a range of diagnostic fields. “Old tech” doesn’t necessarily mean totally out of date, but it can mean the tech and the theories are limited in scope and practical values. If facial expressions were a reliable indicator of terrorism, for example, how many people would be considered terrorists when driving?
Irrational? Try “naïve” instead.
Most people grow a whole working frame of reference for facial expressions from birth. Most people know at least a few other people who are very hard to read. Most psychotics have well-adapted social behaviours, including their facial expressions. The nice guy con man who tries to steal your bone marrow will always have a friendly expression. The average poker player knows how to control facial expressions, just on principle.
…So using software rather than your own judgment may be a rather unreliable way of hiring people or predicting your sparkling new psychopath’s next move. Consider for a moment the staggering level of lack of management skills which needs software to read expressions and understand them. Or do managers need to document in absurd detail the fact that so-and-so had good facial expressions to justify hiring someone?
The Human Resources and security hardware rackets have obviously found a new toy. That toy is now worth about $20 billion annually. That’s a great basis for trust. It makes money, it probably doesn’t work, and if it does, it’s also legally pretty iffy. According to The Guardian, use of the technology resulted in air travellers being selected for interrogation, basically at random.
Just plain dumb? Bet your idiotic overweight tech budget it’s dumb.
Adding to the hilarity and high dollar values, developers claim that this technology will be able to read people’s innermost emotions. Just think – All these nice two-dimensional screen-brains will be able to claim the sort of psychological insights no real psychologist would claim without serious, and lengthy, diagnosis.
For those of you who haven’t been paying attention since the Stone Age, there’s a whole new human habit called “evasion”. It exists in all social interactions. People avoid conflict, send reassuring signals and reach for a gun.
For example, using very basic psychology which has been around since about the Stone Age:
An expression of fear may mean a very violent response; fight or flight.
• A smile may mean pure hatred.
• Deference to people in authority may mean total untrustworthiness.
• Assertive expressions may mean you are deeply in the proverbial, and no amount of prediction by some two-bit piece of software will save your sorry carcass from that assertion.
You’d have to be out of your mind (and possibly your other several minds as well) to believe people would be so socially gullible as to instantly deliver any old number of useful indicators. The whole idea is basically counterintuitive and wrong. Do you go around telling people how you feel about every little thing? Do you go poker-faced in a new social environment, like a workplace? Any actor, and most people, rather annoyingly, think themselves good actors, can put on a show for a camera. Good or bad, they’ll be at least a bit misleading, and doing so deliberately.
This tech seems to be based on the general – and generally disproven – theory that everyone is so transparent you can simply take a picture of them and know everything there is to know about them. Emotion detection is exactly the sort of intrusion everyone will deeply resent. Most people will also be only too happy to deceive it as much as possible.
Nor is emotion detection much of an asset in practical terms, particularly in the workplace. Are you prepared to believe that a bit of software which is by definition simplistic, rigid and not designed to factor in current stress levels will predict a workplace shooting, for example?
Job interviews, for example, are now so bureaucratic, so heavily documented, and so basically inefficient already. The entire convoluted, paranoid process of hiring can be called into question by any competent manager. Workplace psychology has loaded itself up with any amount of tests and rituals, and this is one more. The net result is that both businesses and potential employees are barely able to tolerate the hiring process at all.
Emotion detection is just stacking on more highly debatable dead weight in areas where real judgment in real time is required. Your friendly local homicidal maniac is unlikely to sit still for a happy snap and biometrics.
Just read the current news about emotion detection, and you’ll see how far-reaching this technology is. Note the conspicuous lack of altruism, social sensitivity and other quaint ideas. Your phone and your selfies can be used as analytical tools, for or against you, mainly against, apparently.
Even the revered, sometimes justifiably revered, MIT is getting in on this fools’ crusade to further devalue human experience. Suggestion, researchers – When researching anything, consider what the research is likely to deliver to the happy collection of nutcases you’re working for. All else follows.
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of DigitalJournal.com
More about emotion detection, University of California, Berkeley, university of amsterdam, university of british columbia, emotional detection hiring
More news from
Latest News
Top News