Email
Password
Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: Emotional recognition is coming to your life. You won't like it.

By Paul Wallis     Sep 4, 2019 in Technology
Sydney - You have to wonder about the aims of the seemingly endless digging into people’s minds by a lot of types of research. One of the least impressive, and riskiest, is emotion recognition. What use is it, and to whom?
The latest ideas in emotion recognition are being applied to Virtual Reality (VR) gaming. VR is the classic current “immersive” experience, and many people do love it. Using neural networks, it’s now possible to recognise emotions without a lot of supposedly common facial features, like eyes, etc. being included in the assessment.
One of the reasons for developing emotional recognition is that a game could respond to user emotions in real time. Well, hallelujah. Most people try to win games, not emotionally interact with them, so where’s the need?
A few points to make here:
1. Exactly how would the game respond to emotions in real time?
2. When else would it be responding, incidentally? Last Friday?
3. For what actual gaming purpose?
Imagine the interactions:
VR character to player: “You’re upset. I can tell.”
Player: “!!!&))# off, you no-dimensional (*&(*(. “
Sounds productive, doesn’t it? Imagine that scenario in a rage quit!
Call me just a bit infinitesimally skeptical, but “You wanna play VR? Hand over your emotions,” doesn’t sound like a great deal to me.
Security, the other side of emotion recognition
This kind of data has ramifications. You may or may not know that intelligence and security agencies uses “emotion-reading” all the time. There are actually people walking around cities just watching other people, for that reason. Sometimes for a good reason, too. The very jumpy guy at the airport may well be a terrorist. The facial twitch of someone on the street might be the only warning of a stabbing attack.
As hard science, emotion recognition has a very long way to go. People can read emotions, some do it well, but it’s tricky. Creating an A.I. version of emotion reading will be quite difficult, mainly because there’s a lot of ways of making mistakes.
It seems very optimistic to assume that this tech can deliver practical values just out of a very badly defined hat, for spurious and equally badly defined reasons. Gamers are not famous for their patience. They are famous for their emotions, and also for making money out of gaming.
Imagine an emotional recognition use where you’re not reading the game, but you’re reading your competitors. The hack psych’s dream, in practice, cheating, of course, and almost sure to be tried out soon enough. What’s in this for gamers?
The current situation
The current state of “play” is that multiple neural networks have been put to work on deep learning. One of them, DenseNet, achieved a 90% accuracy rate. Another excelled at registering “fear and disgust”.
Disgust? Some games are deliberately disgusting, but that’s their appeal, too. What if this tech miraculously turns into a way of hyping up the selling points? Seems pretty likely. Another breakthrough for tacky entertainment? Probably.
Privacy issues
Another point needs to be made very clearly. Emotions are private. People don’t willingly broadcast their emotions, most of the time. If emotional recognition can be used to someone’s detriment, it needs to be seen as a threat to basic privacy, on an existential level.
Now the good news, perhaps:
• If emotion recognition is encoded in any way, it becomes data. That data can be considered private by definition, because emotions are personal “property”. (What a hideous expression to use about deep feelings.)
• Behavioural information can also be construed as medical information, also private, depending on the circumstances.
• You could demand that emotional recognition features be turned off, citing privacy.
Nobody can or should be forced to undergo what is effectively a psychoanalysis by stealth. There is no basis for claiming that emotional recognition isn’t a clear intrusion into personal privacy.
Meanwhile, I’ll just keep my usual expression of extreme hostility parked in its usual spots on my face, and let them try to figure it out. Could take millennia.
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of DigitalJournal.com
More about emotional recognition, emotional recognition gaming, emotional recognition privacy, emotional recognition psychoanalysis, virtual reality gaming
 
Latest News
Top News