Remember meForgot password?
    Log in with Twitter

article imageMicrosoft's Cortana digital assistant gets sexually harassed?

By James Walker     Feb 7, 2016 in Technology
Microsoft has explained how it trains its Cortana digital assistant, available on Windows, Android and iOS, to respond to the "inappropriate" questions that users often ask. A "good chunk" of early queries asked Cortana about her sex life.
With a female voice and character, Microsoft knew Cortana would be subject to some degree of harassment from day one. It's a problem faced by other virtual assistants too, many of which are female by default or can be changed to a female persona. Some assistants, such as Amazon's Alexa, even have obviously female names.
In a CNN Money article, Microsoft's Deborah Harrison explained how the company scripts Cortana to respond to questions that would be frowned upon in actual society. The digital environment and "private" atmosphere of talking to Cortana inspires many people to talk very freely, asking "her" for details she may not be comfortable in revealing.
According to CNN, Harrison told the ReWork Virtual Assistant Summit in San Francisco this week that a "good chunk" of questions pitched to Cortana after her initial launch in 2014 — then available only to U.S. Windows Phone 8.1 users — asked her about her sex life or relationship details.
People also tried to discuss their own relationships, extract explicit language from the assistant or just hurl insults at the female persona. Microsoft realised it would need to sculpt Cortana's personality to respond to these queries in a gender-neutral manner.
Harrison and her team of seven other script-writers began to work out efficient ways of shutting down inappropriate conversations. As well as adding support for replies to actual commands — such as "Hey Cortana, add an appointment" or "Hey Cortana, what's the weather", the team has to script for the casual conversation and playful mindset that is a key component of Cortana's personality.
The team realised this personality is incompatible with vulgar requests from users and gave Cortana a new characteristic — fierce independence. Despite having a female voice, avatar and persona — Cortana derives from the semi-nude character of the same name in Microsoft's Halo video game series — the virtual assistant refuses to adhere to female-assistant stereotypes.
Harrison said "We wanted to be very careful that she didn't feel subservient in any way … or that we would set up a dynamic we didn't want to perpetuate socially." Today, Cortana refuses to deprecate herself and only says sorry if it's actually required.
The team changed her into a character that remains accessible but is also distanced from the user. Like a real personal assistant, Cortana is available to help with "work" requests but users shouldn't expect anything more.
Users are still encouraged to build up a relationship with the assistant. Frequent events around holiday seasons introduce users to new casual commands and banter but don't compromise the "safe for work" mindset of Cortana's personality. Harrison said "if you say things that are particularly a**holeish to Cortana, she will get mad."
Her comments are an interesting insight into the social challenges companies like Microsoft face when building virtual assistants. In this case, it isn't dodgy voice recognition, slow loading times or limited functionality that is the issue. Instead, it's a general female stereotyping problem that has spread from real society into the digital world.
More about Microsoft, cortana, Harassment, Society, digital assistant
Latest News
Top News