Connect with us

Hi, what are you looking for?

Tech & Science

Microsoft’s Cortana digital assistant gets sexually harassed?

With a female voice and character, Microsoft knew Cortana would be subject to some degree of harassment from day one. It’s a problem faced by other virtual assistants too, many of which are female by default or can be changed to a female persona. Some assistants, such as Amazon’s Alexa, even have obviously female names.
In a CNN Money article, Microsoft’s Deborah Harrison explained how the company scripts Cortana to respond to questions that would be frowned upon in actual society. The digital environment and “private” atmosphere of talking to Cortana inspires many people to talk very freely, asking “her” for details she may not be comfortable in revealing.
According to CNN, Harrison told the ReWork Virtual Assistant Summit in San Francisco this week that a “good chunk” of questions pitched to Cortana after her initial launch in 2014 — then available only to U.S. Windows Phone 8.1 users — asked her about her sex life or relationship details.
People also tried to discuss their own relationships, extract explicit language from the assistant or just hurl insults at the female persona. Microsoft realised it would need to sculpt Cortana’s personality to respond to these queries in a gender-neutral manner.
Harrison and her team of seven other script-writers began to work out efficient ways of shutting down inappropriate conversations. As well as adding support for replies to actual commands — such as “Hey Cortana, add an appointment” or “Hey Cortana, what’s the weather”, the team has to script for the casual conversation and playful mindset that is a key component of Cortana’s personality.
The team realised this personality is incompatible with vulgar requests from users and gave Cortana a new characteristic — fierce independence. Despite having a female voice, avatar and persona — Cortana derives from the semi-nude character of the same name in Microsoft’s Halo video game series — the virtual assistant refuses to adhere to female-assistant stereotypes.
Harrison said “We wanted to be very careful that she didn’t feel subservient in any way … or that we would set up a dynamic we didn’t want to perpetuate socially.” Today, Cortana refuses to deprecate herself and only says sorry if it’s actually required.
The team changed her into a character that remains accessible but is also distanced from the user. Like a real personal assistant, Cortana is available to help with “work” requests but users shouldn’t expect anything more.
Users are still encouraged to build up a relationship with the assistant. Frequent events around holiday seasons introduce users to new casual commands and banter but don’t compromise the “safe for work” mindset of Cortana’s personality. Harrison said “if you say things that are particularly a**holeish to Cortana, she will get mad.”
Her comments are an interesting insight into the social challenges companies like Microsoft face when building virtual assistants. In this case, it isn’t dodgy voice recognition, slow loading times or limited functionality that is the issue. Instead, it’s a general female stereotyping problem that has spread from real society into the digital world.

Written By

You may also like:

Social Media

Wanna buy some ignorance? You’re in luck.

Business

United Airlines CEO Scott Kirby said the carrier was reviewing recent incidents and would redouble safety initiatives as needed - Copyright AFP Logan CyrusUnited...

World

US President Joe Biden speaks during a reception honoring Women's History Month at the White House - Copyright AFP Brendan SMIALOWSKIDanny KEMPUS President Joe...

Business

A Milei marks 100 days in office, thousands protest his austerity measures - Copyright AFP Luis ROBAYOLeila MACORArgentina’s President Javier Milei has slashed public...