Email
Password
Remember meForgot password?
    Log in with Twitter

article imageGoogle to issue chatbot warning

By Tim Sandle     May 11, 2018 in Technology
Next time you make a phone call to a company, will you be listening to a human or a chatbot? It's becoming harder to tell. As part of a review of ethics, Google will warn people using its own services.
For some companies it might be quite convenient not to let on to someone if they are listening to a human operator a chatbot, especially if they are trying to save money by reducing human labor but which to provide the presence that a 'real person' is listening to the needs of the customer. For Google, this type of deception crosses an ethical line that the technology company is not prepared to take.
Instead, as the BBC reports, Google has put in place measures to ensure that people are not fooled when they get called by the types of software bots that can convincingly copy the human voice. This is to counter this form of mimicry in relation to Google services.
The outcome of Google's pledge, as The Verge reports, means that any person telephoned by a chatbot will be immediately informed that they are conversing with a machine.
Google has raised this issue following a demonstration of the Duplex bot. The bot, according to Techradar, is an artificial intelligence agent that can make phone calls on behalf of a person. This is not simply dialing the number; this is about engaging in full conversation.
When the demonstration was run, several of those present praised Google on the technology but stated they were concerned that the bot's vocals were indistinguishably from those of a human. The voice was under control of Google's DeepMind WaveNet software.
In a statement, reported by CNET, Google said: "We are designing this feature with disclosure built-in, and we'll make sure the system is appropriately identified."
More about chatbots, Google, Human, Computer, Voice