What AI means for brand tone of voice.

AI is reshaping the world around us. Robots now pick our fruit, spot fraudulent transactions on our credit cards, and keep our trains running smoothly. It’s easy to notice when machines make mistakes, but we often overlook their many quiet successes.

Now, the robots are coming for our words.

We think of language as inherently human – the thing that lifts us above both animals and machines. But day-to-day, much of what we say is predictable. When we see someone for the first time, we say ‘hello’. When a waiter comes over to us in a restaurant, we name some things from the menu. Lots of reporting journalism – election results, finance, sports – is the same. That’s why the Washington Post has successfully ‘employed’ Heliograf, a robot journalist, since 2016.

For companies, the growth of automated writing points to a tantalising future. Perfectly search-optimised websites. Ultra-effective ads. Customer service that always lives up to the brand’s tone of voice ideals, and involves far less human effort.

In fact, this bright future is already here. We can now order pizza by talking to Alexa, and stash money in our savings accounts with a Facebook Messenger chat. These experiences may be limited, but they are reliable, and often very convenient for users. AI also has applications in health: Babylon, a diagnostic AI, outperforms the average doctor in some tests.

But there’s a problem: us. Digital assistants like Alexa and Siri are slowly normalising the idea of automated conversations, but for a good while at least, many of us will remain cautious about them. When Google demoed its Duplex technology, which undertakes ‘real world’ conversations on behalf of busy users – think calling to book a haircut – people were shocked and unnerved by how natural it sounded.

This is partly a fear of big tech and poor data privacy. But it’s not just that. In computer animation, people talk about the ‘Uncanny Valley’: when animated characters look nearly human, but not quite human, we find them creepy. I think we can apply the same concept to language.

Duplex deliberately adds ‘ums’ and ‘errs’ for extra realism; Amazon is experimenting with changing Alexa’s tone when delivering bad news (your team lost; your birthday dinner has been cancelled). The other day, I started an online chat with my broadband provider, and for the first few messages I wasn’t sure if I was talking to a person or not. It wasn’t a great feeling. Until we reach the other side of this verbal Uncanny Valley, those experiences are inevitable, and they are a big problem for companies who dabble in automated writing.

Ultimately, this all comes down to the biggest question for any brand: trust. For now at least, companies need to clearly signpost when you’re talking to a computer, and give you a way to reach a human if you’d prefer one.

What happens next? I think the smart approach is to combine automation with human expertise. I can’t write 100 versions of an email and find out which gets the best clickthrough rate, but I can create three routes for a robot to start from. A computer may well be able to write a credible investment report, or help me troubleshoot an issue with my laptop, but I’m not sure it will nail a brand manifesto anytime soon.

When it comes to customer service, it may be a mistake to see a pitch-perfect tone of voice as the ultimate goal. Most of us enjoy talking to fickle humans, precisely because they go off-script every now and then. We don’t have a great experience calling British Gas; we have a great experience calling Sally at British Gas, who tells us about her cat, and that makes us feel better about using British Gas in the future. It’s telling that many recent advances in conversational AI have been about making the conversation ‘less perfect’ (see Duplexs ‘ums’).

In the short term, this may be the most powerful thing conversational AI can offer companies: a reminder that a ‘consistent tone of voice’ isn’t really compatible with normal human conversation. You’re either perfect, or you’re human. You can’t be both.

There’s a lesson for all of us there.


A version of this article was published by Digital Arts.