Hi, Hey Or Hello? How To Brand Your Voice Assistant

Building a brand in a voice controlled world

Voice branding has always been commercially important. Whether via television, radio ads, or recorded messages on customer helplines, businesses have always had to consider the tone of their communications. However, the growth of the modern voice economy has placed far greater emphasis on the voice alone – isolated from images, music, or any other supporting media.

In an increasingly voice controlled world, what companies say and how they say it are equally important. Personal digital assistants can be tailored to individual use cases, dictating how the customer interacts with a brand. This gives rise to a whole host of questions about the design of voice controlled devices. What accents should they have, what wake words should be used, and what kind of relationships are they intended to create?

Let’s just say I’m judging you

The design of a digital assistant is complicated by the inherent prejudices we hold on different voices. According to Jez Groom, Founder and Chief Choice Architect at Cowry Consulting, it only takes a fraction of a second for us to identify a person’s accent, and once we have done so the judgement begins…

“Studies have shown that we identify an accent in just 30 milliseconds,” he says, “and the accent someone talks in plays a crucial role in the way we judge this person. According to psychologists, the accent is much more important than the way a person looks.”

Groom states that accents provide valuable information about a speaker’s geographical, socio-economic, and ethnic background. Applied psychology and sociolinguistics suggests that we generally favour our own accent to other varieties of our native language and attribute more positive traits to it. Simply put: we instinctively prefer people who sound like us. This bias is borne out in our actions, meaning that the accents of those we do business with can impact our economic decisions.

“In a further study,” says Groom, “researchers asked the question ‘How do people act in economically relevant contexts when they are confronted with regional accents?’ Participants in an experiment conducted cognitive tests where they could choose to either cooperate or compete with a randomly matched male opponent identified only via his rendering of a standardised text in either a regional accent or a standard accent.”

“The psychologists found a strong connection between the linguistic performance and the cognitive rating of the opponent. When matched with an opponent who speaks the accent of the participant’s home region, individuals tend to cooperate significantly more often. By contrast, they are more likely to compete when matched with an accent speaker from outside their home region.”

We ain’t stupid

Add to this the fact that we perceive people with certain regional accents to be less intelligent, and there’s a clear need to choose the right accent to represent your brand. According to Groom, “it makes sense for voice platforms to select neutral voices and accents to avoid any potential negative stereotyping.”

While it is dangerous for brands to only associate RP accents with ‘neutral’, intelligent-sounding voices, very strong regional accents would evidently be offputting to many. Given the evidence that we positively favour our own accents amongst others, there may also be a case for a brand to pick an accent that is most likely to be found amongst their target audience. As a consequence, voice branding decisions could feasibly be made on a regional basis. For example, self checkout machines in localised areas around the UK could be given Brummie, Glaswegian or Geordie accents, and responses from automated helplines adjusted according to the caller’s location.

A matter of taste

This kind of personalisation is becoming an important feature of voice based personal assistants, as the growing sophistication of AI and voice recognition technology sees them offer enhanced functionalities and spread into more areas of our lives.

One company which is taking voice assistants beyond the home is Nuance Communications, whose technology is used by over 60 car brands, and can currently be found in more than 200 million vehicles around the world. Nils Lenke, Director of Corporate Research at Nuance, sees personalisation in voice assistants as an asset to both the end user and the brand.

“An assistant can be more helpful the more it knows about the user,” he says. “For example, it can take your preferences into account when searching for a restaurant, or adapt the trip plan to your schedule. The assistant may also adapt to how you feel right now. With features like emotion detection, it can adapt its communication style to your mood, choose the best moment to talk to you, suggest some music that suits that mood, and so on.”

How does the technology achieve this? Artificially intelligent voice assistants can detect our emotions from our speech. They can also learn from experience – for example, identifying the brand of petrol station or type of restaurant that is searched for by the user – and remember these preferences for next time.

If multiple drivers use the same car, voice biometrics technology can also identify who is driving. As such, each driver receives his or her own personalised version of the voice assistant when entering the car. This ensures that users have effective, individualised interactions with the voice assistant and that the brand delivers a positive experience.

“Our Dragon Drive mobility assistant does just this, supporting automakers with capabilities that maximise the safety, productivity and enjoyment of customer journeys while elevating the identity of each unique brand,” Lenke notes.

Intelligent by design

With so many different aspects to consider, designing a voice assistant is no small undertaking. For Lenke, it is important to create the assistant’s persona at the very beginning of the process, as it impacts many other aspects of the system. Vital considerations include how to phrase the prompts of the system, which text to speech voice and earcons should be used, and how to combine all of this with the visual elements of the product – such as, in Nuance’s case – the inside of the car.

“It all has to do with your target audience, the brand image you want to create and the intended experience for your users,” Lenke says. “For example, should they think they are talking to a robot or a nearly human being?”

“An important aspect is anthropomorphism, an effect that always happens when voice and language are involved. This is because as humans we automatically project human like qualities onto things that speak, which increases our trust, but also our expectations – and we tend to overestimate the intelligence of such a system. If you choose a human name for an assistant (for example, ‘Alexa’) you increase that effect. Whereas if you choose a neutral name (Google Assistant or Dragon Drive) you can tone that down.”

Wake up and pay attention

Another consideration is the choice of wake words used to preface a command and instruct an assistant that we want it to do something. From a simple “Alexa”, and a curt “OK, Google”, to the smooth “Hey, Mercedes…”, wake words set the tone of the user-assistant relationship right from the start. It is crucial for brands to choose wake words that resonate with the user, that they feel comfortable saying, and which – as a consequence – probably form part of their normal daily vocabulary.

The selection of wake words as well as assistant persona will therefore typically correspond to a brand’s intended audience. As Lenke states, these features are an important aspect of marketing in the voice economy.

“Mercedes has stated that one goal of the new MBUX (Mercedes-Benz User Experience) system was to lower the average age of an A-class buyer, and you may see a relatively informal and cool choice like “Hey, Mercedes” in that context. That MBUX came out in the A-class first and not in the S-class is a statement in itself. And of course it has Daimler’s car brand, which would not be the case if they had gone with the odd internet giant, off the shelf assistant, which bring their own wake up words, and interject the internet giant’s brand between the OEM (original equipment manufacturer) and driver.”

Where does all of this leave brands? After considering the accent, style, persona, intended audience, wake words, and human or robot like qualities of a voice assistant, they will hopefully have created something the user will warm to. After that, it’s fingers crossed the assistant will actually do what it is told…

For the latest D/SRUPTION insights straight to your inbox, sign up to our newsletter.