Authentication And Fraud Prevention With Biometrics

DISRUPTIONHUB interviews Brett Beranek of Nuance 

When you think of biometrics, what comes to mind? Thanks to the movies, the chances are fingerprint and iris scanners feature high on the list. But along with fictional thieves using synthetic thumbs or contact lenses to break into bank vaults on the big screen, there remains a very real – and important – place for this technology in our everyday lives.

Biometric technology can play a starring role in identity authentication processes and fraud prevention, by ensuring that account access is only granted to its owner. With a range of biometric modalities in existence, relying on physical or behavioural attributes, there are various different processes that organisations can employ to protect their users. DISRUPTIONHUB spoke to Brett Beranek, Vice President of Security and Biometrics at Nuance Communications, to find out how they are being put into practice.

A brief history of voice tech

Nuance’s journey into biometrics began in the 1990s, when the company’s researchers were developing algorithms for speech recognition. At a certain point, focus diverged to also identify the speaker from the characteristics of their voice, through a voice biometric algorithm. The first deployment of the technology was at the contact centre of a financial institution, where criminals were repeatedly calling in and committing fraud. The system – although rudimentary at that point compared to today’s standards – was able to detect that the same voices were calling back and alert agents that a fraud was underway.

Since then, as Beranek explains, Nuance has expanded beyond fraud prevention into the field of authentication. This is where organisations can verify a person’s identity from the characteristics of their voice.

My voice is my password

Unlike fraud prevention capabilities, which happen in the background and pass unnoticed by a caller, most consumers are now aware of voice authentication. In the UK, large banks, telecoms companies and government institutions such as HMRC have instituted these mechanisms, so the phrase ‘my voice is my password’ is familiar to many. But how does it work?

When you call in to your bank and you have a voiceprint on file, the system takes a short audio sample of your voice and compares it to what it has stored,” says Beranek. “So you might say ‘my voice is my password’, or if you’re speaking to a live agent the algorithm would just automatically analyse your voice. If everything matches then the system authenticates your identity, or gives a green light to the agent.”

This technology is understandably game-changing in authentication processes that have relied on usernames and passwords for many years.

‘Why this has been so transformational to a number of contact centres,” says Beranek, “is that they don’t need to ask any security questions any more so it reduces the length of the call. But it also delivers a much better customer experience. If the agent can handle a customer’s request very quickly, then the level of satisfaction goes up.”

Biometrics for fraud prevention

As for fraudsters attempting to get round these systems, a couple of obvious options will be detected by the technology.

Synthetic voices are a way that a fraudster could try to compromise a voice biometric system,” Beranek notes. “The other way would be to record your voice when you’re saying ‘my voice is my password’ and play it back. In both cases we have algorithms to detect that that is taking place.”

If a mismatch does occur between the voiceprint on file and the voice of the person on the call, not only will the agent be notified of that fact, but the technology is also able to match the speaker’s voice to a list of known fraudsters. As Beranek notes, once a criminal successfully breaches a system they will keep going until something stops them. Identifying individual fraudsters is therefore an important step in the crackdown on crime.

There’s a very large volume of fraud attempts that are actually perpetrated by a small number of people,” says Beranek. “We had one of our customers in the UK share with us that their most prolific fraudster was responsible for 18 per cent of all fraud losses. You think that there’s an infinite amount of fraudsters out there but it’s actually not the case.”

Conversation print

The analysis of the sound of a person’s voice through voice biometrics can be augmented by another kind of biometric modality, which assesses how we use language. A conversation print is a profile of the vocabulary, grammar, and sentence structure we use, which can also be employed for identification purposes.

In the case of an individual fraudulently attempting to access an account, conversation print provides another layer of protection.

When a system identifies that it is not the right person speaking,” says Beranek, “it will start transcribing what it hears into text, applying the conversation print algorithms on what had been said. That will enable it to determine that this is the way a previous fraudster has spoken – very similar vocabulary, very similar sentence structure, very similar grammar – and highlight that this is a known fraudster.”

The reason why this has been quite transformational is not only are we able to identify the fraudster themselves but we can prevent the fraudster from hiring mules, and basically providing a script to somebody else to call on their behalf,” Beranek adds. “They’ve realised that they won’t be able to access accounts with their voice any more, so they hire people to do it for them. But they need to instruct those people how to conduct the call – that’s why they give them a script – and that’s how we catch them.”

Big on biometrics

Voice biometrics, then, and the security practices it is linked to, are an ideal way of identifying individuals and protecting their accounts. However, verifying a voice isn’t always the ideal way of granting account access. Branching out of the contact centre, companies such as Nuance now have business customers who want to authenticate users and detect fraud in things like mobile apps and websites. Here, other biometric modalities can also be leveraged, the most impactful of which is behavioural biometrics.

Voice biometrics – out of the biometric modalities that are available – is one of the most robust, and most secure, but for certain applications it may not be the appropriate interface with the customer,” says Beranek. “We do have customers that have voice biometrics on the web and in mobile apps but it’s usually to secure high risk transactions – transactions over a certain amount. But if you’re just logging in to your banking portal, you’re just going to check your account balance, then it may not be the most convenient thing to pick up your phone and say ‘my voice is my password’.”

Here we can use behavioural biometrics, where we analyse how you’re typing on the keyboard, how you’re holding the phone, how you’re tapping and swiping – and determine if it actually is you based on those characteristics.”

Fighting crime – and password sharing…

It’s clear that as far as the effectiveness of biometrics to authenticate identity goes, this technology cannot be doubted. What remains, however, is a procedural question around what to do when a high risk login is identified. To put this into context, not only must company systems combat fraudsters attempting to hack into accounts, they also have to deal with many instances of password sharing, and people legitimately trying to log into accounts on behalf of their friends or family.

When a high risk login occurs, it’s up to our customers what they do but we recommend that they lock the account,” says Beranek. “The issue is that there’s a transition that needs to take place. When we deploy these systems a lot of the alerts are for non fraudulent individuals. There’s a lot of password sharing. So in a contact centre, we see that about five per cent of all calls are from non fraudulent, non account holders. And on the web it’s even higher than that – it’s closer to 10 per cent.”

What this highlights is that organisations have completely lost control over who is accessing their services. They have no level of certainty when a transaction takes place if it is actually the account holder, or somebody else, or if it is a fraudster. With just usernames and passwords and security questions they have no idea, no way of knowing. With biometrics we are giving them a level of certainty that yes it is the correct individual, or no it’s not.”

Staying one step ahead

Interestingly, unlike many areas of fraud where criminals have the upper hand, biometrics technology provided by companies such as Nuance currently gives organisations an advantage.

Right now we are light years ahead of the fraudsters,” says Beranek. “If you think about their operations, they have become experts at data mining – so going on the dark web, purchasing data on individuals… In the United States, with the Equifax breach you can basically have all the information you could ever desire on virtually all citizens. So criminals create synthetic identities or they take over identities and then they steal funds or perpetrate fraudulent transactions.”

That modus operandi has been fine tuned over many years. Therefore, organisations that deploy voice biometrics, or behavioural biometrics, or that deploy conversation print, they’re throwing a wrench into these activities. All of a sudden, the criminals don’t know how to deal with this. So often what we’ve seen is that the fraudsters go to another organisation.”

But eventually, of course, all the organisations will be protected in this way so the fraudsters will adapt. That’s why we’ve been thinking not about how fraudsters are attacking systems today, but how they will attack them in the future.”

The resistance to change

The effectiveness of biometrics notwithstanding, issues with this technology do remain. As with all areas of business, a resistance to change can characterises the authentication space, with some organisations sticking to the status quo of usernames and passwords in spite of their flaws. For Beranek, this goes against the proven advantages that biometrics offers.

All the data is extremely compelling,” he says. “Forrester did an assessment of one of our customers and looked at the business benefits of transitioning to biometrics. There’s better customer experience, better customer loyalty, reduced operating costs, significant fraud loss savings…”

As an executive, if you look at the data, it’s very clear that this is something that should take place. And it’s a mature technology, this is something that’s been around since the 1990s. Really, the only barrier to adoption is a resistance to change.”

Fake news, authentic concerns

Today, the creation and circulation of deepfakes is on the rise, illuminating clear applications for biometric technologies outside of the customer call centre. With the ability to make any public figure say or do anything, and thereby manipulate everything from election results to stock market investments, comes a need to verify the accuracy of information.

Does Beranek see a future for this technology in the fight against fake news? Absolutely – but there’s no time like the present…

I think it’s required now,” he says. “I’m a firm believer that every news agency should have technology such as ours to validate whether what they’re viewing, be it audio or video, is real.”

The amount of fake news we have in our society is just going to increase over time. Individuals of all kinds will use these technologies. What if your neighbour is frustrated with you and they want to retaliate, so they create some fake news about you? That can be extremely damaging to a person’s reputation.”

Democratising these technologies and using them to validate whether something you’re listening to or watching is real or not is incredibly important. I think it will become commonplace but it’s not an instinct that we currently have.”

As the need to protect our digital identities grows, it’s likely that we will become ever more familiar with biometric technologies. Our bank balances, and the stability of the world’s fragile social and political systems, will thank us for it. Long live biometrics.

Find out more in our free weekly newsletter