What are voice deepfakes, the new banking scam?

This spring, Florida investor Clive Kabatsnick called his local Bank of America representative to discuss a large cash transfer he was planning to make. Then he called again.

Except the second call didn’t come from Kabatznik. Instead, a computer program artificially created his voice and tried to trick the bank employee into transferring the money.

Kabatznik and his banker were the target of a sophisticated fraud attempt that caught the attention of cybersecurity experts: the use of artificial intelligence to create Deep fakes (or deepfakes), voice shows that imitate the voices of real people.

The problem is still new, and there is no absolute count of how often it occurs. But PinTrap, an expert that monitors audio traffic for many of the largest US banks, said banks have seen an increase in the prevalence and sophistication of fraudsters’ voice fraud efforts this year. Late last year, Nuance, another major voice recognition provider, suffered its first successful in-depth attack against a financial services customer.

In Kabatsnik’s case, fraud was discovered. However, the speed of technological development, the decrease in the cost of developing artificial intelligence programs and the widespread availability of recordings of people’s voices on the Internet have created the perfect conditions for fraud with the help of voice-related AI programs.

Customer data such as bank accounts stolen by hackers – and available on underground markets – enables fraudsters to carry out these attacks. They are even easier with wealthy clients, whose public appearances, including speeches, are often widely available on the Internet. Finding audio samples from everyday customers can be as easy as conducting an online search — for example, on social media apps like TikTok and Instagram — for the name of someone whose bank details fraudsters already have.

„There’s a lot of audio content out there,” said Vijay Balasubramanian, CEO and founder of PinDrop, which reviews the automated voice verification systems of eight of the 10 largest lenders in the United States.

Over the past decade, PinTrap has reviewed records of more than 5 billion calls received in the call centers of the financial institutions it serves. The centers manage products such as bank accounts, credit cards and other services offered by large retail banks. All call centers receive calls from fraudsters, typically 1,000 to 10,000 per year. Balasubramanian said he used to get calls from fraudsters 20 times a week.

READ  This is how the technology that achieved the Toy Story-style animation Falcons vs Jaguars works

Until now, fake voices created by computer programs accounted for only „some” of these calls, which didn’t start occurring until last year, according to Balasubramanian.

Most of the fake voice attacks Pindrop has seen have occurred at credit card call centers, where human representatives serve customers who need help with their cards.

Balasubramanian showed a journalist an anonymous recording of one of these calls in March. While this is a very basic example—in this case the voice sounds robotic, more like the voice of an e-book reader than a human voice—the call illustrates how fraud can occur as AI makes it easier to imitate human voices.

A bank employee is heard greeting a customer. Then, an automated voice-like voice says: „My card has been declined.”

„May I ask in whom I am pleased?” replied the bank employee.

„My card was declined,” the voice repeats.

The bank employee again asks for the customer’s name. There is silence as you hear the soft jingle of keys. According to Balasubramanian, the number of keystrokes corresponds to the number of letters in the customer’s name. The fraudster types the words into a program and then reads them.

Balasubramanian explained that in this case, the artificial speech of the caller led the employee to transfer the call to another department and it was flagrant of fraud.

Calls like these, which use text-to-speech technology, are some of the easiest attacks to combat: Call centers can use diagnostic software to detect technical clues generated by a speech engine.

„Synthetic speech leaves traces, and many anti-spoofing algorithms can detect them,” explained Peter Soufleris, CEO of IngenID, a provider of voice biometrics technology.

However, like many security measures, it is an arms race between attackers and defenders that has recently evolved. Now, a scammer can speak into a microphone or type a message and have it quickly translated into the target’s voice.

READ  Toyota GR Yaris technology and design

Balasubramanian noted that Microsoft’s VALL-E, an artificial intelligence system, can create a voice representation of whatever the user says from a three-second audio sample.

In May, in an episode 60 minutesRachel Tobach, a security consultant, used software to clone herself in a very convincing way Voiced by Sharin AlfonsiOne of the program’s reporters managed to trick an employee 60 minutes Alphonse’s passport number should be given to him.

Tobag, CEO of Social Proof Security, said the attack took only five minutes to coordinate. Your used equipment is available for purchase from January.

Bret Beranek, general manager of security and biometrics at Nuance, a voice technology provider that Microsoft acquired in 2021, said that while gruesome demonstrations of deepfakes are common at security conferences, actual attacks are extremely rare. The only successful attack against a Nuance customer was in October, where the attacker took more than a dozen attempts.

Peranek’s biggest concern isn’t attacks on call centers or automated systems like the biometric voice systems that many banks have used. He worries about scams where the caller directly approaches an individual.

„I had a conversation with one of our customers earlier this week,” he said. „They said to me, hey, Brett, it’s great that we keep our contact center secure, but what if someone calls our CEO directly on his cell phone and pretends to be someone else?”

That’s what happened in the Kabatznik case. According to the banker’s description, the fraudster appeared to be trying to convince her to transfer the money to a new location, but the voice kept repeating, he spoke to her and used confusing phrases. The banker hung up.

„It was like I was talking to her, but it didn’t make sense,” Kabatznik said the banker told him. (A bank spokeswoman declined to make the employee available for an interview.)

READ  The Minister of Science and Technology appreciates the contributions of scientists to the productive development of the country

Kabatznik explains that after receiving two more similar calls in a row, the banker notified Bank of America’s security team. Concerned about the security of Kabatznik’s account, he stopped responding to actual Kabatznik calls and emails. When Kabatznik arranged to visit his office, it took about ten days to re-establish the connection.

„We continue to train our team to recognize and recognize fraud and help our customers avoid it,” said Bank of America spokesman William Haldin, who could not comment on individual customers or their experiences.

Although the attacks have become more sophisticated, they stem from a fundamental cybersecurity threat that has been around for decades: a data breach that exposes the personal information of bank customers. Between 2020 and 2022, more than 300 million people’s personal data fell into the hands of hackers, resulting in a loss of $8.8 billion, according to the Federal Trade Commission.

Once they collect a batch of numbers, hackers dissect the information and match it with real people. The people who steal information are almost never the same people. Instead, thieves put them up for sale. Professionals can use any of a number of easily accessible programs to spoof the phone numbers of customers they want to scam, and that’s exactly what happened in the case of Kabatznik.

Finding recordings of your voice is easy. There are videos of him on the internet Speaking at a conference Y participates At a fundraiser.

„I think it’s very scary,” Kabatznik concluded. „The problem is, I don’t know what to do about it. Do you go underground and hide?”

Audio produced by Daily Abacus.

Emily Flitter Includes finance. She is a teacher The White Wall: How Big Finance is Bankrupting Black America. More than Emily Flitter

Stacey Cowley He is an economic journalist specializing in consumer issues and data protection. He has previously reported on a variety of business topics, including technology and economics, for CNN Money, Fortune Small Business and other magazines and websites. Other works by Stacey Cowley

Dodaj komentarz

Twój adres e-mail nie zostanie opublikowany. Wymagane pola są oznaczone *