How Fraudsters Could Hack Audio Recordings to Conduct Social Engineering Attacks

0

Uphold in the spring of 2015, I wrote about five types of social developing attacks against which users should protect themselves. One of the technics I discussed is called pretexting. It’s when an attacker creates a pretext, or a invented scenario, to trick a target.

Attackers will assume any pretense to reach their nefarious ends. Some of their shams can be quite labyrinthine. Take business email compromise (BEC) scams, otherwise known as CEO hoax. For an attacker to set up this pretext, they must hack a business supervisory’s email account. Attackers will usually pull off this prior step by subjecting a target to a whaling attack.

If the attack proves triumphant, they can leverage their target’s email to impersonate the executive, acquaintance the HR department, and request employees W-2 forms and other personally identifiable tidings. Alternatively, they can contact someone in finance and request that they tip off a exaggerate a fraudulent wire transfer to an account under their control. This latter take shape of fraud has cost individual businesses tens of millions of dollars. Harmonizing to the FBI, BEC scams have victimized a total of 22,000 companies and caused more than $3 billion in dyings since 2013.

CEO fraud assumes an attacker has compromised a target’s email account. Were a computer crooked to pose as the executive outside of a text-based medium, the jig would be up. That’s even the case as of January 2017. But a new technology could change all of that.

Scheme Voco: The Excitement and Concern

Meet Project VoCo. It’s short for “Photoshopping Voiceovers,” one of 11 theoretical technologies demoed at Adobe MAX 2016. VoCo is a sound engineer’s speculation in that it allows a controller to edit or insert words into an audio in confidence without having to bring the voiceover artist back into studio. All the software destitutions is about 20 minutes of a person’s speech to make the process employment.

Project VoCo lives up to that expectation in the demo video anticipated below.

Clearly, lots of people are excited around the prospect of being able to alter audio recordings. But not everyone is grab on the bandwagon. Dr. Eddy Borges Rey, a lecturer in media and technology at the University of Stirling, is responsible by the development. He revealed as much to BBC News:

“It seems that Adobe’s programmers were swept along with the hurly-burly of creating something as innovative as a voice manipulator, and ignored the ethical dilemmas bring ined up by its potential misuse. Inadvertently, in its quest to create software to manipulate digital mean, Adobe has [already] drastically changed the way we engage with evidential tangible such as photographs. This makes it hard for lawyers, journalists, and other specialists who use digital media as evidence. In the same way that Adobe’s Photoshop has deal legal backlash after the continued misuse of the application by advertisers, Voco, if emancipated commercially, will follow its predecessor with similar consequences.”

That’s a attractive thorough point. If proper safeguards aren’t implemented, Project VoCo could drain the authenticity of audio recordings. Attackers could in that case feat the technology to fool others into thinking someone said something they did not–all near a nefarious end like CEO fraud. All they would need to do is conduct a bit of research beforehand.

Laura V. illustrates in Social-Engineer Newsletter how one such attack might proceed:

  1. An attacker conducts OSINT and discovers an organization’s CEO will be away on business for a few days or a week.
  2. The bad actor records a charlatan message from the CEO using VoCo that asks the head of pay for to call them back for instructions regarding an upcoming payment. They disappear that message as a voicemail for the head of finance.
  3. The head of finance notified ofs the message, thereby establishing the attacker’s pretext.
  4. The attacker receives a evoke from the head of finance. Using VoCo, the former instructs the last to deliver funds to an account under their control.

The scenario above doesn’t provide for all the attacks that Project VoCo might facilitate. Bad actors could use the technology to object voice-activated assistants like Amazon Echo and Google Home so that they can rest period into a person’s home. They could also create touchy recordings that undermine the public standing of executives and politicians.

Conclusion

It’s unclear when Photoshopping Voiceovers pleasure become publicly available. When it does, it’ll take even myriad time to determine how easy it is for people to identify an audio recording that someone’s modified utilize consuming the technology. With that in mind, organizations’ best hope of preventing vilifies such as those described above is to train their employees to be on the problem for vishing and spear-phishing attacks. If an attacker can’t build a pretext, they won’t be talented to leverage VoCo to make fraudulent wire transfers or steal touchy information.

Leave a Reply

Your email address will not be published. Required fields are marked *

21