top of page

Deepfake CEO Scam: Voice Cloning Is the New BEC

  • Writer: DH Solutions
    DH Solutions
  • Mar 2
  • 5 min read

For years, the biggest email scam targeting businesses was Business Email Compromise (BEC): a fraudster impersonates the CEO via email and tricks someone in accounting into wiring money. It was crude but effective, costing businesses billions annually.


Now, BEC has a terrifying upgrade. Using AI voice cloning, attackers can now replicate your CEO's voice from as little as three seconds of audio and make a phone call that sounds exactly like them. According to Keepnet Labs, deepfake-related fraud losses in the United States reached $1.1 billion in 2025, tripling from $360 million the year before. Deepfake fraud in North America surged 1,740% between 2022 and 2023.


This is not science fiction. This is the scam that is already hitting businesses in Metro Detroit and across the country. Here is what you need to know to protect your team.



Key Takeaway

If your verification process for wire transfers is "My boss called and told me to do it," your process is broken.

A person behind a laptop in a hooded sweatshirt in blue and black


The Anatomy of a Deepfake CEO Scam

Here is how these attacks typically unfold:

 

Step 1: Harvest the Voice.

The attacker finds a video of your CEO on LinkedIn, YouTube, a podcast interview, or even a company webinar. AI voice cloning tools can create a convincing replica from just 3 to 5 seconds of sample audio.


Step 2: Build the Pretext

The attacker researches your company. They learn who handles wire transfers, who the CEO typically contacts, and what a plausible request would look like ("We are acquiring a vendor," "I need an emergency payment processed").


Step 3: Make the Call

Using the cloned voice, the attacker calls the target employee directly. The voice sounds exactly like the CEO, and the caller ID may even be spoofed to show the CEO's number. The request is always urgent, always confidential, and always financial.


Step 4: Cash Out

The money is wired to a fraudulent account and moved offshore within hours. Recovery is nearly impossible.



The $25 Million Warning

In February 2024, a finance employee at UK engineering giant Arup was tricked into wiring $25 million to fraudsters after participating in a video conference where every other participant, including the company's CFO, was an AI-generated deepfake. As CNN reported, the employee initially suspected phishing, but the live video call with convincing AI clones of his colleagues overcame his doubt.


This case shattered the assumption that video calls are inherently trustworthy. If you are using a Zoom call as your "verification step," attackers have already accounted for that.



Why This Threat Is Growing


  • CEO fraud now targets at least 400 companies per day using deepfakes.

  • 77% of victims who were targeted by a voice clone and confirmed financial loss actually lost money.

  • 80% of companies have no established protocols or response plans for handling a deepfake-based attack.

  • Human detection rates for high-quality video deepfakes are just 24.5%.


The tools to create deepfakes are cheap, widely available, and improving rapidly. Searches for "free voice cloning software" rose 120% between 2023 and 2024. This is a democratized threat.



How to Protect Your Business: The "Two-Channel" Rule

The single most effective defense is simple: Never authorize a financial transaction based on a single communication channel.

 

The Rule: If you receive a phone call requesting a wire transfer, you must verify via a completely separate channel before acting. Call the person back at their known number (not the one that just called you). Send a Teams message. Walk down the hall.



Build a Verification Protocol:

  1. Establish a code word. Have a pre-agreed verbal passphrase that only the real CEO and the finance team know. Rotate it monthly.

  2. Require dual authorization. No single employee should be able to initiate a wire transfer over a certain threshold (e.g., $5,000) without a second sign-off from a different person.

  3. Ban "urgency" as a justification. Train your team that any request framed as "Do this immediately and don't tell anyone" is a red flag, not a command to obey.

  4. Document the chain. Every wire transfer request must have a paper trail: email confirmation, ticket number, or approval form. "My boss called me" is not documentation.


What’s at Risk in Southeast Michigan?

Small Law Firms (Southfield/Troy): Attorneys regularly handle client escrow funds. A deepfake call impersonating a senior partner could trick a paralegal into wiring trust funds to a fraudulent account, creating both financial and ethical liability.

 

Auto Suppliers (Wayne County): Manufacturing companies process large vendor payments routinely. A fake call from the "CFO" requesting a rush payment to a "new supplier" is highly plausible in this industry.

 

Dental/Medical Practices (Livonia/Farmington Hills): Smaller practices with lean office staff are especially vulnerable because one person often handles finances, scheduling, and administration. There is no second set of eyes on a suspicious request.

 

Heads up: Next week (March 1-7) is National Consumer Protection Week. It is a great time to remind your team and clients about emerging scams like deepfake voice cloning.



Deepfake Defense Checklist

  1. Implement the Two-Channel Rule for all financial transactions.

  2. Establish a rotating code word for verbal authorization.

  3. Require dual authorization on wire transfers above your threshold.

  4. Train your team with real examples of voice cloning scams.

  5. Audit your public media. Reduce the amount of CEO audio/video publicly available (podcasts, webinars, social media).

  6. Contact DH Solutions for help implementing verification protocols and training your team on deepfake awareness.

 

Is your finance team prepared for the next generation of AI fraud?

Schedule a quick Cyber Awareness Chat to discuss how to protect your business from deepfake CEO scams.






Frequently Answered Questions (FAQs)


Can deepfakes really be created from just a few seconds of audio?

Yes. According to research cited by Keepnet Labs, modern AI voice cloning tools can create a convincing voice replica from as little as three to five seconds of sample audio, and the technology is improving rapidly. Any public video, podcast, or voicemail recording can be used as source material. 

Are deepfake scams only targeting large corporations?

No. While headline cases like the $25 million Arup scam involve large enterprises, attackers increasingly view smaller businesses as easier targets with weaker controls. A single successful deepfake call can trigger a large payment at a small firm with no dual-authorization policy.

What should I do if I suspect a deepfake call?

Hang up without acting on the request. Immediately call the person back using a known, verified phone number, not the number that appeared on your caller ID. Report the incident to your IT provider and, if money was transferred, contact your bank and law enforcement immediately.



Republished with Permission from The Technology Press

Contact Us Today

Thanks for submitting!

Office: 734-743-2720

Westland: PO Box 851135, Westland, MI 48185

Livonia: 13321 Stark Road, Suite #2, Livonia, MI 48150

  • Facebook
  • LinkedIn

Copyright DH Solutions LLC, 2023  |  Privacy Policy  |  Terms of Use

bottom of page