When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

FBI: Scammers now using AI voices to impersonate government officials

FBI used NSO spyware

The FBI has issued a fresh warning about an ongoing scam that uses deeply convincing AI-generated voice messages and fake texts to rip people off. It seems that since April 2025, some scammers have been pretending to be senior US officials to target people, including current and former government officials and their contacts.

These malicious actors hit you up with texts, known as smishing, or voice messages, called vishing. Their goal is to build up trust before trying to sneak into your personal accounts or sweet talk you out of money or information. The feds say one trick is sending a sketchy link that claims to move the chat to a different platform. If they get into your accounts, they can use contact info to target others you know, pulling off even more scams or digging for sensitive details.

Traditionally, malicious actors have leveraged smishing, vishing, and spear phishing to transition to a secondary messaging platform where the actor may present malware or introduce hyperlinks that direct intended targets to an actor-controlled site that steals log-in information, like user names and passwords.

As AI voice technology becomes increasingly sophisticated, these crooks are now making fake calls or voicemails that sound freakishly like the people they are pretending to be. This makes the lies much harder to spot. Generative tools are definitely getting used to target public figures like politicians and celebrities, not just average folks. We have already seen how platforms are wrestling with this; OpenAI, last year, stated that its ChatGPT blocked over 250,000 requests attempting to generate fake images of presidential candidates.

More recently, Google integrated its SynthID tool, which watermarks AI-generated images, into the Magic Editor for photos. Meanwhile, platforms like YouTube are backing legislation such as the bipartisan NO FAKES Act of 2025 in an effort to rein in the growing chaos around synthetic media.

To avoid getting fooled, the FBI says you've got to verify who is contacting you. Do not just trust the message or call out of the blue. Hang up, find their real number independently, and call them back. Look super closely at email addresses, phone numbers, URLs, and spell-checking errors. Scammers often make tiny changes you might miss. Pay close attention to voices, too; AI can replicate voices well now, but sometimes there can still be weird glitches, although those are getting tougher to catch.

If someone you know contacts you from a new number or app, check with them through their old, trusted contact method before doing anything they ask. Never, and the FBI means never, send money, gift cards, crypto, or anything of value to someone you only know online or from a call, especially if it is a weird request. Do not click links or download files if you are not absolutely sure they are legitimate and from a verified source.

Also, set up two-factor authentication on all your accounts if you can, and do not ever give anyone the codes you receive. The FBI says if you believe you have been targeted by this campaign, contact your relevant security officials or report it to your local FBI Field Office or the Internet Crime Complaint Center at ic3.gov.

Report a problem with article
YouTube Podcast Weekly Podcast Shows
Next Article

YouTube shows another way to find trending podcasts

Microsoft Copilot
Previous Article

Microsoft finally brings ChatGPT's popular image generation capability to Copilot

Join the conversation!

Login or Sign Up to read and post a comment.

5 Comments - Add comment