AI Scams

AI Scams Are on the Rise

September 21, 2023
by Team SESLOC

Artificial intelligence (AI) tools for creating images, videos, recordings and text are easily available online and growing in popularity —  and scammers are taking full advantage. Here are a few ways scammers are utilizing AI:

Voice Cloning

A Family Emergency Scam is a classic scam that bets on a sense of urgency and emotional appeal to separate you from your money by calling you up and pretend to be a family member in crisis. In the most common version, they pretend to be a grandchild calling from jail and they need immediate financial assistance. Phone spoofing has been around for awhile, which may allow them to display the number of a real jail on your phone. Now with the help of AI, scammers have successfully cloned the voice of the family member to add an extra layer of authenticity.

The Federal Trade Commission (FTC) recommends a few tips to protect yourself:

  • Don’t trust the voice — call the person to confirm the story. If you can’t get ahold of them, call others who may have been in recent contact or can verify the story.
  • Establish a family password that you can ask for to establish identify over the phone.
  • If the caller is pretending to be someone from a business, financial institution, or government official, hang up and call the
  • Be extra suspicious if the caller asks for you to send money via wire transfer, cryptocurrency or gift card — these methods make it extremely difficulty to get your money back.

Chat GPT

It’s also reported that scammers are using text-based AI tools like Chat GPT to enhance the believability of phishing emails. The tool can help remove red flags like poor grammar and spelling and more accurately mimic native language use. However, you can still be on the lookout for these other red flags:

  • Request for urgent action or threats.
  • Request for payments, in particular via wire transfer, cryptocurrency, or gift card.
  • Request for sensitive personal information, like bank login details, credit card information, or Social Security Number.
  • Presents an offer that is too good to be true.
  • When in doubt, use another means of communication to contact the person to confirm.

Additionally, scammers have developed Chat GPT clones hoping to take advantage of people interested in playing with AI tools. These clones may require money to use, or may be infected with malware, or malicious software designed to secretly mine your data. If you plan to use an AI tool, be sure you are accessing a legitimate one.

Share this post

Like this post