AI voice phone scams are on the rise. Here’s how to avoid them – USA TODAY

Posted: May 18, 2023 at 2:01 am

Jennifer Jolly| Special to USA TODAY

The most powerful people on the planet dont quite know what to make of AI as it quickly becomes one of the most significant new technologies in history.

But criminals sure do.

In the six months since OpenAI first unleashed ChatGPT on the masses and ignited an artificial intelligence arms race with the potential to reshape history a new strain of cybercriminals has been among the first to cash in.

These next-gen bandits come armed with sophisticated new tools and techniques to steal hundreds of thousands of dollars from people like you and me.

I am seeing a highly concerning rise in criminals using advanced technology AI-generated deepfakes and cloned voices to perpetrate very devious schemes that are almost impossible to detect, Haywood Talcove, CEO of LexisNexis Risk Solutions' Government Group, a multinational information and analytics company based in Atlanta told me over Zoom.

AI-generated images already fool people: Why experts say they'll only get harder to detect.

Competition in cyberspace: Google ups the ante on AI to compete with ChatGPT. Here's how search and Gmail will change.

If you get a call in the middle of the night and it sounds exactly like your panicked child or grandchild saying, help, I was in a car accident, the police found drugs in the car, and I need money to post bail (or for a retainer for a lawyer), its a scam, Talcove explained.

Earlier this year, law enforcement officials in Canada say one man used AI-generated voices he likely cloned from social media profiles to con at least eight senior citizens out of $200,000 in just three days.

Senior scam: An elderly man was scammed out of millions. Could the bank have done more to prevent fraud?

The what-if scenarios: Fear over AI dangers grows as some question if tools like ChatGPT will be used for evil

Similar scams preying on parents and grandparents are also popping up in nearly every state in America. This month, several Oregon school districts warned parents about a spate of fake kidnapping calls.

The calls come in from an unknown caller ID (though even cell phone numbers are easy to spoof these days). A voice comes on that sounds exactly like your loved one saying theyre in trouble. Then they get cut off, you hear a scream, and another voice comes on the line demanding ransom, or else.

The FBI, FTC, and even the NIH warn of similar scams targeting parents and grandparents across the United States. In the last few weeks, its happened in Arizona, Illinois, New York, New Jersey, California, Washington, Florida, Texas, Ohio, Virginia, and many others.

An FBI special agent in Chicago told CNN that families in America lose an average of $11,000 in each fake-kidnapping scam.

Talcove recommends having a family password that only you and your closest inner circle share. Dont make it anything easily discovered online either no names of pets, favorite bands, etc. Better yet, make it two or three words that you discuss and memorize. If you get a call that sounds like a loved one, ask them for the code word or phrase immediately.

If the caller pretends to be law enforcement, tell them you have a bad connection and will call them back. Ask the name of the facility theyre calling from (campus security, local jail, the FBI), and hang up (even though scammers will say just about anything to get you to stay on the line). If you cant reach your loved one, look up the phone number of that facility or call your local law enforcement and tell them whats going on.

Whatis ChatGPT?: Everything to know about OpenAI's free AI essay writer and how it works

New Twitter CEO: What to know about Linda Yaccarino, Elon Musk's pick

Remember, these criminals use fear, panic, and other proven tactics to get you to share personal information or send money. Usually, the caller wants you to wire money, transfer it directly via Zelle or Venmo, send cryptocurrency, or buy gift cards and give them the card numbers and PINs. These are all giant red flags.

Also, be more careful than ever about what information you put out into the world.

An FTC alert also suggests calling the person who supposedly contacted you to verify the story, use a phone number you know is theirs. If you cant reach your loved one, try to get in touch with them through another family member or their friend, it says on its website.

A criminal only needs three seconds of audio of your voice to clone it, Talcove warns. Be very careful with social media. Consider making your accounts private. Don't reveal the names of your family or even your dog. This is all information that a criminal armed with deepfake technology could use to fool you or your loved ones into a scam.

Talcove shared a half dozen how-to video clips he says he pulled from the dark web showing these scams in action. He explained that criminals often sell information on how to create these deepfakes to other fraudsters.

I keep my eyes on criminal networks and emerging tactics. We literally monitor social media and the dark web and infiltrate criminal groups, he added. Its getting scary. For example, filters can be applied over Zoom to change somebodys voice and appearance. A criminal who grabs just a few seconds of audio from your [social media feeds], for example, can clone your voice and tone.

I skipped all the organized crime parts and just Googled AI voice clone. I wont say exactly which tool I used, but it took me less than ten minutes to upload 30 seconds of my husbands voice from a video saved on my smartphone to an AI audio generator online, for free. I typed in a few funny lines I wanted him to say, saved it on my laptop, and texted it to our family. The most challenging part was transferring the original clip from a .mov to a .wav file (and thats easy too).

It fooled his mom, my parents, and our children.

We're all vulnerable, but the most vulnerable among us are our parents and grandparents, Talcove says. 99-in-100 people couldn't detect a deepfake video or voice clone. But our parents and grandparents, categorically, are less familiar with this technology. They would never suspect that the voice on the phone, which sounds exactly like their child screaming for help during a kidnapping, might be completely artificial.

More from Jennifer Jolly:

Jennifer Jolly is an Emmy Award-winning consumer tech columnist. The views and opinions expressed in this column are the author's and do not necessarily reflect those of USA TODAY.

View original post here:

AI voice phone scams are on the rise. Here's how to avoid them - USA TODAY

Related Posts