web analytics

GUEST ESSAY: A DIY guide to recognizing – and derailing – Generative AI voice scams – Source: www.lastwatchdog.com

Rate this post

Source: www.lastwatchdog.com – Author: bacohido

By Alexander Konovalov

Americans lost a record $10 billion to scams last year — and scams are getting more sophisticated.

Related: Google battles AI fakers

Recently used to impersonate Joe Biden and Taylor Swift, AI voice cloning scams are gaining momentum — and one in three adults confess they aren’t confident they’d identify the cloned voice from the real thing.

Google searches for ‘AI voice scams’ soared by more than 200 percent in the course of a few months. Here are a few tips  how to not fall prey to voice cloning scams.

•Laugh. AI has a hard time recognizing laughter, so crack a joke and gauge the person’s reaction. If their laugh sounds authentic, chances are there’s a human on the other end of the line, at least.

•Test their reactions. Say something that a real person wouldn’t expect to hear. For instance, if scammers are using artificial intelligence to imitate an emergency call from your relative, say something inappropriate, such as “Honey, I love you.” Whereas a real person would react panicked or confused, AI would simply reply “I love you too.”

Konovalov

•Listen for anomalies. While voice cloning technology can be convincing, it isn’t yet perfect. Listen out for unusual background noises and unexpected changes in tone, which may be a result of the variety of data used to train the AI model. Unusual pauses and speech that sounds like it was generated by ChatGPT are also clear giveaway that you’re chatting to a machine.

•Verify their identity. Don’t take a familiar voice as proof that a caller is who they say they are, especially when discussing sensitive subjects or financial transactions. Ask them to provide as many details as possible: the name of their organization, the city they’re calling from, and any information that only you and the real caller would know.

•Don’t overshare. Avoid sharing unnecessary personal information online or over the phone. According to Alexander, scammers often phish for private information they can use to impersonate you by pretending to be from a bank or government agency. If the person on the other end seems to be prying, hang up, find a number on the organization’s official website, and call back to confirm their legitimacy.

•Treat urgency with skepticism. Scammers often use urgency to their advantage, pressuring victims into acting before they have time to spot the red flags — If you’re urged to download a file, send money, or hand over information without carrying out due diligence, proceed with caution. Take your time to verify any claims (even if they insist there’s no time.)

About the essayist: Alexander Konovalov is the Co-Founder & Co-CEO of vidby AG, a Swiss SaaS company focused on Technologies of Understanding and AI-powered voice translation solutions. A Ukrainian-born serial tech entrepreneur, and inventor, he holds patents in voice technologies, e-commerce, and security. He is also a co-founder of YouGiver.me, a service that offers easy and secure communication through real gifts, catering to individual users and e-commerce businesses.

March 11th, 2024 | Best Practices | Essays | Privacy | Top Stories

Original Post URL: https://www.lastwatchdog.com/guest-essay-a-diy-guide-to-recognizing-and-derailing-generative-ai-voice-scams/

Category & Tags: Best Practices,Essays,Privacy,Top Stories – Best Practices,Essays,Privacy,Top Stories

Views: 0

LinkedIn
Twitter
Facebook
WhatsApp
Email

advisor pick´S post