web analytics

Attackers using AI to enhance conversational scams over mobile devices

Rate this post

Attackers are using artificial intelligence (AI) to enhance conversational scams, such as the so-called “pig butchering” social engineering scams, over mobile devices. Instead of just using the same attack with the same image, the AI lets the scammers rapidly create thousands of attacks that look different to each target, which makes it harder for victims to detect the scam.

“It really is a game of large numbers, the attackers want to send out these first messages to as many people as possible hoping to get a response,” said Stuart Jones, director of the Cloudmark Division at Proofpoint, which posted a blog on the issue on April 18. “Once they get a response, then they use the AI to maintain that realism with the target who responded, they may change clothes or change backgrounds over the course of a defined campaign, and they now have the ability to run thousands of different-looking attacks.”

Proofpoint researchers also pointed out that, for example, as the technology advances, AI bots trained to understand complex tax codes and investment vehicles could be used to defraud even the most sophisticated victims.

Jones said the vast majority of these conversational attacks are targeted towards mobile devices, mainly smartphones. He said Proofpoint has seen a 12-fold increase in these attacks over the past year, pointing out that they have observed roughly 500,000 over any given one or two-month period.

Many of the attacks are around romance and involve job seekers. The attackers try to lure the victims to alternative platforms such as WhatsApp and Facebook Messenger to run the final transaction. The favorite payment method is in Bitcoin.

Security pros at enterprises have to stay on guard because even though the pandemic has receded, most organizations are still in hybrid mode and there’s a very definite blurring of work and home with people using their personal and work devices interchangeably.

“We’ve seen a variant of business abuse where a threat actor pretends to be a managerial or supervisor or a leader in a company, strikes up a conversation, and by pretending to be that work associate tries to get the person to give up personal information or money,” said Jones.

The term “pig butchering” gets used for a family of related scams where the target is “fattened up and then butchered,” explained Mike Parkin, senior technical engineer at Vulcan Cyber. Parkin said it’s derived from a Chinese term used in the field, but the concept is nothing new. Today, Parkin said it plays into the crypto-market’s ties with cybercrime as the currencies of choice.

“These scams target individuals and are a different business model than the higher profile ransomware and extortion schemes that target organizations,” Parkin said. “Ultimately, a faltering crypto market will be barely an inconvenience for these scammers. They may need to change some details and revert to older techniques for fleecing their victims, however, it won’t stop them. Criminals were operating long before there was cryptocurrency and they will continue to operate even if crypto get banned.”

Krishna Vishnubhotla, vice president of product strategy at Zimperium, added that because the victim has no way of verifying the trustworthiness of the caller, conversational scams are tough to prevent. If they give the victim a wallet address, wallet providers don’t allow the victim to check the person’s reputation, location, or other details to corroborate the story they are being sold, said Vishnubhotla.

“AI tools will reduce the operational costs of running these scams to almost nothing,” said Vishnubhotla. “It’s not just about just responding like a person and generating photos. The tools allow conversations to have a personality of their own, tailored to the victim’s demographic and socioeconomic background. When these technologies are misused, it leads to personalized fraud at scale. And these will only evolve to get better as the models improve. We already see these scams making it into LinkedIn, Instagram, and TikTok as legitimate profiles already.”

Views: 0

LinkedIn
Twitter
Facebook
WhatsApp
Email

advisor pick´S post