Types of AI Attacks CISOs Should Track

Types-of-AI-Attacks-CISOs-Should-Track

AI Poisoning Attacks

By manipulating the data that a deep learning model trains upon, an attacker can either corrupt the model (untargeted) or even manipulate its output to produce favorable results for the attacker (targeted).

Weaponized Models

Attackers could potentially embed malicious code into pre-trained machine learning models to carry out a ransomware attack against an organization utilizing ML models from public repositories.

DataPrivacy Attacks

An attacker compromises the confidentiality of the data used to train machine learning and AI models

Model Theft

Attackers can also potentially steal the special sauce of how a particular AI/ML model works through various types of model theft attacks.

Sponge Attacks

Adversaries can essentially conduct a denial of service attack against an AI model by specially crafted input to burn up the model’s use of hardware consumption.

Prompt Injection
An attacker uses maliciously crafted prompts into generative AI to elicit incorrect, inaccurate, and even potentially offensive responses.

Evasion Attacks

Attacker maliciously modifies inputs so AI systems are unable to recognize or correlate inputs to known data, like putting a sticker on a stop sign so the AI in a self-driving car doesn ‘t recognize the stop sign and keeps driving.

AI-Generated Phishing and Business Email Compromise Lures
The use of generative AI like ChatGPT to automate the creation of phishing emails.

Deepfake BEC sand Other Scams

AI-generated media like voice and video impersonate a CEO or other executive in order to convince workers to fall for business email compromise and other scams that involve the transfer of large sums of money.

AI-Generated Malware and Vuln Discovery
Attackers leverage generative AI to help them craft malware and quickly discover
vulnerabilities in targeted systems to speed up and scale their attacks.

Facebook
Twitter
LinkedIn
Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *