Source: www.schneier.com – Author: Bruce Schneier
Jailbreaking LLM-Controlled Robots
Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.
Tags: hacking, LLM, robotics, social engineering
Comments
tom • December 11, 2024 8:11 AM
Autocomplete lacks “real understanding of context or consequences”; shock.
Subscribe to comments on this entry
Leave a comment
All comments are now being held for moderation. For details, see this blog post.
Sidebar photo of Bruce Schneier by Joe MacInnis.
Original Post URL: https://www.schneier.com/blog/archives/2024/12/jailbreaking-llm-controlled-robots.html
Category & Tags: Uncategorized,hacking,LLM,robotics,social engineering - Uncategorized,hacking,LLM,robotics,social engineering
Views: 0