From Schneier on Security – Jailbreaking LLM-Controlled Robots

From Schneier on Security – Jailbreaking LLM-Controlled Robots

 Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions. Read More