Jailbreaking LLM-Controlled RobotsBy rooter / December 11, 2024 Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.