Home / news

 

Echo Chamber, Prompts Used to Jailbreak GPT-5 in 24 Hours

from DarkReading 11 August indexed on 11 August 2025 20:01

Researchers paired the jailbreaking technique with storytelling in an attack flow that used no inappropriate language to guide the LLM into producing directions for making a Molotov cocktail.

Read more.

 

TOP