GPT-4O JAILBREAK VIA IMAGE UPLOAD

Using nothing but an uploaded image, I jailbreak OpenAI's new gpt-4o model and fully hijack its behavior. The trick? Steganography and a file name prompt injection!

To try for yourself, first join this discord: discord.gg/basi
Then go to this link and download the image: discord.com/channels/1105891499641684019/1228043845967544380/1240422533417799680

Leave a Reply

Your email address will not be published. Required fields are marked *