OpenAI will pay $25,000 to jailbreak GPT-5
OpenAI has launched a new bug bounty program for its latest large language model, GPT-5. The biggest highlight of this program is that the company is offering rewards of up to $25,000. The challenge is strict: the jailbreak must work from a clean chat and bypass moderation systems.