OpenAI has launched a new bug bounty program for its latest large language model, GPT-5. The biggest highlight of this program is that the company is offering rewards of up to $25,000. The challenge is strict: the jailbreak must work from a clean chat and bypass moderation systems.
short by
/
07:24 pm on
25 Aug