GPT-5 bio bug bounty call

📰 OpenAI News

OpenAI launches Bio Bug Bounty for GPT-5 safety testing with a $25,000 prize

advanced Published 5 Sept 2025
Action Steps
  1. Review the Bio Bug Bounty program details
  2. Develop a universal jailbreak prompt to test GPT-5's safety
  3. Submit the prompt and results to OpenAI for review
  4. Participate in the competition to win up to $25,000
Who Needs to Know This

AI researchers and security experts on a team can benefit from this opportunity to test GPT-5's safety and win a prize, while also contributing to the development of more secure AI models

Key Insight

💡 OpenAI is crowdsourcing security testing for GPT-5 through a bug bounty program

Share This
🚨 Test GPT-5's safety & win up to $25k! 🚨
Read full article → ← Back to News