Microsoft Offers Cash Rewards for Finding Bing AI Bugs

Microsoft Offers Cash Rewards for Finding Bing AI Bugs

Microsoft has introduced a new initiative to encourage security researchers to uncover vulnerabilities in their Bing AI products.

This AI bug bounty program promises rewards ranging from $2,000 to $15,000.

The eligible products encompass the Bing AI-powered suite, which includes Bing Chat, Bing Chat for Business, Bing Image Creator, and Bing’s AI integrations with the Microsoft Start app and Skype Mobile app.

Researchers are urged to look for various vulnerabilities, including “prompt injection” attacks, false or offensive chat messages, and potential code and system prompt leaks that might also affect OpenAI’s GPT-4.

To report these discoveries, researchers can use the MSRC Researcher Portal, or reach out with questions to [email protected].

The rewards will be determined based on the severity and quality of the vulnerabilities identified.

This initiative comes after the launch of Bing Chat faced initial glitches upon its introduction in February. Despite these early challenges, Microsoft decided to push forward with the platform.

An AlgorithmWatch study revealed that Bing Chat provided inaccurate information related to elections. Despite this, Microsoft continues to promote it as an information source. Additionally, Bing Chat has been used for unlabeled advertisements for Microsoft products.

Since becoming generally available in May 2023, Microsoft has been actively expanding Bing Chat’s capabilities, adding features such as image responses, chat history, and plugins. This transformation aims to evolve Bing Chat from a product into a versatile platform.

CEO Satya Nadella tempered initial enthusiasm by calling his earlier remarks about Bing Chat’s potential impact against Google “exuberance.”

He stated that increasing market share from three percent to 3.5 percent is a realistic goal and that AI could play a significant role in achieving this, even as Google maintains a strong market presence.

Share On:

Leave a Comment