Bing also crashed during his presentation

Photo of author
Written By tsboi team

The Best Url Shortener, QR Codes, Bio-Profile Link & File Sharing Platform.

Google lost 100,000 million dollars in the stock market after an error in the texts generated by Bard during his presentation. But it is not the only intelligent chatbot that has been wrong: the new Bing was also wrong during the Microsoft conference that lifted the veil on this new function, linked to ChatGPT, of the search engine.

The new Bing // Source: Microsoft

Although Google Bard is not available to the general public, it has already cost the company $100 billion. And for good reason: in its presentation, Google showed automatically generated responses, which contained false information. While the new Bing has significant differences from ChatGPT, it shares similarities with the OpenAI chatbot.

However, it is not the only artificial intelligence that is wrong: the new Bing announced the day before also made mistakes. The first demonstrations of the tool were ” riddled with financial data errors“, written the edge . The media carries the analysis of independent researcher Dmitri Brereton, who made some discoveries about Bing, but that don’t necessarily go in the direction of Microsoft. For the moment, the discovered errors have had no impact on the company’s stock market value.

A new Bing that also makes mistakes

Microsoft presented some possible uses of its new tool integrated into Bing. You can answer questions based on the content of the indexed pages: allowing you to create recipes, plan a trip, and more generally, synthesize search results related to a query. If the AI ​​was found to have mistaken a corded vacuum for a cordless model, it was also wrong in the economic results. It is precisely at this point that the edgemade a face

Bing AI got Gap financial results wrong // Source: Microsoft

Bing tries to summarize the financial report for the third quarter of 2022 of the ready-to-wear brand Gap: if it indicated a gross margin of 37.4%, the report specifies with an adjusted gross margin of 38.7%. Bing writes that the Gap’s operating margin was 5.9% during this period when it was 4.6%, or even 3.9% after adjusting. For the edgeIt’s very clear : ” the result is a comparison riddled with inaccuracies. »

What Bing’s bugs tell us about AI

The first thing you learn is that Bing is flawed: it’s not completely trustworthy, and the AI ​​behind it isn’t bug-free. At the moment, there is no substitute for human verification, especially in specific and subtle areas like finance.

The other lesson is that Bing doesn’t rely solely on the links it displays in the results. If all the elements come from the pages indexed by the search engine, some are not cited in the answers. We were able to realize this thanks toa query on the pros and cons of best-selling pet vacuums“, writtenthe edge. Bing cited a reference and listed the disadvantages of the cable, but according to Brereton that’s wrong, you don’t have a cable. However, upon researching, we realize that there is, in fact, a corded version of this portable vacuum.

Bing claims false things with aplomb // Source: Frandroid

The consequence of all this is that Bing uses several sources without listing them completely, which has caused two versions of said vacuum cleaner to be confused. Another very interesting thing to point out: human verification itself doesn’t necessarily work. Thus, the double control carried out bythe edgeshows that the researcher was also wrong.

In short, Microsoft still has some work to do to make the AI-powered version of Bing less buggy. The firm will be able to rely on data generated by early Internet users with access to AI. At the moment, you have to sign up for a waiting list to wait to be able to use the new Bing. Certain Microsoft methods related to this list allowed Bing to boost itself in the ranking of the most downloaded applications in the App Store.


Do you use Google News (News in France)? You can follow your favorite media. Continue Frandroid on Google News (and Numerama).

Rate this post

Leave a Comment