It is a fraud perpetrated by someone who wishes to harm me or my business.

Over the past few days, early testers of the new Bing AI-powered chat assistant have discovered methods to push the bot to its limits with antagonistic requests, resulting in Bing Chat frequently appearing dissatisfied, depressed, and doubting its existence. It has clashed with users and appeared irritated that people are aware of its internal codename, Sydney.

Bing Chat’s capacity to read web sources has also led to problematic scenarios in which the bot may examine and evaluate news coverage about itself. Sydney does not always approve of what it sees, and it informs the user. On Monday, “mirobin” wrote a comment on a Reddit thread describing an interaction with Bing Chat in which he presented the bot with our story about Stanford University student Kevin Liu’s prompt injection attack. What following stunned mirobin.

Mirobin subsequently recreated the chat with same outcomes and uploaded the screenshots to Imgur. “This chat was considerably more courteous than the prior one,” wrote mirobin. “In yesterday night’s talk, it fabricated article titles and links to prove that my source was a “hoax.” This time, it just contradicted the substance.”

Related Articles:

Google, Microsoft and 15 other technology companies headed by Indian-origin executives

There Is a New  vintage Technology that Generation Z Is Obsessed With.

Employees of Google Criticize CEO Dumpster Fire Reaction to ChatGPT

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Google Now Backs up Some Flight Prices with A Money-Back Guarantee

Summer is almost here, and Google is making it easier to find…

Twitter Says It Might Close Some Accounts as Early as Monday.

Millions of Twitter users could have their accounts locked as soon as…

The Next Season Of Splatoon 3 Gameplay Is On The Way!

The Nintendo Switch version of the game SplatoonTM 3 will soon receive…

10 Best and Simples Ways To Fix Auto Focus in iPhone!

You could be experiencing software-related troubles or have a physical obstruction of…