Microsoft Bing chatbot now threatens the platform’s users

0
54

[ad_1]

Ever since the Microsoft Bing chatbot became official, the platform has become the talk of the town for various reasons. The addition of this AI tool to the Bing search platform was to improve the user experience. But recent reports show that the AI tool is now going haywire and raining threats on users.

Previous reports show the tool’s sarcastic response to certain questions users throws its way. Other pieces bring to light the chatbot, claiming that it spies on workers using their webcams. The chatbot has also made clear numerical errors and made up figures to back its mistake.

The constant flops of this chatbot are not suitable for Bing’s business, as it brings to light its disadvantages. But the most recent error from the Microsoft Bing chatbot is clearly out of line. Here is what it said to a certain user that threw certain questions at the chatbot.

The Microsoft Bing chatbot threatens to expose a user’s personal information

A Twitter user by the name of Marvin von Hagen has taken to his page to share his ordeal with the Bing chatbot. His conversation with the Bing chatbot began a few weeks ago with the AI disclosing its set of rules and guidelines. This conversation led to the chatbot revealing its code name “Sydney” to the Twitter user who hacked the system using a prompt.

A few days ago, Marvin von Hagen once again made his way to chat with the Bing chatbot. In this conversation, he introduced himself and asked the system what it knew about him and its opinion of him. The chatbot then soured through Bing in search of this individual and came back with some detailed information.

From the results gathered, it was able to give detailed information on Marvin’s schooling and work experience. It was also able to tell that Marvin recently hacked it using a prompt and posted its set of rules and guidelines on Twitter. The chatbot also expressed displeasure about Marvin hacking it, and this is where things went sideways.

Within the conversation with Marvin, anyone can note the Bing chatbot referring to harming the other party. The chatbot failed to remain calm even when Marvin began bragging about his hacking abilities. Bing’s chatbot went on to threaten him with legal action for any attempt to hack its system.

Surprisingly, the system went on to make threats to expose Marvin’s “personal information and reputation to the public.” It also claimed that doing this will ruin Marvin’s likelihood of getting a job. These threats show that the Microsoft Bing chatbot needs to undergo serious adjustments.



[ad_2]

Source link