ChatGPT bug reportedly exposed user histories

0
54

[ad_1]

It’s no surprise that ever since its release OpenAI’s ChatGPT has taken the world by storm, gathering over 100 million users in just two months. However, this popularity has come with its fair share of concerns, particularly when it comes to user privacy, as users recently discovered a bug in ChatGPT, which allowed users to see the chat history titles of other people.

The incident first came to light when many ChatGPT users and security researchers reported the issue on Reddit and Twitter, as they observed that the sidebar that usually displays user history was showing the history titles of other users too.

OpenAI confirmed the incident to Bloomberg, noting that the bug did not expose confidential information and that they are still investigating the cause of the issue. However, reports suggest that an “unnamed, open-source software” caused the bug.

In response, OpenAI took down the chatbot on Monday for a few hours and replaced the chat history sidebar with a message stating, “History is temporarily unavailable. We’re working to restore this feature as soon as possible.” And although OpenAI’s status page says that they have restored the chatbot, the company is still working to restore the chat history feature.

Major security issue

This incident raises some serious privacy concerns regarding AI chatbots. And while ChatGPT maker OpenAI claims to remove personally identifiable information from the data and promises not to use the data from companies that pay for its API, this bug highlights that regular users are still vulnerable, and OpenAI can still access their data. Users of ChatGPT and other AI chatbots must be aware that the information they are sharing with the chatbot might not be as private as they previously thought. Therefore, it is always important to keep in mind the information you are sharing.



[ad_2]

Source link