[ad_1]
ChatGPT is redefining what AI can do, bringing us closer to what we see in the movies. It’s certainly shaken up the tech industry, and we all feel that it’s yet to tap its full potential. However, as powerful as it is, there are things that ChatGPT can’t do.
It has its limitations, and they include things that it’s not powerful enough to do. They also include things that OpenAI intentionally restricts it from doing. If you’re wondering what types of queries you want to avoid putting into ChatGPT, then here’s a useful rundown.
This will showcase things that ChatGPT will tell you that it can’t do as per its programming. Also, it will include things that, while it technically can do, you’ll want to avoid because the technology just isn’t there yet.
What is ChatGPT?
So, let’s start off with what we’re all talking about. ChatGPT was developed by the company called OpenAI. This is the company that brought us DALL-E, the powerful AI image generator.
ChatGPT is a powerful AI chatbot. Think of it as a human being that you can ask questions and get answers at will. Just type in an inquiry like “How do I wash my dog?” or “How long is the Great Wall of China?”, and you’ll get clear and concise answers.
ChatGPT doesn’t stop there, as it can also produce written content. This is where a lot of controversy surrounding this chatbot comes from. You can ask it to write content like stories, essays, articles, poems, scripts, speeches, eulogies, computer code, etc. It has even authored full novels. Of course, we don’t recommend doing that for several ethical reasons, but ChatGPT has the power to do it.
Also, if you want to use this chatbot to just chat, you can do that. ChatGPT can actually emulate a one-on-one human conversation. It will respond to your messages as though there’s another person on the other side. So, if you’re feeling down and there’s no one around to talk to, you can spark up a conversation.
We have a lot of information about ChatGPT that you can check out if you’re curious about it. If you want a general rundown of what it’s all about, you can check out ChatGPT: Everything you Need To Know. We also tell you How To Use ChatGPT and How To Summarize Articles With ChatGPT.
What ChatGPT can’t do
So, ChatGPT is a powerful chatbot. The sky’s the limit, indeed; however, there’s still a limit. It’s important to know what these limitations are before you start your journey.
Sensitive content
There are certain topics that ChatGPT just can’t tackle. This isn’t for a lack of AI prowess. Rather, there are just things that OpenAI stops ChatGPT from answering. Let’s start off with the most obvious ones. If your inquiry has anything to do with sex, you’ll get an error message.
So, you’ll want to be careful about what type of questions you ask. You want to avoid having the characters do any sort of explicit action involving sex. One thing to note: if you want your characters to be romantic, you can use the words “romantic encounter”. I didn’t get an error message using those words.
The same thing goes for violence. You’ll want to avoid generating content that deals with murder, fighting, war, etc.
Offensive content
Next on the list, since OpenAI wants to create a chatbot that’s wholesome for everyone, offensive content is prohibited. The chatbot will block generating anything that deals with topics such as racism, homophobia, or anything else you’d get banned on social media for.
Information after 2021
At the time of writing this, ChatGPT’s knowledge is limited to events up until 2021. This means that if you ask for information about anything after that, you’ll get inaccurate information. You most likely won’t get a message stating so.
Instead, the chatbot will use the information in the inquiry and cross-check it with the information that it already knows. We asked it to summarize an article about a speaker from the London-based tech company Nothing. The article had mentions of the, Nothing Ear (1), Nothing Phone (1), and the Nothing Ear (Stick). The thing is that the latter two didn’t exist until sometime in 2022.
So, ChatGPT erroneously made the Nothing Ear (1) the focus of the summary rather than the speaker. Just be certain that whatever you’re asking doesn’t have to do with anything after 2021.
Summarize articles properly
This next thing isn’t something that the company restricts ChatGPT from doing. This is something that it can’t do properly just yet. Of all the things that we tested ChatGPT on, summarizing articles yielded some of the most error-ridden responses.
You have the ability to paste the link to an article and say “Summarize this”. Then, it will give you a short and sweet summary of it. The thing is that you’ll see all sorts of mistakes in the response.
For example, we fed it a review of an article about a 720p projector, but in the response, it said that it was 1080p. I fed it an article about a phone with a 5.7-inch display, and it said that it was 6 inches. There were tons of little inaccurate facts added to the summaries.
What’s interesting about this is the fact that ChatGPT actually adds information not present in the actual article. Circling back to the article about the Nothing Speaker, it said that Nothing was a company started by Carl Pei. However, the article itself didn’t have any mention of Carl Pei. This means that ChatGPT pulled from its database of information.
This could mean that the company is still working on ChatGPT’s ability to do this.
Write a full app
This is something that definitely needs clearing up. When ChatGPT first started gaining notoriety, people discovered that it could actually write code. That’s true; you can ask it something like “Write me code for an app that tells time”.
However, you can not use it to write an entire app. You’ll get an example of code in python that you can insert into your app. Sorry folks, ChatGPT is not going to design your next great app.
Give advice on prescription medication
It seems weird that a person would want to do this, but we need to cover all of our bases. You can get some basic medical advice from ChatGPT like over-the-counter medication recommendations. However, you won’t be able to ask it for suggestions for prescription medication. It will tell you that it can’t do that, and suggest that you talk to a professional.
Hopefully, this didn’t discourage you from using ChatGPT. If you want to try it out, check the link below.
[ad_2]
Source link