[ad_1]
Right now, Microsoft-backed OpenAI has a selection of LLMs to power your ChatGPT experience. Well, the company just released a cheaper version of its flagship AI model. Named GPT-4o Mini, this model costs much less to use, and it might be a boon for ChatGPT’s largest user base.
In case you don’t know, OpenAI has several models like GPT-3.5 Turbo, GPT-4 Turbo, and GPT-4o. The third one on the list is the latest and most powerful model that the company has. It’s available to all users, but people who are paying will have the most access to the model. It’s multi-modal and it can understand live video.
If you want to know more about this model, we have Everything You Need To Know About GPT-4o. It answers all of the important questions about it.
OpenAI released GPT-4o Mini
When it comes to using a model, cost matters a lot. Every time you send a query and get a response, it costs money. This cost applies to GPT-4o mini model costs 60% less than GPT-3.5 Turbo. That’s a massive reduction in cost. So, if you’re using this model, you will see some major cost savings.
When it comes to the performance, this model definitely has some chops. It’s important to note that this is only from one benchmarking system. On the MMLU (Massive Multitask Language Understanding) test, GPT-4o scored 88.7%, and GPT-4o Mini was a bit behind with 82%.
GPT-4o Mini scored higher than 3.5 Turbo, which scored 70%. That’s important to know because GPT-4o mini is meant to be the replacement for 3.5 Turbo. That’s right! Free users are getting access to a more powerful model. Paid users saw models like GPT-4 and GPT-4 Turbo, but free users haven’t seen too many changes over the past nearly two years.
At this point, we don’t know everything that was stripped away from GPT-4o to make this Mini version. We know that it can currently only take text and images as input. It will eventually gain the ability to understand audio and video some time down the road.
GPT-3.5 Turbo isn’t going away altogether. Users will still be able to access it through the API.
[ad_2]
Source link