Microsoft unveiled its Phi-3 family of AI models

0
35

[ad_1]

Ever since the birth of the computer, we’ve seen the trend of groundbreaking technology coming down in size while getting more powerful. The computers in our pockets are exponentially more powerful than the large room-sized computers that were used back in the 60s. Well, this is happening with AI models. Microsoft just released the Phi-3 series of AI models, and they come in three sizes.

If this sounds familiar, Google released Gemini, a family of models that comes in three sizes. Recently, Meta launched its Llama 3 family of models, which also comes in different sizes. Companies are utilizing different sizes for their LLMs which makes them more versatile.

Where this really shines is on-device computing. There are several developers who don’t quite need large Internet-connected LLMs for their needs. Many people are just fine using smaller models that can easily fit on a computer.

Microsoft just released its Phi-3 AI models

Microsoft announced that these models are more powerful than the last iteration. One of the more interesting models in this family is the Phi-3 mini. It’s the smallest one in the family, and it’s designed to fit on smaller devices such as computers or smartphones. If it is small enough to fit on smartphones, then it could be a competitor to Google’s Gemini Nano.

The corporate vice president of Microsoft Azure, Eric Boyd, told The Verge that Phi-3 Mini is actually as capable as GPT-3.5. However, it’s in a much smaller form factor. According to the report, Phi-3 Mini has 3.8 billion parameters. The next model in the family is called Phi-3 Small, and that has 7 billion parameters. Lastly, Phi-3 Medium has 14 billion parameters.

If you’re using platforms like Microsoft’s Azure, Hugging Face, or Ollama, you have access to Phi-3 mini. We’re sure that these new AI models will help push Microsoft forward in the rapidly growing AI space. It’s one of the leaders of artificial intelligence, so, we’re excited to see what the company does next.

[ad_2]

Source link