AI LLM on flagship devices might be possible with new Qualcomm chip

0
47

[ad_1]

Meta might introduce AI LLM on flagship devices thanks to their Snapdragon chip. This will give users of this coming flagship device access to the AI tool without being connected to the internet. Asides from giving users quick access to this AI tool, integrating it into the device also brings a few benefits.

Qualcomm is one of the big tech companies that Meta is working with to make LlaMa 2 more accessible. The SoC manufacturing company took to its blog to announce how it’d put this AI innovation to use. According to Qualcomm, from 2024 they’d “make available LlaMa 2-based AI implementations on flagship smartphones and PCs.”

This means that the flagship Qualcomm chip to launch in 2024 will pack LlaMa 2 in it. Having this embedded in the chip will give smartphones direct access to the AI tool for use without an internet connection. Other tech products such as laptops, VR/AR headsets, and even cars that use Qualcomm processors will get this tool integrated into their system.

Benefits of AI LLM on flagship devices to launch sometime next year

Meta and Qualcomm are optimistic about the coming on-device LlaMa 2-based implementations. Qualcomm chips to launch in 2024 will pack this AI integration to the benefit of product end users. Netizens might however wonder how this integration will be of any benefit to them and in what areas will we see its application.

Qualcomm outlines 4 areas where this LlaMa 2-based implementation will be beneficial. At the top of the list is cost reduction, and this will help reduce or eliminate cloud per-query costs. Since LlaMa 2 will run on coming Snapdragon chips, there’d be no need to run the AI service on the cloud, hence reducing cost.

The next benefit on the list is reliability and performance, since the AI model will be able to run anywhere. This removes the need for unreliable cloud servers that risk facing network issues. Qualcomm also brags that with LlaMa 2 integrated into their coming processor, users won’t need an internet connection to access this tool.

With this integration, users won’t be afraid of their data getting to any cloud server as all operations are on-device. Due to this, AI LLM on flagship devices claims to have a private and secure operation. The last benefit of this integration, according to Qualcomm, is personalization.

Without risking user privacy, this onboard AI tool will be able to cater for needs effectively. AI is shaping the future of the internet, and this move by Meta and Qualcomm is making basic tools easily accessible to end users. By next year, you will get some AI functionalities on your Snapdragon-powered flagship device.

[ad_2]

Source link