Published in News

Meta’s Llama 2 on Qualcomm chips

by on19 July 2023


Qualcomm and Meta make an agreement

Qualcomm and Meta will enable the social networking company’s new large language model, Llama 2, to run on Qualcomm chips on phones and PCs starting in 2024, the companies announced today.

Meta’s large language models have run on large server farms, powered by Nvidia graphics processors due to the technology’s computational power and data needs.  This has made Nvidia relatively rich but has left a hole in the market for companies that make leading-edge processors for phones and PCs, like Qualcomm.

This latest announcement suggests that Qualcomm wants to position its processors as well-suited for AI but “on the edge,” or on a device, instead of “in the cloud.”

If large language models can run on phones instead of in large data centres, it could reduce the cost of running AI models and lead to better and faster voice assistants and other apps.

This idea is similar to the move from big supercomputers with dumb terminals to IBM PCs.  However, with all the thinking about cloud technology, it would seem that this concept is a step backward.  It will be interesting to see what eventually wins.

Rate this item
(0 votes)