Meta’s large language models have run on large server farms, powered by Nvidia graphics processors due to the technology’s computational power and data needs. This has made Nvidia relatively rich but has left a hole in the market for companies that make leading-edge processors for phones and PCs, like Qualcomm.
This latest announcement suggests that Qualcomm wants to position its processors as well-suited for AI but “on the edge,” or on a device, instead of “in the cloud.”
If large language models can run on phones instead of in large data centres, it could reduce the cost of running AI models and lead to better and faster voice assistants and other apps.
This idea is similar to the move from big supercomputers with dumb terminals to IBM PCs. However, with all the thinking about cloud technology, it would seem that this concept is a step backward. It will be interesting to see what eventually wins.