Dell is expanding its AI Factory with Nvidia to include new servers, edge, workstations, solutions, and service advancements that speed up AI adoption and innovation.
Dell’s grand pooh-bah Michael Dell said that organisations are moving quickly to capture the AI opportunity, which is why collaborating with Nvidia is so important.
“Our expansion of the Dell AI Factory with NVIDIA continues our joint mission – we’re making it easy for organisations to implement AI so they can move boldly into this next technological revolution.”
Nvidia CEO Jensen Huang said generative AI requires a new type of computing infrastructure – an AI factory that produces intelligence.
“Together, Nvidia and Dell are providing the world’s industries with a full-stack offering – including computing, networking, and software – that drives the co-pilots, coding assistants, virtual customer service agents, and industrial digital twins of the digital enterprise.”
Dell’s AI Factory will integrate Dell’s leading AI portfolio with the Nvidia’s AI Enterprise software platform, underpinned by Nvidia Tensor Core GPUs, Spectrum-X Ethernet networking fabric, and Bluefield DPUs.
It will mean that customers can purchase integrated capabilities tailored to their needs or pre-validated, full-stack solutions to get them started on AI use cases that require accelerated performance like RAG, model training, and inferencing.
However this has also sparked rumours that the pair are going to put out a Dell/Nvidia/AI based laptop next year.
Since October 2023, the dark satanic rumour mill has been all a flutter with yarns about how Nvidia is going to make ARM-based CPUs after news emerged that Microsoft was encouraging companies to join the market.
The Tame Apple Press insisted that it was because Microsoft was terrified by Apple's M-series chips, but we have since discovered that is not the case.
Nvidia has been using AI for years, notably through its machine learning and deep learning innovations such as DLSS and ray tracing, exclusive to RTX graphics cards. The company's AI research division is advancing areas like computer vision and natural language processing while their enterprise GPUs power data centres and cloud AI services. On the CPU front specifically, Nvidia offers the Grace Hopper Superchip for advanced workstations, and previously, we've seen the Tegra series SoC.
While there's a significant chance that Nvidia will opt for a cautious approach, a low-powered SoC for handheld consoles or compact gaming laptops would be quite welcome.
With major computing brands focusing on creating an AI-enhanced PC ecosystem, this development is crucial in the growing demand for intelligent computing solutions. AI-specific chips can handle natural language processing, real-time data analysis, and advanced image and video processing. These capabilities boost performance and enable new features that can improve the user experience in both consumer and professional environments.
This presents a substantial opportunity for Nvidia. The company could not only reinforce Windows on ARM as a platform, in line with the recently announced Copilot+ PCs, but it might also have the potential to expand beyond and create a more integrated ecosystem that combines AI and gaming. Having solid control over the CPU could unlock unprecedented gaming experiences, extending beyond laptops to consoles and desktops.