"Interest in the inference-focused Mach-1 is on the rise. Some clients are eager to deploy Mach in expansive applications with over a trillion parameters, which necessitates the swifter-than-anticipated progression of Mach-2. It's time we commenced preparations," Kyung stated.
The Mach-1 chip was initially introduced at Samsung Electronics' recent shareholder meeting, albeit with scant details, other than its intended use for inference and its expected launch early next year.
This disclosure marked the first occasion a specific designation for the firm's proprietary AI chip was made public.
Simultaneously, Samsung is asserting dominance in the memory chip fray. They're touting a specialised team that's pushing the boundaries on high bandwidth memory (HBM) chips.
"For AI applications, high-capacity HBM is a competitive edge, hence clients' demand for 12H (12-layered) HBM3 and HBM3E," Kyung remarked.
Samsung Electronics pioneered the development of the 12-layered HBM3E earlier this year, with ambitions for mass production in the first half of the year.
The buzz is that Samsung's 12-layered HBM3 and HBM3E are the coveted choices for AI applications, leaving competitors SK Hynix and Micron, who are committed to 8-layered HBM3E chips, trailing in their wake.