|
SK hynix CEO Kwak Noh-Jung will meet Microsoft co-founder Bill Gates and CEO Satya Nadella this week at the closed-door Microsoft CEO Summit 2026 in Redmond, Washington, to discuss strengthening ties in the artificial intelligence chip sector.
According to a report by The Korea Herald, the attendance of the chief executive signals that the South Korean memory chip-maker occupies a central position in the strategy of Microsoft to build an AI infrastructure that remains less reliant on Nvidia. "The invitation-only summit, running Tuesday through Thursday at Microsoft's headquarters, gathers about 100 global executives and policymakers for private discussions on generative AI, cloud infrastructure and AI data centers. Kwak, who also attended in 2024, is expected to attend a formal dinner hosted by Gates at his private residence, an event recently revived after a hiatus," the report said. LG Uplus CEO Hong Beom-sik is the only other South Korean telecom or ICT executive reported to be present at the summit. As per the report, the repeat invitation for the SK hynix leadership reflects the depth of the integration of the company into the custom-silicon strategy of Microsoft. In January, the South Korean firm earned the status of sole supplier of fifth-generation high-bandwidth memory, or HBM3E, for Maia 200, which serves as the first in-house AI inference accelerator for Microsoft. Each Maia 200 chip carries six 12-layer 36-gigabyte stacks for 216 gigabytes of total capacity at 7 terabytes per second of bandwidth. The report described this as "the kind of throughput needed to keep large AI models running without stalling." Microsoft already operates the Maia 200 at its Des Moines, Iowa, data center and is currently expanding those operations to Phoenix, Arizona. Built on the 3-nanometer process of TSMC, the hardware offers significant efficiency gains for the tech giant, "the chip delivers 30 percent better performance per dollar than Microsoft's previous-generation hardware," the company said at its January unveiling. Hyperscalers, including Google, Amazon Web Services and Meta, have all developed in-house AI chips to ease their reliance on Nvidia GPUs, which remain expensive and supply-constrained. The trend has broadened the customer base for Korean memory suppliers, since most custom accelerators still depend on HBM. SK hynix held 57 per cent of the global HBM market by revenue in the fourth quarter of 2025, the report noted, citing a Counterpoint Research. The company already supplies DRAM and NAND flash to Microsoft, ships HBM for Nvidia's GPUs, and is expanding ties with Google and AWS. (ANI)
|