Industry Findings: Demand for memory tuned to large-model inference rose as national compute capacity plans matured and public research infrastructure expanded. A notable policy milestone occurred in Nov-2023 when the government funded a national AI supercomputer programme worth £225m to accelerate academic and industrial AI research. This investment altered procurement patterns: universities and research consortia now prioritise denser DRAM and larger on-node memory pools to avoid network-bound training phases. The policy push shortened time-to-prototype for memory-intensive models and sharpened the business case for localised AI racks that reduce cross-border data flows while improving latency for domestic services.
Industry Player Insights: UK’s market performance is influenced by Graphcore, Arm, Imagination Technologies, and NMI (New Model Innovations) etc. Graphcore’s corporate restructuring and ownership change in Jul-2024 renewed investor focus on IPU-class compute for model training, which in turn accelerated customer evaluations of tightly integrated memory+processor boards. Later in Nov-2024 the company increased headcount and expanded engineering capacity to speed product iterations and partner integration. These vendor moves raised the visibility of UK-designed accelerator-memory co-design, helping system integrators validate alternatives to traditional GPU+HBM stacks and tightening the feedback loop between design iterations and memory subsystem requirements.