San Francisco: Chip maker AMD has unveiled new details of its Instinct MI300 Series accelerator family processors, including the introduction of the Instinct MI300X accelerator, an advanced accelerator for generative AI.
“We are laser-focused on accelerating the deployment of AMD AI platforms at scale in the data centre, led by the launch of our Instinct MI300 accelerators planned for later this year and the growing ecosystem of enterprise-ready AI software optimised for our hardware,” AMD Chair and CEO Dr Lisa Su said in a statement.
The MI300X is based on the next-gen AMD CDNA 3 accelerator architecture and supports up to 192 GB of HBM3 memory to provide the compute and memory efficiency needed for large language model training and inference for generative AI workloads, according to the company.
With the large memory of the AMD Instinct MI300X processor, customers can now fit large language models such as Falcon-40, a 40B parameter model on a single, MI300X accelerator.
The chip maker also introduced the AMD Instinct Platform, which brings together eight MI300X accelerators into an industry-standard design for the ultimate solution for AI inference and training.
AMD said that it will start sampling the MI300X processor to key customers, starting in Q3 (third quarter).
The company also announced that the AMD Instinct MI300A, the world’s first APU Accelerator for HPC and AI workloads, is now sampling to customers.