Samsung investing over £55 billion on AI chip expansion

Samsung is planning to spend over £55 billion on AI chip expansion.

Samsung is planning to spend over £55 billion on AI chip expansion.

The South Korean tech giant is aiming to increase its investment in production and research by 22 percent this year, as they look to secure the top spot as Nvidia’s main memory provider.

The company has announced plans to invest over 110 trillion Korean won in facilities, research and development investment.

Samsung is aiming to secure its leadership in the “AI semiconductor era” by using its memory, foundry and advanced packaging capacity.

The increased funds will be used in “future-oriented” sectors, which include AI and advanced robotics, while the company will chase potential merger and acquisition deals in those areas.

Meanwhile, Samsung has announced a Memorandum of Understanding (MOU) with AMD to expand their strategic collaboration on next-generation AI memory and computing technologies.

Young Hyun Jun, Vice Chairman and CEO of Samsung Electronics, said in a statement: “Samsung and AMD share a commitment to advancing AI computing, and this agreement reflects the growing scope of our collaboration.

“From industry-leading HBM4 and next-generation memory architectures to cutting-edge foundry and advanced packaging, Samsung is uniquely positioned to deliver unrivalled turnkey capabilities that support AMD’s evolving AI roadmap.”

The two parties will “align on primary HBM4 supply for the next-generation AMD AI accelerator, the AMD Instinct MI455X GPU, as well as advanced DRAM solutions for 6th Gen AMD EPYC CPUs”, which have the codename Venice.

The press release adds: “Samsung and AMD are closely collaborating on advanced memory technologies for AI and data centre workloads.

“As memory bandwidth and power efficiency become increasingly critical to system-level performance, this collaboration will help deliver more optimized AI infrastructure for customers.”

Dr. Lisa Su, Chair and CEO of AMD, commented: “Powering the next generation of AI infrastructure requires deep collaboration across the industry.

“We are thrilled to expand our work with Samsung, bringing together their leadership in advanced memory with our Instinct GPUs, EPYC CPUs and rack-scale platforms.

“Integration across the full computing stack, from silicon to system to rack, is essential to accelerating AI innovation that translates into real-world impact at scale.”

Close Bitnami banner
Bitnami