Microsoft is allegedly working with AMD to expand into AI chips

Credit Source

Microsoft has allegedly teamed up with AMD to help bolster the chipmaker’s expansion into artificial intelligence processors. According to a report by Bloomberg, Microsoft is providing engineering resources to support AMD’s developments as the two companies join forces to compete against Nvidia, which controls an estimated 80 percent market share in the AI processor market.

In turn, Bloomberg’s sources also claim that AMD is helping Microsoft to develop its own in-house AI chips, codenamed Athena. Several hundred employees from Microsoft’s silicon division are reportedly working on the project and the company has apparently already sunk around $2 billion into its development. Microsoft spokesperson Frank Shaw has, however, denied that AMD is involved with Athena. 

We have contacted AMD and Microsoft for confirmation and will update this story should we hear back.

Nvidia’s commanding market share for GPUs has helped it dominate the AI chip industry

The explosive popularity of AI services like OpenAI’s ChatGPT is driving the demand for processors that can handle the huge computational workloads required to run them. Nvidia’s commanding market share for graphic processing units (GPUs) — specialized chips that provide the required computing power — allows it to dominate this space. There’s currently no suitable alternative, and that’s a problem for companies like Microsoft that need Nvidia’s expensive processors to power the various AI services running in its Azure Cloud.

Nvidia’s CUDA libraries have driven most of the progress in AI over the past decade. Despite AMD being a major rival in the gaming hardware industry, the company still doesn’t have a suitable alternative to the CUDA ecosystem for large-scale machine learning deployments. Now that the AI industry is heating up, AMD is seeking to place itself in a better position to capitalize. “We are very excited about our opportunity in AI — this is our number one strategic priority,” Chief Executive Officer Lisa Su said during the chipmaker’s earnings call Tuesday. “We are in the very early stages of the AI computing era, and the rate of adoption and growth is faster than any other technology in recent history.”

Su claims that AMD is well positioned to create partly customized chips for its biggest customers to use in their AI data centers. “I think we have a very complete IP portfolio across CPUs, GPUs, FPGAs, adaptive SoCs, DPUs, and a very capable semi-custom team,” said Su, adding that the company is seeing “higher volume opportunities beyond game consoles.”

AMD is also confident that its upcoming Instinct MI300 data center chip could be adapted for generative AI workloads. “MI300 is actually very well-positioned for both HPC or supercomputing workloads as well as for AI workloads,” said Su. “And with the recent interest in generative AI, I would say the pipeline for MI300 has expanded considerably here over the last few months, and we’re excited about that. We’re putting in a lot more resources.”

In the meantime, Microsoft intends to keep working closely with Nvidia as it attempts to secure more of the company’s processors. The AI boom has led to a growing shortage of specialized GPU chips, further constrained by Nvidia having a near monopoly on the supply of such hardware. Microsoft and AMD aren’t the only players trying to develop in-house AI chips — Google has its own TPU (Tensor Processing Unit) chip for training its AI models, and Amazon has similarly created Trainium AI chips to train machine learning computer models.

Read Full Article

Originally appeared on: TheSpuzz

Scoophot
Logo