As Su said on an earnings call Tuesday, Meta is using AMD’s MI300X AI chips to serve its largest Llama model through its Meta AI service, and Microsoft is using them to power OpenAI-based ...
Despite a 37% y/y drop in AMD's stock price ... Looking at the fourth quarter, MI300X production deployments expanded with our largest cloud partners. Meta exclusively used MI300X to serve ...
Advanced Micro Devices AMD reported fourth-quarter 2024 non-GAAP ... In the Data Center AI business, MI300X deployment increased with cloud partners, including Meta Platforms META, Microsoft ...
AMD followed the MI300X with the MI325X last year, which was designed to match Nvidia's newer H200. According to AMD, some customers are yielding significant performance and cost advantages by ...
AMD has outlined its AI roadmap during recent earnings call: new Instinct MI350 now releases in mid-2025, next-gen MI400 ...
Most of this growth was driven by the ramp-up of the AMD Instinct accelerators (MI300X and MI325X) and the EPYC processors. I argue that the data center revenue (unlike the client segment ...
AMD’s data center revenue for Q4 2024 missed analysts' expectations ... Turning to the company’s cloud partners, Su also noted that Meta exclusively used MI300X to serve their Llama 405B frontier ...
Meanwhile, AMD said Microsoft and Meta Platforms are both using its MI300X GPUs. It said it is seeing strong interest in its next-generation MI350 series GPUs, which it plans to ramp up around mid ...