“The new Xeon 6 AI performance, which Intel claims is over 5 times faster than the current AMD EPYC CPU, is due to further exploitation of the matrix processors on the Xeon.”

    https://www.forbes.com/sites/bernardmarr/2024/09/25/how-to-write-amazing-generative-ai-prompts/?

    “Intel is betting the success of its Gaudi 3 on its lower price and lower total cost of ownership […] Intel indicated that an accelerator kit based on eight Gaudi 3 processors on a baseboard will cost $125,000, which means that each one will cost around $15,625. By contrast, an Nvidia H100 card is currently available for $30,678, so Intel indeed plans to have a big price advantage over its competitor”

    https://www.tomshardware.com/tech-industry/artificial-intelligence/intel-launches-gaudi-3-accelerator-for-ai-slower-than-h100-but-also-cheaper

    Re-posted to reformat.

    Intel Releases New AI Chips That Beat AMD In Performance and Nvidia in Cost
    byu/plebbit0rz inwallstreetbets



    Posted by plebbit0rz

    20 Comments

    1. Lol suuuuuuurrreee Intel…

      Nvidia has already said that if you do the cost analysis including energy, even if chips were given to their competitors for free, it would still be more expensive than Nvidia’s chips.

    2. Nobody cares if the chip is 600 times faster than an EPYC processor in AI performance because nobody uses EPYC processors for AI. They use a GPU with tens of thousands of vector processing cores. And nobody will invest in rewriting AI tooling to support their new babies if it is not significantly faster per watt and they can trust it will be around for a decade. Intel currently doesn’t inspire that confidence.

      Also all of the big AI companies have ( Google) or are working on ( meta, Amazon, Apple, Microsoft) their own tensor processing units. Why should they jump out of nvidias grubby maws and jump right back into another ecosystem they would be caught in. Amd has the biggest chance to take some of that pie because their gpus are almost identical in many ways to nvidias and have CUDA emulation. The gaudi stuff of intel forces you to run a tool to rewrite python code. Sure that works great and fit really well into their development processes.

    3. But how many watts of electricity will you need to run it. Upfront hardware cost is unimportant, cost to operate is.

    4. GreenFuturesMatter on

      This is a nothing burger imo. Nvidia has the full stack and while yes Intel chip maybe cheaper… we’re also comparing H100s to intel newest chip when NV is set to release GB200 and is far ahead of the competition. On top of that NV has another chipset set to succeed the GB platform…. It’s been said before. The competition could give their chips away for free and it wouldn’t matter because of CUDA

    5. North-Examination715 on

      kind of uneducated and stupid question, but wouldn’t the ideal be vice versa? Isnt Nvidia the best performance and AMD cheaper? So wouldn’t you want to have the opposite result? And if true, intel has successfully created a chip that is not cheap enough to compete with AMD for the people/companies that sort by low price, and not good enough to compete with the people/companies that sort by best performance? idk again im not an expert but that is what I recall from when I looked into this stuff.

    6. No they don’t.

      And they don’t have anything like NVLink. In fact there’s a consortium that is years away from trying to come close to bandwidth of NVLink provides.

      The biggest chess move Nvidia ever did so far was acquiring Mellanox.

    7. Syab_of_Caltrops on

      Ah, some good news. Watch INTC tank tomorrow ![img](emote|t5_2th52|4267)![img](emote|t5_2th52|4271)

    8. Why have I read this type of headline every year for 5 years straight while intel continues to see shrinking revenue and underperformance lol.

    Leave A Reply
    Share via