generatesNvidia

Nvidia generates up to 1,000% revenue for every H100 GPU sold

Breaking News

TechSpot is celebrating its twenty fifth anniversary. TechSpot manner tech analysis and suggestion you can belief.

The gargantuan image: AI acceleration is made up our minds to change into without a doubt one of essentially the most prosperous hardware agencies in the upcoming months and years, and Nvidia is in a first-rate attach to capture a vital piece of this market. The H100 recordsdata middle GPU is already proving to be a basic revenue generator for the Santa Clara-essentially based entirely company.

For every H100 GPU accelerator sold, Nvidia appears to be like to be making a mighty revenue, with reported margins reaching 1,000 p.c of producing prices. Tae Kim, a senior know-how writer at Barron’s, currently highlighted that Nvidia spends around $3,320 to attach a single H100 unit, which is then sold to total clients for a mark starting from $25,000 to $30,000. These estimates are equipped by consulting agency Raymond James and reputedly encompass prices linked to the onboard HBM memory chips as properly.

If the estimations tag right, this would well additionally tag the onset of an unparalleled golden know-how for Nvidia’s GPU industry. The request for H100 GPU devices is so high that they are in fact sold out till 2024, Kim talked about. Meanwhile, AI corporations are scrambling to stable sufficient GPU accelerators to gas their generative devices and AI-powered services and products. Foxconn predicts that the AI server market will reach a price of $150 billion by 2027, and these in trend AI servers heavily depend on the robust computing capabilities of the latest Nvidia hardware.

The H100 GPU is essentially based entirely on the Hopper microarchitecture, designed as the recordsdata middle counterpart to the Ada Lovelace structure that empowers the latest know-how of GeForce RTX gaming GPUs. Nonetheless, Hopper’s basic middle of attention is now not on gaming, as indicated by efficiency benchmarks. The H100 accelerator is equipped with a modified GH100 GPU that consists of 14,592 CUDA cores, as properly as to 80GB of HBM3 RAM with a 5,120-bit memory bus.

Nvidia is indubitably riding a high wave in the present AI increase, despite the truth that the suggested $3,320 estimated payment per H100 GPU proposed by Kim requires further clarification. Creating a fresh GPU is a pricey and time-interesting endeavor, spirited a bunch of hardware engineers and in actuality professional workers. These workers require compensation, and fresh estimates indicate that Nvidia’s moderate wage for a hardware engineer is around $202,000 per year.

On the one hand, producing and promoting H100 GPUs would possibly well additionally simply be grand extra pricey than the $3,320 expense reported. On the diversified hand, Kim has been following Nvidia for 30 years and now he is even writing a e book as the “definitive historic past” of the corporate. As a consequence of this truth, via Nvidia’s within working and payment levels, he likely is conscious of what he is speaking about.

Back to top button