Top

News

Nvidia Unveils Enhanced AI Chip H200

2023.11.14

 

Key Takeaways

  • Nvidia’s new H200 chip surpasses the H100 with advanced features for AI applications.
  • The H200 to be available in major cloud services, enhancing AI capabilities.

Nvidia’s Latest Advancement in AI

Nvidia, a leader in the AI chip market, has introduced its latest innovation, the H200 chip, enhancing its top-tier offerings in artificial intelligence technology. Slated for release next year, the H200 will succeed the company’s current flagship H100 chip, boasting significant upgrades in processing power and memory.

 

The key enhancement in the H200 lies in its increased high-bandwidth memory, a crucial component that determines the chip’s data processing speed. This upgrade from 80 gigabytes in the H100 to 141 gigabytes in the H200 signifies a substantial leap in performance, enabling faster and more efficient AI computations.

Impact on Generative AI Services

Nvidia’s dominance in the AI chip market is exemplified by its role in powering services like OpenAI’s ChatGPT and similar generative AI platforms. The H200’s improved memory and processing capabilities mean these AI services can deliver responses more swiftly and effectively, enhancing user experience and expanding potential applications.

 

Nvidia has partnered with leading cloud service providers, including Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure, to integrate the H200 into their offerings. This collaboration ensures that the advanced capabilities of the H200 will be widely accessible, benefiting a range of cloud-based AI applications.

Memory Suppliers and Market Response

While Nvidia has not disclosed the memory suppliers for the H200, Micron Technology has expressed intentions to become a supplier. Nvidia also works with South Korea’s SK Hynix, which reported a boost in sales due to the rising demand for AI chips. These collaborations indicate a robust supply chain supporting Nvidia’s ambitious AI chip endeavors.

Broadening AI Accessibility

With the inclusion of the H200 in services provided by cloud giants and specialty AI cloud providers like CoreWeave, Lambda, and Vultr, Nvidia is set to broaden the accessibility and utility of advanced AI technology. This move is poised to accelerate innovation and efficiency in various industries relying on AI capabilities.

 

CoinW