top of page

NVIDIA Introduces Next-Gen AI Chip for Advanced Generative AI and Computing



Mett.Ai Tech Desk

It promises breakthroughs in generative AI and accelerated computing capabilities

NVIDIA, the prominent technology company renowned for its high-performance GPUs, has recently revealed its latest innovation in the realm of generative AI and accelerated computing - the GH200 Grace Hopper™ platform. The platform is built around the state-of-the-art Grace Hopper Superchip, featuring the world's first HBM3e processor. Geared towards addressing the growing demand for advanced generative AI applications, the GH200 Grace Hopper platform is designed to accommodate complex tasks such as large language models, recommender systems, and vector databases.

A standout feature of this platform is its dual configuration, providing an exceptional 3.5 times increase in memory capacity and triple the bandwidth of its predecessor. This setup incorporates a single server equipped with an impressive 144 Arm Neoverse cores, delivering eight petaflops of AI performance, and accommodating 282GB of cutting-edge HBM3e memory technology.

Jensen Huang, Founder, and CEO of NVIDIA, emphasized the platform's ability to meet the surging demand for generative AI solutions in data centers. Huang stated, "The new GH200 Grace Hopper Superchip platform delivers this with exceptional memory technology and bandwidth to improve throughput, the ability to connect GPUs to aggregate performance without compromise, and a server design that can be easily deployed across the entire data center."

The Grace Hopper Superchip forms the core of this platform, enabling seamless connectivity with additional Superchips through NVIDIA NVLink™. This interconnect technology facilitates collaborative work among the Superchips, enabling the deployment of colossal models crucial for generative AI applications. The NVLink technology ensures the GPU's direct access to the CPU memory, offering an impressive combined memory capacity of 1.2TB in dual configuration.

The incorporation of HBM3e memory, boasting 50% faster performance than its predecessor, enables the platform to offer an extraordinary total combined bandwidth of 10TB/sec. This enhancement allows the GH200 platform to run models 3.5 times larger than its predecessors, all while delivering a remarkable 3 times improvement in memory bandwidth.

The market has already witnessed a significant demand for the Grace Hopper technology, with several leading manufacturers already offering systems based on the previous generation Grace Hopper Superchip. The new GH200 Grace Hopper Superchip platform is fully compatible with the NVIDIA MGX™ server specification, facilitating its integration into over 100 server variations. This strategic move aims to foster broader adoption of this innovative technology.

Leading system manufacturers are expected to roll out systems based on the GH200 platform in the second quarter of the calendar year 2024. As Nvidia continues its market dominance in providing high-end processors for generative AI, the GH200 Superchip represents a major leap forward, catering to the escalating need to execute complex AI models efficiently.

Nvidia's unveiling of the GH200 Grace Hopper Superchip platform signals a significant stride in the realm of AI and computing technology. With its enhanced capabilities in handling generative AI workloads, the GH200 platform is set to shape the future landscape of accelerated computing and AI applications. As the industry anticipates the release of the GH200-powered systems, Nvidia solidifies its position as a pioneer in advancing the boundaries of AI and high-performance computing.

Latest News


Professor Graham Morgan Unveils the Transformative Power of Game Development Beyond Entertainment

From Healthcare to AI-driven Finance: How Game Technology is Shaping Diverse Fields and Future Careers


Google Removes Controversial Live Video Chat App Chamet from Play Store Over UGC Violations

Chamet's Removal Highlights Google's Commitment to Ensuring Safe and Appropriate App Content


INA and GDS Partner to Transform Indonesia's Data Center Landscape

Collaboration Sets the Stage for Nationwide Data Center Expansion and Pioneering Tech Advancements in Indonesia

bottom of page