Nvidia and Google Cloud to Launch AI Hardware Instances
AI News

Nvidia and Google Cloud to Launch AI Hardware Instances

Nvidia has partnered with Google Cloud to launch a new cloud hardware offering, the L4 platform, which is designed to run video-focused applications while accelerating AI-powered video performance. This partnership will provide enterprise customers with access to the L4 platform via Vertex AI, Google’s managed machine learning service.

L4 is currently available on Google Cloud as a private preview through Google’s G2 virtual machines. It is a general-purpose GPU that delivers transcoding, video streaming, and video decoding capabilities. Nvidia plans to release L4 later this year through its hardware partners, including Cisco, Dell, Asus, Hewlett Packard Enterprise, and Lenovo.

Nvidia has also announced other AI-focused hardware solutions, including L40, H100 NVL, and Grace Hopper for Recommendation Models. L40 is designed to optimize graphics and AI-enabled 2D, video, and 3D image generation, while H100 NVL supports large language models like ChatGPT. Grace Hopper for Recommendation Models is focused on recommendation models.

L40 is available through Nvidia’s hardware partners this week. On the other hand, Nvidia expects H100 NVL and Grace Hopper to ship in the second half of the year.

In addition to the L4 platform, Nvidia has launched the DGX Cloud platform, which offers infrastructure and software for training models for generative and other forms of AI. DGX Cloud features eight Nvidia H100 or A100 80GB Tensor Core GPUs per node, along with storage. Subscribers of DGX Cloud also have access to Nvidia’s software layer, AI Enterprise, which contains AI frameworks, pretrained models, and accelerated data science libraries.

Nvidia plans to partner with other cloud service providers to host DGX Cloud infrastructure. Oracle Cloud Infrastructure is the first partner, while Microsoft Azure is expected to begin hosting DGX Cloud next fiscal quarter. The service will eventually expand to Google Cloud as well.

Nvidia’s move into AI compute is part of its strategy to move away from unprofitable investments in gaming and professional virtualization. The company’s last earnings report showed its data center business, including AI chips, continued to grow, indicating that Nvidia could benefit from the generative AI boom.

What is your reaction?

Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
Sam Wilson
Sam is a data scientist based in Berkeley, California. He has a passion for AI and has been working in the field for several years. In his free time, he enjoys hiking and exploring new trails.

    You may also like

    More in:AI News