Today, NVIDIA announced that its first DGX H100 systems are shipping to customers all over the globe to advance AI workloads.
Announced back at GTX 2022, the NVIDIA Hopper H100 was titled the world's first and fastest 4nm Data Center chip, delivering up to 4000 TFLOPs of AI compute power. Featuring a brand new GPU architecture with 80 billion transistors, blazing-fast HBM3 memory, and NVLINK capabilities, this chip was specifically targeted at various workloads including Artificial Intelligence (AI), Machine Learning (ML), Deep Neural Networking (DNN) and various HPC focused compute workloads.
Now, the first DGX H100 Hopper systems are shipping to customers from Tokyo to Stockholm says NVIDIA. In its presser, the company states that the Hopper H100 GPUs were born to run generative AI that has taken the world by storm with tools such as ChatGPT (GPT-4). CEO, Jensen Huang has already called ChatGPT as one of the greatest things ever done for computing and said that it's the iPhone moment of AI. The company has seen major demand for its AI GPUs resulting in its stock skyrocketing.
NVIDIA is leading the charge in powering some of the world's top data center clients responsible for powering these generative tools. Some companies are listed below:
Boston Dynamics AI Institute (The AI Institute), a research organization which traces its roots to Boston Dynamics, the well-known pioneer in robotics, will use a DGX H100 to pursue that vision. Researchers imagine dexterous mobile robots helping people in factories, warehouses, disaster sites, and eventually homes.
Scissero has offices in London and New York, and employs a GPT-powered chatbot to make legal processes more efficient. Its Scissero GPT can draft legal documents, generate
Read more on wccftech.com