NVIDIA is pushing what's possible on the AI PC platform further with its latest RTX technologies being announced today.
The difference between NVIDIA and others who have just started their journey in the AI PC realm is quite evident from the get-go. While others are talking mostly about how their hardware, NPUs, are faster than the rivals, NVIDIA is the one making the AI PC platform vibrant by introducing several new features. The company has a list of technologies already available for AI PC consumers running its RTX platform such as the most prominent DLSS (Deep Learning Super Sampling) feature which has seen a countless amount of updates that add to its neural network to make games run and look better.
The company also offers several assistants in the form of its RTX Chat, a chatbot, that runs locally on your PC and acts as your assistant. There's also TensorRT & TensorRT-LLM support added to Windows which accelerates GenAI & LLM models on client platforms without needing to go to the cloud and there are several gaming technologies that are coming in the future that will utilize AI-enhancements such as ACE (Avatar Cloud Engine) which also gets a new update today.
NVIDIA also lays the current landscape of AI computational power and shows how its GeForce RTX 40 Desktop CPUs scale from 242 TOPS at the entry level and all the way up to 1321 TOPS at the high end. That's a 4.84x increase at the lowest end and a 26.42x increase at the very top compared to the latest 45-50 TOPS AI NPUs that we will be seeing on SOCs this year.