NVIDIA has showcased a new real-time neural materials models approach that offers a huge 12-24x speed in shading performance versus traditional methods.
At Siggraph, NVIDIA is showcasing a new real-time rendering approach called "Neural Appearance Models" which aims to leverage AI to speed up shading capabilities. Last year, the company unveiled its Neural Compression technique which unlocks 16x texture detail, and this year, the company is moving to speedup texture rendering and shading performance by a huge leap.
The new approach will be a universal runtime mode for all materials from multiple sources including real objects captured by artists, measurements, or generated from text prompts using Generative AI. These models will be scalable in various quality levels ranging from PC/Console gaming, Virtual Reality, and even Film Rendering.
The model will help to capture every single detail of the object to be rendered such as incredibly delicate details and visual intricacies such as dust, water spots, lighting, & even the rays cast by the blend of various light sources and colors. Traditionally, these models will be rendered using shading graphs which are not only costly for real-time rendering but also include complexities.
With NVIDIA's "Neural Materials" approach, the traditional materials rendering model is replaced with a less expensive and computationally efficient neural network which the company states is going to enable up to 12-24 times faster shading calculation performance. The company offers a comparison between a model rendered using a shading graph and the same model rendered with the Neural Materials model.
The model matches the details of the reference image in all regards & as mentioned above,
Read more on wccftech.com