Facebook’s rebranded parent company Meta is working with Nvidia to build a massive supercomputer for AI research that probably should terrify us all.
According to an Nvidia Blogpost, the two companies will be working together on what’s been dubbed the Air Research SuperCluster or RSC, and will be the largest NVIDIA DGX A100 customer system yet. The beast is said to deliver 5 exaflops, or 5000000 teraflops of AI performance across thousands of GPUs.
How to buy a graphics card: tips on buying a graphics card in the barren silicon landscape that is 2021
This big synthetic brain is going to be set to train AI models with more than trillion parameters. A Facebook blog post explains this will be used for things like learning language and identifying harmful content. Given some of Facebook’s questionable decisions in the past, the company could probably use some help making these decisions. A few years back it banned this Fallout 76 group and an advert for the beautiful watercolour videogame Gris for being too sexy. I am still waiting for my own ban for that one, Facebook.
One very neat use suited for this absolute unit of a supercomputer would be to help translate languages. It sounds like the company is aiming for real time translation like having your own babel fish, which is very cool.
“We hope RSC will help us build entirely new AI systems that can, for example, power real-time voice translations to large groups of people, each speaking a different language, so they could seamlessly collaborate on a research project or play an AR game together,” Meta says in the blogpost.
The blogpost also talks about the encryption of the data that will be used with the RSC. It states that there’s processes to anonymise and encrypt data
Read more on pcgamer.com