You can now find Cyber Kendra on Google News!

Meta is Working on its Own AI Supercomputer

Introducing Meta's Next-Generation AI Supercomputer


Meta has joined the list of tech giants planning to build their own AI-powered supercomputer, a high-speed machine designed specifically for training machine learning systems. According to Meta, its new AI Research SuperCluster (RSC) is already one of the fastest machines of its kind in the world, and by the middle of this year, when all work on it is completed, it will be the fastest.

“Meta has developed what we believe is the world's fastest AI supercomputer. We named it RSC, which stands for AI Research SuperCluster, and work on it will be completed later this year,” said CEO Mark Zuckerberg.

The RSC will be used to train a variety of Meta systems, ranging from content moderation algorithms that detect hate messages on Facebook and Instagram to augmented reality. In addition, the RSC will be used to create the metaverse.

“RSC will help Meta AI researchers create new and improved AI models that can learn from trillions of examples, work with hundreds of different languages, easily parse text, images, and videos, develop new augmented reality tools, and more. We hope that the RSC will help us build new AI systems that can, for example, provide real-time voice translation for large groups of people who speak different languages, so that they can seamlessly collaborate on research projects or play augmented games. reality” said Meta engineers Kevin Lee and Shubho Sengupta.

Work on the RSC began a year and a half ago. The company's engineers created various parts of the computer from scratch - cooling systems, power, communications, and cables. Phase one of the RSC has now been completed and the computer currently consists of 760 Nvidia GGX A100 systems containing 6080 linked GPUs. According to Meta, the computer is already delivering up to 20 times faster performance on standard machine vision research tasks.

By the end of 2022, phase two will be completed. By then, the RSC will have 16,000 GPUs and will be able to train AI systems "with more than a trillion parameters in exabyte-sized datasets."

Post a Comment