Nvidia is getting into the fight for the future of the Metaverse – and with a bang. Global remote work from any device, a common space for the creation of virtual 3D worlds: This is the vision of the US company – Omniverse. This should make it possible to simulate entire production processes in real time. Robots, environmental disasters, avatars with realistic facial expressions and gestures, created with the help of artificial intelligence. “The metaverse from science fiction is within reach,” promises graphics card manufacturer Nvidia. A little like Christopher Nolan’s inception seems like that – just in virtual reality.
The Omniverse could revolutionize gaming in particular. Today it is the cultural sector with the highest turnover in the world, characterized by fierce competition among developers and already closely linked to NFTs. Players’ expectations of virtual worlds have been increasing for years, also driven by the marketing departments of the Silicon Valley giants, who never miss an opportunity to loudly point out the upcoming VR/AR revolution. The new worlds in cyberspace should become even more immersive and even more breathtaking.
Who will bring the next box office hit onto the market and lure gamers for hours in front of the screens – or more appropriately: in the headsets? Many video games are developed over years, by huge teams and with budgets in the hundreds of millions of dollars. By incorporating realistic lighting, lifelike physics, and AI-powered technologies, 3D development becomes a labor-intensive, time-consuming process. A failure often means the ruin of the studio. Nvidia also wants to simplify this process with its Omniverse.
Nvidia’s vision: Huge global teams of artists, designers and developers working simultaneously to create massive libraries of 3D content to be used in the Metaverse.
Strong competition: Unreal and Unity engines
But can Nvidia keep up with the strong competition? For example, Zilliqa’s Metaverse Metapolis is currently being built on Epic Games’ Unreal Engine. Photo-realistic graphics, buttery smooth animations, complex light simulations have been standard here for a long time. The engine has been further developed for over 20 years. Hundreds of game blockbusters (series) like Unreal Tournament, BioShock, Splinter Cell, Borderlands, Mass Effect and Fortnite were built on her. It is the absolute gold standard in the industry. The biggest gaming companies and even Hollywood (e.g. in the form of the Disney series The Mandalorian) rely on the technology. Epic Games recently launched the even more powerful Unreal Engine 5, ushering in a new era.
Epic Games was one of the first companies to recognize the Metaverse trend. They have been working on it since 2017. In the meantime, they have expanded their portfolio to include many important companies. Just yesterday, April 7th, Epic Games announced a long-term cooperation with the LEGO Group. A number of games have also been created using the Unity Engine. The company behind it has been around since 2005. At the end of 2021, it acquired Weta Digital, the effects studio behind films like that Lord of the rings, Avatar and Planet of apes.
The Omniverse components
Omniverse, on the other hand, is based on Pixar’s Universal Scene Description (USD). This is an easily extensible, open-source framework for exchanging 3D computer graphics data. It is intended to be for the Metaverse what HTML was for the Internet: a common language that everyone speaks.
The Omniverse platform is designed for maximum flexibility, openness and scalability. These traits fit the ideals of the crypto community. Most notably, the various disparate components of the Omniverse platform have the potential to make the Metaverse more collaborative, decentralized, and more realistic than ever.
- Audio2Face enables realistic facial expressions and facial expressions that provide a stronger emotional connection within the virtual world.
- thanks to the NVIDIA RTXgraphics technologies, real physical events can be simulated. The technology simulates in real time how each ray of light bounces in a virtual world. The course of the sun and the behavior of shadows can also be implemented.
- With NVIDIA PhysX can be faithfully recreated within Omniverse materials using the NVIDIA Material Definition Language, or MDL for short. Every blade of grass, every stone, every object in the virtual world looks deceptively real.
- NVIDIA CloudXR is client and server software for streaming content from OpenVR applications. It forms an interface between the real and the virtual world, using VR and AR. Sort of like a portal to the Metaverse.
Fantasy worlds are as old as man himself
The human will to create virtual parallel worlds in the form of art is basically as old as humanity itself. “For millennia we have used our senses to create virtual realities through music, art and literature,” writes Rev Lebardian, the Head of Omniverse in a press release. Technologies like metas Oculus QuestMicrosoft’s Hololense or the Google glasses are just the next step in this long evolution. All of the technologies mentioned will probably continue to develop. But VR glasses alone are not enough. The most important thing is already there: A realistic simulation of our world that can be displayed on the screen and experienced with almost all of our senses. That should be the Nvidia Omniverse – and will be.