[Disclaimer: This article has been written as part of a paid collaboration with NVIDIA. It’s been written by me, on a topic I had an interest in, and trying to convey useful information to the community, so I hope you’ll like it]
Reading the news about SIGGRAPH a few weeks ago, I saw NVIDIA tease the release of the Omniverse Connector for Unity, and as a Unity developer, I found it fascinating. I asked for the opportunity to get more information about this and had the pleasure of speaking to Dane Johnston, Director of Omniverse Connect at NVIDIA to ask for more details about it.
The article is a summary of the most interesting information that came out of our chat… including the overwhelming moment when I realized that with this technology people in Unity and Unreal Engine could collaborate on the same project :O
NVIDIA Omniverse
Omniverse is NVIDIA’s collaboration and simulation tool. I had some trouble understanding what it does until some people in the company explained it to me in detail. In short, Omniverse is a system made up of three parts:
- A central nucleus, called the nucleuswhich holds and takes care of the rendering of a scene in USD format in the cloud Integration of all distributed modifications in his common scene;
- Some Connectors used by people working remotely on site. A connector connects a specific local application (e.g. Blender) to the Nucleus in the cloud, and sends him the work done in that application. There are connectors for many uses: people who create 3D models can use the connector for 3D Studio Max, while people who work with materials can use the connector with Substance. Nucleus takes care of that merging all the assets created by the different users using the different applications into a common scene;
- Some NVIDIA modules, which can be run on Nucleus to perform some operations in the scene. For example, you can have a module to perform a complex physics simulation on the scene the team has sculpted.
Omniverse allows people on a team to collaborate remotely on the same scene: In that sense It’s a bit like Git but for 3D scenes. And it also offers the possibility to run NVIDIA AI services (e.g. for digital twins) on the scene you create.
Unity Connector for Omniverse
At launch, Omniverse was made compatible with Unreal Engine, and support for Unity was missing. I asked Dane why and he said there was no particular reason. Actually, NVIDIA started developing both connectors togetherbut the UE was developed much faster, likely due to greater expertise within NVIDIA.
As a Unity developer, this was disappointing because it made Omniverse a lot less interesting for professional use for me. But now finally NVIDIA officially announced the development of a Unity connector for Omniverse at GTC 2022. It will be released in beta before the end of the yearso Unity developers will soon be able to jump into the world of Omniverse and create scenes together with other professionals.
How to use Unity with Omniverse
I asked Dane how Unity works with Omniverse and I feel a little sorry for him that I probably asked too many technical details 🙂 That’s what I was able to understand by the way.
The first thing needed to use Omniverse is Set up a shared Nucleus server (the “repository” in Git terms). To do this, follow the instructions at this link: https://docs.omniverse.nvidia.com/prod_install-guide/prod_nucleus/enterprise/installation/quick_start_tips.html
Then you must Install Omniverse on your PC, Open the Omniverse Launcher and The Unity connector can be found in the Exchange Connectors section. You install it, basically installs a plugin for Unity.
This plugin will give you an Omniverse dashboard in Unity, which lets you choose how you want to collaborate with your colleagues. There are two ways to start a collaboration, one offline and the other online (I made up these terms…they aren’t official).
The offline collaboration Works similar to version control systems like Dropbox, if we draw a parallel with writing documents. You open the shared project, make some changes, and then save the changes. When your colleague opens the project, he/she gets your modified scene from the server and edits it, then saves it again for others to use.
Online collaboration works similar to Google Docs. You and your colleagues decide to collaborate and start a live session together. While in the live session, you can work on the scene together and Any change made by a professional is reflected in real-time in the scene seen by the others. So a material artist could create a new material for a sofa in the scene in Substance, push it to Omniverse, and all other workers in the live session would see it change instantly in their local version. At the end of the session, the team can see the list of changes they made to the scene, Decide if you want to keep them all or just a part, then confirm that to Omniverse. After confirmation, the scene will be updated on the servers for all other employees.
USD and unit
Omniverse exploits the novel USD support powered by Unity to offer its functionalities. Behind the scenes, for every change, the system sends a “delta” of changes to the Nucleus server, which integrates it into the common scene. USD provides this ability to work with deltas, making it perfect for the purpose of working with a shared 3D environment. Also, since only deltas are sent and not the whole scene, the network collaboration system is very lightweight.
How teams can use it
I know Omniverse is mostly used for simulations, but I was wondering if it might be useful for simulations as well small game studios to work together on a common Unity game. Dane told me yes, it’s a possible use: Omniverse is good for both enterprise applications and game building.
With Omniverse, a 3D artist and a game designer could work live on the same scene to create a level together. and then save everything when the level is complete. As I work on creative projects with designers and artists who are remote, I can tell you this this would be a fantastic tool for collaboration, because the current workflow doesn’t allow us to really work on a scene at the same time.
NVIDIA is working with developers on it Try to understand how to evolve Omniverse to support them. For example, Dane told me that some developers like using Omniverse because it’s so easy Connect the NVIDIA AI Services application such as audio-to-face (which generates facial expressions from a voice) for NPCs. Another feature the company is working on is the offering a “packaging process” for the scenes created with Omniverse. This means that before creating your game, Omniverse “converts” the scene to your engine’s native format, so the game’s build process can flow exactly as if you did it all in Unity, without using Omniverse at all.
An open system
I asked Dane which features of Omniverse he liked the most. He said that at the end of the day it was one of his favorite things to do Everyone on a team can work with the tool he/she knows best, and everyone’s work is seamlessly integrated into a shared scene. So someone working in Substance can create a material, add it to the scene, and then the developers working in Unity see it this material is automatically converted into a Unity material by the system. Everything integrates seamlessly so a team of diverse professionals can collaborate using the tool they work best with.
And one consequence of this openness is that … People in Unity and Unreal could work together! Once there will be a connector for Unity, people using Unity will be able to modify a scene that will be automatically updated for designers working in Unreal and vice versa. So it’s a unique thing that for the first time People working on different engines could work together on the same project. This shows the power of Omniverse and the USD format.
He added that the idea of Omniverse is too be open and offer many functionalities, and then let teams decide how they want to use it and how it can improve their current production processes. In the case of Unity, the vision is to blend the benefits of using Unity with those of using Omniverse.
Speaking about VR, he also told me that he loves the fact that Omniverse is now features amazing scenes rendered in VR with real-time ray tracing.
As you can try
The Unity Connector for Omniverse was announced at GTC 2022 and will be released in beta in late 2022. Follow Omniverse on Twitter to be notified when it’s released. NVIDIA warns that this is a beta and is looking for studios interested in using it and providing feedback not only on the bugs but on them as well What features do software companies need from it?
And if you try it, please let me know what you think! I’m very curious to hear what gaming pros think about using Omniverse to collaborate with their peers
A final word…
This post is part of a promotion for the GTC 2022 event taking place from September 19th to 22nd.
If you register for it with my unique code, You are entering a raffle to win an RTX3080 Ti! To do this, you must go through the following procedure:
Step-1: Register for NVIDIA GTC from this link: https://www.nvidia.com/gtc/?ncid=ref-crea-201724. To qualify, registrants must be located in the EMEA region (Europe, Middle East or Africa).
Step-2: Join GTC sessions, there is even one about VR in Omniverse which you can watch here. NOTE: Prizes will only be awarded to those who register for GTC using the link above and attend at least one session.
Step-3: Wait and hope to win a graphics card!
Good luck 😉
(Nvidia header image)
Disclaimer: This blog contains advertising and affiliate links to sustain itself. If you click on an affiliate link, I am very happy because I receive a small commission on your purchase. My boring full disclosure is here.
Related