NVIDIA GTC 2021: The Omniverse

The biggest news out of this year’s annual GTC conference that NVIDIA — best known for its GPUs (graphics processing units) for gaming and professional markets such as AEC — puts together to showcase the latest developments in its products was the Omniverse. It’s an intriguing name, and while most of us have an inkling of what the term means (universe > multiverse > omniverse), I found this definition particularly apt in light of what NVIDIA is trying to do with it: The Omniverse is a container that contains everything that exists.

From a computational perspective, this translates to a 3D virtual world that can contain models created by many different people in many different applications in diverse locations around the world. It might seem too good to be true, but we saw it in action at GTC 2021, where the Omniverse was undoubtedly the “crown jewel” of all the products and developments that NVIDIA showcased. There were several sessions dedicated to describing the technology in more detail, including some that were specific to AEC (Figure 1). Up until now, NVIDIA has not been a key player in AEC technology (as highlighted in my articles on GTC 2018 and GTC 2019). But its Omniverse technology has the potential to change that.

Let’s look at it in more detail, starting with the essential facts: what it is, how it works, and where it came from.

Overview

The Omniverse is an open platform for real-time 3D visualization and simulation that can host multiple assets created in different applications, allowing design professionals using different tools to be able to work together in one environment. Thus, in an AEC project, individual designers could each be using different applications such as Revit, Archicad, SketchUp, Rhino, 3ds Max, ArcGIS, etc., for different parts of the design, but they could work together and see in real-time the changes that each of them is making to the overall project in Omniverse (Figure 2). This kind of synchronous collaboration is useful not just in AEC, but in all design fields that rely on visualization including movies, gaming, manufacturing, product design, and so on (Figure 3).

Prior to the availability of such a platform, when design professionals needed to see their work in the context of what was being done by others on the project, they had to exchange files by importing and exporting them in specific formats. Not only was this tedious and time-consuming, it was also constraining to the workflow and likely inhibited creativity, given that they could not immediately see the impact of their parts on the larger whole. With the Omniverse, each of them can “plug” their design into the larger context and continue working on it, while simultaneously being able to see what the other team members are doing. The workflow is less restrictive, more free flowing.

Of course, something is needed to make this “magic” happen, and in the case of the Omniverse, what is required for the applications feeding into it is to have a “connector” that maintains the live link to it. Once there is such a connector, the application can connect to Omniverse in real time, without the need for any file import/export.  There are currently a large number of applications that have already developed this connector, and depending upon the success of this initiative, I would imagine the list continuing to grow as more and more applications would want in on the opportunity. There is also the ability for firms to use APIs to develop custom connectors for any inhouse software they may have developed.

While the Omniverse platform may seem like it has suddenly emerged on the technology scene out of nowhere, it has actually evolved from NVIDIA’s Holodeck technology, a VR-based collaborative design environment I had a chance to try out at GTC 2018. While the Holodeck technology allowed designers to enter into a virtual design domain with other users and make interactive design decisions jointly, they were required to wear VR glasses, which, to this day, remain cumbersome. The Omniverse is definitely a much better way for multi-user design collaboration, the biggest advantage being that the individual designers can continue to work in their application of choice.

From NVIDIA’s perspective, it has not just developed a technology that any industry relying on visual computing can use (Figure 4), the technology itself is also a “killer app” for the company’s own continually developing graphics technologies. Not only will the Omniverse generate its own separate revenue stream, it is optimized for NVIDIA certified systems, which means that wider adoption of Omniverse will increase the demand for NVIDIA’s other products as well. Currently in beta mode, the commercial version of Omniverse is expected to be released in the summer, along with an Enterprise version for larger firms.

Additional Details

The main technology enabling the Omniverse platform — the foundation on which it is built — is an open file format called USD (Universal Scene Description) that was originally developed by Pixar to simplify entertainment industry workflows and allow artists to work collaboratively on a scene. It is now being extended to other industries. NVIDIA sees USD as the 3D equivalent of HTML, providing a common language for describing models, scenes, materials, lighting, and other relevant attributes for visualization. Extending the HTML analogy, it has developed Omniverse to be a “browser” for USD.

The graphic in Figure 5 shows the large number of NVIDIA technologies on top on which the Omniverse platform is built. In addition to the USD, there are two additional open standards: MDL (Material Description Language), which handles material definitions and visualization; and PhysX, a physics technology which ensures that the virtual world follows the laws of physics of the real world such as gravity, solidity, etc.

Figure 6 shows the platform technologies in more detail. One of the key technologies is the Nucleus server, which stores the entire project including all the assets feeding into it from different applications, shares them between the different collaborators, and maintains the live link between them. It acts like a data traffic hub and keeps track of the delta changes for each asset, communicating only those so that the combined project can be updated in (seemingly) real time. It also manages and resolves any conflicts between the actions of the individual collaborators that may come up. Another key platform technology is Connect, which includes the plug-ins and SDK for developing the connectors between the Omniverse and the applications that link to it. Also noteworthy for AEC is the Simulation technology, which currently enables real-life simulations in Omniverse such as sun studies and wind patterns. (An example is shown in the next section in Figure 13).

The Nucleus server that hosts Omniverse projects can be installed on local machines or in the cloud. It is even possible to have multiple Nucleus servers at different levels as shown in Figure 7, with the lower-level servers catering to smaller, local teams, which are, in turn, connected to higher-level servers for larger team collaboration. While the details of how exactly the project and the connections would be set up and managed were not shared, it would presumably involve the IT department of a firm as well as one or more dedicated Omniverse administrators.

At the top of the technology stack of the Omniverse (shown earlier in Figure 5) are the actual applications for using it. While custom applications can be developed for it, NVIDIA has developed some applications for it that can be used out of the box (Figure 8). Of these, the ones most relevant to AEC are Create, which allows designers to put a project together and assemble, light, simulate, and render scenes in real time; and View, which allows users to bring in CAD and BIM models, apply materials, perform sun studies, etc. Two of the four screenshots in Figure 1 earlier, which showed real-time multi-user collaboration on a building design, were of Omniverse View.

And finally, with regard to how the Omniverse data itself can be viewed, this can be done on a local computer or even remotely on a web browser, as well as with AR (augmented reality) and VR (virtual reality).

The Omniverse in AEC

Given that the AEC industry is so multidisciplinary with many different applications and collaborative workflows, the Omniverse is not only a significant technological development, but also one that is very exciting and has generated a lot of interest. It is hardly surprising, therefore, to find that many of the applications commonly used in AEC including Revit, Archicad, SketchUp, Rhino with Grasshopper, 3ds Max, and CityEngine (from ESRI) have already developed connectors for their live link to Omniverse. Other vendors such as Bentley have also announced they will be developing connectors. Unfortunately, there were no live demos of many of these applications actually connecting to Omniverse in real-time, so it was difficult to determine what is the actual progress that has been made in enabling multi-disciplinary collaboration using them.

There were, however, demos of Rhino and Grasshopper live syncing with Omniverse provided by two leading architectural firms, Foster + Partners and KPF (Kohn Pedersen Fox), as part of their presentations at GTC on how Omniverse was improving their workflows. (Both firms were early adopters of the technology.) Foster + Partners showed their use of Omniverse on one of their recent projects, China Merchants Bank’s Global Headquarters in Shenzhen. Figure 9 shows a massing model of the project in Rhino and the Omniverse plug-in installed with the Live Sync option turned on. When the project is published as a USD file to the Omniverse server and brought into the master project file with the Live Sync option in Omniverse activated for it, any change made to the file in Rhino can now be seen in real time in Omniverse, as shown in Figure 10.

The live sync also works with the Grasshopper plug-in to Rhino, as shown in Figure 11, where a change to the Grasshopper script makes the corresponding changes to the generated model in Rhino and can also been seen in Omniverse.

KPF (Kohn Pedersen Fox) also showed an example of their workflow with Rhino and Grasshopper linking to Omniverse (Figure 12) for real time design and collaboration between its multiple offices in different locations across the globe. The workflow enables the designers to experiment with the design, adjust parameters, change massing models, etc., and immediately see the impact of the changes in Omniverse. In addition to the visual changes, the environmental impact can also be seen, since the solar studies and shading analysis has been activated in the example in Figure 11. An additional example of the use of Omniverse’s simulation technology is shown in Figure 13, where a wind flow analysis of the same project is being done. KPF also uses Omniverse for the quality of the renderings, which are spectacular (as was shown earlier in Figure 1).

Analysis and Conclusions

Prior to the Omniverse technology, I could not even have conceived of an idea where multiple applications could plug into the same system and be accessible to others using that system in real time, so that a designer using Revit or Archicad for a building could collaborate in real-time with a designer using Bentley OpenSite for the civil infrastructure that houses that building. So far, synchronous multi-disciplinary collaboration in AEC has been limited to those on the same platform such as Revit or Bentley, and even that is far from seamless. Omniverse’s cross-platform synchronous design capability seems too good to be true, and had I not seen a live demo of it in the presentation by Foster + Partners, I would have remained skeptical.

At the same time, it is important to keep in mind that from an AEC perspective, this real-time, multi-disciplinary, multi-application collaboration is, for now, primarily visual, and therefore useful for the design phase only. There is no “BIM” information as such, making it similar to a general-purpose design tool like SketchUp rather than a building-specific BIM application.  There are no analysis capabilities of the kind needed for structural and MEP design, and when it comes to construction and operations/FM, it is hard to see the Omniverse being useful at all. In one of the technical talks, NVIDIA did talk about trying to incorporate a BIM open standard into Omniverse, but what this would be? The IFC format? And even if this could be done, how would it help?

The AEC industry, in contrast with the M&E industry that the Omniverse concept came from (the USD format from Pixar), deals with the real world rather than imaginary worlds. Our biggest concern, as Lori Hufford of Bentley put it (in a panel discussion at GTC) is, “how to make the world a better place.” The Omniverse is an amazing technology and could make NVIDIA a significant player in the AEC technology industry, but, as of now, I don’t see it as a game-changer in AEC.

About the Author

Lachmi Khemlani is founder and editor of AECbytes. She has a Ph.D. in Architecture from UC Berkeley, specializing in intelligent building modeling, and consults and writes on AEC technology. She can be reached at lachmi@aecbytes.com.

Blog

Have comments or feedback on this article? Visit its AECbytes blog posting to share them with other readers or see what others have to say.

AECbytes content should not be reproduced on any other website, blog, print publication, or newsletter without permission.

Related Articles

NVIDIA GTC 2019 Conference

This article summarizes the key advances NVIDIA has made in the broader field of graphics-enabled visualization that are relevant to AEC in addition to other industries such as gaming, media and entertainment, manufacturing, and industrial and product design.

Technology at Foster + Partners

Han Shi, Head of BIM & Design Systems at Foster + Partners, describes how technology forms an integral part of the firm’s workflow, with several interdisciplinary groups involved in computational design, building physics, performance analysis, optimisation, fabrication, and interaction design.

NVIDIA GTC 2018 Conference

A discussion of NVIDIA developments unveiled at GTC 2018 including AI denoising, predictive rendering, virtual reality, the Holodeck for multi-user design collaboration, and intelligent video analytics.

Merdeka 118: Project Profile

Fender Katsalidis describes the implementation of AEC technology on the “Merdeka 118” project, a 118-storey, mega-tall skyscraper under construction in Kuala Lumpur, Malaysia.