Introduction

In an age of augmented reality (AR), virtual reality (VR) and mixed reality (MR), where everything from flight simulators to automotive displays to product ads are rendered in interactive 3D, engineering companies are investing in immersive, real-time 3D experiences that differentiate their brand and connect with their customers.

Lucky for them, they’ve already taken a big step in that direction—without even planning for it. Most engineering companies are sitting on a treasure trove of 3D models in the form of computer-aided design (CAD) data. It may seem that all they have to do is drag-and-drop their CAD models into a real-time 3D platform. But raw CAD data needs to be optimized for real-time 3D, converted into a form less taxing on hardware like smartphones, tablets, laptops, AR/VR headsets, and other devices. This results in faster time-to-deployment, better performance at runtime, and more high-fidelity models in the next 3D experience.

This whitepaper will explain the fundamentals of optimizing CAD data for real-time 3D applications. It will explore the role of CAD designers in preparing data, the importance of optimization, the best practices for ensuring smooth performance no matter your use case, and what engineering companies need to know to get started with real-time 3D.

The role of CAD designers

Whether modeling brackets in SOLIDWORKS, high-rises in Revit or anything in between, CAD designers can take heart: the design workflow doesn’t need to change much to ensure models are ready for real-time 3D. The bulk of the effort comes later, during the optimization process. However, there are certain things that designers can do to make optimizing less painful.

Take naming conventions, for example. It’s always good practice to adhere to consistent naming schemes but this is especially important when you plan to repurpose CAD models for real-time 3D. It’s a common issue with a straightforward solution, according to Franck Elisabeth, senior technical artist at Unity. He says that all CAD users in a company should use the same naming conventions across all their data. Some designers may see this as an afterthought but it’s an important habit to enforce.

“If those constraints are respected, it’s a gain for everyone,” he says.

The gain comes from fostering the ability to automate data optimization. It’s one thing to take one CAD model and bring it into a real-time 3D engine once. But to do it over and over, for many different kinds of parts and with any sort of efficiency, requires automation. To truly scale up the use of real-time 3D, designers must bow down to one critical automation enabler.

“Metadata is king,” says Brad Scott, technical art manager at Unity.

Metadata enriches an abstract geometric form with concrete real-world info—such as the purpose it serves, the material it’s made of, how to make it or where to find it, and so on. The more metadata that is available in your CAD models, the easier it will be to program automations to optimize that data for real-time 3D.

It’s typically the first and longest conversation Brad has with partners looking to scale up their real-time 3D usage. If there is little or no metadata in a catalog of 3D models, you can’t write rules for how to treat different kinds of parts. For example, if you wanted to bring a big machine into a real-time 3D engine, you might want to optimize it by removing small fasteners. A simple rule for automating that process would find all parts with “screw” in the metadata and discard them. Lacking this data, those parts would need to be removed manually, or through workarounds that lack precision and efficiency. You can see how tedious this process would become and how it might inhibit a company from fully exploring real-time 3D.

“If there’s no information around a certain part, it either has to fall under one rule and we can't treat it any differently, or we need to find another way to annotate that data,” Brad says.

Note that while many CAD systems have built-in methods of handling metadata, some companies use external tools such as a database, PLM platform or Excel spreadsheet to manage metadata. That’s fine, as long as that information can be tied to the 3D geometry. Brad calls this “enriching the CAD model.” Afterwards, you can still craft recipes for automation.

For companies with scant metadata, automation options may be limited but that doesn’t mean they’re shut out of the world of real-time 3D. It’s still easy to explore manually and it’s never too late to start planning for future usage by enriching new CAD models with appropriate metadata.

The optimization process

Why is it important to optimize data for real-time 3D?

Designers create a CAD model as a means to an end. CAD is digital documentation for how to manufacture a part. What’s needed in that context is often different than what’s needed in real-time 3D applications.

Take for example a CAD model of a car. A manufacturer would need to know the size and placement of every nut and bolt in the assembly, so the level of detail in the model must be extremely high. But if that same car were in a video game, rendering every part under the hood would be useless to the player as well as a burden to their graphics card. Then again, a different real-time 3D application—such as a digital twin—may require a higher level of model detail. These requirements will also be dependent on the target device, whether it be a smartphone, AR/VR headset, laptop or workstation. Each has a varying measure of computing resources.

Data optimization is necessary to ensure that your CAD model is suited to your real-time 3D application. From a single CAD source, good optimization processes can feed multiple purposes and teams within a company. There are six main reasons for optimization, according to Brad.

  1. Performance: Data optimization is necessary to provide a smooth and immersive experience—particularly in high frame rate applications like VR, where choppiness is not just annoying, it’s nauseating.
  2. Responsiveness: Real-time 3D applications require quick response to user inputs and environment changes, which means 3D assets must be as lightweight as possible.
  3. Memory efficiency: Optimized models mean less memory requirements, which is especially important for compatibility across different devices.
  4. Bandwidth and storage: Optimizing data reduces the size of 3D assets, textures and animation data, which lowers bandwidth and storage space requirements.
  5. Cross-platform compatibility: Data optimization can ensure a consistent experience across different devices.
  6. Scalability: Using optimized CAD data in real-time 3D applications enables larger and more complex scenes, with higher polygon counts and higher-resolution textures, while maintaining real-time performance.

Tools to optimize CAD data for real-time 3D

Just as there are a variety of real-time 3D platforms, there are a variety of tools to optimize CAD data for those platforms. This whitepaper will focus on the Pixyz (pronounced “pixies”) data preparation suite. Though owned and developed by Unity, one of the most popular real-time 3D platforms, Pixyz can be used to optimize CAD data from any engineering or design tool to target any real-time 3D platform.

Pixyz can import a wide variety of CAD file formats, including CATIA, NX, SOLIDWORKS, Creo, STEP, IGES, FBX, OBJ, JT, USD, glTF, and many others. It can then export the optimized mesh data in several formats, including FBX, OBJ, glTF, glb, USD, and more.

There are three Pixyz products for CAD data optimization: Pixyz Studio, a standalone tool for data preparation and optimization; Pixyz Plugin for Unity, a fully integrated 3D and CAD data importer for the Unity Engine; and Pixyz Scenario Processor, software to automate data optimization at scale. Unless otherwise noted, examples and details in this whitepaper pertain to Pixyz Studio.

Best practices for data optimization

In principle, optimizing CAD data for real-time 3D is straightforward: you need to adjust the original 3D model enough to ensure good performance in the target application but not so much that quality suffers. You don’t need to include every tread on your tire but if it turns into a square, you’ve gone too far.

There are several techniques for setting this balance efficiently and the first among them is proper tessellation, the conversion of B-rep (boundary representation) CAD data to a polygonal mesh.

“It is much easier to get the balance between performance and quality correct during the tessellation stage from CAD to polygons,” Brad says. “Decimation can be used to further reduce the polycount later on but tessellating at a lower setting will give better results than decimating a high poly model.”

Pixyz has a tessellate function that allows users to adjust several meshing parameters, or simply select between default presets: low, medium, high and very high mesh accuracy, density, and quality. The visual difference between these settings may not always be apparent but the higher the polygon count, the higher the computational burden. There also is a “tessellate relatively" function that takes into account the overall size of the CAD model to adjust the preset’s value. In general, you should pick the lowest setting that produces acceptable quality.

Comparison of low, medium and high tessellation presets with and without wireframe. Note the similar appearance of the medium and high variants, despite the difference in polygon count. (Image: Pixyz.)

The tessellate function also provides the option to generate what’s called a UV map for a model. This is a 2D map of the model’s surface that stores texture information. If you don’t need photorealistic or real-time lighting in your target application, Brad suggests generating a UV lightmap at this stage. Called baked lighting, this gets the computational work out of the way up front to save resources during runtime.

Example of baked lighting with a UV lightmap. (Image: Unity.)

Another technique to boost real-time 3D performance is to simplify your data by merging it. For example, a CAD assembly of an airplane comprises many discrete parts but keeping track of each of these separately would tax a 3D application. Merging them all together eases this burden considerably.

“It’s faster to compute and render in real-time a model made of one transform (change of position, rotation and scale) instead of millions of mechanical and electronic pieces,” Franck says.

Similarly, many CAD assemblies consist of nested hierarchies of parent and child parts, sometimes many layers deep. You should flatten these hierarchies whenever possible. Deeply nested hierarchies incur unnecessary computations in real-time 3D applications, while smaller hierarchies enable more efficient multithreading.

On the other hand, sometimes it makes sense to split models apart. A large building model, for example, could be divided up by floors. That way, small parts of the model can be loaded as needed in real-time 3D applications, saving memory space and improving performance.

One of the main advantages of data optimization software like Pixyz is that it allows users to adapt these best practices easily for different types of data, applications and devices.

“Fundamentally, tools such as Pixyz allow a common workflow regardless of the use case,” Brad says. “So a lot of these problems can be tackled in the same way using the same functionality.”

Getting started with real-time 3D

Data maturity

Leveraging your existing 3D models for use in real-time 3D applications can range in difficulty, depending on the quality of your data (and your metadata), your target applications and devices, and your technical expertise. While it’s easy to access the tools and start experimenting with them, or even complete a one-off project, developing scalable workflows is another matter.

“If you want to really take advantage of your CAD and PLM data, you need to put some effort in it. Then you can do some amazing, powerful things,” Axel Jacquet, technical product manager at Unity, told engineering.com.

Franck agrees, acknowledging that “it’s a tricky process to update old models and pipelines.” But like Axel, he believes that the investment is well worth the cost and can “give better reactivity to any company when that transition to a metadata-driven pipeline is done.”

For newcomers to real-time 3D, the takeaway is positive: there’s little harm in trying it and much reward if you’re willing to put in the effort. The level of effort depends on several factors and the most important is a concept that Brad calls data maturity. Mature data is well-organized and replete with metadata that can be used to automate data optimization. Immature data is scattered across multiple systems and lacking in metadata. It’s a spectrum and it’s important to understand where on it your company lies.

“There are levels of automation there,” Brad says. “If the data is not mature enough, we can’t really unlock a full automation pipeline. But we could do anywhere from 50 percent and then maybe there’s a bit of manual work left.”

Even if your data is not yet mature enough to unlock full automation, you can always improve it for new models by ensuring the proper metadata is in place. In fact, it’s never too early to establish best practices and repeatable patterns for naming conventions and metadata use.

“Start annotating data at that design stage,” Brad says. “It has this knock-on effect. You might not want to add an extra 10 minutes to the designer to add metadata but when you start to understand that 10 minutes here saves thousands of hours down the end, over repetition, that’s where you start to see that payoff.”

Understanding your target application and platform

The optimal form of real-time 3D data depends on the end use case—both the application and the device that will power it. A 3D model viewer on a smartphone will have different requirements from a factory simulation running on a high-end workstation. To create an effective real-time 3D experience, you need to understand both the hardware and software constraints.

In terms of hardware, Axel says that looking up the specs of your target device (e.g., the Meta Quest 3 VR headset) is a good place to start. With a bit of trial and error, you’ll figure out the best way to optimize your models in terms of mesh size, part merging, and the other techniques described in this whitepaper. Pixyz may soon include hardware presets that make optimizing for a target device even easier.

In terms of software, it’s also crucial to understand what end-users will do with your models. For instance, you can’t merge every part of a car assembly if you expect users to open the door and you can’t remove parts like the engine if you expect users to lift the hood. Your approach to optimizing any CAD model should account for its role in the real-time application.

“You need to know what your final goal is, what your final scenario is, and from there you will be able to find a good recipe,” Axel says.

Some organizations like to start out with what Axel calls a “150 percent model,” a master version of the data with as much detail as possible. “It’s really the perfect digital twin of their actual product,” he says. This model can be used as the starting point for any other real-time 3D application. With nothing left to add, it’s just a matter of deciding what to remove for any given application and device.

“To me that’s the perfect way to go, because you need to manage one big database of perfect data and then you can create some export or optimization scenarios to push to your channels,” Axel says.

Of course, whether the 150 percent route is viable depends on a company’s goals, culture and level of data maturity. There are many valid approaches to optimizing real-time 3D data but the most effective ones all make use of automation to some degree.

“Pixyz lets a company optimize data on demand, giving reactivity and agile development. With scripting, you can automate most of it to adapt to your needs,” Franck says.

Conclusion

Many engineering companies have a deep repository of CAD models. These companies are in a prime position to jump-start development of real-time 3D applications using these assets. Optimizing the CAD data for real-time 3D takes a bit of effort and depending on the level of data maturity and willingness to adapt workflows, some companies may find it easier than others. But the effort is well worth the investment. Real-time 3D unlocks new possibilities to develop design insights, expand your product offerings, engage your employees and clients, and much more.

To learn more about using Pixyz for data optimization, visit unity.com.

Try the ultimate CAD data optimization software

Pixyz Studio enables CAD experts, engineering departments, interactive 3D developers, 3D artists, architecture firms, agencies, and manufacturing, communication and marketing departments to unlock the full potential of their 3D and CAD data.

Learn more Try for free
Please enable Javascript in your browser in order to continue.