Creating Multiplatform Assets For Augmented Reality
Even though USDZ Quicklook has been around for a while, it’s still obvious that honing a platform agnostic workflow for AR is something of an evolving process. The new landscape of ARKit3, Reality Composer, and the latest Depth API for Google has forced a re-think of processes from a content production standpoint.
Evolving AR Space
At Taylor James Interactive, we have many conversations with prospective clients and brands about the potential for interactive technologies. Much of our R&D into this area has been driven by the need to provide meaningful examples of prospective content. It’s simply not enough to talk about AR — as a visual medium, you need to show it too. The USDZ framework has provided us with a proactive way to demonstrate the capabilities of Augmented Reality to a wider audience.
We are constantly trying to push the quality level of our work and allow a richer interactive experience. In order to do this, we need to examine how traditional CG skills combine with shader workflows, rigging techniques and model formats. Additionally, we need to make the content work on multiple platforms and look visually similar. We’ve been following a workflow that we devised since the launch of ARKit3, so we will outline why this is different to before, and more importantly the benefits of adopting a GLB-centric approach.
An ArKit Refresher
USD (short for Universal Scene Description) is an open source model and asset format developed by Pixar, and subsequently the format Apple have made pivotal to their Quicklook AR system. Models are packaged into the USDZ format, which is a zipped archive of the USD binary and the PBR texture files.
This is a workflow of how we would compile a USDZ asset -
The reality of this scenario was manageable, but it was limited to models with transform animation. Animation examples were few and far between, and exporting them were similarly involved. Preparing a skinned character into Quicklook with a hybrid toolset was a difficult enterprise. We conceded that despite achieving this goal, something would have to be provided soon if this was going to become more mainstream and consistent.
ArKit3 and Usdzconvert
Apple announced some interesting developments in AR at WWDC 2019 with the launch of Reality Composer. This could loosely be called Quicklook on Steroids — A more sophisticated platform to deliver more interactive experiences. Alongside this, the need to create animations is only going to become stronger to allow people to build more engaging content.
ARCore now provides a consistent experience for Android
When we were first looking into this well over a year ago, Android wasn’t really a consideration. The number of devices it could run on was limited, and tended to be a small subset of the overall user base. We see subtle differences in the rendering and plane detection, the ability to provide both iOS and Android support for AR is something we were always keen to figure out.
Google specifies the GLTF/GLB file format for use in ARcore.
The GLTF format comes in a serialized or binary format. GLTF is plain text, with the assets in a secondary file. GLB is a binary version where it is all contained within a single file.
The main thing that stood out on new version of USDZ converter was the ability to convert from GLTF to USDZ. Potentially, we now had parity between the models for both iOS and Android. Unit scales would in theory be the same, and we’d be setting up for one export and a transcode.
So if you make the GLB, everything else comes for free.
All that’s needed in this scenario is a way of exporting a PBR shaded, potentially skinned animation from 3D into GLB or GLTF.
Here’s the critical thing — we need a reliable way to build PBR shaded animated models to pass to the GLTF export.
Substance Painter
We use Substance Painter for our PBR workflow. It’s fast, and gets great results. Substance now even provides export presets for GLTF/GLB and USDZ, so it’s a once click solution for static AR assets.
However, where we had to rethink was for animated assets. We needed a way to rebind the PBR shader maps into a rigged asset in our DCC of choice, 3dsMax. We use a variety of softwares in-house, including Maya and Houdini, but in this case we developed this for 3dsMax, mainly down to the GLTF exporter workflow required.
3dsMax to GLB Workflow
The main reason we need a rebuild of the PBR maps in 3dsMax is Substance Painter only imports and exports static geometry. So if you want to have animation on your AR export it will need to be re-created in the DCC with the PBR textures and exported.
This also aligns with the classic CGI asset creation pipeline — as you may wish to hand the texture stage to a different artist, so this will allow rigging and animation to happen concurrently if needed. If you want to use the model for WebGL, you may want to have rigged elements being influenced programmatically, so for this you’ll additionally need to specify a rigged hierarchy for export.
To make the trip into GLB world, we found we could use the Babylon Exporter for 3dsMax. It has prebuilt binaries and you are up and running pretty quickly. This is a great piece of open-source software — the docs are amazingly thorough, and detail how you can set up a 3dsMax Physical material to export correctly as a GLTF/GLB PBR shader for real-time.
The above texture bindings will create a template for the Babylon exporter to build a realtime PBR shader on export. If you want to hook up the shaders in a physical material, you’ll need to build a shader per textureset, and connect up to 5 texturemaps per set. We chose the Physical shader route as it has a better implementation to use occlusion, bump and emissive maps.
Scripting The Shader Rebuild
With models containing multiple texture sets, it can be a time consuming process to rebuild the shaders after exporting the final maps from Substance. We wrote a simple Maxscript that would sort the maps into each textureset, and create the Physical materials that the exporter needs, based on the Babylon exporter’s specification above.
It’s definitely a toolset that is being added to — more of a swiss army knife of GLTF processes. We added some common tools to facilitate the pre-export steps of 3D assets.
- Correct setup of skin modifiers for vertex weighting
- Creating Neutral TM Root helpers
- Default Physical Materials for quick colour shading
- A one-click reference object to check the real world size of the object before export.
But ultimately the biggest time saver is to recreate the PBR shaders from Substance, ready for animation export.
Here’s a demo of it in action — This animated globe was an AR piece we did that could be activated from a card at a healthcare event. You can see how the materials are configured automatically from a folder location, and added to the Slate editor.
We have this script as a download, so feel free to pull this repository and see some of the other tools we’ve added to help you get up and running with asset creation.
Scale Considerations
With AR, you need to make sure the scales are pretty good before you start. We work in centimeters in 3dsMax, so we wanted to make sure that we had a 1:1 correlation with our scene setups and the exported geometry.
Our studio default is to work in centimeters as a base system unit scale. (3ds Max has this concept of a display unit which can be anything you like, but the system units dictate the actual export scale) The best workflow we found was to work in centimeters in 3dsMax, then export via Babylon using a scale factor of 0.01. This allowed us to keep our studio working unit scale and export with predictable results.
The Final USDZ Frontier
Once we have the GLB model, it’s pretty straight forward to convert this using usdzconvert into a USDZ file. If you’ve built USDZ files by the commandline before, you would have used usdz_converter. It’s important to note the initial difference, as the process for ArKit3 has changed. (You’ll need to download the latest USD python bindings to get access to the improved toolset)
The benefit of this workflow are obvious -
- Reliable animation exports, both transform and skinned meshes.
- Great looking shaders without having to re-bind on the command-line
- Use the precompiled USD bindings for mac, without having to setup a dedicated python environment, or build USD from source
Running the usdz.command file sets up the environment for USDZ export. Just double click the file and it registers the correct variables in the python path.
One thing to note, you can additionally download the FBX SDK and change the path location in the command file. This adds FBX support to USDZ export, and whilst this was missing before, we prefer the GLB-USDZ workflow.
I can see why Apple have adjusted the workflow, it makes the whole process much more straightforward. We think that using our GLB to USDZ workflow is a little more to setup and understand, but gives you the ability to create really great animated content for both Android and iOS.
usdzconvert tag-globe.glb
This will write a USDZ file of the same name into the folder where the GLB is located. Ultimately what you’ll end up with is a dual platform AR model that is identical, and also has potential to be used in WebGL which uses GLB/GLTF as the preferred mesh format.
Inevitably, the renderers in ARCore and ARKit will handle things differently — different devices will acquire the planar surface at different rates, and the render will be different in terms of how the lighting estimation resolves.
The cool thing about Model Viewer from ARCore, is it will work on any browser, so you can set this up to serve the USDZ model on iOS (using the <ios-src> property), and the GLB model on Android by using the standard <src> tag. You’ll also get a 3D viewer rather than a thumbnail.
If you want to invest in a more sophisticated animated content for real time, we believe adopting a GLB workflow is the currency you’ll need.