Unstuffed Bears

Taylor James Interactive
8 min readDec 9, 2020

--

WebGL as part of a dynamic user experience

Taylor James Interactive were recently approached to build a digital user experience for a Christmas campaign for Dole. The client had aligned with the creative agency St Luke’s, and the charity Save The Children to launch a campaign to raise money and awareness for child hunger.

With our Parent company Tag, and as part of a wider campaign that included an online video and a donation route, we collectively worked towards an activation that would allow a personalized Augmented Reality experience, one that could be delivered by QR Code after the user had donated.

The end screen gave a code to scan to activate the AR

We always try to look at any experience from the user’s perspective. If the user feels part of the experience in any way, it has a great chance of creating a resonance with them. Like most projects in our industry, we had to look at inventive ways for us to deliver AR in a short timescale. We’ve worked on previous campaigns using Spark AR which have been great, but since approval times for filters can be so varied, it wasn’t something we could risk with a live launch date only a handful of days down the line.

You don’t have that many options when you decide to go direct to web. The JavaScript libraries are still evolving, and WebXR, while poised to become the standard of immersive technologies on the web, does not have the required device footprint to be viable right now.

We used 8th Wall’s platform for our Halloween game — Army of Darkness

Taking all of that into consideration, we chose 8th wall, a service we became quite familiar with over the past few months with some of our R&D projects — give it a try here.

The Dole campaign was centred around making a donation for child hunger. The more you donated, the more the bear could be filled with stuffing.

We worked with the creatives on a few routes, but then decided to centre on the idea that the AR was a reward gifted back to the donor to say thank you. We created an animated narrative where you would tap to wake the bear up, and then receive a personalized message. Extra dance moves can be activated from tapping the experience, as well as capturing an image of your personal message to share with your friends.

Development Considerations

As the project had a quick turnaround, we didn’t want to be tied to any process where deployment was outside our hands. This has happened before with the approval process of SparkAR, so we decided to move to the browser as a more predictable medium for our experience. By using the browser, it placed deployment control firmly into our hands, giving us more time to make the content. It also made the experience available to more users, the only requirement being that they had a device with AR capability. Users just don’t want to install apps for something small and immediate like this — they have to work natively on the web in order to stand a chance of capturing the interest of the user.

8th wall provides you with an API that contains an abstraction of SLAM tracking, and you can use the stack you are more familiar with for the WebGL part of your project. As we have been using Three.js for most of our internal and commercial projects, this was our framework of choice. We have been very happy integrating the GLB workflow from Babylon.js for some time now, and we have been impressed with our tests of Babylon.js straight out of the box. But in this case we haven’t completed the testing phase in order to make the switch. So we favoured the union of 8th Wall’s API and our tried and tested Three.js development pipeline.

Building the 8th wall

8th Wall is excellent — they have a great coding environment, and they handle the hosting and source control for the project. They have support and repository examples for most of the frameworks you can get out there, and the 8th Wall team and developer’s Slack channel is very responsive and helpful.

There are a few considerations on choosing 8th Wall. The prices can escalate quite quickly, and the client has to be aware they could get an extra bill in case of a successful campaign. I guess you could say this is a good problem to have in some regards.

The scale for SLAM tracking is the main issue to resolve. If you need the object to locate on flat surfaces properly (as they would if you are using Vuforia or ARKit/ARCore) then it’s not quite the same. You can get the proper scales if you use 8th Wall and a traditional marker, but not really if you just use planar tracking. Given the nature of this project, a markerless approach was the only viable option, so we again felt the immediacy of the content meant we could make a concession on the reliability and scale of the tracking environment.

Going with the Flow

Even a simple linear experience like this needs to outline the flow and interactions required

Locking into our GLB workflow, we could easily handle the pre-defined animation loops and code this into the experience. It also allows us to refine the animation, textures and shaders while the asset is being worked on. By defining the takes that the experience uses, we can plan the framework of the app before we have any content to add to it.

We use Adobe XD to plan the design and UX of the app before we do anything, and this shows both the animators and developers the key points we need to hit. It also has a great online sharing stage where you can create animated flows for clients, as well as design and development guides for the creative team.

Asset Creation and Animation

We model the asset and create textures in Substance painter to give a fur-like appearance, without the rendering overhead

If you’ve read our previous article, you can tell we are serious about how we structure our asset creation pipeline, using a well-tested GLB asset and PBR shader workflow. Not only is this optimised for web frameworks in general, it allows a simple asset integration for the development team.

Here we showcase the setup stages on the character — from bones, to rig controls and final animation

For the animator to be able to move the character in a sophisticated way, we build a production-level animation rig with various controls at the animator’s disposal. Character animators use forward and inverse kinematics to gain good control over the posing and movement. We wanted to make sure that we could create an asset that allowed for some sophisticated movements, but would still be compatible with our web framework of choice.

We used a script in 3dsMax to take the rig setup and perform a step known as “baking animation”. This means we take the deformable bones and collapse the transforms per frame into their position in 3D space. As the non-compatible kinematic chain is removed, the exporter can process and display the animation in full keyframed glory.

We opted for a physical particle system rather than a procedural one

You may have noticed in the preview above that we opted to use a physical particle system in our animation. Whilst Three.js does have particle systems, we found the control we could get from a baked particle animation was just as viable for this. It also created less development overhead, as our Technical Artist could animate this, and collapse it to a series of meshes. Again — our workflow was to bake the particle animation out to null objects and link a mesh to it. The result is a fast implementation of particles, using physics like drag and gravity. Since it is not a requirement for the particles to follow or be influenced by any of the animation, this is a good way for us to work to get something that looks great in a fast time frame.

Now the Science Bit

Three.js has a wealth of incredible interactive projects

Three.js is our framework of choice for most of our WebGL projects. Getting it all to integrate and being able to create a real time update of the animated asset was the technical goal of this project.

From what you can find on the basic 8th Wall examples, we have enabled the renderer to be able to handle PBR materials and give the most lifelike integration into the AR stage.

renderer.physicallyCorrectLights = true;
renderer.toneMapping = THREE.ACESFilmicToneMapping;

Rendering the Realtime Texture

We worked up a novel solution to personalise this experience, and rather than adding some text on the screen, we conceived a way to write the message onto the texturemap of the model in the scene.

We take the text input from the user, and then add this into the diffuse map of the model itself at runtime.

To be able to print the name of the user in the texture we have used Three.js CanvasTexture . Firstly, we load the diffuse map in a canvas and use HTML5 to draw on the top of the texture. We then retrieve that back to the diffuse channel in the model.

const texture = new THREE.Texture(canvas , THREE.UVMapping) // Create new Texture
texture.wrapS = THREE.RepeatWrapping; // Set the wrapping modetexture.wrapT = THREE.RepeatWrapping; // Set the wrapping modetexture.flipY = false; // Apparently the UV gets flipped as default on three js so we revert that
texture.needsUpdate = true; // update the texture
mesh.material.map = texture;
mesh.material.color.set(‘#7f7f7f’);

You can see the dynamic aspect in action here as part of the full experience. Any text entered, becomes part of the augmented reality model in your environment.

It’s a nice solution to make the experience more personal. We talk a lot about user empowerment as the cornerstone of meaningful experiences, and involving the user with some personalized content is a great place to start.

Thanks for reading, and if you want to try this AR experience for real (and make a donation to Save The Children) you can view it live through December 2020.

--

--

Taylor James Interactive
Taylor James Interactive

Written by Taylor James Interactive

Our team of Creatives, Technologists and 3D Artists combine to allow us to bring rich narrative and innovative digital content alive.

No responses yet