Augmented Reality and Visualization Frameworks

Taylor James Interactive
6 min readJan 7, 2020

--

At Taylor James Interactive, we’ve witnessed many premonitions of where the industry will go, and what tech is going to drive it. This doesn’t change the fact that any platform needs meaningful content, irrespective of the delivery method.

User Driven Experiences

Interactive platforms have a great capacity to deliver meaningful content — but they need to empower the user with the ability to visualize to their specification and taste. The main focus for us is how we can deliver high quality content for a variety of scenarios, that is tailor made for the user. Put simply - if the user feels empowered to drive the experience rather than being led, you have a greater chance of it creating a meaningful connection.

VR, AR, and (to a lesser extent ) XR are all terms we hear a lot of and most notably see used in the wrong context. If we see them as pieces of the same visualization framework we can begin to see how we can utilize each of them to their maximum, as we are being objective about the needs of the content rather than a piece of hardware.

We understand that our job is to guide potential clients through this evolving world into the territory they are comfortable with — ultimately delivering the messages that mean something to them.

Recently, we have discussed a notable shift away from apps and towards web technologies. This is something that we are finding our clients are asking for, but is this actually something you could realistically define as a trend? In our experience, we don’t see there being any less need for apps where the experience delivers a more complex experience. Where we see the requirement for solutions outside custom apps is to capture potential connection for people who are not already invested in the brand or product. So in order to be able to create viable interactive content, we need to be mindful and aware of how to deploy in all channels.

Advances in Integration

Last year we delivered a browser-centric Healthcare project for McCann using WebGL, which was a first for them and a good indication of the potential of web technologies. Whilst this is advancing at a pace, AR’s native Web support isn’t quite there. hence why Apple and Google have built their visualization frameworks (ARKit and ARCore) into their OS out of the box.

Previous limitations such as needing to print out a marker to activate the experience have been removed. Markerless plane detection means we can activate any flat surface, and facial recognition on newer generation phones is implemented to a level that allows us to use our own faces or others as a target, including using expressions and gestures as triggers.

However for the experience to work, we have to feel like the object is meant to be there. Integration of AR content is one area where we have seen a huge leap forward. There are two game changers for Augmented Reality to allow us to feel it sits within our world — lighting estimation and segmentation.

Lighting estimation adjusts the ambient levels on the renderer to allow tight integration with the scene. If we believe the content is there, we are already in a place where the message can be conveyed without interference.

Lighting estimation in action — This AR model is being lit by the light that is present in the environment. You can see the shadows (that are generated in real time) are aligned with the direction of the light source behind the model.

Segmentation is a technique that allows the camera feed to separate objects from the backgrounds, so we can build an idea of the space.

This allows the content to integrate further with the environment by compositing in real-time any occluding objects like arms and bodies on top of the 3D content. This creates the impression of the content being located inside the environment, rather than overlaid on top.

First Image— Without occlusion Second Image — With Occlusion (Images courtesy of Google)

This has opened up many potential ways to showcase products for brands. Whether this is more functional content, like being able to visualise a product within your environment or more social driven filters using Spark AR, we’ve witnessed first-hand the need to develop our AR process as a legitimate channel for brands to leverage.

Our SparkAR filters for the Lovefit Festival used face recognition and segmentation technology to make the particles appear behind the user’s head

Apple has been developing the Quicklook AR system for some time now, via their ARKit Framework, and Google have used ARCore to implement similar functionality by adding AR to Google maps, their online web search and the camera app.

Not only this, Facebook, Snapchat and Instagram feature sophisticated (and fun) augmented reality filters. This now gives us is dynamic, immersive content opportunities without installing apps. Although this is only partially true as they are still being app driven, it’s just that the apps that support it -Instagram, Snapchat and Facebook are coincidentally on most smartphones. This tells us that true, native experiences within the browser are still something to develop for the future.

However the good news is AR is now enabled into the devices we use on a daily basis, so all we need now is the content.

We made an AR activation point for the launch of the Volvo XC40

We recently delivered a small project for Volvo UK, where we built an AR model of their new XC40 in time for the UK release.

Given the nature of the platform it was being deployed to, we were faced with the challenge of optimisation. If someone wanted to preview AR content from Volvo’s site, they couldn’t be asked to wait 3 minutes for content to download. As a static model it couldn’t be streamed either, so how small does a 3d model of a car need to be to deliver this content via a normal data connection?

A combination of a measured optimization and high quality physically accurate shaders gave us a premium quality finish without the download cost. The iOS version weighed in at just over 5 Mb for an entire vehicle. Sure, we could see areas that we had to make concessions on (notably the interior) but it was a highly optimized solution that gave great balance between fidelity and overhead. It’s much like the old days of CGI for games — Where content for the games engines had a triangle budget. Nowadays for AR, we can employ the same mindset, updated for newer approaches to asset preparation and shading. As a result our team conceived a new workflow to perform this task.

Volvo used this content as “the Volvo on your driveway” — A way for prospective buyers to visualise the prospective purchase in their environment. The connection between the product and a user’s personal space is something that creates a tangible connection in the prospective buyer.

Dual Platform Support

The metrics from the Volvo UK deployment give a good indication of the platform landscape. It’s still loaded in Apple’s direction, but this is because iPhone and iPad are linked together into a generic section, rather than a per device for Android. We look to develop content regardless of platform, as this still represents more than 20% of the overall base.

Supporting the Transition

One thing we take as seriously as the work is how we support understanding the process for interactive. Given that it is an emerging field where the landscape is evolving and changing, we spend time making sure we understand the needs before advising a potential route. We think this is how you make better judgments on content and the message it holds. Regardless of the size of the Brand or the agency, we work to the premise that any decision into this market needs solid foundations and a reason to exist.

--

--

Taylor James Interactive
Taylor James Interactive

Written by Taylor James Interactive

Our team of Creatives, Technologists and 3D Artists combine to allow us to bring rich narrative and innovative digital content alive.

No responses yet