Select which cookies you accept
When you visit this website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the website work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience.
Because we respect your right to privacy, you can choose not to allow some types of cookies. Below is a list of different categories of cookies that may be set and that you can freely change. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.
These cookies are required and must be accepted to use this site.
These cookies collect data about how visitors use this website.
These cookies allow the website to remember choices you make and provide enhanced, more personal features.
These cookies are used to deliver adverts more relevant to you and your interests.
How We Grounded Angry Birds AR: Isle of Pigs in Your Living Room (Part 1)
Hi, I’m Magnus, Technical Director of the AR and MR team at Resolution Games and today I’m taking over the blog to share some of the tricks we used to create the illusion that Angry Birds AR: Isle of Pigs really took place in your living room. I will describe how we achieved the effect in Unity as well as the general principles that apply regardless of which engine you are using to create an AR game or app. I will also describe them from an ARKit point of view because that was our primary development platform at the time. The majority of these tricks carry over to Android as well. Angry Birds AR: Isle of Pigs was released over two years ago, but again, the general principles still apply to any AR project made today.
If you haven’t played the game, here’s our release trailer. You can also download it on the App Store or in the Google Play Store.
An important aspect of selling an AR experience is to properly place virtual objects in the real world. To do so you get a bunch of information from your device. For example you get a video feed of the real world and most devices give you spatial information, like how the device is oriented. Some devices, like iOS and Android, also provide information about the space it is in, like horizontal and vertical surfaces, understanding of light, etc, and sometimes even a full mesh of the room.
This information can be used to more or less place virtual objects in your real world. In Angry Birds AR: Isle of Pigs we applied a few different tricks and effects to achieve this.
The effects are:
Ambient occlusion on physics objects
Ambient occlusion on props
Light intensity estimation
Light temperature estimation
Environment texture as light information
Color tint on video feed
Particle effects in the air
Refraction in virtual objects
Occluder surfaces on peripheral planes
All these tricks combined makes for quite a convincing visual effect. We will look at all the individual effects and see how they contribute to the look of the game and to really sell the fantasy that the Angry Birds are in your living room.
Among all of those effects two major groups appear: shadows and light and color.
Shadows connect the virtual world with the real world. This is a well known effect for all artists.
Consider this pig.
It’s a nice looking pig. But it looks like an illustration inserted in a blog post. Now look at this pig!
This one looks a lot more like it is posing in a white room. The difference is that the second pig has a subtle little ambient occlusion. A little blob shadow underneath. This is the oldest trick in the book - maybe? The blob shadow informs the viewer that the pig is actually placed on a physical object that receives shadows, instead of being suspended in mid air. If the drop shadow would move vertically it would seem like the pig would hover above ground.
In our game we have three different kinds of shadows.
Ambient occlusion from dynamic objects
Ambient occlusion from static objects
Directional shadows are simply the built-in shadow casters in Unity. There is a shadow receiver on the ground level that picks up on the directional shadows. The shadow receiver is simply a material that is completely transparent except for where there is a shadow received. Where a shadow is received we darken the real world a little bit.
In this game we really wanted to push the feeling of immersion, so everything was in AR. Even the majority of our UI. This choice is arguably not great, but that’s what we did at the time. So since our UI was in AR as well, that too had to cast shadows.
If you look closely you’ll notice that the UI is actually floating a little bit above ground level.
And of course the birds that are running around the level also need to cast shadows.
Ambient occlusion from static objects
There are two types of objects in a level - static and dynamic objects. Static props are used to sell the theme that the player is currently playing. For example, when playing ice levels the static props are snow covered trees and signs and heaps of snow.
By default our static objects don’t have any ambient occlusion on them. Ambient occlusion is normally achieved automatically by baking lights or similar methods, but we built geometry around our static props and that geometry became the ambient occlusion. So all our static props have what we call “ambient occlusion skirts” around them. That way we could tweak the ambient occlusion to make it look exactly how we want it to look in every situation.
In the clip below you can check out the ambient occlusion around the pile of snow to the right, with the sign and the tree. The ambient occlusion skirt is fading in and out in the video just to illustrate what it looks like with and without it.
Ambient occlusion from dynamic objects
Dynamic objects in this case are the blocks that the level is made up of. Our second shadow type is ambient occlusion from these dynamic objects. The floating pig you saw at the top had a tiny subtle ambient occlusion from it. This is a really powerful way to place objects in the real world.
We achieved this type of shadows by drawing the level blocks from the bottom looking up, with an orthogonal camera. This basically creates a depth buffer where the black indicates a shadow. The darker the black is, the closer to the ground that block is. We drew the shadow into a 32x32 pixel texture, which we then transformed into a 128x128 black and white texture with a simple blur. This creates a nice and soft ambient occlusion shadow. We then drew this 128x128 texture on a transparent quad on the ground plane.
To be continued:
There are many tricks you can use to merge the virtual and the real world when developing AR experiences and these are just a couple of them. I’ll be back soon to dive into this topic further. Stay tuned!
We are a games studio led by some of the top minds in the industry, forging a world-class catalog of titles that brings players into the richest VR and AR worlds possible. As one of the first studios dedicated solely to creating experiences for these platforms, our titles rank among the top-rated and most downloaded games in the space. Games like Demeo, Bait!, Blaston, Acron: Attack of the Squirrels!, Angry Birds VR: Isle of Pigs and Cook-Out: A Sandwich Tale illustrate the innovative and immersive ways the studio is constantly pushing the boundaries of what is possible.
Already working at Resolution Games?
Let’s recruit together and find your next colleague.