Cinematic Animation and Previs Supervisor, Mocap Director and Unreal Integration Specialist

October 2018 I joined Dreamcraft Attractions based in Victoria on Vancouver Island, who were making Themepark rides with VR/AR and Projection mapping using video game tech.

Over the two years I was with the studio, I got to make several prototypes which I am fortunate to be able to show you here. Quality of work varys pending the focus of the project as budgets were often small to keep costs down.

There were other projects we did but sadly I am unable to show those at this time. You’ll just need to take my word for it. :)

Dragon Valley

Over a 4 month span a small team of us created this Dragon Flying Experience. I was responsible for much of the intro’s pacing and all of the animation in the experience including some of the motion base movement.

In the fall of 2018 I joined Dreamcraft Attractions. A company making Themepark rides uses modern game tech in VR and AR. This Dragon ride prototype is what we made in a 4 month span. This experience runs at 90fps in Unreal4 on an I7 1080 ti using a Vive, a leap motion camera to do guest embodiment tracking hands, and a 6 Dof Motionbase nicknamed the beast.

Here’s a work in progress animation that never got completed for the experience. This represents about 2 days of animation work on the dragon that lands. It has a ways to go but had promise.

During some down time I figured I would start reanimating some of the dragons in the experience to bring up the quality level. I figured it would be cool if the Dragon to the left of the guest wasn't their at the start but came flying in landing very close to the guest. I didn't get too far in before I got pulled onto something else so I never completed this. This has a solid 16 hours of work put into it to get it to here. It needs at least another 16 to get in a place I would be happy with. The turn needs some finessing with how the wings flap to get a more accurate physical motion of that dragon augmenting it's wing flaps to pivot in the air. Trying to keep the landing hard hitting, but I was aiming to get a little more of a slowdown reaction to the last two flaps. The tail needs a whole pass of key framing to get that snake like secondary motion going. Then there was the continuation of this where it needs to turn on the platform so it's facing you, and the take off with you when your dragon lifts off. Not to mention the dragon your on needs much love neck forward still. That was a rush job. Interesting note on that is the dragon your on needs to be slower than the ones you see around you for a couple of reasons: 1. To sell scale/size, a slightly slower moving dragon helps sell that. 2. VR can often be motion sickness inducing so dialing back the amount of motion translated into the seat and ultimately your head set needed to be factored into the animation. 3. The 6 Dof motionbase this animation powers can only move so fast and so much. Some movements it still can't do so tailoring the motion of the dragon to feel good on the motion platform affects the visual performance.

 

Project Eve

Eve started off as an attempt by our small content team to make the most life like digital human we could that could run at 90fps in VR. Working with Pixel Light Effects, Beyond Capture, SnappersTech, XAC and a senior character rigger, we produced Eve over a 5-6 month span with an internal team of 3 at Dreamcraft.

Once we got the her visuals sorted out created a little demo experience where you can play Tictactoe against her. Again I was responsible for all things animation, and handled all of the tech work around getting her rig running correctly in Unreal.

Dreamcraft Attractions embarked on trying to make a photo real digital human, that runs at 90fps on a Vive Pro. Working with Pixel Light Effects to do Photogrammetry Scanning, Snappers Tech to create a face rig. We went to a seasoned Rigger, Jim Su created her body rig. Jason Buchwitz, Dreamcrafts Art Director worked through the look dev of the character getting her hair build, her textures and shaders all working together with the scenes lighting. Costumes were done by XAC, and Dreamcrafts Junior Artist, Meagan Tiede. Meanwhile I worked on integration of the rigged character into unreal, getting all the body and face animation with the 360+ blendshapes working correctly. We approached our friends at Beyond Capture for all our motion capture needs. All of the mocap was processed and improved by myself along with all the face animation once an initial solve through faceware was done by Beyond. The opening animation I worked with our outsourcing partner XAC to push the face animation as far as we could. Seasoned Engineer Mike Krazanowski supported most of the engineering needs, while Ben Wilson, Dreamcrafts Embodiment expert got the guests arms and head tracking correctly so we could play tictactoe without and controllers or wands. Jordan Ivey layered in some supportive audio for the experience. While good, we could not punch through the uncanny valley. The nature of taking 23 face scans of facs face shapes, and creating a complete set of facial poses can not perfectly match a humans ability to create 1000's of subtly unique face expressions. A person can smile hundreds of different ways but the setup could only do 3 or 4 ways so the subtley that could punch through the uncanny valley was flattened down.

 

4 Day Proof of Concept

Towards the end of my time at Dreamcraft. A couple of us decided to take some art that our Env/VFX lead, Alex Jenyon was putting together and make a proof of concept video with it for what could be a VR or projection mapping room experience. We started on a Monday and delivered it to the studio Friday Morning. My part of this was the fps character animation, the spot light animation, and the general pacing/flow of this demo. The jump scare was dropped in the final 30min of production of this piece. Our seasoned Audio Director jumped in the final evening and busted out an audio pass to complete the experience and sell the spookyness of it giving it a bit of a Stranger Things vibe.

Enviromental/VFX lead Alex Jenyon, Audio Director Jordan Ivey, and myself had some downtime. Alex was prototyping out some weird looking growth using his Houdini magic, that reacted to lights in the room so we decided to make an experience out of it as a proof of concept. The idea is that this could be done in VR, or even on a projection mapped room. I mocked up the players point of view and light to mimic what a guest would do in this experience. Then Jordan layered in his audio expertise to complete the experience. Started with the idea Monday, delivered this to the studio Friday morning. It turned into a much more horror Stranger Things vibe as it progressed, but shows off what a small team can do very quickly to prototype out an experience.