Seeing Is Believing

A slight deviation from the normal post made as I'd like to share some thoughts on more of the technical and dev side of the things.  Maybe that's what a blog should really be about rather than constantly explaining why things are taking so long! So I think a slightly overlooked aspect of character interaction is the way characters use their arms and hands to interact with objects and their virtual environment.  It is customary for a character to get relatively close to an object and then through a momentary act of supernatural prowess the object will magically appear in their grasp - bing!

It really is a no brainer as a developer as it is just so much easier to code a quick teleport than to worry about actually reaching and interacting with the object in a real-world manner.  Anyway gamers expect this kind of process and all that effort to worry about realistic animations won't really add too much value to the final experience...right?

Check out this video of a highly talented animator named James Benson (currently working on the fantastic looking Firewatch) who has replaced the standard Half Life character animations with his own.

I think this clearly demonstrates the power of realistic hand behaviours and how great a sense of presence and immersion it generates, especially given the first-person perspective.  Speaking of first-person - this aspect suddenly takes on a whole new level of precedence in the amazing world of Virtual Reality.

To look down and see your body is one thing but to see your hands reach out and interact with an object in the way you would in reality creates a really interesting bond with your character.  This connection to the avatar brings with it a new problem whereby the movement of the arms feel driven by an external force.  Almost like there is a puppet master controlling how you move - not great for immersion.  The only way to overcome this is to tie this movement of your arms directly to the physical input of the player.  I think we all dream about when this will be your actual 1:1 movements (Oculus touch!) but I think the standard controller will have to do for now.  It's amazing how well this works though, although you obviously know a small squeeze of your left index finger is not comparable to a full reach of the left arm but that tactile link is enough the make the animation feel like your own.

So we have continually been trying to ensure the player avatar behaves in this way.  In our first demos the player was locked down to a single location so it was relatively easy, you knew the approach path the reach would take.  This enabled very fine tuning in the editor to ensure it looked just right.  However, we had to re-develop to allow the player to move around freely in a scene, the animation system had to work when reaching assets from any position in the scene.  This was a tough cookie to crack and has been what the majority of the last month has been about but we go there.  However, we are thinking of moving this forward one more step and introducing physical key objects. This changes everything as the asset could end up in any position/rotating in the scene.  The final system may result in a slightly less accurate end product as the original stationary grabbing animations but the wins from physicality is just too great to dismiss.

Ultimately, we see this extra time spent on the arm/hand animations as something that will hopefully be one of Private Eye's strongest points.  It has sprouted many new challenges and some tough gameplay decisions (target object is out of reach and can't move closer, then what!?) but we hope these can be overcome in a way that maintains the level of presence we are trying very hard to maintain.  Realistic grabbing of physical objects coming very soon (hopefully)!

Thanks! Jake