Week 4, Part 3: 9 - 16 March [Kinect Prototypes]

I began building basic prototypes to test functionality, using the example scenes provided in the Windows example package as a starting point to understand how the data is collected and used.
The package comes with documentation explaining the top-level user interaction with the sensor and the example scenes provide functional implementation examples.

Throughout development of each of my test scenes I did find several incompatibility issues. A significant amount of commands, or features of Unity used by the Kinect v1 are deprecated. In addition to this, Unity would often crash on playing a scene. From reading archived forums I found that this is a common problem in newer versions of Unity. After installing earlier Unity versions and using forum consensus, I found Unity 3.2 to be the most reliable (will still occasionally crash, but far more rarely). I was unable to trace the cause of the crash, though it is likely a result of memory leak within the example project.

My first attempts were creating scenes where the user had direct interaction with other game objects. I initially used joint tracking to produce a basic skeleton, then moved onto use the depth sensor to give a point cloud representation of the user. The point cloud not only had more accurate interactions, but can also show a recognisable figure in game – which if used at an exhibition may be more engaging than a universal skeleton.

The first scene is a simple game where the user must keep the ship afloat. I included the model of the airship which I then gave a rigidbody and collider components. If the user’s point cloud enters the collider component, it has a force applied lifting it into the air with a random degree of horizontal movement. If the ship’s y-axis position is too low, the ship is destroyed and another spawned.

Depth tracking to interact with game objects.
The second scene uses the same depth generated point cloud as the first, for a visual representation, but tracks user joints for the gesture functionality. User’s gestures effect the rotation of the model. This could be improved by either ‘lerping’ the ships transform, or ‘slerping’ the ships rotation to smooth transitions.

Gesture controls.
The third, and most promising scene, attempts to create a 'virtual 3D' perspective by altering the scene camera based on the users head position. Initially considered using the Kinect data to affect the camera matrix directly, though realised that including a tracked skeleton within the Unity scene, I could update the cameras position to that of the tracked skeleton head ‘joint’.

Joint tracking to alter perspective.
My scene for testing originally included only the airship model, which did demonstrate the shifting camera position, though did not make it feel very significant.

To exaggerate the effect, I added objects (with exclusively 90-degree angles) extending along the z-axis, also adding a grid texture to the background planes – both of which provide reference points so the user can how much the camera moves relative to the scene.

I had only been testing the virtual 3D scene myself, and recognised that if the camera was positioned higher/lower, or someone else tested the scene the y-axis values were never within an appropriate range to experience the full shift in perspective.

To overcome this, when the user is first detected by the Kinect, the y-axis position of their head ‘joint’ is taken and made to equal 0, relative to the cameras y-axis position. While this resolved the initial problem, it does mean that users should enter the Kinect’s view slightly crouched, so that they can fully stand to raise the view to see over the airship.

During production of the three prototypes I also noted instances where the Kinect would automatically adjust its viewing angle at inappropriate times, pivoting up/down during play. From reading I know it is possible to limit the Kinects allowed range of motion, so in future will restrict this while playing.

Having dedicated some time to exploring the implementation of the Kinect, I no longer think it suitable for the airship exhibition. Compatibility issues and depreciated packages make it relatively unreliable, which will slow development and cause significant issue if it were to happen at the exhibition.

Comments

Popular posts from this blog

Week 3: 2 - 9 March [Continued AR prototype]

Week 4, Part 1: 9 - 16 March [Understanding Kinect functionality]

Week 1: 16 - 23 Feb. [Meeting with Museum Director]