Week 9: 13 - 20 April [Raycast Movement, Project Polish]


Venue Change

Basil emailed to advise the original venue is no longer able to host the event. Basil has already begun arranging a replacement venue, which will be a large, open area – though, it will be entirely indoors.

We had been developing with the understanding we had all the vertical space and floor area both within the hangar, as well as the exterior as the hangar doors would be open for the exhibition.

The new venue has significantly less overhead space and the interior will contain all visitors at all times, which may cause my current app design (navigation of scene through physical movement) impractical and interfere with other displays/people at the event.

This necessitates a design change. I will keep the fundamental design of the app and experiment with alternate methods of moving within the scene, replacing physical movement with user action to select predefined view points within the scene.

Responding to Change

Movement

Of the possible solutions, selecting a new user position to move to with a raycast from the controller would be the most intuitive (versus a separate button or new input).
Using markers to denote positions the player could select to move between, I added particle systems to further draw attention to the available positions. From the lessons learned regarding particle systems during week 5, this was a very short task by comparison.
Testing the size/position of the position marker colliders was then necessary to ensure usability.

This method of navigating the scene totally eliminates the possibility of ground plane detection or GPS based movement as would have functioned before, since the perspective is immediately changed without device movement. The ‘Spawn Airship’ controller button will also now be removed as the feature is dependent upon either plane detection or GPS position.

User movement between markers.
To preserve some sense of movement, to let the user know where they were relative to the airship, and to remain aware of its scale, I changed this to a transitional movement, lerping the ARcam position between its current position and desired position.
Through a script I rotated the camera about the Y-axis to keep an element of the airship model in view at all times.

Feedback from testers showed they were confused with what was happening and why their input to the scene view was being ignored.  

Indicating User Action

Once in place I tested the functionality of the remainder of the scene. While functional, it wasn’t clear whether the button press was effecting a change in the environment.

Highlighting the selection to make users input more easily recognised was a clear choice. I trialled use of a shader to highlight the object border – had to research how to do this as had never previously attempted a shader before. 
Using information from various tutorials and documentation I began making a shader which would give the selected model a glowing outline.
I was surprised by how little feedback shaders give you in response to code, only showing the error when compiled.


Border shader applied to airship
Testing use of border shader



The effect was applied successfully to shapes where the geometry on every axis is proportionate to the centre mass of the object. As such, the effect distorted the outline of the airship when applied to the model, and when I attempted to alter the shader to overcome this, the resulting border thickness was scaled so drastically that if the gondolas were selected, they appeared as solid objects (no windows). I chose the detail of the model over the highlight effect.
I found the best, and also simplest way of showing the selected object was by simply changing the colour of the object material.


Object highlight

Object highlight

R34 3D Model

Unfortunately, when making the model I was oblivious to the importance of how materials were applied. When attempting to assign alternate materials within the Unity scene, only portions of the intended model would be effected. After much reading I learned the cause, now aware that multiple materials exist on the same object, it was a simple matter of accessing each material in the array to alter them.

In attempts to identify solutions to the prior issues, I revisited my model in an effort to locate the problem. While this didn’t help my confusion, I was able to amend the model to produce doors to the main front gondola, and separate engine propellers, both of which can be animated to give more depth to the application.

Vuforia Issues

Incorrect materials application.
One significant issue I experienced during testing was not entering any of my switch statement cases, despite seeing the raycast pass directly through another model.
After rewritten code and many attempts to rectify, I found it was because while the ARcamera prefab offers the choice between attributes and child objects being assigned relative to world space or screen space, the camera itself is repositioned on play start to world origin. This meant my view on play was not what I had been intending, and also that the line renderer I was shooting from the camera position, did not correspond to the raycast coming from the controller tip.
Once I was aware of the behaviour, I made an empty game object and then made the ARcam a child of this empty. This allowed me control over the game object, and to update the ARcams position to that of its parent, moving it where I wished.

Testing has also highlighted the limited effective range of the virtual buttons. The vuforia documentation advises that virtual buttons (approximately 5” in size) should give reliable responses up to 2’ from the device camera. My own tests have shown behaviour to become inconsistent after approximately 40cm, though if the controller image is first recognised within the first 40cm range, its distance from the camera can be increased to approximately 50cm before tracking becomes too inaccurate for use. While not a crucial problem, I have occasionally seen this cause frustration in users and some awkward poses in order to recapture the controller. I have considered increasing the size of the controller to increase the effective detection range, however if the controller is made any bigger it will not only make it difficult to hold, but also impractical for use in the scene as the controller will hide even more of the game scene.


Now more features have been added, the last object to be recognised and rendered by the ARcam has least priority (the controller). This has contributed to the interference seen in the below image. This is primarily caused by objects competing for a similar Z-axis position (the controller cannot be altered as virtual buttons are permanently locked to database image z-axis at which they are created. Altering the cameras near and far clip planes can reduce the clash, but does not solve it. 




Next Week

The most troubling piece of feedback received this week related to the updated user movement. Users have advised that without being able to look around the scene themselves, it feels like a ‘click-adventure’ and all the engagement the interactivity brings is lost.
For this reason, next week I will work towards including google cardboard within the project, to provide the ability for users to look freely as before.

Comments

Popular posts from this blog

Week 3: 2 - 9 March [Continued AR prototype]

Week 1: 16 - 23 Feb. [Meeting with Museum Director]

Week 4, Part 1: 9 - 16 March [Understanding Kinect functionality]