Week 8, Part 1: 6 - 13 April [Vuforia Controller]

As discovered in the last blog, use of the Leap Motion with the application design is impractical, though I intend to preserve the interaction mechanic developed as much as possible.

Creating a controller which could use virtual buttons as tested previously would give the same result as raycasting from the Leap Motion hand prefab.
I trialled several different prototype designs – ranging from a simple A5 piece of patterned paper, through a paper cube, to a tube-shaped roll of paper.
The latter was the most promising design, giving the user a handle to hold the controller, leaving the top half of the tube available for placing virtual buttons which could then trigger a variety of functions
For the exhibition, I was keen to make the controller easy to replicate and cheap to produce, paper being an ideal option.

The first controller attempt went unrecognised by Vuforia. Building on my earlier research into issues with recognition, I believed this was due to the rolled paper surface being presented to the ARcam not being flat and so skewing the database feature points.
The test was repeated after folding the paper into a rectangular cuboid. This confirmed my assumption as the database image was now recognised.
After repeated testing I found handling the paper deformed the structure and distorted the feature points as before. As a solution to this, I gave the paper a more robust structure, measuring the paper and spacing columns of images so that when folded it would produce a triangular prism. This shape presented a more reliably flat surface and eliminated any further deformation during testing.






When designing the controller, I ensured the selected database was rated 5-star, so that it would be recognised as accurately as possible, given that users may move the controller erratically and at different distances from the camera.

Repeating database pattern
Although rated 5-star the first image design resulted in the augmented controller continuously snapping to positions down the length of the paper controller. After more tests I realised this was due to the repeating pattern on the controller. As the controller was moved and rotated Vuforia would lose tracking of the database image for fractions of a second, normally unnoticeable, but would then reacquire the database image incorrectly and incorrectly assume the top-most ‘R34’ text and use this as the position to instantiate the augmented controller.
I selected new images, giving thought to those selected so as not to create another repeating pattern, to allow the database image to be tracked from a range of angles and ensure a high concentration of image features for the virtual button locations. I was also careful not select images or portions of images used elsewhere in the project as this could result in instantiation of an incorrect model as I learnt on a few careless occasions.


Demonstration of controller and buttons
Final controller database image design.



A virtual button was added to the ship that when interacted with would raycast out from the controller, detecting models in its path.
After some testing a second button was added to spawn another airship. Due to using ground plane tracking for instantiation of the ship model, it is possible that if tracking is lost, or the user moves too far from the starting location the ship will be positioned relative to the user at a distance that makes interaction difficult.
By editing one of the example scripts Vuforia provides I was able to trigger spawning a new ship on interaction with the virtual button and limit the maximum number of possible loaded airship models to one to prevent overcrowding.


When testing these buttons, I found one to be unresponsive. After confirming the functionality of attached code was correct I looked to the Vuforia documentation and forums to identify an issue with the button itself. I learnt that virtual buttons can be given interaction layers, and that this buttons layer was out of range of real world input. Adjusting the layer quickly rectified this.

I also noticed that the reliability of buttons greatly decreases as distance from the device camera is increased. After further reading and forum advice there is no solution or work around to this, but it is a limitation of the software.
I will need to remain aware of this when designing further controller elements.


Due to the necessarily large size of the airship relative to the player, one very significant issue that repeatedly arose in testing was that users would orient the device up, to look at the airship. This prevents the device from tracking the ground plane, moving the model relative to the device, rather than relative to the real world.
This issue occurred too frequently to ignore. To solve the issue created by lost tracking of the ground plane, I will refine my script which takes the GPS position of the device to move the models position. Using this approach will also allow other android devices to better experience the application, as the more reliable ground plane detection is limited to Samsung Galaxy models, S6 and newer.

Comments

Popular posts from this blog

Week 3: 2 - 9 March [Continued AR prototype]

Week 1: 16 - 23 Feb. [Meeting with Museum Director]

Week 4, Part 1: 9 - 16 March [Understanding Kinect functionality]