Week 6: 23 - 30 March [Plane Detection, GPS, User Movement]

After having implemented a physically static scene, triggered upon recognition of a database image, to increase the applications interactive potential I began the week by continuing to research how to generate a 3D model relative to world space the user can physically walk around and interact with.

This implementation would also allow multiple users to use the application independently, at the same time, without the need to crowd around a single database image or mass produce many images.

While as a group we had played with the idea of adding image detection to leaflets, any images and virtual buttons used would need to be the maximum rating for recognition, at the same time as having a suitable layout for the museum’s needs. Both of which may interfere with existing plans of the museum.

Ground Plane Detection

Vuforia heavily advertise ground plane detection as one of their powerful assets. From the resources made available online I learnt the platform offers object permanence, relative to real world position.
The device will recognise a horizontal surface, use that surface and surrounding features to define a location, then as the device’s perspective moves relative to these features, the instantiated model will move by the same amount.

After attempting to recreate the guided example multiple times I referred to the Vuforia documentation. Unfortunately, this feature is currently only supported by Samsung Galaxy S7 or newer, and iPhone 6 or newer. This meant having to borrow a device to build test builds too, limiting my access for developing the method.

I tested the scene on S7 and S8 devices, with varying degrees of success. The ground plane must remain in view of the device camera at all times for accurate tracking. If line of sight is lost, the object will move relative to the device view.

An anomaly I did notice while testing was how battery life of the phone affected the performance. As soon as either device entered critical battery, any plane detection would fail and be permanently lost until recharged, regardless of other factors.

Until this point all tests had been carried out with a small model, to confirm functionality. I moved on to testing a life-size model but found that testers immediately would love plane tracking.

To achieve a suitable scale where the airship gondola allows the user to walk through results in the main compartment of the airship being a significant distance off the ground. This made every playtester look up, losing the plane. If the user does not have the ground plane in view of the device camera when moving, the position of the airship will remain relative to the device, not the world.
Using this method for a model of this size requires a large space (without walls) so the user can approach it and interact while keeping the ground in view. Because we are yet to have seen the event location, as well as the prohibitive need for a Samsung galaxy to use this feature - I will revert to using my earlier implementation of using the device GPS position relative to the model, to give all attendees the same opportunity to experience the application. 



Early implementation of GPS data to move airship,
erratic spike in distance value loses ship.

GPS Based Model Position

Building on my earlier implementation of GPS positioning I attempted to remove the erratic variables which often interfere with moving the airship smoothly. I implemented a basic algorithm to take the average of the incoming data values, then overwrite the existing value when within a suitable tolerance.

This has improved the reliability of the method, though significant spikes in data still persist.











Limiting Instantiate Objects

Occasionally the instantiated model would be lost. Either the tester would walk too far in a particular direction and not want to return to the starting location, or an erratic GPS reading would push the airship out of view.
There were also occurrences where a player would instantiate many airship models in very close proximity, making the scene practically unusable.
To overcome these issues, I edited one of the supplied Vuforia scripts which spawns models on user tap to first check whether another model was already instantiated and if yes, delete it when creating a new one.

Testers responded well to this addition, seeming to find the control intuitive as no explanation was requested.




References:
  • Library.vuforia.com. (2018). Ground Plane User Guide. [online] Available at: https://library.vuforia.com/articles/Training/ground-plane-guide.html [Accessed 27 Apr. 2018].
  • Library.vuforia.com. (2018). Introduction to Ground Plane in Unity. [online] Available at: https://library.vuforia.com/articles/Solution/ground-plane-guide.html [Accessed 27 Mar. 2018].
  • Technologies, U. (2018). Unity - Scripting API: LocationService.Start. [online] Docs.unity3d.com. Available at: https://docs.unity3d.com/ScriptReference/LocationService.Start.html [Accessed 28 Mar. 2018].
  • Vaughan-Nichols, S. (2018). How Google--and everyone else--gets Wi-Fi location data | ZDNet. [online] ZDNet. Available at: https://www.zdnet.com/article/how-google-and-everyone-else-gets-wi-fi-location-data/ [Accessed 28 Mar. 2018].



Comments

Popular posts from this blog

Week 3: 2 - 9 March [Continued AR prototype]

Week 4, Part 1: 9 - 16 March [Understanding Kinect functionality]

Week 1: 16 - 23 Feb. [Meeting with Museum Director]