Week 7, Part 2: 30 March - 6 April [Leap Motion, Prototype]
Thinking of possible uses of the Leap Motion in the
application, I ruled out a pop-up menu as this could only be used to trigger
scene changes – not the level of engagement the project requires.
Despite this, I was still able to draw inspiration from
the ‘Hover-UI-Kit’ menu I experimented with in the previous blog entry, in Unity.
Using a gesture to activate a ‘mode’ then more intuitive movements within the
mode for further functionality.
Using the hand to select parts of the airship to then display
related information was the idea I decided to develop. This could be done with
relative ease through raycasting out from the hand prefab to detect colliders
on a ship model.
I used the last bone in the index finger for the origin
of the ray, casting the ray forward in the direction of the bones Z-axis. Once
this functionality was implemented it needed to be made toggleable, triggered
by a user gesture.
Leap Motion provides an example script to detect predefined
hand positions. The script works by tracking each digit’s tip, relative to the
palm and wrist positions to detect whether fingers are extended. Each fingers
position is tracked continuously and its position used to update an enum for
each finger (extended, not extended, or either). The fingers’ states are then
used to detect gestures. The states enum is set to public, which can then be
selected from a drop-down menu in the inspector.
Initially toggled the ray by making a fist, but through
testing this was later improved to the point gesture itself (triggers when
middle, ring and little finger are not-extended), removing the need for a
different toggle gesture.
I began entertaining the possibility of using leap motion
when I saw there was no explicit requirement of connection to a computer as I had
expected. By using a USB-C/micro USB to female USB adapter, Leap Motion can be
connected to the android device itself.
However, after securing the device to my phone in this way, I was forced to reconsider the use of Leap Motion while
physically mobile at the exhibition. All other users would also need to secure the device to their phone/phone case with non-reusable adhesive. This
isn’t ideal as it may damage their property, will be relatively expensive and
not reusable and if not done properly may result in device damage. These factors make the Leap Motion an impractical choice as an application peripheral.
As an alternative to physically moving around the scene,
I considered using the hand prefabs position to translate the camera.
Initially, I assumed it would be necessary to get a
vector between parallel joints to calculate the rotation, pitch and yaw of the hand
prefab.
Luckily, I consulted the Leap Motion documentation before
implementing this approach as it showed that the hands rotation and pitch can
be accessed from the instance of the hand prefab. It was then a matter of
comparing these values to defined thresholds to trigger movement of the camera.
After implementing this, I found the camera would often not
respond immediately to input, or move along or rotate about unintended axis.
After writing the position values of the hand to the console to locate the
issue, I found that even when the user holds their hand as still as possible, erratic
value outliers are returned.
I altered the thresholds to increase the size of the
deadzone. This separated the range of motions to a suitable degree, preventing
hand position from triggering an alternate command.
Using Leap Motion to steer and select |
If I were to continue development using the Leap Motion, I
could improve the reliability of data given by the Leap Motion by averaging the
data to eliminate noise. Averaging the data will add latency, so testing would
be needed to balance the results accuracy/latency.
An alternative implementation would be to apply smoothing to
the data. Creating a weighted average function, where the further the new data
is from the previous value, the less weight it is given.
While Leap Motion proved to be an effective method for
obtaining user input, unconsidered issues regarding the practicality of its use
will make it incompatible with the project I am building.
Using the device would also restrict the number of
potential simultaneous users to the number of Leap Motions brought to the
exhibition.
Raycasting out to detect interactable objects was a far
better option than the slow transition of steering the ship to trigger UI, and
could be made to work with my recent Vuforia build which allows the player to
physically move about the scene.
I will attempt to create a similar functionality using
Vuforia next week.
References:
Comments
Post a Comment