Week 7, Part 1: 30 March - 6 April [Leap Motion, Research, Testing]
To keep building on the focus of the project, making the
application as interactive as possible, I was keen to include Leap Motion as a
method of input.
When I began using Kinect, I found dedicating some time
to understanding how the device functions allowed me to approach the problem with
knowledge of specifically what the device itself needed and identify potential
issues before beginning development
Similarly, with Leap Motion, while I understand what the
peripheral does, though I’m not fully aware of how the device reads and
processes data.
Leap Motion key components |
From reading I found Leap Motion then applies algorithms
to the sensor’s raw data to map hand movements - rather than generate a depth
map as the Kinect had done.
The device then allows users to interact with an
application through gestures, with a high level of accuracy.
The LEDs project infrared over an ‘interaction zone’ of 8
cubic feet above the device (2 feet above the controller, 2 feet either side).
The interaction zone is limited to this size due to how
infrared propagates through space as it becomes much harder to determine the position
of the hand in 3D beyond this distance.
With greater power the effective range could be
increased, however the intensity of the LED lights is limited by the current
that can be drawn via a USB connection.
In 2016 Leap Motion released new software, called
‘Orion’, built specifically for VR. This also increased the effective
interaction space above the device by 33%.
As a team, later in the projects lifetime using VR is
central to our applications design. Knowing that Leap Motion is being actively
developed with VR in mind is encouraging.
-
After doing this introductory reading to learn how the
equipment functions, I moved onto installing the device ready for use.
Leap Motion have made their software easily available and promote
its use with unity. An asset pack is available from their website’s developer
portal which can be imported directly into unity with ease.
It immediately provides several example scenes, scripts and prefabs which showcase basic functionality.
It immediately provides several example scenes, scripts and prefabs which showcase basic functionality.
By default, hands are tracked and displayed in the scene.
Displayed hands have a rigidbody component, so can interact with other scene
objects using basic physics.
Having access to and being able to analyse these scripts was
a huge help in understanding how to implement the functionality I wanted.
I was then able to implement some basic functionality myself.
Building and testing these experiments highlighted some potential issues, as
well as solutions for future development.
The first issue I encountered was the orientation and
position of the device itself. Initially hands were being tracked accurately,
though along the wrong axis. I attempted to adjust the prefab rotation in Unity
to correct this, both in the inspector and in code, though the problem
persisted. After searching forums I soon realised that the device orientation
can be specified (horizontal/vertical) within Unity, depending on whether the
device is being used standalone (horizontal) or attached to a headset
(vertical).
I then found the leap can calibrate its Z-axis based on
initial input in a session (so if flat on a desk it does not matter which way
forward is).
Through testing, which I then confirmed through reading
the Leap Motion documentation, the device cannot be placed facing the user as
the algorithm also tracks forearms to inform where hands are expected to
appear. If the device faces the player in vertical orientation, hand tracking
is very unreliable.
Smudges on the device can also hinder accuracy of detection
and tracking, although the device provides notifications warning of smudges if
one is detected. These notifications have so far always been reliable (has
always been a smudge when waned of one) and seem to be sent to the user very
rapidly once the smudge has occurred.
Leap Motion produces a grayscale image as it tracks in near-infrared:
Leap Motion produces a grayscale image as it tracks in near-infrared:
Leap Motion produced greyscale image (skeletal hand respresentations are overlayed) |
I found that objects which reflect infrared light, can interfere with detection and tracking of hands and fingers, causing one or both hands to become untracked until the reflection onto the device is removed.
Other infra-red sources (halogens and sunlight to name
two) can also negatively affect the accuracy of readings. For the best results only
infra-red emitted from the Leap Motion should illuminate the hands to be
tracked. ‘Ghost hands’ can appear and take priority over the user’s hands,
replacing them in the scene, if additional infra-red sources are strong enough.
After these experiments, I looked into what other developers
have been able to create using the device.
This open source menu system “Hover-UI-Kit” (Github/Hover-UI-Kit/wiki, 2018) shows
how fluid and accurate the device is.
As shown in the above gif, using gestures to trigger the
opening of an interactable menu, track individual joints and bones independently, then allow natural hand movements to
interact with the menu contents seems to be the most intuitive implementation
of Leap Motion I have seen. As the application will need to be accessible to a
range of users, I will try to apply an approach in my own design, Inspired by
the Hover-UI-Kit.
References
- Colgan, A. and Colgan, A. (2018). How Does the Leap Motion Controller Work?. [online] Leap Motion Blog. Available at: http://blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work/ [Accessed 26 Mar. 2018].
- Developer.leapmotion.com. (2018). C# SDK Documentation — Leap Motion C# SDK v2.3 documentation. [online] Available at: https://developer.leapmotion.com/documentation/v2/csharp/index.html [Accessed 27 Mar. 2018].
- Developer.leapmotion.com. (2018). Leap Motion Unity Assets and Plugin — Leap Motion Unity SDK v2.3 documentation. [online] Available at: https://developer.leapmotion.com/documentation/v2/unity/index.html [Accessed 27 Mar. 2018].
- En.wikipedia.org. (2018). Leap Motion. [online] Available at: https://en.wikipedia.org/wiki/Leap_Motion [Accessed 26 Mar. 2018].
- GitHub. (2018). aestheticinteractive/Hover-UI-Kit. [online] Available at: https://github.com/aestheticinteractive/Hover-UI-Kit/wiki [Accessed 25 Mar. 2018].
- Medium. (2018). How Does the Leap Motion Controller Work? – LeapMotion – Medium. [online] Available at: https://medium.com/@LeapMotion/how-does-the-leap-motion-controller-work-9503124bfa04 [Accessed 26 Mar. 2018].
Comments
Post a Comment