Posts

End of development, 29 April [Email to Basil, Project Evaluation]

Image
Looking back at the project, I am very pleased with how much I was able to learn about technology I was previously unaware of. I believe my own resourcefulness has improved as a result of undertaking the project. At almost every issue I was forced to find the answer myself as no one had experience with the technology. Gathering player feedback throughout the development was key to improving every element, and highlighted both issues and solutions I would otherwise have been unnaware of. Had Basil’s arrangements for the exhibition event been more finalised I may have been able to concentrate on a specific implementation, but having to change my focus at times throughout the project has given me a wider understanding and I feel I could undertake a project using any software/device I have used during this project. If I were to undertake the project again, getting to the final stage would only take a fraction of the time. Having gathered the knowledge I would be more confide

Week 10: 20 - 27 April [Google Cardboard]

Image
Using Cardboard I began by incorporating google cardboard into the existing build. Google cardboard is an inexpensive way to provide more immersive mixed reality to event visitors. Limited by the mobile devices placed within it, the extent of a Cardboard VR experience will be greatly reduced in comparison to an Oculus rift; a small number of headsets could be used by many people over the course of the exhibition. Using the current GVR (google virtual reality) SDK, provides an emulator prefab within Unity so that the application does not have to be built to the device for testing purposes. This reduced development time hugely. My main focus for inclusion of the Cardboard headset is user experience. With this is mind, when purchasing the headset, I also selected some inexpensive additional items which will assist with the overall experience. Cushioned pads – to aid with extended sessions, as the Cardboard corners can irritate the nose. Face pads. Head st

Week 9: 13 - 20 April [Raycast Movement, Project Polish]

Image
Venue Change Basil emailed to advise the original venue is no longer able to host the event. Basil has already begun arranging a replacement venue, which will be a large, open area – though, it will be entirely indoors. We had been developing with the understanding we had all the vertical space and floor area both within the hangar, as well as the exterior as the hangar doors would be open for the exhibition. The new venue has significantly less overhead space and the interior will contain all visitors at all times, which may cause my current app design (navigation of scene through physical movement) impractical and interfere with other displays/people at the event. This necessitates a design change. I will keep the fundamental design of the app and experiment with alternate methods of moving within the scene, replacing physical movement with user action to select predefined view points within the scene. Responding to Change Movement Of the possible solutions,

Week 8, Part 2: 6 - 13 April [Vuforia Controller continued, 3D Printing]

From further reading I learnt Vuforia can recognise scanned 3D models/objects in a similar way to that which it identifies database images. 3D printing a controller for use in the application would improve user engagement and understanding. As there is no way to make the users hand appear in front of the rendered models with only a single device camera, having a 3D controller, with defined button areas the users can feel would assist with any confusion experienced if models overlap their hands. These defined areas would then be used to secure physical images to the model, to maintain the functionality of virtual buttons. Not having 3D printed before, I sought the advice of the universities 3D technologist, regarding printing 3D model for recognition. This gave me a good foundation to conduct my own research. I learnt that using software to hollow the model, then create an internal lattice within the model will “reduce the mass and weight of the object which means le

Week 8, Part 1: 6 - 13 April [Vuforia Controller]

Image
As discovered in the last blog, use of the Leap Motion with the application design is impractical, though I intend to preserve the interaction mechanic developed as much as possible. Creating a controller which could use virtual buttons as tested previously would give the same result as raycasting from the Leap Motion hand prefab. I trialled several different prototype designs – ranging from a simple A5 piece of patterned paper, through a paper cube, to a tube-shaped roll of paper. The latter was the most promising design, giving the user a handle to hold the controller, leaving the top half of the tube available for placing virtual buttons which could then trigger a variety of functions For the exhibition, I was keen to make the controller easy to replicate and cheap to produce, paper being an ideal option. The first controller attempt went unrecognised by Vuforia. Building on my earlier research into issues with recognition, I believed this was due to the rolled paper su

Week 7, Part 2: 30 March - 6 April [Leap Motion, Prototype]

Image
Thinking of possible uses of the Leap Motion in the application, I ruled out a pop-up menu as this could only be used to trigger scene changes – not the level of engagement the project requires. Despite this, I was still able to draw inspiration from the ‘Hover-UI-Kit’ menu I experimented with in the previous blog entry, in Unity. Using a gesture to activate a ‘mode’ then more intuitive movements within the mode for further functionality. Using the hand to select parts of the airship to then display related information was the idea I decided to develop. This could be done with relative ease through raycasting out from the hand prefab to detect colliders on a ship model. I used the last bone in the index finger for the origin of the ray, casting the ray forward in the direction of the bones Z-axis. Once this functionality was implemented it needed to be made toggleable, triggered by a user gesture. Leap Motion provides an example script to detect predefined hand positions. Th

Week 7, Part 1: 30 March - 6 April [Leap Motion, Research, Testing]

Image
To keep building on the focus of the project, making the application as interactive as possible, I was keen to include Leap Motion as a method of input. When I began using Kinect, I found dedicating some time to understanding how the device functions allowed me to approach the problem with knowledge of specifically what the device itself needed and identify potential issues before beginning development Similarly, with Leap Motion, while I understand what the peripheral does, though I’m not fully aware of how the device reads and processes data. Leap Motion key components Leap Motion uses three infrared LEDs to project (a hemisphere) above the device. It then uses two wide-angle-lensed cameras to detect when objects are hit by the infrared. This is the same approach the Kinect uses to map its surroundings. From reading I found Leap Motion then applies algorithms to the sensor’s raw data to map hand movements - rather than generate a depth map as the Kinect had done.