Week 1: 16 - 23 Feb. [Meeting with Museum Director]
While I was
intrigued by both the energy efficient car and the museums interactive experience,
the prospect of working with new technology directly related to the content of
the degree, where people outside of the university would be the end users made
the museum’s project far more appealing.
No brief or
requirements were specified in the museum’s request, so after learning of the
project I contacted the museum manager, Basil Abbott, to arrange a meeting to further
clarify what they hoped to include in their exhibition and available resources.
From initial
email exchanges it was clear that the intention was to involve the community
with a hands-on experience(s) rather than a static exhibition, the
implementation of which the museum has left to our discretion.
To prepare
for the meeting it was necessary to familiarise ourselves with the history of
the airship – the origin, the evolution, the projects goals, its achievements
and its end. In addition to information found online and from books, we arranged
to speak with Sheila at Pennoyers Centre (Pulham), who provided further details into the procedures
of airships of the time as well as more specific accounts of the airships
voyage.
- Creating a VR experience (VR headset, option of leap motion hand tracking) where both milestones from the Atlantic crossing and more specific stories from the crew could be showcased. Could provide interaction between the user and items in the environment, movement through the environment or expandable information panels, all through simple gestures.
- Multi-screen display using IR head tracking could produce a similar experience, that could be used by a primary user and shared with onlookers at the same time.
- Printing 3D models which could then be painted by local school children to involve the community. There is also the possibility to add RFID phidgets to the models to then given the models functionality within the experience.
- Using a projector/multiple screens in combination with an Xbox Kinect would allow limb tracking so many users could interact with the experience simultaneously (though designing a suitable experience, enjoyable for all, using this implementation may prove difficult).
- Mobile devices with cameras could be used to provide an AR experience by loading assets onto the device screen when the camera is pointed at and recognises defined patterns.
All the
above appealed to Basil, though particular interest was shown towards applications
which more directly involve the primary user and show more detailed
information.
Basil kindly provided further reading resources and a wealth of material related to the R34 as fuel
for our project.
Comments
Post a Comment