Digital Universe Fly-through
Digital Universe Fly-through was a gesture-based installation for a fly-through experience that allows users to explore the visualized universe data from the Digital Universe Database using their body movement, captured via Kinect.
Eojin Chae (Lead Digital Artist)
Barry (Experience Researcher)
According to a recently conducted research of the Hall of the Universe in AMNH, compared to other exhibitions, the main screen in the hall is of least interest to the audiences, measured by average stay duration and pictures taken by audiences. As a result, the Science Visualization Group Emerging Media Lab decided to experiment with ways that could make it more playful and interactive.
After the project was initiated, there were many discussion about the guidelines for design and development. And here were the basic guidelines that we established:
1. It should focus on being educational
2. It should fit into the rest of the exhibitions in the Hall of Universe
1. It should be engaging and utilize the advantage of gesture interaction
2. It should try its best to be accessible to main audiences of the Museum, including kids, senior, people who speak different languages and people with disabilities
(Use body pose to match different constellations)
Analyzing the first version
When I first joined this project, it just went through its first user testing with the public. So as someone who is really interested in the UX aspect of emerging technologies, I was lucky to have video documentation of what people were trying to do with the visualized universe.
By analyzing the video documentation, I realized that this gesture might not be the ideal gesture for zooming:
Zooming is a continuous and directional process. However, the first version of gesture required detecting change between two states, from arms open to arms closed.
In order to match the continuous process of “zooming”, users had to do this gesture repeatedly. Since the change between two states is highly reversible, during user testing, we realized that this gesture might trap users between two states.
Brainstorming for iteration
Brainstorming for new gestures is something that is fun and painful.
I felt like I took an intense Yoga class and found myself waving in air/bending in weird angle/doing tree pose unconsciously while waiting for a train that was delayed forever in NYC.
But most gestures for the second iteration was definitely inspired by observing what users were trying to do in the first user testing. For example, we eventually chose to go with a simple “lean” gesture and used the “iron man flying” metaphor because we realized that many users tried to “lean forward” to “zoom in” and “lean back” to “zoom out”.