Instrument: One Antarctic Night
INSTRUMENT: One Antarctic Night (IOAN) is a performative, multi-participant, immersive virtual reality artwork. In IOAN, participants explore a large scale immersive field of visualized and sonified data representing 817,373 astronomical objects.
The IOAN Installation Space
IOAN's 20-foot by 20-foot installation space is outfitted with a 7-channel spatialized audio system. Multiple users can enter the installation space and interact with the system through tracked head-mounted displays and controllers. Not only do users share the same physical space but they share the same virtual environment. Their physical locations and movements are matched in the virtual world. This hybrid physical/virtual space facilitates their collaborative interactions with the star field. Three pairs of thirty-two-inch monitors placed around the installation space provide spectators with a window into the virtual space.
The 817,373 astronomical objects represented in IOAN include stars, novae, and galaxies which reside within the Large Magellanic Cloud. They were observed by the AST3 (Antarctic Survey Telescope) robotic telescopes at Dome A over the course of its first Antarctic night of operations (approximately 4 months long) in 4,183 images taken by devices 10K x 10K CCD. In IOAN, objects are analyzed for qualities such as average magnitude and periodicity and, when possible, is fused with information from the GAIA Archive and the SIMBAD Astronomical Database – both large scale open-access astronomical repositories.
The data is used to drive the objects' representations in virtual reality including their 3D form, color, and animation and to drive the objects ambisonic sonification. In an aesthetic gesture that inverts the relationship of the human body to astronomical scales, the astronomical data is rendered as a field of objects, a literal and poetic star field, surrounding the users at hip-to-waist height instead of above the users in the sky.
For the participant, IOAN is a non-goal driven exploration of an otherworldly environment. When a user enters the space, they are exposed to a general ambiance of a soft, granulated soundfield. This invites them to explore, via movement and the relation of their body to the data, the star field represented in the virtual environment. Users can freely walk throughout the star field, select objects for examination, or manipulate individual objects through gestures to trigger analytical operations.
As participants explore the space, their physical gestures and manipulations are translated into database operations for immersive analytics. For example, participants can tap objects which results in the playing of a short sound and a data preview. Participants can select objects by picking them up, triggering a more intensive data query and adding the object's sonification to the installation's soundscape. Users can also strike the surface of the data field creating a large wave which performs a data filter operation on the entire dataset of 817,373 objects.
IOAN's virtual environment was developed using the Unity game engine. Unity's built-in rendering and physics engines are optimized for traditional game development and have inherent limitations for working with large multidimensional datasets like those in IOAN. IOAN's system works around the game engine's limitations using GPU shaders to render data and provide real-time interactive access to 817,373 astronomical objects. In testing, IOAN could support up to 7 million objects, with little impact on interactive frame-rates. Max/MSP is used for the multichannel ambisonic sonification and OSC is used for communication between Unity and Max/MSP.
Current and Future Work
IOAN was shown at SIGGraph 2018's art show. User feedback was solicited and recorded. This past winter, we engaged in a user study of the system. Based on user feedback, results of the usability study, and our own analysis of the project's successes and shortcomings, we are in the process of improving the work to allow for more intuitive and expressive interactions. We are also in the process of searching for venues and raising funds to show the improved work. We also plan on engaging in a second usability study next year which we hope to publish in SIGCHI (or a similar journal).
IOAN is the result of a collaboration. Primary collaborators include Ruth West, Violet Johnson, I-Chen Yeh, Zach Thomas, Lars Berg, and myself (Eitan Mendelowitz). My contributions include co-developing the concept, developing the software architecture, user experience design, overseeing graduate student contributors, and the programming of some visual elements.
Publications and Exhibitions
- West R., Johnson V., Yeh I.C., Thomas Z., Tarlton M., Mendelowitz E. "Experiencing a Slice of the Sky: Immersive Rendering and Sonificaiton of Antarctic Astronomy Data." Electronic Imaging Proceedings of Engineering Reality of Virtual Reality, 2018.
- West, R. Johnson I., Yeh I.C., Thomas Z., Mendelowitz E., Berg L., "INSTRUMENT | One Antarctic Night: Interactive Installation, Virtual Reality, Multi-channel Spatialized Audio, Multi-display." Leonardo Journal, August 2018
- West, R. Johnson I., Yeh I.C., Thomas Z., Mendelowitz E., Berg L, SIGGRAPH Art Exhibition, Vancouver, Canada, 2018.