Respire brings together three components: a virtual environment (via head-mounted display-HTC Vive), embodied interaction (via a respiration sensor), and an intelligent musical agent to listen to breathing patterns and generate the sound with affective properties. Respire guides its users to reconnect with an embodied experience, often lost in our interaction with new and emerging technologies. Built upon mindfulness principles, in this piece breathing is utilized as an object of the user’s attention through impermanence of virtual landscapes and audio environment. The changes in Respire are generated directly from changes in breathing patterns, as the user becomes aware of their breath and the agency they have in the environment.
The musical agent listens to the user’s breathing and takes the user on a journey through abstracted worlds with different affective qualities creating a user-dependent narrative. From the calm to stormy oceans and ambiguous architectures that one is immersed in and that elicit curiosity; this is a journey within one’s own breathing. The environment allows each audience member to create their alternate realities based on the interaction between two dynamic systems: the user and the system. This makes each journey unique and unrepeatable.
->Tatar K., Prpa M., Pasquier P. (2019-Upcoming). Respire: A Virtual Reality Art Piece with a Musical Agent guided by Respiratory Interaction. Leonardo Music Journal.
-> Prpa M., Tatar K., Françoise J., Riecke B., Schiphorts T., Pasquier P. (2018). Attending to Breath: Exploring How the Cues in a Virtual Environment Guide the Attention to Breath and Shape the Quality of Experience to Support Mindfulness. In Proceedings of the 2018 Designing Interactive Systems Conference (pp. 71-84). ACM Press. https://doi.org/10.1145/3196709.3196765
-> Prpa M., Tatar K., Schiphorst T., & Pasquier P. (2018). Respire: A Breath Away from the Experience in Virtual Environment. CHI’18, April 21–26, 2018, Montreal, QC, Canada ACM 978-1-4503-5621-3/18/04.
This work has been supported by the Natural Sciences and Engineering Research Council of Canada, and Social Sciences and Humanities Research Council of Canada.
Ce travail est supporté par le Conseil national des sciences et de l’ingénieurie du Canada, et le Conseil national des sciences humaines et sociales du Canada.