January 2018 - August 2018
(22 weeks)


Shin Liu
Will Wang
Minjun Chen
Brian Nguyen

My Role

Interaction Design


Adobe Premier Pro


Encountering mental blocks is a common feeling for many artists because constant creative output requires a lot of focus, concentration, and emotional management. Continuously engaging with these things can often make the artist feel burnt out or simply, just stuck. Sponsored by HTC, we wondered how we could apply VR with other emerging technologies to solve this problem.

Design Challenge

How might we use biometric data and affective computing to support artists in overcoming mental blocks?


Stream takes user's brainwaves with the Muse headband to influence the
virtual Affective Environment based on EEG brainwave data

Getting Data From the Muse Headband

The Muse Headband takes brainwave readings from the user which is then streamed into the virtual environment, creating the affective room. There are 4 channels that receive data from your head. An average is taken of EEG data on the four channels and mapped to the intensity of the visualization.

Affective Room

The affective room is a virtual space where the environment is reactive to the EEG activity of the brain. Higher stimulation and activity results in more volatile visualizations, lower stimulation and activity results in calmer visualizations. Shown above, we had two themes that could be selected from. The white theme has orbs which increase in size and redness the higher the activity the brain. The black theme has particles that increase the number of connections the higher the activity of the brain.

The affective room serves as ambient feedback to the users’ current mental states while they are creating in the affective environment. This is significant since a large part of the artistic creation is to externalize artists’ mental state. Representing artists’ internal mental states through affective visualizations can  support them during their creative process.


With the click of a controller button, the user can open up the timeline feature. The timeline gives the user the ability to go back and review a recreation of their project and how they were feeling during those moments of creation.


During the creation process, the user can press the mark button on the controller panel to mark a moment on timeline. The mark sign appears on the timeline once timeline is evoked which indicates parts of the creation process that the user wanted to take note of. Seeing the state of the visualization while you were creating gives greater context of how you were feeling during your creative process.

Reflection Sphere

There is a sphere in the corner of the user’s viewport. The more filled the sphere becomes, the closer the user is to matching their previous mental state. By allowing users to revisit their mental states, Stream helps creatives to be more mindful, therefore giving them more command over their emotional output.


Within our design, the creative activity we have present is 3D painting, similar to something like Tilt Brush, but that is not the focus of our design.  We chose painting to keep the idea focused, however having feedback on any creative activity is possible and valuable.  This can include activities such as writing, musical performance, etc.

Setting the Problem

We first began by examining the state of VR and explored what could be improved or added upon in the VR landscape. Unlike mobile/web, VR is in a state of infancy, and many fundamental interactions are not well defined. One exciting thing about explorations into VR is the potential to expand this set of fundamental interactions.

Rosalind Picard, a professor at MIT, proposed the concept of affective computing. Affective computing is the idea that technology can incorporate cognition and emotions in order to help make people’s lives better.

"Computers are beginning to acquire the ability to express and recognize affect, and may soon be given the ability to “have emotions.” The essential role of emotion in both human cognition and perception, as demonstrated by recent neurological studies, indicates that affective computers should not only provide better performance in assisting humans, but also might enhance computers’ abilities to make decisions"

We saw the opportunity in combining affective computing with VR through taking biometric measurements of mental states/emotion and having them directly influence the virtual environment.

Primary Research

Our research was a two fold approach:

The first area was research through interviewing experts of VR such as researchers, industry professionals, and entrepreneurs.

 The second area was research into the work process of creative artists. We conducted interviews and contextual inquiry with artists from several areas such as visual art, design, music, and film.

Subject Matter Experts

For our expert interviews, we aimed to capture opinions and insights from experts in various disciplines related to VR to discover opportunities for our problem space. We interviewed with four experts who specialize in VR in the following areas: technology and development, use in healthcare, mindfulness, and affective computing. The data and insights collected from these expert interviews helped greatly in shaping and solidifying our problem space. Learning about current trends and research gave us an insight at how the medium is currently being approached by professionals in the field.

Guillermo Bernal

MIT Media Lab, Fluid Interfaces Group

Adam Haar Horowitz

MIT Media Lab

Trond Nielsen

CEO, Founder, and Researcher,
Virtual Therapeutics

Eric Whitmier

University of Washington, Ubiquitous Computing Lab

Participant Interviews

Our participant interviews consisted of a contextual inquiry of each artists' creative process and asking them to think out loud about how they were feeling as well as their concentration level during this process. Afterwards, we conducted semi-structured interviews with each user on their overall development as a creative in their field, as well as how the role of emotion and mental states play into their work. Moreover, we wanted to discover how each individual overcame their own personal mental blocks in the creative process.


After completing our interviews and contextual inquiry, we then distilled research insights that answered our question of how emotion and mental states contribute to an artist's work output. These research insights informed and guided us on our way towards the final design of Stream.

1. Overcoming mental blocks and changing one's mental state is about introducing change into one's environment.

2. Emotional experience is fluid not discrete. Titles such as "mad" or "sad" are ways of simplifying a much more complex and multifaceted feeling.

3. Like physical strength, one’s mental muscle can be trained, thus giving an artist more control over their emotional output.  

4. Manipulating sensory experience in an environment directly affects one's related emotional experience.

Generating Ideas

After completing our interviews and contextual inquiry, we then distilled research insights that answered our question of how emotion and mental states contribute to an artists work output. These research insights informed and guided us on our way towards the final design of Stream.

Selecting the Idea through Lo-fi Prototypes

After completing our interviews and contextual inquiry, we then distilled research insights that answered our question of how emotion and mental states contribute to an artists work output. These research insights informed and guided us on our way towards the final design of Stream.

Mapping the System

Examining the User Journey

Externalizing the user journey of art creation was critical for us to envision how to design Stream to benefit our users. We mapped how emotional experience affects one's art work and saw Stream as a stimulus to prompt emotions within the user.

Interaction Flow

The interaction flow shows the walkthrough of each of the main functions of Stream: creating a project, entering a current project, and exploring other's projects.

System Architecture

Given the  multi-modal interaction of streaming brainwave data into virtual reality, we created a flow of the system architecture to clearly illustrate how data is passed through the system.

Paper Prototyping for VR

After developing the structure of the system, we moved on to a paper prototype of Stream to test with our users. In order to maintain the immersive component of virtual reality at this stage of the design, we created physical UI that the user would interact with. We had them put on the Muse headband as well as use the Vive controllers to give the best sense of what actually using Stream would be like. In addition to the physical UI, we dimmed the lights in the room and showed a visualization to create a mental model of what the affective room would be like.

Final Prototyping

EEG Data Into Virtual Reality

I prototyped bringing EEG data into VR with the Muse headband by using Open Sound Control (OSC) Protocol. MuseLab is a supporting tool for the headband that, once configured, streams the data over a selected port. There is a wide range of values that can be sent through MuseLab such as delta, theta, alpha, beta, gamma, EEG, and more. The alpha and beta waves were of particular interest, as these states are most associated with mindfulness and wakefulness. Ultimately, the alpha and beta values were used as they were the most consistent of all the data.

Mapping to the Visualization

With a script, the values would be passed into Unity3D by reading the port through which the data was being sent. The visualizations are particle systems within Unity which have variables that can be adjusted. By mapping the EEG data values to the particle system values a direct correlation exists where the higher the EEG data reading, the higher the value of the variable within the particle system. For our "light" visualization, this meant an increase in size and redness. For our "dark" visualization, this meant increased connections between the dots in the space.

View the full UI Spec here