Updated 12/09/18

Implementing Breath Detection in VR

In this article I'll share my experience of building the VR Breath Tech experience (which is free to download).

I believe that projecting the user's breath into VR increases immersion. I would like to see HMD manufacturers add breath sensors as standard in a future generation of headsets and see more developers expand on the ideas presented here.

Read about other uses of breath detection in why we want to breathe in vr

Using the microphone as a breath sensor

Let's start with the drawbacks.

It's not ideal, but a microphone is already built into most headsets and therefore the easiest way to demonstrate the ideas and technology behind breath detection.

Using the microphone has the following drawbacks:

Audible sound is required rather than measuring breath velocity. A calibration exercise is therefore needed to provide a more accurate representation.

Background sounds can be mistaken for exhaling.

If the microphone is calibrated to be too sensitive, the sound of inhaling can be treated as an exhale.

Microphone volume levels may be set to different levels by users.

What we really want is a standard sensor built into the headset, specifically for breath detection.

Sampling the microphone data

I use the amplitude of the signal at the microphone to determine the velocity of the breath. Clipping (air hitting the mic) can return high amplitudes even when the user is talking quietly. Don't use the maximum amplitude for a frame - a more reliable method is to calculate the RMS (root mean square) of the samples. This reduces the effect of any anomalies.

As no one will ever hear the sound, you can set the sample rate to the lowest level available, therefore reducing performance overheads.

Microphone data is buffered at lower levels, so even with constant sound you can expect to get blocks of data, then nothing, then more data. To smooth this out I process the frames that would fit in the current frame delta and buffer the rest to be processed in the next frame. Be aware that the lack of samples in one frame doesn't mean the user has finished making a sound.

Microphones in different headsets can return very different data. My Vive Pre was particularly sensitive to frequent clipping. Blowing was returning higher values than talking so I had to make one of the puzzles dependant on the results of the microphone calibration.

Seeing your breath

Particle systems can be used to visualise your breath in cold air or as bubbles under water. Use the amplitude to control the particle velocity. Use short burst from particle emitters so that changes in direction and velocity are quickly represented in VR.

Using flat sprites as smoke / steam can look bad in VR but if you keep the transparency high and have multiple sprites moving through each other it can give an appearance of depth and volume. I use an orbit module to add extra movement. Having overlapping transparent sprites can be a problem for performance so I set the lifetime as low as I can get away with.

You also see your breath when you mist up a window or mirror. I demonstrate this by allowing the user to breathe on ice. This involves performing a line trace to the object, calculating the UV offset for the hit point on the object and updating a render target using those UV coordinates. The render target is used as a mask in the object's material to strengthen a mist texture. An immersive game would let us breath on windows and write messages in the misted up area :)

Dust

Dustmotes

I have dustmotes floating around that can be blown away. Dustmotes that are out of range are controlled by a single particle system and physics are not required. The low number of close-up dustmotes are individual actors that I can apply forces too as the user blows on them.

Dusty surfaces

I used a material layer that I can combine with other materials to apply dust to any surface. At the start of the game a short initialisation uses line traces to see where dust would settle and adds noise to a render target at the appropriate UV coordinates for the given object. This is the mask for the dust layer. This approach requires some thought in how the mesh is UV unwrapped, ensuring that top surfaces that can gather dust have some separation from other surfaces. This ensures the mask doesn't accidentally affect inappropriate surfaces.

When the user blows on the dusty surface, particles are spawned to show the dust being blown away and I update the render target to reduce the dust layer. A 2 dimensional array is also updated to roughly reflect how much dust is left in a given area. This array is used to decide if dust particles should be spawned on further blowing or if the area has already been cleared.

Interactive objects

Cloth

I use the wind actor built into the engine to affect cloth. This could be refined and improved but it's a good starting point.

Cobwebs are a good candidate for cloth simulation and breath detection.

Candles

Seeing a candle and not having the ability to blow it out has to be a constant reminder of being in a simulation. When blowing on a flame gently, I update the flame's material to affect the world position offset of the top of the flame - moving it away from the player. Blowing harder extinguishes the flame and changes the lighting.

Making sounds

I allow the user to play the panpipes to demonstrate this idea. The amplitude of the breath and closeness to the pipes affect the volume of the sound. In the original DK2 demo I made sounds when the user blew over the top of a bottle.

Other objects

I allow a small ball to be blown around a maze, but this idea should be expanded on to include anything that makes sense. Here's a few more examples:
Blow a pen and make it roll off a tableBlow the pages of a book to help separate them and turn the pageBlow a spider away from youBlow on wind-chimes to hear themBlow on embers as you try to get a fire going

Other microphone uses

While we may be concentrating on breath detection in this article, also remember that the microphone should be used to make the virtual world react to the sounds you make. Characters should turn around if you shout. Their gaze should move between your eyes and mouth when you're talking to them etc.

The next step - airflow

Simulating airflow can enable subtle enhancements that increase immersion. Breath is an obvious first step, but any movement also affects airflow. Moving your hand past a candle should make it flicker. Walking quickly by a desk may result in papers rustling. Opening a door or window may affect multiple objects as the air changes. Dustmotes should react to all these things too.

The more an environment reacts to our presence in expected ways the more immersive it becomes. Subtle effects may go unnoticed while adding to an overall feeling of presence.

In closing

I hope these notes are of use to someone. Use the comments section below to ask questions or share your own ideas. We're just at the beginning of VR's journey and one day all these small elements will be taken for granted. For now, it's up to developers to prototype everything we can think of and share our ideas to inspire further developments.

Connect with me for more news and insights

Website design by Head Start Design (based in Sheffield).