Skip to content


  • Add this article to your LinkedIn page
  • Add this article to your Twitter feed
  • Add this article to your Facebook page
  • Email this article
  • View or print a PDF of this page
  • Share further
  • Add this article to your Pinterest board
  • Add this article to your Google page
  • Share this article on Reddit
  • Share this article on StumbleUpon
  • Bookmark this page

How virtual reality sound techniques could revolutionise medical equipment development

You might be able to operate a piece of complex medical equipment when you’re relaxed, sitting in a quiet room by yourself, with the instructions in front of you. But what happens if you’re in a crowded hospital, stressed, with lots of people talking to you all at once, and you have no time to look through the instructions?

Our Human Factors team finds out just how someone might handle a new medical device in real life situations. We place potential users, who could be surgeons, doctors, nurses or patients, in simulated   everyday environments, from hospital wards to outpatients’ living rooms. Some of these simulated environments are more complex than others – operating rooms can include noisy machines, warning alarms, and a collection of busy people to interact with.

A realistic ‘soundscape’

We’re always looking for new ways to make the situations we’re simulating closer to real life. One aspect of simulations we’re currently looking at is sound, as it contributes significantly to the realism of a situation. In particular, we’ve started to explore how Virtual Reality (VR) and Augmented Reality (AR) enhance the sense of realism using 3D audio to create and manipulate soundscapes. A soundscape is the combination of sounds, both natural and manmade which make up the immersive environment of the simulation, and gives the user a sense of place.

To create soundscapes, 3D audio uses binaural recordings. These are recordings taken from two different points, much like an ear on either side of the head. When a sound is made it will arrive at the   closest point first, before reaching the second point located farther away. So, a sound made to the left of a person will reach the left ear before reaching the right ear. Using this difference in time taken to reach these two points, it’s possible to recreate what happens to our experience of sound when something making a sound is moving or we’re moving around.

A life-like experience

We think we’ll be able to use 3D audio to create genuinely life-like experiences. For example we could make the experience of a simulated hospital ward more immersive. We could do that by creating and   manipulating a soundscape which includes the cacophony of sounds that happen out of sight, behind privacy curtains by medical staff talking and using equipment. And we could simulate an operating theatre even more accurately.

This work will help us provide richer and more insightful data on how people may use a new device in real situations. We’re getting much nearer to simulating real life, ensuring the devices we develop are as  safe and effective as possible.


Contact the author

  • Philip Lance

    Philip Lance

    Human Factors Specialist

    Applying Usability Engineering to product development, to help create new devices which can be used safely and effectively.

    Insights by Philip Lance

Contact the product design and engineering team


By using this website, you accept the use of cookies. For more information on how to manage cookies, please read our privacy policy.