View a 508-compliant PDF of this issue here: NICHD_Connection_2016_09.pdf

After years of hype and expensive, underperforming attempts to provide Virtual Reality (VR), it is now reality! A newly formed Virtual Reality Interest Group (VARIG) meets monthly at the NIH and aims to empower the scientific community to use the readily available technology for research. But how can researchers use it, and what’s in the future? Let’s dive in and take a look:

A woman in a VR headset, holding a handheld controller

Using the nVisor at NHGRI's IVETA facility

(Click to enlarge)

The History of Virtual Reality

Stereo viewers—devices that create a three-dimensional (3D) image by providing separate images to the left and right eyes—have existed for hundreds of years. Kinematoscopes and other stereo movie and photo viewers were a staple at nickelodeons and circuses at the turn of the 20th century. Maybe more familiar, the child’s View-Master® has rattled around toy boxes for 75 years.

The images in those devices have always been static, unable to move with the viewer. When the peripheral vision is completely filled and moves fluidly with motion, a powerful sense of immersion in the environment is created. Because of this sense of immersion, the military has employed Virtual Reality dating back to the Cold War. Most extensively, it has been used for training and simulations, but also for piloting. Helicopters like the Blackhawk don’t even have a window and rely entirely on a combination of virtual and augmented reality to navigate the craft. Industries have also employed VR in the design process, for example testing prototypes of cars before investing time and materials in construction.

NIH scientists have used VR for over a decade. The reason for recent excitement is that high quality VR is now affordable. Consumer models work well and are available for less than $1,000. For biomedical research, the applications are many.

The Immersive Virtual Environment Testing Area

Dr. Susan Persky from NHGRI (perskys@mail.nih.gov) helped to launch “IVETA”—the Immersive Virtual Environment Testing Area—in 2007. This lab’s primary purpose is to serve the Social and Behavioral Research Branch within NHGRI. While the lab doesn’t provide service to the larger NIH community, they sometimes collaborate with other groups, including labs in NIDA, NIAAA, and NIMHD. The research facility began actively recruiting and conducting research with study participants nine years ago. 

The facility features an eight-camera position tracking system for tracking motion in a walkable space. They currently use nVisor and Sony Head Mounted Displays (see below for specific VR hardware information), but will transition to the HTC Vive in the near future. The group plans to test the Oculus Rift for travelling to remote sites, such as medical centers.

One interesting use for VR includes the simulation of a clinical environment to study doctor/patient interactions or teach and test medical students in a virtual clinic. Researchers have absolute control over every aspect of visual and auditory stimuli. The scene can be reset exactly after each session, and researchers can change specific variables to test their influence on the encounter, such as a virtual physician’s skin color or a virtual patient’s word phrasing.

Another feature of the technology is the ability to collect precise movement parameters like gaze tracking and interpersonal distance. One of the lab’s studies involves a virtual buffet to study food choice. Parents might be asked to select foods from the buffet to serve their children. How might decisions be influenced by surroundings? By working with this technology, the group can design research environments that isolate the variables they want to study, yet look and feel realistic to research participants.

Two fellows wearing VR headsets, playing games using handheld controllers

Dr. Sunbin Song's colleagues test the HTC Vive at home and in the Clinical Center

(Click to enlarge)

From bench to bedside

The speed at which VR has developed can prove both challenging and exciting. New graphics cards, games, applications, and platforms allow increasingly realistic physics simulations.

Dr. Sunbin Song, a research fellow from NINDS in the Human Cortical Physiology Section, is familiar with the benefits and challenges of advanced VR. She is utilizing VR to study motor learning and investigate the potential for remote home-based rehabilitation paradigms in a VR environment. Dr. Song began taking content development classes, specifically on the game engine called Unity, led by Dr. John Ostuni. Together, the pair worked to develop a driving simulation task with the Oculus Rift DK2 and a steering wheel. They quickly realized that their application caused motion sickness, switching to a ball-sorting task with the HTC Vive HMD and controllers.

Drs. Song and Ostuni are currently exploring how VR technology can be used for both diagnostic and training purposes. The long-term advantage of using commercial technology is that patients could avoid a hospital setting for physical therapy, allowing for access to physical therapy more often and for longer, as the process would be cheaper and less intensive on limited resources, such as space. Patients and medical professionals separated by long distances could interact in virtual space, opening up the possibility of global healthcare.

Looking to the future, Dr. Song is exploring social VR platforms, such as High Fidelity (highfidelity.io), that actively integrate information from sensors for tracking facial expressions and other information to animate avatars in real time. These tools can allow not only for more realistic person-to-person interaction, but may also be used to create a feedback-controlled regulation of the VR environment in rehabilitation paradigms. Also, researchers have the option to measure other characteristics, such as galvanic skin response, blood pressure, or heart rate. The researcher only needs to build the functions into the application and environment.

A presenter shows how to create a virtual reality avatar

Recent VARIG meeting topic covered avatar creation via services such as www.makehumancommunity.org.

(Click to enlarge)

Complex 3D datasets

A unique and interesting use of Virtual Reality is for exploring complex 3D datasets. Dr. Damian Dalle Nogare in the lab of Dr. Ajay Chitnis is collaborating with Dr. Harry Burgess to develop a VR environment for visualizing whole brain scans of zebrafish embryos. They decided early on to build on the Google cardboard platform for the most widespread distribution without the need for specialized and expensive hardware.

Brain scans are first converted to 3D objects and then imported into Unity3D, which they are using to build the VR Brain Browser. Within the VR environment, users can move around within the brain, examining various structures and how they relate spatially to other structures in the brain. Within this browser, users can turn on or off various brain regions to visualize different regions without leaving the VR space. The relative simplicity of the zebrafish brain allows an observer, for example, to visualize in 3D space how the optic nerve connects to the optic tectum. They hope to include gene expression patterns and brain activity in future generations of the application.

Virtual Reality challenges

The market for VR is changing quickly and faces several challenges, including:

  • VR can make people feel motion sick, which can limit how much people use it and this hinders their enthusiasm. 
  • A dedicated space is needed for systems that allow people to move around, when needed
  • While pricing is now lower, most systems are still expensive or require an expensive PC.
  • Content is limited, but the tools for creating custom content are powerful and expanding.

Dr. Song and Dr. Persky will be demonstrating VR technology at the IRP Intramural Research Festival in the NIH Library, and the Brain Browser will be available to try at the booth, located at the South Entrance of the Clinical Center on September 14–16 (Wednesday–Friday). Be sure to check out their demos and the Virtual and Augmented Reality Interest Group (https://list.nih.gov/cgi-bin/wa.exe?A0=varig).


Virtual Reality Hardware

Many companies create hardware for Virtual Reality. Here we focus on a sampling of the current leaders in the Virtual Reality space, at several levels of sophistication:

Google Cardboard

This is called Google “Cardboard” because it is a piece of cardboard folded according to a design, with two magnifying lenses and a simple button, into which a smartphone is inserted. This platform for VR launched in June 2014 as a means of creating an immersive environment that takes advantage of the display, processor, and motion sensors of a modern smartphone to track head motions. Google launched an accompanying mobile app, which serves as a “Main menu” and allows navigating while wearing the device. While this is not the best VR experience, the low cost (retail $15) has enabled the distribution of 5 million units.

Samsung Gear VR

This (mostly plastic) device also offers mobile phone-based VR, but it is specifically designed to work with several models of Samsung Galaxy smartphones. It includes a touchpad and support for applications on the Oculus Store for a vastly improved (but more expensive) experience to Google Cardboard.

Oculus Rift

This device is fully dedicated to Virtual Reality, rather than serving as a phone. It helped to popularize the concept of consumer VR through demonstrations at the E3 conference in 2012, followed shortly by a Kickstarter campaign. Facebook purchased the company in 2014 for a cool $2 billion. Two developer versions were available (DK1 and DK2) but the consumer version (CV1) was finally launched in March 2016 for $599.

Only featuring one tracking camera, Oculus Rift is designed for a sitting and standing experience, with some movement possible but limited to a 5’ x 5’ space. Controllers are limited to keyboard or a handheld Xbox One game controller and the Leap Motion hand tracking camera, but Oculus plans to release the “Oculus Touch” motion-tracked controller, which ships with an additional tracking camera, to expand movement area and add tracking of the controller(s). Oculus Rift requires a PC that works with the Oculus Store and has a good video card (est. $1200).

HTC Vive

The Vive launched about a week after the Oculus Rift CV1, with a similar experience and slightly higher price point ($799). It features expandability to two handheld controllers and room-scale tracking (through 70 sensors on the HMD and two base stations mounted either on a camera stand or attached to the wall) in a 15’ x 15’ space. Like the Oculus Rift, the Vive requires a PC that meets specifications. It works with Valve’s SteamVR platform to deliver games and VR content.

Sony Playstation VR

Set to launch in October 2016, Sony Playstation VR will likely be the most ubiquitous VR ($399) and relies on the PS4 instead of a PC. It uses the Playstation Network to deliver games.

Application Development

Several game development platforms exist for creating content for Virtual Reality. The two most popular game engines, Unity and Unreal Engine, are popular because they are easy to use, affordable, and can work on many different gaming platforms, from mobile phones to PCs.

NIH’s Dr. John Ostuni (ostunij@ninds.nih.gov) taught classes in the use of Unity over the summer of 2014. He now provides assistance and teaches the use of emerging technologies through the VARIG scientific interest group, which meets on the 3rd Friday of the month. Join the VARIG ListServ here.