By Jeremy Swan
"Plus ça change, plus c'est la même chose" (Jean-Baptiste Alphonse Karr, Les Guêpes, January 1849), commonly translated as “The more things change, the more things stay the same,” seems to hold true for virtual reality (VR) technologies. Strap on your headsets as we break down highlights from recent Virtual and Augmented Reality Interest Group (VARIG) meetings covering the history of VR, how NIH researchers are contributing to the development of privacy guidelines on the world stage, use of VR for addiction therapy, and visualization of data and 3D resources for communicating science in virtual and augmented reality. Hang on to your controllers; we’re going in!
History of Virtual Reality
While VR is taking the world by storm as the buzziest new mainstream technology, the most recent iterations are only the latest in a spectrum of immersive technologies intended to transport a viewer to another place.
At the October 2018 VARIG meeting, Dr. John Ostuni, a staff scientist in the National Institute of Neurological Disorders and Stroke (NINDS), dove into the history (and pre-history—and pre-pre history) of VR, beginning 17,000 years ago in Lascaux, where prehistoric peoples created depictions of animals in a cave in what is now the French countryside. Panoramic paintings covering all surfaces in the visual field, accessible only by descending into a cave by lantern, could be considered one of the first uses of “virtual reality.” Whether the animal cave was used for religious ceremonial purposes, or sharing knowledge about hunting, we will never know; but it would certainly have served as an immersive experience, travelling from the real world into a pictorial representation.
Moving forward in time, Sir Charles Wheatstone, an English scientist and inventor, described an effect reproducing stereovision in an 1838 paper (read more) about tricking the brain to see two-dimensional objects in three dimensions using images from different perspectives. And many readers will remember the Viewmaster, which launched one hundred years later in 1938. With 1.5 billion reels of color photographs, it may be the most successful product based on stereovision ever.
After describing these first stereo vision-based products, Dr. Ostuni’s presentation highlighted early iterations of VR, beginning with the first head-mounted display, called “The Sword of Damocles,” which was so heavy that it was mounted to the ceiling above, but allowed the wearer to walk around. The Air Force’s Visually Coupled Airborne System Simulation (VCASS), completed in 1982, was another innovative system, which would put the Atari to shame. (Photo at right; click image to enlarge.)
Part one of the two-part “History of VR” presentation ended with the success of VR via Google Cardboard, Oculus Rift, HTC Vive, and Samsung Gear VR. We also learned that the Brendan Iribe Center for Computer Science and Innovation at the University of Maryland in College Park is opening soon (collaboration, anyone?), funded by—you guessed it—Brendan Iribe, the co-founder of Oculus, with a $31 million donation in 2014.
Part two of the History of VAR session continues in January. Join the listserv at https://oir.nih.gov/sigs/virtual-augmented-reality-scientific-interest-group-varig for more information.
VR and Privacy Concerns
One of the biggest concerns in the virtual and augmented reality (VAR) development community is privacy, shared Dr. Susan Persky, an investigator with the National Human Genome Research Institute (NHGRI), who utilizes VR technology in her research. Dr. Persky presented to a full house at the November 2018 VARIG meeting held at the NIH Library.
Fresh from back to back VR meetings in Palo Alto, California and the World Economic Forum in Dubai, United Arab Emirates, she shared insights into some of the topics that leaders in the VR community are contemplating, in particular those discussed at the Palo Alto meeting.
VR systems currently track 18 “Degrees of Freedom,” in addition to cameras embedded in the most popular VR headsets for room mapping. Twenty minutes in VR produces two million data points, which could potentially be analyzed by third parties, such as advertisers. It’s possible to “fingerprint” who is using VR, for instance, by analyzing distances between the controller and headset. Inferences could potentially be made into a person’s gender, cognitive abilities, sexuality, and physical fitness, to name some of the possibilities. The popular Netflix Sci-Fi series “Black Mirror” has explored how innovative technologies could be used for nefarious purposes. With emerging technologies such as eye and facial tracking, it’s possible to know more about people than they may even know about themselves, and then use that information in real time for persuasion.
Leaders in the VR community were recently invited to the World Economic Forum to participate in discussions based around the “4th Industrial Revolution,” which builds on digital technology and encompasses, but is not limited to, artificial intelligence, VAR, 3D printing, autonomous vehicles, and quantum computing. Dr. Jeremy Bailenson, Director of the Stanford Virtual Human Interaction Lab, together with Sandra Lopez, Senior Vice-President, Immersive Experiences, Intel Corporation, co-chaired the Global Future Council on Virtual and Augmented Reality to explore ways in which privacy could be protected, including a decentralized identity, or a new construct modeled on an Institution Review Board (IRB) used in clinical research, or the European Union’s General Data Protection Regulation.
Objectives of the group included outlining solutions for mitigating risks created by emerging VAR technologies. For example, creating a privacy agreement template for data-tracking or generating agreements for collected data to expire and not be stored indefinitely could go a long way in protecting privacy. Additionally, an application or plug-in could be developed to help consumers visualize and understand what data is being tracked (akin to the open source Mozilla Lightbeam plug-in for Firefox). Another approach would be to incentivize the protection of privacy, allowing monetization alongside controlled access to collected data.
VR for Therapy
Noah Robinson, a former postbac fellow at NIH, is now working as a clinical psychology graduate student with the Hollon Research Group at Vanderbuilt University, where he is exploring how VR can be used to treat addiction. While presenting to the VARIG group in November, Noah explained that he has been testing the use of VR at an inpatient rehabilitation center.
VR therapies, which utilize techniques such as cognitive behavioral immersion, are affordable, accessible, and scalable. Noah recently received a National Science Foundation travel grant to conduct 100 interviews on the use of VR to treat addiction. It’s generally faster and easier for someone with an opioid use disorder to virtually visit a therapist or reach support than it is to visit a therapist in person. Entering VR also changes the perception of your physical space, allowing a person to “leave” an environment that contributes to negative feelings or triggers for using. Other therapeutic uses of VR include: negative arousing stimuli, exposure to carefully controlled stimuli, therapy for Post-Traumatic Stress Disorder, and Exposure Response Prevention for drug use. Noah recently founded his own company, Very Real Help, LLC, to quickly develop applications (apps) for engaging in VR therapy.
VR at NICHD — Zebrafish Brain Browser 2.0
The Zebrafish Brain Browser (zbbrowser.com) app, developed in the Burgess lab, allows users to search for and visualize transgene and gene expression patterns of larval zebrafish brains. As part of the app, the Burgess lab has developed a VR environment for volumetric rendering of image stacks representing whole brain scans of zebrafish embryos. They decided early on to build on the Google cardboard platform for the most widespread distribution without the need for specialized and expensive hardware.
Chris Hurt, a computer sciences undergraduate student at Virginia Tech, worked in the Burgess lab as a summer student to give the Zebrafish Brain Browser app a major facelift and add new features, including “spatial search,” which provides the ability to select a region of interest and then sort by the level of expression data (volume occupancy) contained within. The first iteration of their app relied on the user to convert brain scans to 3D objects before viewing in VR, but Chris’ work allowed for direct manipulation of imaging data within the browser window. Now, users can turn on or off various brain regions, change colors, adjust brightness/contrast for each set of data, and even upload their own original image stacks to compare to other lines in the browser—the number of expression lines available has grown to 300.
The browser is open source and completely free; other groups can use the code directly. Find it here on GitHub: https://github.com/BurgessLab/ZebrafishBrainBrowser. Learn more about Zebrafish Brain Browser on the Burgess lab website, plus check out an example publication highlighting the app!
Interested in VR?
If you’d like to learn more about virtual and augmented reality at the NIH, check out the following resources and articles: