Loading Events

« All Events

  • This event has passed.

Virtual Human Interaction Lab (IHP)

July 20, 2016 @ 3:00 pm - 4:00 pm

vhil-logo2

Event Description

The mission of the Virtual Human Interaction Lab is to understand the dynamics and implications of interactions among people in immersive virtual reality simulations (VR), and other forms of human digital representations in media, communication systems, and games. Researchers in the lab are most concerned with understanding the social interaction that occurs within the confines of VR, and the majority of our work is centered on using empirical, behavioral science methodologies to explore people as they interact in these digital worlds. However, oftentimes it is necessary to develop new gesture tracking systems, three-dimensional modeling techniques, or agent-behavior algorithms in order to answer these basic social questions. Consequently, we also engage in research geared towards developing new ways to produce these VR simulations.

Our research programs tend to fall under one of three larger questions:

  1. What new social issues arise from the use of immersive VR communication systems?
  2. How can VR be used as a basic research tool to study the nuances of face-to-face interaction?
  3. How can VR be applied to improve everyday life, such as legal practices, and communications systems.

The VHIL VR Lab integrates technology that can stimulate three of the human senses at once —spatialized sound, virtual touch (haptics), and three-dimensional imagery. The lab features cutting edge equipment for tracking motion, rendering graphics, and displaying visual, aural, and haptic information. Kornberg Associates Architects have collaborated with Worldviz LLC, the world’s leading virtual reality creators, and Stanford University to renovate all aspects of the lab.

The lab includes a multisensory room that allows participants to explore a 20 by 20 foot space with spatialized sound, floor shakers, and a new head-mounted display (HMD). The HMD is tethered to a ceiling mount and allows an unrestricted range of motion in the multisensory room. In addition, the spatialized sound system makes sounds “move” around the lab space and allows users to hear natural and realistic aural information without relying on headphones. Surface motion is incorporated into experiments through the installation of floor shakers. Controlled vibrations give users the sensation of movement, which will increase the compelling nature of the virtual world. At one moment, a user may feel immersed in an earthquake simulation, while the next moment the sensations may simulate crossing a suspension bridge.

A two-walled Cave Automatic Virtual Environment (CAVE) is another integral part of the lab. The CAVE consists of 70-inch diagonal 3D displays, which means colors and definition will be crisper than typical projection systems. In contrast with the HMD based virtual reality, the CAVE permits less range of motion but allows users to experience virtual environments without wearing any hardware.

All of these features are controlled in a single room that has the power to simultaneously operate the activities in four separate virtual reality simulation spaces. This integrated design increases the efficiency of the experiments and allow researchers to have greater control of the different settings in these virtual worlds.

Directions: The lab is at the end of Palm Drive on Stanford Campus, in the Main Quad. The address is 450 Serra Mall, Stanford, CA. We are in McClatchy Hall, Building 120. As you approach the quad from the top of the oval at the end of Palm Drive, McClatchy is the first building facing front, left of center. Enter the building and take the hallway to the left, then take the elevator to the fourth floor. The entrance to the lab is through the glass doors directly outside of the elevator, room 4

Details

Date:
July 20, 2016
Time:
3:00 pm - 4:00 pm

Venue

Virtual Human Interaction Lab
450 Serra Mall, McClatch Hall Building 120
Stanford, CA 94305 United States
+ Google Map