Enlarge / An actor portrays a participant in a new study of the impact of augmented reality on social interactions. The area inside the dotted line is the field of view of the augmented reality goggles, which shows digital content such as avatars.Mark Miller/Stanford Human Interaction Lab

Neal Stephenson's influential 1992 sci-fi novel Snow Crash offered a fairly dystopian vision of a future virtual-reality based Internet known as the "Metaverse" and is widely credited with bringing the term “avatar” into mainstream culture. Stephenson called people who remained publicly plugged in around the clock via wearable computer gear "gargoyles," and he derided the adverse impact of that level of immersion on social behavior. "Gargoyles are no fun to talk to," he wrote. "They never finish a sentence. They are adrift in a laser-drawn world."

We are at the dawn of the 21st century in which the novel is set, and we don't yet have a fully immersive VR Internet. But smartphones are ubiquitous, and augmented reality (AR) is already here, most notably in popular games like Pokémon Go and the Microsoft Hololens AR interactive crime drama Fragments. It seems Stephenson wasn't far off the mark. According to researchers at Stanford University, layering computer-generated content, like someone's avatar, onto a real-world environment will influence people's behavior as if that person were really present. The researchers described the results of three recent experiments on the impact of AR on social interactions in a new paper in PLOS ONE.

Quite a lot of research has studied the psychological impacts of both rudimentary virtual worlds like Second Life and fully immersive VR experiences—a good chunk of it conducted in co-author Jeremy Bailenson's Virtual Human Interaction Lab at Stanford. One of the first simulations Bailenson created involved a virtual, gaping pit in the middle of a simulated “room” with a board laid across it. Test subjects, outfitted in full VR gear, were instructed to walk on the board across the pit. Even though they knew consciously that the pit wasnt real (because they had seen the real-world version) they still reacted as if the pit were really there. Some teetered uncertainly, some fell down, some ran away, some screamed in fear—a testament to the power of digital illusions.

The Microsoft Hololens interactive crime drama <em>Fragments</em> scans your surroundings to create a detailed map of the space and impose AR elements onto it.
Enlarge / The Microsoft Hololens interactive crime drama Fragments scans your surroundings to create a detailed map of the space and impose AR elements onto it.YouTube/Microsoft Hololens

Bailenson's experiments studying the extent to which our avatars are an extension of ourselves showed that inhabiting a virtual world can impact our behavior offline, too. He found that watching your digital avatar running on a treadmill, for example, makes you more likely to exercise offline as well. This so-called "Proteus effect" is even stronger when you watch your avatar become thinner or heavier in response to behavioral choices, such as eating carrots versus candy, or exercising versus standing still.

The more we identify with our avatars, the more strongly we will respond. Spend enough time with an avatar that looks like us, and the lines between our real and virtual identities begin to blur. It only takes 20 minutes of exposure to produce changes in behavior. If something bad happens to your avatar in a social context, it seems to engage the same neural circuitry that is engaged when something happens to the actual you in a social context.

For this latest series of experiments, Bailenson switched his focus to AR. "I've been in the VR and AR space since 1999, and I always thought AR was a cool technology but never had that 'aha' moment," he said. Then two years ago, one of his graduate students put together an AR demo space that changed his mind. He discovered that the current crop of AR goggles can project highly realistic 3D versions of a real person, in real time, onto the physical surroundings of the goggles-wearer.

"How does the world change when people are basically seeing ghosts all the time?"

"There's something about that shared common ground that made me realize this kind of demo could really make AR flourish," Bailenson said. "The potential benefits are astronomical. All of a sudden, I don't need to fly halfway across the world for an hour-long meeting, because [with AR] it feels like somebody is really here, with eye contact and posture and all these other [social] cues you don't really get over video conference. This is a big deal if we can get this right." AR games like Fragments are just scratching the surface of what might soon be possible in the realm of entertainment. "In a couple of years, whatever the new version of The Shining is, you're going to have those twins literally crawling up your bed," he said.

A thorough, brute-force survey of the academic literature revealed prior studies looking at the effects of smartphone use on social interactions. (Not surprisingly, those studies showed that people rated conversations in the absence of smartphones more satisfying and of higher quality than conversations where one person actively used their phone.) And there have been numerous papers investigating the ergonomics of AR and using AR as an educational tool. But nobody had yet looked into how AR might alter social interaction.

Bailenson saw a unique opportunity to get ahead of the rapid technological advances and study both the costs and benefits early on, particularly looking at social norms and behavior, before AR becomes as ubiquitous as smartphones. "How does the world change when people are basically seeing ghosts all the time?" he said.

Bailenson's group recruited 218 subjects for three studies. In the first two scenarios, a virtual avatar named Chris sat in one of two real chairs, and subjects were required to interact with him. One experiment focused on social inhibition, which is why many people will struggle with more challenging tasks if someone is observing them. The same held true in an AR setting; subjects performed worse on a challenging task if Chris was in their AR field of vision.

Read More – Source [contf] [contfnew]

Ars Technica

[contfnewc] [contfnewc]