XR: dreaming with your eyes open

In the future, XR—a fusion of virtual, augmented and mixed reality—will blur the lines between the virtual and the real worlds. Our guest authors describe what this means both for individuals and society as a whole.

XR means much more than just expanding to a larger screen. In the words of Dr. Brennan Spiegel, XR “is like dreaming with your eyes open.” These technologies generate an intense experience known as “presence.” Virtual scenes, objects, and characters are lifelike and magical. The technology takes you into an immersive experience that feels like a parallel reality. In the next twenty years, XR will revolutionize entertainment, training, retail, healthcare, sports, and travel. An immersive experience should be one in which the user experiences the same sensations he or she would in a real-life environment and is unable to distinguish between what is real and synthesized.

"We must fool our most acute sense, our sight"

In order for such a sensory experience to be realistic, we must fool our most acute sense, our sight. Beyond XR glasses, I believe XR contact lenses may be the first XR technology to achieve the milestone of mass acceptance. Several start-ups are already working to develop XR contact lenses. Their prototypes show that displays and sensors can be embedded in contact lenses, making text and images visible. These contact lenses still require external CPU for processing, which can be done on a mobile phone. By 2041, we anticipate the “invisibility” of contact lenses will truly cause the market to accept the product, and that challenges such as cost, privacy, and regulations will be overcome.

XR, 2023, Porsche AG

If visual input will be provided by glasses and contact lenses, audio input can be achieved through ear sets, which have improved with every year. By 2030, good ear sets should be almost invisible, through bone-conducting, omni-binaural immersive sound and other technologies, perhaps to the extent that they could be comfortably worn all day. This combination above is likely to become sufficient to evolve into an “invisible smartstream” (or the smartphone of 2041).

Summon your smartstream

When you summon your smartstream, the visual display covers your field of view, perhaps semi-transparently. You could manipulate the smartstream content and apps using gestures, like Tom Cruise’s character in the movie Minority Report. The smartstream sound will be heard by your “invisible earsets,” and operated by voice, gestures, and fingers typing “in air.”

This ever-present XR smartstream can do more than a smartstream (or mobile phone) with a screen. It can remind you of the name of an acquaintance you ran into, alert you when a nearby store has what you want to buy, translate for you when you travel abroad, and guide you to escape from a natural disaster. Beyond the usual “six senses,” our body can “feel” sensations such as wind and an embrace, as well as warmth, cold, vibration, and pain. Haptic gloves will allow you to virtually pick up objects and feel them. And somatosensory (sometimes also called haptic) suits can make you feel cold or hot, or even that you’re getting punched or caressed.

XR, 2023, Porsche AG

The analysis above pertains to devices stimulating our perception, but how do we provide input, or control XR? Today, the input device for XR is a handheld controller similar to the Xbox controller but usually one-handed. These are easy to learn and to use, but feel unnatural as the rest of the experience becomes immersive and lifelike. The ideal future input should be purely natural. Eye tracking, movement tracking, gesture recognition, and speech understanding will be integrated to become the primary inputs.

Content creation as one major obstacle

One major obstacle to achieving these experiences is content creation. Content creation in an XR environment is similar to creating a complex 3D game; it must encompass all permutations of user choices, model the physics of real and virtual objects, simulate the effects of light and weather, and deliver lifelike renderings. This level of complexity is much greater than what’s needed for making video games and developing apps. If we wear devices like glasses or contact glasses all day long, then we are capturing the world every day. On the one hand, it is wonderful to have this “infinite memory repository.” If a customer wants to renege on a commitment, we will be able to search and find the video of his or her promise.

“An immersive experience should be one in which the user experiences the same sensations he or she would in a real-life environment and is unable to distinguish between what is real and synthesized.” Kai-Fu Lee, Qiufan Chen

But do we really want every word we say to be stored? What if this data falls into the wrong hands? Or is used by an application we trust but has an unknown externality?

The bottom line is that by 2041, much of our work and play will involve the use of virtual technologies. We should orient ourselves to this inevitability. There will be giant XR breakthroughs, probably starting in entertainment. All industries will eventually embrace as well as struggle with how to use XR, just like they do with AI today. If AI turns data into intelligence, XR will collect a greater quantity of data from humans at a higher quality—from our eyes, ears, limbs, and eventually our brains. Together, AI and XR will complete our dream to understand and amplify ourselves—and, in the process, expand the possibilities of the human experience. 

Info

Text first published in the Porsche Engineering Magazine, issue 1/2023

Text: Kai-Fu Lee, Qiufan Chen

The text is based on excerpts from the book ‘AI 2041’ (Publisher: Campus). 

Copyright: All images, videos and audio files published in this article are subject to copyright. Reproduction in whole or in part is not permitted without the written consent of Dr. Ing. h.c. F. Porsche AG. Please contact newsroom@porsche.com for further information.

Related Content

Platform for China
Innovation

Platform for China

Digital ecosystems have grown at a rapid rate in China over the last few years. They are now an integral feature of everyday life. This has a corresponding impact on the expectations placed on vehicle infotainment systems.

Consommation et émissions

Cayenne E-Hybrid

WLTP*
  • 1,8 – 1,5 l/100 km
  • 42 – 33 g/km
  • 30,8 – 28,6 kWh/100 km
  • 66 – 74 km

Cayenne E-Hybrid

Consommation de combustible / Émissions
consommation de carburant en cycle mixte (WLTP) 1,8 – 1,5 l/100 km
émissions de CO₂ en cycle mixte (WLTP) 42 – 33 g/km
consommation électrique en cycle mixte (WLTP) 30,8 – 28,6 kWh/100 km
Autonomie électrique combinée (WLTP) 66 – 74 km
Autonomie électrique en zone urbaine (WLTP) 77 – 91 km
Classe d'efficacité: F