Optimism view

 Extended reality (XR) is a term referring to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables, where the 'X' represents a variable for any current or future spatial computing technologies. [1] e.g. It includes representative forms such as augmented reality (AR), mixed reality (MR) and virtual reality (VR) [2] and the areas interpolated among them. The levels of virtuality range from partially sensory inputs to immersive virtuality, also called VR. 

XR is a superset which includes the entire spectrum from "the complete real" to "the complete virtual" in the concept of reality–virtuality continuum introduced by Paul Milgram. Still, its connotation lies in the extension of human experiences especially relating to the senses of existence (represented by VR) and the acquisition of cognition (represented by AR). With the continuous development in human–computer interactions, this connotation is still evolving.

XR is a rapid growing field being applied in a wide range of ways, such as entertainment, marketing, real-estate, training and remote work[3].

A wearable computer, also known as a wearable or body-borne computer,[1][2] is a computing device worn on the body.[3]

The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.[4][5]

Wearables may be for general use, in which case they are just a particularly small example of mobile computing. Alternatively they may be for specialized purposes such as fitness trackers. They may incorporate special sensors such as accelerometers, thermometer and heart rate monitors, or novel user interfaces such as Google Glass, an optical head-mounted display controlled by gestures. It may be that specialized wearables will evolve into general all-in-one devices, as happened with the convergence of PDAs and mobile phones into smartphones.

Wearables are typically worn on the wrist (e.g. fitness trackers), hung from the neck (like a necklace), strapped to the arm or leg (smartphones when exercising), or on the head (as glasses or a helmet), though some have been located elsewhere (e.g. on a finger or in a shoe). Devices carried in a pocket or bag – such as smartphones and before them pocket calculators and PDAs, may or may not be regarded as 'worn'.

Wearable computers have various technical issues common to other mobile computing, such as batteries, heat dissipation, software architectures, wireless and personal area networks, and data management.[6] Many wearable computers are active all the time, e.g. processing or recording data continuously.

A wearable computer, also known as a wearable or body-borne computer,[1][2] is a computing device worn on the body.[3]

The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.[4][5]

Wearables may be for general use, in which case they are just a particularly small example of mobile computing. Alternatively they may be for specialized purposes such as fitness trackers. They may incorporate special sensors such as accelerometers, thermometer and heart rate monitors, or novel user interfaces such as Google Glass, an optical head-mounted display controlled by gestures. It may be that specialized wearables will evolve into general all-in-one devices, as happened with the convergence of PDAs and mobile phones into smartphones.

Wearables are typically worn on the wrist (e.g. fitness trackers), hung from the neck (like a necklace), strapped to the arm or leg (smartphones when exercising), or on the head (as glasses or a helmet), though some have been located elsewhere (e.g. on a finger or in a shoe). Devices carried in a pocket or bag – such as smartphones and before them pocket calculators and PDAs, may or may not be regarded as 'worn'.

Wearable computers have various technical issues common to other mobile computing, such as batteries, heat dissipation, software architectures, wireless and personal area networks, and data management.[6] Many wearable computers are active all the time, e.g. processing or recording data continuously.

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device[1] such as a smartphone.

Extended reality (XR) is a term referring to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables, where the 'X' represents a variable for any current or future spatial computing technologies. [1] e.g. It includes representative forms such as augmented reality (AR), mixed reality (MR) and virtual reality (VR) [2] and the areas interpolated among them. The levels of virtuality range from partially sensory inputs to immersive virtuality, also called VR.

XR is a superset which includes the entire spectrum from "the complete real" to "the complete virtual" in the concept of reality–virtuality continuum introduced by Paul Milgram. Still, its connotation lies in the extension of human experiences especially relating to the senses of existence (represented by VR) and the acquisition of cognition (represented by AR). With the continuous development in human–computer interactions, this connotation is still evolving.

XR is a rapid growing field being applied in a wide range of ways, such as entertainment, marketing, real-estate, training and remote work[3].

A wearable computer, also known as a wearable or body-borne computer,[1][2] is a computing device worn on the body.[3]

The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.[4][5]

Wearables may be for general use, in which case they are just a particularly small example of mobile computing. Alternatively they may be for specialized purposes such as fitness trackers. They may incorporate special sensors such as accelerometers, thermometer and heart rate monitors, or novel user interfaces such as Google Glass, an optical head-mounted display controlled by gestures. It may be that specialized wearables will evolve into general all-in-one devices, as happened with the convergence of PDAs and mobile phones into smartphones.

Wearables are typically worn on the wrist (e.g. fitness trackers), hung from the neck (like a necklace), strapped to the arm or leg (smartphones when exercising), or on the head (as glasses or a helmet), though some have been located elsewhere (e.g. on a finger or in a shoe). Devices carried in a pocket or bag – such as smartphones and before them pocket calculators and PDAs, may or may not be regarded as 'worn'.

Wearable computers have various technical issues common to other mobile computing, such as batteries, heat dissipation, software architectures, wireless and personal area networks, and data management.[6] Many wearable computers are active all the time, e.g. processing or recording data continuously.

A wearable computer, also known as a wearable or body-borne computer,[1][2] is a computing device worn on the body.[3]

The definition of 'wearable computer' may be narrow or broad, extending to smartphones or even ordinary wristwatches.[4][5]

Wearables may be for general use, in which case they are just a particularly small example of mobile computing. Alternatively they may be for specialized purposes such as fitness trackers. They may incorporate special sensors such as accelerometers, thermometer and heart rate monitors, or novel user interfaces such as Google Glass, an optical head-mounted display controlled by gestures. It may be that specialized wearables will evolve into general all-in-one devices, as happened with the convergence of PDAs and mobile phones into smartphones.

Wearables are typically worn on the wrist (e.g. fitness trackers), hung from the neck (like a necklace), strapped to the arm or leg (smartphones when exercising), or on the head (as glasses or a helmet), though some have been located elsewhere (e.g. on a finger or in a shoe). Devices carried in a pocket or bag – such as smartphones and before them pocket calculators and PDAs, may or may not be regarded as 'worn'.

Wearable computers have various technical issues common to other mobile computing, such as batteries, heat dissipation, software architectures, wireless and personal area networks, and data management.[6] Many wearable computers are active all the time, e.g. processing or recording data continuously.

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device[1] such as a smartphone.

Typically, it is the user's visual perception of the environment that is mediated. This is done through the use of some kind of electronic device, such as an EyeTap device or smart phone, which can act as a visual filter between the real world and what the user perceives.

Computer-mediated reality has been used to enhance visual perception as an aid to the visually impaired.[2] This example achieves a mediated reality by altering a video input stream light that would have normally reached the user's eyes, and computationally altering it to filter it into a more useful form.

It has also been used for interactive computer interfaces.[3]


The use of computer-mediated reality to diminish perception, by the removal or masking of visual data, has been used for architectural applications, and is an area of ongoing research.[4]


The long-term effects of altering perceived reality have not been thoroughly studied, and negative side effects of long-term exposure might be possible.[citation needed]


Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time. Mixed reality does not exclusively take place in either the physical or virtual world, but is a hybrid of reality and virtual reality, encompassing both augmented reality and augmented virtuality via immersive technology.[2]


The first immersive mixed reality system that provided enveloping sight, sound, and touch was the Virtual Fixtures platform, which was developed in 1992 at the Armstrong Laboratories of the United States Air Force. The project demonstrated that human performance could be significantly amplified, by overlaying spatially registered virtual objects on top of a person's direct view of a real physical environment.[3]

Typically, it is the user's visual perception of the environment that is mediated. This is done through the use of some kind of electronic device, such as an EyeTap device or smart phone, which can act as a visual filter between the real world and what the user perceives.


Computer-mediated reality has been used to enhance visual perception as an aid to the visually impaired.[2] This example achieves a mediated reality by altering a video input stream light that would have normally reached the user's eyes, and computationally altering it to filter it into a more useful form.


It has also been used for interactive computer interfaces.[3]


The use of computer-mediated reality to diminish perception, by the removal or masking of visual data, has been used for architectural applications, and is an area of ongoing research.[4]


The long-term effects of altering perceived reality have not been thoroughly studied, and negative side effects of long-term exposure might be possible.[citation needed]


Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time. Mixed reality does not exclusively take place in either the physical or virtual world, but is a hybrid of reality and virtual reality, encompassing both augmented reality and augmented virtuality via immersive technology.[2]


The first immersive mixed reality system that provided enveloping sight, sound, and touch was the Virtual Fixtures platform, which was developed in 1992 at the Armstrong Laboratories of the United States Air Force. The project demonstrated that human performance could be significantly amplified, by overlaying spatially registered virtual objects on top of a person's direct view of a real physical environment.[3]