Glossary of Extended Reality

Daniel AlShriky
9 min readAug 19, 2023


Let's start with VR, AR, MR — XR

Virtual Reality (VR) - VR visually takes the user out of their real-world environment and immerses them into an entirely virtual environment, typically using a headset for viewing, coupled with hand-held controllers to navigate the virtual space.

Augmented Reality (AR) - AR overlays digital objects (information, graphics, sounds) onto the real world, allowing the user to experience the relationship between digital and physical worlds.

Mixed Reality (MR) - MR overlays digital objects onto the real world, and anchors the virtual and real objects to one another, allowing the user to interact with combined virtual/real objects.

Extended Reality (XR) - XR refers to the full range of immersive experiences that enable human interaction between the physical and digital (or virtual) worlds. This includes augmented reality, virtual reality and mixed reality, as well as broader techniques that use and enhance human senses such as haptics, holograms and beyond.



Anchors — user-defined points of interest upon which AR objects are placed. Anchors are created and updated relative to geometry (planes, points, etc.)

Ambisonics — Ambisonics is the name given to a method of recording and reproducing sounds in 360°. This is done using a special array of at least 4 microphones to capture sounds from every direction.

A-Frame (virtual reality framework) — A-Frame is an open-source web framework for building virtual reality experiences. It is maintained by developers from Supermedium and Google. A-Frame is an entity component system framework for Three.js where developers can create 3D and WebXR scenes using HTML.

ARCore — ARCore is Google’s augmented reality SDK offering cross-platform APIs to build new immersive experiences on Android, iOS, Unity, and Web. Transform the way people play, shop, learn, create, and experience the world together through contextual understanding about people, places, and things.

ARKit — ARKit is Apple’s Augmented Reality (AR) development platform for iOS mobile devices. Let’s have a look at what can be done with ARKit: ARKit helps developers produce the greatest AR experiences for iPad and iPhone.



CAVE (Cave Automatic Virtual Environment) — A cave automatic virtual environment or CAVE uses projections on the walls and ceiling of a room to create the illusion of a real environment. A viewer can move around anywhere inside the cave, giving them the illusion of immersion.

Cosmos Everything that exists


Degrees of Freedom (6DOF, 3DOF) — 3DoF means we can track rotational motion but not translational. For the headset, that means we can track whether the user has turned their head left or right, tilted it up or down or pivoted left and right.

6DoF means we can additionally track translational motion. That means we can track whether the user has moved forward, backward, laterally, or vertically.

Dollhouse view — A dollhouse view refers to a zoomed-out, usually top-down view of a given 3D space or structure from the outside. It enables designers to observe the entire area without moving around and view computer-modeled designs in their entirety before physical prototyping begins.

Digital Twin — A digital twin is a virtual asset that is designed to accurately replicate a real physical object in a virtual 3D space. Digital twins are often used in virtual training and testing complex simulations as a virtual environment allows for quick iteration, low-cost scaling, advanced real-time feedback, and accurate automatic data collection.


Eye tracking — Eye tracking is a process used in headsets to measure and keep track of the direction of the user’s gaze. Using this information, it is possible to reproduce the eyes’ natural process of bringing objects into/out of focus depending on what the user is concentrating on.


Feature Points — these are visually distinct features in your environment, like the edge of a chair, a light switch on a wall, the corner of a rug, or anything else that is likely to stay visible and consistently placed in your environment.

Framing — with regards to mobile AR design, this is the strategic placement of 3D objects in the environment to avoid breaking immersion.

FOV (Field of View) — The field of view is the total number of degrees visible at any given moment from a given point of view. Most people’s field of view is approximately 200°; about ~120° of binocular vision and another ~40° of monocular vision on either side of this area which is covered only by one eye’s field of view.

Frame Rate (30fps and 60fps) — Frame rates are the frequency at which an image/frame on a monitor is replaced by another. Each frame represents a still image to replace the previous image, giving off the illusion of change/movement on a monitor.


Gyroscope — measures orientation and angular velocity.

Gaze-based interactions — refer to interactions between the user and the VR content, where the content is directly impacted by the user’s gaze, i.e., the direction the user is looking in when wearing a VR headset.



Inside-Out Tracking — when the device has internal cameras and sensors to detect motion and track positioning.

IPD — Interpupillary Distance — Interpupillary distance (IPD) is the distance between the centers of a person’s eyes. The IPD of every person is slightly different and typically sits in the range between 54 and 72 millimeters with the majority of people having an IPD of around 61–65 mm.




Latency — Latency in virtual reality refers to a delay between user input (e.g., head, hand, or leg movements) and output (e.g., visual, haptic, positional, audio) caused by a mixture of technical problems that are likely to be eliminated as the technology advances.

Locomotion — Locomotion refers to the means by which the user is able to move around within a VR environment. Most systems use some combination of three different types of locomotion: teleportation, transportation, and perambulation.

LiDAR — LiDAR, or light detection and ranging, is a remote sensing method used for measuring the exact distance of an object on the earth’s surface. Even though it’s a fairly old technology (it was used in 1960s for the first time) it is seeing resurgence today in AR applications.


Metaverse — A metaverse can be any 3D virtual space powered by technologies — including virtual reality (VR), augmented reality (AR), artificial intelligence (AI), the Internet of Things (IoT), and blockchain — that allows people to interact with each other (and in some cases, with non-human avatars).

Mirrorworlds — are alternative dimensions of reality, layered over the physical world. * Rather than completely removing you from your environment, they are parallel to reality, transforming your surroundings into refracted versions of themselves.



Outside-in tracking — is the external sensors (like base stations) placed around the VR headset to track the movement of the users.

Occlusion — Occlusion is an event that happens when one object blocks another one in a 3D space. In virtual reality, occlusion is commonly used to describe a positional tracking issue, for example, when a built-in camera tracking system can’t detect the position of one virtual controller because it’s blocked by another one.

OpenXR — OpenXR is a royalty-free and open standard API to help AR and VR developers build applications that operate across a wide range of devices. OpenXR was released by Khronos Group in 2019. The standard aims to bring together a disjointed field of VR and AR development devices and tools that either require too much additional development to ensure cross-platform functioning or don’t work together at all.


Parallax — Parallax describes the perceived movement of objects when the viewer moves, i.e., objects further away from the viewer seem to move more slowly in relation to the viewer’s position while objects closer to the viewer seem to move more quickly.

POV (point of view) — The point of view or POV is the reference point from which observations, calculations, and measurements take place; the location or position of the viewer/object in question.



Raycasting — projecting a ray to help estimate where the AR object should be placed in order to appear on the real-world surface in a believable way; used during hit testing.

Runtime — when edits/changes are made during active gameplay mode or while your app is running.

Reticle — The reticle refers to a visual marker representing the user’s gaze in a 3D environment. It helps the user keep track of their object of focus, but can break immersion when used in an unsubtle or unnecessary manner.


Spatial mapping — the ability to create a 3D map of the environment and helps establish where assets should be posed.

Stitching 360 and 180 video — Stitching is a digital process that combines multiple source videos from a VR camera’s lenses into equirectangular videos for playback and distribution in VR

SLAM — Simultaneous localization and mapping (SLAM) is a problem of simultaneously mapping an unknown environment and tracking a user’s position within it. SLAM is essential in the field of robotics, virtual reality, augmented reality, autonomous transportation, and computer vision.


Three.js — is a cross-browser JavaScript library and application programming interface used to create and display animated 3D computer graphics in a web browser using WebGL. The source code is hosted in a repository on GitHub.

Tethered headset / Mobile Headset — There are currently two different types of headsets for VR: tethered and mobile. Tethered headsets are physically connected to a powerful computer using wires, allowing them to make use of the processing power available to track positions, movements, etc. Tethered headsets give the user a freedom of movement beyond turning their heads and additional interactivity. However, they require the user to stay within a certain distance of the computer, as the user’s headset must stay connected physically to the computer, and are relatively expensive compared to their mobile counterparts. Mobile headsets can be taken anywhere, since they do not require a physical connection to a processor, and are far cheaper than tethered headsets. While they don’t limit the movements of the user, their lack of additional computing power makes them unable to react to the user’s movements and limits them to look around the environment without interacting with or moving around it.

Teleportation — Teleportation is a type of locomotion that allows the player to move around a VR environment with minimal discomfort. With teleportation, the player points the controller at a location they’d like to move to, then initiates the teleportation action, they are transitioned to that location via a rapid animation tuned to maintain player comfort.

Tunneling — Tunneling is a technique used with first-person locomotion (such as walking) where, during movement, the camera is cropped and a high-constrast stable background is displayed in the user’s peripheral vision. The cropping of the scene is the “tunnel” and the use of a stable background is referring to as “grounding the user.” This is analogous to a user watching first-person locomotion on a television set, where the television and room around it form the stable background.

Telepresence —Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance or effect of being present via telerobotics, at a place other than their true location.


Unity 3D Engine — Unity’s real-time 3D development engine lets artists, designers, and developers collaborate to create amazing immersive and interactive experiences. You can work on Windows, Mac, and Linux.

Unreal Engine — Unreal Engine is a series of 3D computer graphics game engines developed by Epic Games, first showcased in the 1998 first-person shooter video game Unreal.


Virtual reality sickness — Virtual reality sickness, or VR sickness occurs when exposure to a virtual environment causes symptoms that are similar to motion sickness symptoms. The most common symptoms are general discomfort, eye strain, headache, stomach awareness, nausea, vomiting, pallor, sweating, fatigue, drowsiness, disorientation, and apathy. Other symptoms include postural instability and retching. Common causes are low frame rate, input lag, and the vergence-accommodation-conflict.


WebGL — WebGL is a JavaScript-based API that enables browser-based 3D rendering without plugins in an HTML page, allowing web designers to, for example, render an entire 3D VR world all in their own browser.

WebVR Virtual Reality for the Web— WebVR is an emerging JavaScript-based API supporting a variety of virtual reality devices to allow more people to experience VR content in traditional web browsing interfaces. The great advantage is that beyond the web browser no further plug-ins or apps are required to view and design VR content.






Daniel AlShriky

UX / UI Leader | Researcher | Extended Reality (XR) designer