top of page

CORE IMMERSIVE TECHNOLOGY CONCEPTS

​

1. Virtual Reality (VR)

A fully immersive digital environment that replaces the physical world. Users experience VR through

headsets that track head and body movement. VR is widely used for training, education, simulation,

and entertainment.

​

2. Augmented Reality (AR)

Technology that overlays digital elements onto the real world. AR enhances physical environments

rather than replacing them. Common devices include smartphones, tablets, and AR glasses.

​

3. Mixed Reality (MR)

A blend of physical and digital environments where virtual objects interact with real-world surfaces.

MR allows digital content to respond to lighting, space, and user actions.

​

4. Extended Reality (XR)

An umbrella term that includes VR, AR, and MR. XR refers to all immersive technologies that extend

human perception beyond physical reality.

​

5. Spatial Computing

Technology that enables computers to understand and interact with three-dimensional space. It

allows digital objects to exist and behave naturally within physical environments.

​

HARDWARE & DEVICES

​

6. Head-Mounted Display (HMD)

A wearable device that places visual displays in front of the user’s eyes. HMDs are used for VR, AR,

and MR experiences.

​

7. Optical See-Through Display

A transparent display that overlays digital content onto the real world. Users can still see their

surroundings directly through the lenses.

​

8. Video See-Through Display

Uses cameras to capture the real world and display it digitally. Virtual elements are layered onto the

video feed inside the headset.

​

9. Motion Controllers

Handheld devices that track movement and input. They allow users to interact with virtual objects

and environments.

​

10. Haptics

Technology that provides tactile feedback such as vibration or force. Haptics improve realism by

simulating physical sensations.

​

11. Force Feedback

A haptic technique that applies resistance or pressure. It is often used in training simulators to mimic

real-world forces.

​

12. Tactile Feedback

Vibration-based feedback used to signal interaction or impact. Commonly found in controllers and

wearable devices.

​

AUDIO & SENSORY SYSTEMS

​

13. Spatial Audio

Audio that appears to come from specific directions in 3D space. It enhances realism and situational

awareness.

​

14. Binaural Audio

Audio recorded or processed to simulate human hearing. It creates realistic depth and direction

when listened to with headphones.

​

TRACKING & MOVEMENT

​

15. Inside-Out Tracking

Tracking that uses cameras built into the headset. It allows movement tracking without external

sensors.

​

16. Outside-In Tracking

Tracking that relies on external cameras or base stations. It often provides higher precision but

requires more setup.

​

17. SLAM

Simultaneous Localisation and Mapping. A technique that allows devices to map environments

while tracking their position.

​

18. Field of View (FOV)

The visible area of the virtual environment. Wider FOV increases immersion and realism.

​

19. Refresh Rate

The number of times the display updates per second. Higher refresh rates reduce motion sickness.

​

20. Latency

The delay between user action and system response. Low latency is critical for comfort in immersive

experiences.

​

21. Motion-to-Photon Latency

The time between physical movement and visual update. Reducing this improves realism and

comfort.

​

22. Eye Tracking

Technology that detects where the user is looking. It enables interaction, analytics, and rendering

optimisation.

​

23. Foveated Rendering

Rendering technique that focuses detail where the user is looking. It improves performance without

reducing perceived quality.

​

24. Hand Tracking

Allows interaction using natural hand movements without controllers. Cameras track finger and

hand position.

​

25. Gesture Recognition

Interprets hand or body movements as commands. Gestures must be intuitive and consistent.

​

26. 6DoF

Six Degrees of Freedom. Allows movement and rotation in all directions within a 3D space.

​

27. 3DoF

Three Degrees of Freedom. Allows rotation only without positional movement.

​

28. Passthrough

A feature that shows the real world inside a headset using cameras. Used for safety and mixed

reality.

​

29. Guardian System

A virtual boundary that prevents users from colliding with real-world objects. It improves safety in

VR.

​

SOFTWARE & DEVELOPMENT

​

30. Game Engine

Software used to create interactive 3D experiences. It handles graphics, physics, and input.

​

31. Unity

A popular real-time engine for VR, AR, and mobile applications. Widely used in education and

industry.

​

32. Unreal Engine

A high-fidelity engine known for realistic visuals. Used in games, film, and immersive experiences.

​

33. SDK

Software Development Kit containing tools for building applications. XR SDKs include tracking and

input features.

​

34. API

Application Programming Interface that allows software systems to communicate. APIs enable

integration and automation.

​

35. Plugin

An add-on that extends software functionality. Plugins add features without modifying core software.

​

36. Script

Code that controls logic and behaviour within immersive environments. Scripts manage interaction

and events.

​

37. Shader

A program that controls how surfaces are rendered. Shaders define lighting, colour, and effects.

​

38. Material

Defines how a surface looks in a 3D environment. Materials combine shaders and textures.

​

3D CONTENT CREATION

​

39. Mesh

A 3D object made up of vertices and polygons. Mesh complexity affects performance.

​

40. Polygon

A flat shape that forms part of a 3D model. Models are typically built from triangles or quads.

​

41. Vertex

A point in 3D space that defines geometry. Vertices connect to form polygons.

​

42. UV Mapping

The process of applying 2D textures to 3D models. Proper UV mapping prevents distortion.

​

43. 3D Modelling

Creating digital objects using specialist software. Models are used in immersive environments.

​

44. Texturing

Adding colour and detail to 3D models. Textures improve realism and visual quality.

​

45. Rigging

Adding a digital skeleton to a 3D model. Rigging allows animation and movement.

​

46. Animation

The process of creating movement for characters or objects. Used to bring environments to life.

​

47. Inverse Kinematics

A technique that calculates realistic joint movement. Commonly used for hands and arms.

​

48. Motion Capture

Recording real human movement for use in animation. Provides natural motion.

​

49. Volumetric Capture

Capturing real people or objects in 3D. Used in immersive storytelling and training.

​

50. 360° Video

Video recorded in all directions. Allows users to look around freely in VR.

​

UX, UI & INTERACTION DESIGN

​

51. UX Design

Focuses on usability, comfort, and flow. Good UX reduces fatigue and confusion.

​

52. UI Design

Design of visual interface elements. In XR, UI often exists in three-dimensional space.

​

53. Spatial UI

User interfaces placed within 3D environments. Enables natural interaction.

​

54. HUD

Heads-Up Display that shows information over the scene. Often used for guidance.

​

55. User Flow

The path users follow through an experience. Clear flows improve usability.

​

56. Accessibility

Designing experiences usable by everyone. Includes comfort, subtitles, and control options.

​

57. Simulator Sickness

Discomfort caused by mismatched motion cues. Reduced through good design.

​

58. Teleport Locomotion

Movement method that instantly moves the user. Helps reduce motion sickness.

​

59. Smooth Locomotion

Continuous movement similar to traditional games. Requires comfort options.

​

60. Snap Turning

Turning in fixed angles rather than smoothly. Reduces disorientation.

​

61. Vignette

Darkening the edges of vision during movement. Helps reduce motion sickness.

​

PRODUCTION, WORKFLOW & INDUSTRY

​

62. Prototype

An early version of a product used for testing ideas. Prototypes are often incomplete.

​

63. MVP

Minimum Viable Product with essential features. Used for early feedback.

​

64. Agile Development

An iterative approach to software development. Encourages frequent testing.

​

65. Sprint

A short development cycle within agile workflows. Typically lasts 1–2 weeks.

​

66. Build

A compiled version of an application. Builds are used for testing and release.

​

67. Deployment

The process of releasing an application to users. Includes testing and packaging.

​

68. Optimization

Improving performance and efficiency. Critical for immersive applications.

​

69. Frame Rate

Number of frames displayed per second. Higher rates improve comfort.

​

70. Occlusion

Blocking of objects by others in a scene. Improves realism.

​

71. Collision Detection

Detects when objects touch or overlap. Enables physical interaction.

​

72. Physics Engine

Simulates real-world forces such as gravity. Adds realism to interactions.

​

73. Multiplayer Networking

Connects multiple users in a shared environment. Enables collaboration.

​

PRESENCE, CONTENT & ETHICS

​

74. Avatar

A digital representation of the user. Supports identity and presence.

​

75. Embodiment

The feeling that a virtual body belongs to the user. Increases immersion.

​

76. Presence

The sensation of being inside a virtual environment. Central to XR experiences.

​

77. Immersion

The depth of engagement a user feels. Influenced by visuals and interaction.

​

78. Diegetic UI

Interface elements that exist naturally in the environment. Enhances immersion.

​

79. Non-Diegetic UI

UI elements separate from the environment. Includes menus and overlays.

​

80. Spatial Anchors

Fixed points that lock digital objects to physical space. Used in AR.

​

81. Cloud Anchors

Shared spatial anchors stored online. Enable multi-user AR.

​

82. Digital Twin

A virtual replica of a physical object or system. Used for simulation.

​

83. Data Visualization

Presenting data using immersive visuals. Helps reveal complex patterns.

​

84. AI Agents

Autonomous virtual characters or systems. Respond to user actions.

​

85. Procedural Generation

Algorithmic creation of content. Used to build large environments.

​

86. Storyboarding

Visual planning of scenes and interactions. Helps structure experiences.

​

87. Narrative Design

Crafting story and pacing in interactive media. Guides user engagement.

​

88. World-Building

Creating believable environments and rules. Enhances immersion.

​

89. Environmental Design

Designing lighting, atmosphere, and layout. Shapes user experience.

​

90. Level Design

Structuring spaces and challenges. Guides user movement.

​

91. Interaction Design

Defining how users interact with objects. Must feel natural and intuitive.

​

92. HCI

Human-Computer Interaction. Studies how people use technology.

​

93. Ethical Design

Designing technology responsibly. Considers user wellbeing and privacy.

​

94. Data Privacy

Protecting user data. Especially important in tracked environments.

​

95. Safety Boundaries

Virtual limits to protect users. Prevent physical collisions.

​

96. Learning Objectives

Clear goals for educational experiences. Guide content design.

​

97. Assessment

Measuring user understanding or performance. Used in training.

​

98. Content Pipeline

Workflow from concept to deployment. Improves efficiency.

​

99. Quality Assurance

Testing for bugs and usability issues. Ensures reliability.

 

100. Deployment Platform

The system used to distribute applications. Includes app stores and enterprise systems.

bottom of page