Emerging Interfaces
The Reality-Virtuality Continuum
Milgram's continuum (1994) describes the spectrum from fully real to fully virtual environments.
Real Augmented Augmented Virtual
Environment Reality (AR) Virtuality (AV) Environment (VR)
|─────────────|───────────────|───────────────|
◄─── Mixed Reality (MR) ──────►
Virtual Reality (VR) Interaction Design
VR immerses users in a fully synthetic 3D environment, replacing the physical world.
Interaction Challenges
| Challenge | Description | Mitigation | |-----------|-------------|------------| | Locomotion | Moving through large virtual spaces without physical space | Teleportation, arm-swing, redirected walking | | Selection | Pointing at and selecting objects at various distances | Ray casting, hand grab, gaze + dwell | | Manipulation | Rotating, scaling, moving virtual objects | Direct grab with 6DOF, widget handles | | Text input | No physical keyboard visible | Virtual keyboard, voice input, gesture typing | | Simulator sickness | Nausea from visual-vestibular mismatch | Fixed reference frame, teleport over smooth motion, 90+ FPS | | Fatigue | Extended arm use, headset weight | Rest positions, short sessions, ergonomic controllers |
VR Design Principles
- Maintain frame rate: Minimum 72 FPS (90 preferred). Dropped frames cause sickness.
- Respect the vestibular system: Never move the camera without user initiation. Avoid acceleration.
- Comfortable interaction zone: Primary UI elements within arm's reach, 0.5-2m from user.
- Ground the user: Provide a visible floor/ground plane and body representation.
- Spatial audio: 3D audio cues reinforce direction and presence.
- Avoid small text: Minimum angular size of ~1.5 degrees. Current headset resolution limits readability.
- Provide safe boundaries: Guardian/chaperone systems for physical space awareness.
VR Locomotion Techniques
| Technique | Comfort | Immersion | Physical Space Needed | |-----------|---------|-----------|----------------------| | Teleportation | High | Low | Minimal | | Room-scale walking | High | High | Large | | Arm swing | Medium | Medium | Minimal | | Smooth locomotion (joystick) | Low | Medium | Minimal | | Redirected walking | High | High | Medium | | Treadmill | High | High | Minimal (specialized hardware) |
Augmented Reality (AR) Interfaces
AR overlays digital content onto the physical world, maintaining awareness of real surroundings.
AR Display Types
| Type | Examples | Field of View | Use Case | |------|---------|---------------|----------| | Optical see-through | HoloLens, Magic Leap | 30-70 degrees | Hands-free work, enterprise | | Video pass-through | Quest 3, Apple Vision Pro | Wide | Consumer mixed reality | | Handheld (phone/tablet) | ARKit/ARCore apps | Screen-limited | Consumer, marketing | | Head-up display | Car windshield HUD | Narrow | Driving, aviation |
AR Design Principles
- Anchor to the real world: Digital content should maintain stable spatial relationships with physical objects.
- Respect occlusion: Virtual objects should be occluded by real objects in front of them.
- Consider lighting: Match virtual object lighting to the real environment.
- Minimize clutter: Real-world context already provides visual complexity. AR layers must be restrained.
- Design for variable environments: AR is used in diverse lighting, spaces, and contexts.
- Glanceable information: Users should not need to stare at AR overlays. Brief, peripheral-friendly.
AR Interaction Patterns
Placement: Detect surfaces -> show placement indicator -> tap to place object
Annotation: Anchor labels or info panels to physical objects or locations
Navigation: Overlay directional arrows onto real-world walkways
Visualization: Show hidden infrastructure (pipes, wiring) overlaid on walls
Measurement: Point-to-point distance measurement in physical space
Try-before-buy: Place virtual furniture in a real room
Spatial Computing
Spatial computing treats 3D physical space as the interface canvas. Apple Vision Pro and similar devices exemplify this paradigm.
Spatial Interaction Model
Input: Eye tracking (look) + Hand pinch (select) + Hand gestures (manipulate)
Voice commands for text and system control
Head position for viewport
Output: Floating windows positioned in 3D space
Volumetric objects and immersive environments
Spatial audio anchored to virtual objects
Spatial UI Patterns
- Windows: 2D content panels floating in 3D space, repositionable
- Volumes: 3D content containers bounded by a defined space
- Immersive spaces: Full environment replacement (like VR within an AR device)
- Ornaments: Small UI elements attached to window edges (controls, status indicators)
Design Considerations
- Ergonomic placement: UI at comfortable viewing angles (slightly below eye level, within 1-4m)
- Z-depth hierarchy: Closer elements are more important/interactive
- Passthrough integration: Blend virtual content naturally with physical surroundings
- Shared experiences: Multiple users seeing and interacting with the same spatial content
Wearable Interfaces
Wearable devices are constrained by small screens, limited input, and intermittent attention.
Design Constraints
| Constraint | Implication | |------------|------------| | Small display | Prioritize ruthlessly; one primary action per screen | | Glanceable | Information consumed in 3-5 seconds | | Limited input | Minimal text entry; prefer taps, swipes, voice, crown/bezel | | Context-aware | Use sensors (location, activity, time) to surface relevant info | | Battery life | Minimize active screen time and sensor polling | | Social norms | Interactions should be brief and non-disruptive |
Smartwatch Interaction Patterns
- Notifications: Short, actionable. Quick replies, dismiss, or deep-link to phone.
- Complications: At-a-glance data widgets on the watch face.
- Short tasks: Set a timer, check weather, start workout.
- Continuous sensing: Heart rate, activity tracking, fall detection.
Tangible Interfaces (TUIs)
Tangible user interfaces use physical objects as representations of and controls for digital information.
Examples
| System | Physical Object | Digital Function | |--------|----------------|-----------------| | Reactable | Pucks on table surface | Music synthesis parameters | | Sifteo cubes | Small interactive blocks | Gaming, puzzles via proximity | | Lego Mindstorms | Physical bricks | Robot programming | | Sand table | Sand surface | Topographic mapping, landscape simulation | | Programmable tokens | Marked tokens on surface | Data visualization control |
TUI Design Principles (Ishii & Ullmer)
- Physical-digital coupling: Changes to physical objects are reflected digitally, and vice versa.
- Spatial interaction: Arrangement of physical objects in space controls digital parameters.
- Embodied interaction: Leverage existing physical manipulation skills.
- Peripheral awareness: Tangible displays can convey ambient information through shape, movement, or surface changes.
Ambient Displays
Ambient displays present information at the periphery of human attention, requiring minimal cognitive effort.
Design Principles
- Calm technology (Weiser & Brown): Information transitions smoothly between periphery and center of attention.
- Low attention cost: Glanceable. Users should not need to stop what they are doing.
- Aesthetic integration: Ambient displays should fit naturally into the environment.
- Appropriate abstraction: Represent data through color, movement, sound, or physical form rather than numbers and text.
Examples
| Display | Information | Modality | |---------|------------|----------| | Ambient Orb | Stock market trend | Color (green=up, red=down) | | Calmly clock | Calendar density | Analog clock with color-coded face | | Weather curtain | Forecast | LED pattern behind fabric | | Office door light | Availability | Green=free, red=busy | | Water fountain speed | Energy usage | Flow rate varies with consumption |
Conversational AI
Chatbots
Text-based conversational interfaces for customer support, task completion, and information retrieval.
Design patterns:
- Decision trees: Structured conversation with predefined branches
- Slot filling: Collect required parameters through dialog (origin, destination, date)
- Open-domain: Free-form conversation powered by large language models
- Hybrid: Structured flows with fallback to free-form understanding
Conversational UX principles:
- Set expectations about capabilities upfront
- Provide suggested replies / quick actions to reduce typing
- Handle non-understanding gracefully (do not repeat the same reprompt)
- Allow human escalation when the bot cannot help
- Maintain conversation context across turns
- Confirm critical actions before executing ("Transfer $500 to checking? Yes/No")
Voice Assistants
Voice-first interfaces (Alexa, Google Assistant, Siri) for hands-free interaction.
Design elements:
- Invocation: Wake word ("Hey Siri") or button press
- Intents: What the user wants to do (play music, set alarm)
- Entities/Slots: Parameters for the intent (song name, time)
- Dialog turns: Multi-turn conversations to collect missing information
- Multimodal responses: Voice response + visual card on screen-equipped devices
Generative UI
Interfaces dynamically generated or adapted by AI based on user context, intent, or content.
Approaches
| Approach | Description | Example | |----------|-------------|---------| | Prompt-driven layout | LLM generates UI structure from natural language | "Show me a dashboard for sales data" | | Adaptive interfaces | System rearranges UI based on usage patterns | Frequently used tools move to primary positions | | Content-driven rendering | UI adapts its structure to fit content semantics | Structured data auto-renders as table, chart, or card | | AI-assisted design | AI suggests or generates design variations | Copilot-style design assistance in Figma |
Design Challenges
- Predictability: Users expect consistent interfaces. Generated UI must maintain structural coherence.
- User control: Users should be able to override, pin, or reject AI-generated layouts.
- Transparency: Make it clear when UI is being adapted and why.
- Graceful failure: When generation fails, fall back to a sensible default layout.
- Accessibility: Generated interfaces must still meet WCAG standards programmatically.
- Testing: Non-deterministic outputs make traditional QA difficult. Require constraint-based validation.
The Spectrum of AI-UI Integration
Static UI Adaptive UI Conversational UI Generative UI
(fixed layout, (layout adjusts (dialog-driven, (AI creates UI
human-designed) to usage data) natural language) on the fly)
|─────────────────|─────────────────────|─────────────────────|
Less AI involvement More AI involvement
More predictable Less predictable
Design Ethics for Emerging Interfaces
Emerging interfaces raise new ethical considerations:
| Concern | Context | Principle | |---------|---------|-----------| | Privacy | Eye tracking, brain signals, spatial mapping | Minimize data collection; explicit consent | | Addiction | Immersive VR, persuasive AR | Provide usage awareness and break reminders | | Exclusion | Expensive hardware, physical requirements | Design for diverse abilities and economic access | | Reality distortion | Deepfakes in AR, manipulated environments | Clearly distinguish real from synthetic | | Autonomy | AI-driven adaptation, dark patterns | Ensure users can understand and override system decisions | | Physical safety | VR disorientation, AR distraction while walking | Guardian systems, context-appropriate interaction modes |