Omni Design System for Scalable Products

Omni was built to scale across products and teams. The goal was not just consistency, but a shared foundation for speed without losing coherence.

Omni was built to scale across products and teams. The goal was not just consistency, but a shared foundation for speed without losing coherence.

Omni was built to scale across products and teams. The goal was not just consistency, but a shared foundation for speed without losing coherence.

Duration

2018-2025

Platform

XR (VR / MR), Web, Mobile, PC

Context

Between 2016–2018, VIVE expanded into a multi-surface ecosystem across VR, PC, Web, and Mobile.
Each surface evolved independently — without shared foundations, tools, or patterns.

This became the beginning of Omni.

Problem

Problem

Without a unified system, each platform evolved independently — fragmenting the experience across the ecosystem.

When VIVE launched its first VR device in 2016, the end-to-end experience extended across VR, PC, Web, and Mobile surfaces. Yet each surface was designed in isolation, using different tools and patterns, without a shared library or system connecting them.

This resulted in recurring challenges across teams:

  • Manual updates were required whenever UI changed

  • VR and 2D UI lived in separate tools with no shared assets

  • Icons and components were duplicated across products

  • Interaction patterns diverged rapidly

  • Designers and engineers repeatedly rebuilt similar UI

This fragmentation became the catalyst for Omni 1.0.

My Role

Omni began as a collaboration across design teams in the US and Taiwan. I joined early as a contributor, helping define foundational assets and align 2D and VR patterns — and grew into a system and creative leader, helping shape Omni’s direction.

As the system and organization evolved, my role expanded from building components to taking on system direction, and eventually into a stewardship role supporting Omni’s long-term evolution.

Contributor → System Lead → System Stewardship

  • Built foundations: color, typography, iconography, and early 2D/VR component systems.

  • Defined XR interaction architecture, multi-input patterns, and token-based system foundations.

  • Led system evolution across products — establishing governance, scaling adoption, and enabling cross-team collaboration.

  • Initiated Omni × AI exploration — connecting design systems with emerging AI workflows.

My role evolved from designing components to shaping systems — and directing how teams align, build, and scale coherent experiences.

Process

Omni 1.0: Foundations (2018)

Establish coherence across VR and 2D

  • Unified color system, typography, iconography

  • Shared Sketch UI library

  • Unity VR component set

  • Initial cross-team review workflow

Establishing shared UI primitives across VR and 2D surfaces

Omni 2.0: Visual Language & System Adaptation (2019–2020)

As VIVE evolved from PC VR to all-in-one (AIO) devices, the system needed to adapt to new constraints: lighter hardware performance, longer daily usage, and broader adoption in enterprise environments.

To support these shifts, we introduced a distance-based visual model that organizes UI layers around the user’s interaction space.

This reduced visual inconsistency across products and enabled teams to design for spatial contexts with a shared model.

Visual Language Evolution

Omni 2.0 introduced a lighter visual language using a 2.5D approach.

Rather than treating all interfaces as flat surfaces, the system organizes UI into three visual layers based on interaction distance:

  • Near zone : fully spatial 3D UI designed for direct interaction around the user

  • Mid zone : 2.5D overlays designed for spatial panels and system surfaces

  • Far zone : traditional 2D interfaces used across screens and dashboards

This layered approach preserves depth and hierarchy while reducing visual fatigue and maintaining clarity across different interaction distances.

Omni 2.0 organizes UI into distance-based layers, allowing the system to adapt visual depth and interaction patterns across XR environments

Theme & Color System Expansion

A key shift in Omni 2.0 was the introduction of a theme-ready color system.

Primary (accent) colors were redefined at a system level, enabling global replacement through shared color definitions. This allowed enterprise clients to adopt their own brand colors without fragmenting the design system or redesigning individual components.

System Impact

Enabled scalable brand customization for B2B use cases

  • Maintained consistency across VR, PC, Web, and Mobile

  • Prepared the foundation for future tokenization and system automation

Omni 2.0 marked the transition from visual consistency to system adaptability—ensuring the design system could evolve with both technology and business needs.

The color system separates neutral UI tokens from brand accent layers, enabling scalable theming across enterprise deployments

Transition — From Omni 2.0 to Omni 3.0

Omni 2.0 established a distance-based visual model and a theme-ready color system, enabling the design system to scale across devices, brands, and business contexts.

As XR hardware capabilities and product ambitions continued to evolve, new interaction challenges emerged. Experiences were no longer limited to single input methods or flat interfaces. Spatial environments required support for direct touch, eye tracking, controller input, and mixed reality scenarios.

These shifts introduced new design constraints that could not be addressed through visual language alone. A deeper interaction architecture was needed—one capable of defining consistent interaction patterns across devices, input modalities, and spatial contexts.

This led to the next evolution of Omni.

Omni 3.0: XR Interaction Architecture & Token Expansion (2021–2025)

Omni 3.0 expanded the system beyond visual language—introducing a unified interaction architecture, spatial ergonomics, and tokenized foundations designed for XR environments.

Interaction Model

Omni introduced a distance-based interaction model to unify interaction behaviors across XR devices.

  • Near zone (0-0.4 m): direct touch

  • Mid zone (0.4-1 m): touch or ray-based interaction, gaze & pinch

  • Far zone (> 1 m): raycast, gaze & pinch

Omni unified interaction behavior across XR devices using a distance-based interaction model

Spatial Ergonomics

Interaction zones were supported by ergonomic guidelines to ensure comfort and readability in extended XR use.

  • Reach-based layout zones

  • Panel sizing based on distance & FOV

  • Minimum touch target 56 × 56 px at 1 m

  • Readability and comfort guidelines for extended use

MR-Ready Visual System

  • Transparent and glass-style UI layers

  • Depth layering with blur

  • Real-world contrast validation

System Infrastructure

  • Tokenized foundations (color, radius, elevation, typography)

  • Multi-theme support

  • Shared semantics across VR and 2D kits

  • Dual-kit structure: VR spatial components, PC/Web/Mobile components

System Continuity

Omni 3.0 evolved alongside XR products and was continuously validated through real product implementation. Its application at runtime and system-level execution is detailed in Case 2: VRS.

Outcome & Impact

System coherence

Established shared design foundations that enabled consistent decision-making across regions and products, reducing divergence in UI patterns as the ecosystem expanded.

Collaboration at scale

Reduced duplicated design efforts and alignment overhead across teams, enabling faster iteration and more consistent cross-product experiences.

Durable foundations

Design foundations continued to be adopted across VIVERSE and the broader VIVE ecosystem, supporting new product directions without requiring teams to redefine core UI patterns.

Strategic Resilience

Contributed to a resilient system architecture that allowed the platform to evolve alongside shifting product directions, without fragmenting user experience or design consistency.

Reflection

A design system is not just a library — it is the structure that enables teams to think, build, and evolve together.

As Omni matured, my focus shifted from building components to leading its evolution across the organization.

By establishing governance practices, evolving its architecture, and exploring how tokens and workflows could integrate with AI, my work became less about delivering components and more about strengthening the infrastructure that prepared both the system — and the teams behind it — for what comes next.

If you’re building products where clarity and structure truly matter, I’d be glad to connect.

Based in Taiwan · Open to global opportunities

© 2026 Claire Lee

If you’re building products where clarity and structure truly matter, I’d be glad to connect.

Based in Taiwan · Open to global opportunities

© 2026 Claire Lee

If you’re building products where clarity and structure truly matter, I’d be glad to connect.

Based in Taiwan · Open to global opportunities

© 2026 Claire Lee