AI Workflow & Design Infrastructure

AI transformed how design is created — introducing speed and complexity at scale. The challenge was no longer creation, but integrating AI into workflows without breaking system coherence.

AI transformed how design is created — introducing speed and complexity at scale. The challenge was no longer creation, but integrating AI into workflows without breaking system coherence.

AI transformed how design is created — introducing speed and complexity at scale. The challenge was no longer creation, but integrating AI into workflows without breaking system coherence.

Duration

2023-2025

Platform

XR, Web, Mobile

Context

As XR systems became more coherent and scalable, a new bottleneck emerged — the process of creating content.

While interaction models had been unified, workflows for building scenes, tutorials, and experiences remained fragmented, manual, and difficult to scale.

Traditional workflows struggled to keep up:

  • 3D scene creation required significant time and effort, with repeated manual work

  • Tutorial production involved long iteration cycles across UX, UI, and 3D teams

  • As platforms like VIVERSE emphasized content diversity and rapid iteration, existing design processes became bottlenecks 

AI presented new opportunities — but only if treated as part of the system, integrated into design workflows and infrastructure rather than applied as isolated tools.

Problem

Problem

Generative AI accelerated creation — but design workflows were not built to support it.

As AI became more accessible, designers could rapidly generate layouts, visuals, and variations. However, without a shared framework, adoption often happened in isolated experiments — disconnected from design systems, UX principles, and product constraints.

This introduced new challenges across teams:

  • AI-generated outputs lacked consistency with existing design systems 

  • Design quality became harder to control as iteration speed increased 

  • Workflows diverged across tools instead of converging into shared processes 

  • Designers spent more time refining and aligning outputs than expected 

  • The role of designers became unclear in a “prompt-to-design” environment 

The challenge was not adopting AI faster, but evolving workflows and roles — enabling AI to scale creativity without breaking systems or collaboration.

My Role

I led the exploration of AI-driven design workflows — shaping how AI could move from isolated tools into a scalable part of design infrastructure.

As the work evolved, my role focused on defining where AI adds meaningful value, structuring design system knowledge into AI-readable formats, and aligning emerging workflows with real product needs across VRS and VIVERSE.

  • Defined workflow goals and identified high-impact areas where AI could meaningfully reduce friction

  • Defined early frameworks for structuring design system knowledge into AI-readable formats

  • Led experimentation, tool evaluation, and iterative prototyping with cross-functional teams

  • Aligned AI workflows with real product scenarios — ensuring practical applicability beyond concepts

  • Facilitated discussions on evolving design roles and collaboration models in AI-driven workflows

While this work was highly collaborative, I focused on framing problems, shaping direction, and connecting system thinking with practical execution.

Process

AI as a Creative Partner

Phase 1 — Systematic Scene Design with AI

Phase 1 —
Systematic Scene Design with AI

System-generated scene combining environment, dynamics, and spatial composition

From product-driven insight to a structured scene design system

As XR products evolved, scene creation became increasingly repetitive and costly.

I introduced a systematic approach by decomposing environments into modular layers:
• Outer — atmosphere and mood
• Middle — motion and environmental dynamics 
• Core — spatial composition, lighting, and sensory details 

This established a reusable scene-level system, reducing redundant production effort and enabling consistency across scenes.

AI was introduced within this structure — supporting ideation, exploration, and early asset generation without breaking system coherence.

This approach was later validated in product (VIVE Focus Vision).

Key Shift

Scene design evolved from repeated production into a scalable system.

From product-driven insight to a structured scene design system

Outer Layer — Environment
AI generates skybox variations to define mood, theme, and environmental tone

Outer Layer — Environment
AI generates skybox variations to define mood, theme, and environmental tone

Outer Layer — Environment
AI generates skybox variations to define mood, theme, and environmental tone

Middle Layer — Effects & Dynamics
AI enables dynamic behaviors — shaders, particles, and ambient motion

Middle Layer — Effects & Dynamics
AI enables dynamic behaviors — shaders, particles, and ambient motion

Middle Layer — Effects & Dynamics
AI enables dynamic behaviors — shaders, particles, and ambient motion

Core Layer - Spatial Composition
Materials, lighting, and spatial sound operate within a structured spatial system

Core Layer - Spatial Composition
Materials, lighting, and spatial sound operate within a structured spatial system

Core Layer - Spatial Composition
Materials, lighting, and spatial sound operate within a structured spatial system

AI as a Workflow Accelerator

Phase 2 — AI in Tutorial Creation Workflow

Phase 2 —
AI in Tutorial Creation Workflow

AI enables designers to validate motion and intent before 3D production

From system design to workflow transformation

As XR products expanded, tutorial content required frequent updates across devices, inputs, and interaction scenarios.

Previously, UX, UI, and 3D teams relied on iterative alignment—spending significant time refining animation before design review.

We introduced AI earlier in the workflow, enabling designers to validate motion and intent using generated animation before 3D production.

This shifted the workflow from production-first to validation-first.

From Iteration-Heavy to Validation-First

AI shifts validation upstream—transforming a production-heavy pipeline into a validation-first workflow

Key Shift

  • Validation shifts upstream—from post-production to pre-production 

  • Iteration shifts from manual refinement to AI-assisted loops 

  • Collaboration shifts from sequential handoffs to early alignment

Impact on 3D Production Workflow

Validation moves earlier in the process—reducing iteration cost and enabling faster team alignment

Workflow Steps

AI shifts validation earlier—allowing teams to align on motion before investing in 3D production

Step 1 - Capture intent
Record a reference video

Step 1 - Capture intent
Record a reference video

Step 1 - Capture intent
Record a reference video

Step 2 - Generate motion
AI creates an animation draft

Step 2 - Generate motion
AI creates an animation draft

Step 2 - Generate motion
AI creates an animation draft

Step 3 - Validate early
Review and align before 3D production

Step 3 - Validate early
Review and align before 3D production

Step 3 - Validate early
Review and align before 3D production

From Workflow Optimization to System-Level Design

Phase 3 — AI as Design Infrastructure

Phase 3 —
AI as Design Infrastructure

From workflow optimization to system-level scalability

AI is no longer just accelerating workflows—it is redefining how design is created, structured, and scaled.

Built on Omni’s tokenized foundation, AI generation becomes an extension of the design system—ensuring outputs remain consistent with system rules and visual language.

AI becomes a foundational layer—integrating design knowledge into the system itself

Bridging system definition and execution

With system-level intelligence in place, design creation becomes more structured, scalable, and less dependent on manual processes.

Design knowledge becomes executable—ensuring consistency at scale

As design creation evolves, collaboration fundamentally shifts

As AI becomes embedded in the system, it reshapes not only design creation—but also how designers collaborate.
The role of UX and UI designers shifted from sequential execution to shared, system-guided iteration.

AI transforms collaboration from sequential handoffs to shared, system-driven iteration

Key shift

AI evolves from a workflow accelerator
into the system that defines how design is created.

Application Example — VIVERSE Avatar Memoji Generation

As a concrete application of this approach, the same system was applied to VIVERSE Avatar Memoji generation.

By structuring style constraints, visual rules, and reusable assets as inputs, AI-assisted generation enabled diverse avatar expressions while maintaining consistency with brand identity and platform aesthetics.

Rather than a standalone feature, Memoji generation validated this system in practice—demonstrating how AI-guided creation, grounded in design systems, can scale across use cases while balancing creative flexibility and product consistency.

To enable consistent and scalable generation, the AI layer needed to be redefined—not as a tool, but as a structured system.

Early experiments revealed limitations in controlling expression consistency, leading to a shift toward API-based approaches for scalable generation

Rather than relying on a single tool, the focus shifted to structuring how inputs, constraints, and generation logic are defined and applied.

Design knowledge becomes a reusable input—enabling consistent and scalable Memoji generation

Outcome & Impact

This shift resulted in both operational and strategic impact:

Efficiency

Reduced manual effort and shortened iteration cycles across design creation.

Scalability

Established reusable, modular design structures across products and platforms.

Adoption

Transitioned AI workflows from experimentation into real product use.

Collaboration

Shifted design conversations from execution details toward system intent and quality.

Strategic value

Established AI as part of design infrastructure rather than isolated tools.

Reflection

AI clarified what should scale — and what should remain human.

Working with AI reshaped how I think about design — not as isolated outputs, but as systems that can be structured, extended, and integrated into workflows.

Rather than replacing design, AI revealed the importance of infrastructure — defining how knowledge, patterns, and decisions can scale while maintaining coherence.

This experience reinforced my belief that design is not replaced by AI, but redefined through systems.

If you’re building products where clarity and structure truly matter, I’d be glad to connect.

Based in Taiwan · Open to global opportunities

© 2026 Claire Lee

If you’re building products where clarity and structure truly matter, I’d be glad to connect.

Based in Taiwan · Open to global opportunities

© 2026 Claire Lee

If you’re building products where clarity and structure truly matter, I’d be glad to connect.

Based in Taiwan · Open to global opportunities

© 2026 Claire Lee