UX Design Analysis of the AR Virtual Fitting Room
Redefining Spatial Interfaces: Bridging 2D UI and 3D Human Interaction. The AR Virtual Fitting Room transforms Everlight Chemical’s complex material science into an intuitive, interactive retail experience. I designed the system bridge chemistry and consumer imagination—turning color pigments and textile coatings into tangible fashion experiences through augmented reality. The installation invites visitors to stand before a full-height […]
Client: Everlight Chemical Industrial Corp
Redefining Spatial Interfaces: Bridging 2D UI and 3D Human Interaction.
The AR Virtual Fitting Room transforms Everlight Chemical’s complex material science into an intuitive, interactive retail experience. I designed the system bridge chemistry and consumer imagination—turning color pigments and textile coatings into tangible fashion experiences through augmented reality.
The installation invites visitors to stand before a full-height screen equipped with a depth-sensing camera. As the system detects a user, it projects a life-sized reflection in real time and overlays virtual clothing models—tops, jackets, and pants—rendered with Everlight’s proprietary pigments and finishes. Users can select gender, garment category, and style, view themselves wearing the products instantly, capture photos, and share them via QR code.
Challenge
The UX vision was to humanize AR fitting by making the interface spatially adaptive.
Instead of treating the screen as a flat layer, the new concept positioned all interactive components in a 3D coordinate system relative to the user’s body—essentially treating the visitor as the center of the interface.
Key principles:
- Embodied interaction: design that reacts to the body, not the mouse.
- Spatial ergonomics: comfort and reach matter as much as gesture recognition.
- Dynamic scaling: interface adjusts based on the user’s real-time proportions.
The primary design challenge was usability in spatial interfaces.
Early prototypes revealed:
- Frequent false activations occurred when users raised or lowered their arms.
- UI elements are placed at fixed screen coordinates, ignoring differences in user height and distance.
- Lack of ergonomic feedback leading to interaction fatigue during repeated trials.
From a research standpoint, the team needed to translate 2D interface conventions into a 3D spatial experience while ensuring accessibility across a diverse user base (height 155–185 cm, distance 1.5–2 m).
Execution
1. User Testing & Observation
*Conducted with 5 participants (4 experienced in AR games, 1 Kinect user).
*Mapped usability issues like mis-triggers, unclear feedback loops, and awkward distances.
2. Iterative Prototyping
*V1: 2D overlay UI (vertical orientation 768×1024).
*V2: 3D UI framework using Kinect’s skeletal data for body-anchored positioning.
*Buttons scaled dynamically; “Home” replaced by “Reload” for clarity; added gender and clothing categories.
3. Validation
*Testing confirmed smoother operation within the 1.5–2 m interaction range.
*Gesture success rate and visual clarity improved notably across trials.
1st Prototyping : 2D overlay UI
2ed Prototyping : 3D UI framework
Result / Impact
- Reduced mis-trigger rate through depth-based interaction mapping.
- Increased user satisfaction — participants described the experience as “natural,” “responsive,” and “fun.”
- Established a reusable UX framework for future AR retail and exhibition projects, emphasizing body-anchored UI and sensor-adaptive scaling.
- Transformed a technical pigment showcase into an immersive, human-centered brand experience.
Key UX Insights
Spatial Awareness
Insight: Z-axis interaction is critical Application: Design UI in layers of depth
Accessibility
Insight: Body diversity affects usability Application: Dynamic scaling by height & reach
Feedback Loop
Insight: Users need real-time confirmation Application: Add motion-responsive UI cues
Ergonomics
Insight: Fatigue reduces engagement Application: Design gesture range within comfort zone




