0-1 Product Design @ Virdio

0-1 Product Design @ Virdio

0-1 Product Design @ Virdio

Overview

Virdio is a startup providing engaging home workouts through screen-based AR.

The Virdio product uses machine vision to detect a user's space through a webcam, and places virtual objects on their screen, similar to how a Microsoft Kinect works.

This work is currently under NDA, you can contact me to learn more.

My Role

I joined the team as the Founding Product Designer to research, design, and prototype interactive AR activities, mobile and desktop apps, internal tools, and refine the Virdio design system.

Being a designer in a startup, I was also happily involved in extra tasks like marketing, web design, content design, and video editing.

Duration: 1 year

UI & Design Systems
Research
Prototyping
Gamification

Results

  • Led the design of Virdio's product from early stage engineering prototypes to launched apps on Andriod, iOS, web, and desktop platforms

  • Designed and prototyped AR workouts

  • Designed the AR setup experience

  • Established the Virdio design system

  • Created internal tools for managing workout classes

  • Designed and edited content for marketing.

Team

Hridae Walia (Me) - Product Designer

Johnatan Uribe - (Part time) Product Designer and Advisor

Executives, Engineers, Marketing

Deliverables

High Fidelity Wireframes

Design System Documentation

AR Prototypes

Research Reports

Tools

Figma

Adobe AfterEffects

The Challenge

Design and launch the *new* Virdio!

Design and launch the *new* Virdio!

Virdio aimed to provide AR workout experiences through phones, laptops, and even TVs. The virdio product when I joined was designed by their engineers and b2b. Virdio leadership wanted to redefine the product to be primarily c2c (more like Apple Fitness+), vastly expanding their potential user base.

Virdio wanted to tackle the market of home workouts during covid-19 because they identified the user need for wanting more engaging workouts. The way they thought to do that was with AR. The users were people with intermediate level fitness experience

Key Constraints

Tech Constraint

Virdio only required the user to have a device with a webcam, that meant that all designs needed to be built around what the machine vision system was capable of recognizing as well as maintining compute simplicity that would make it a smooth experience on all kinds of devices.

Designing for Home Setups

The population we were designing for were not able to setup and initiate the experience themselves, so it had to be designed to be used and maintained by staff. So the station was designed with staff members in mind, and I designed and prototyped experiences that could be maintined independently.

The Approach

Initial Research & Exploration

Exploring design considerations for people with Dementia, evaluating possible technologies, conducting competitor analysis, creating a research database.

Prototyping & Experimentation

Taking learnings and design principles from initial research to create and explore low fidelity prototypes.

Internal Evaluation & Testing

Testing prototypes internally to evaluate their usability, effectiveness, and viability before they are ready for testing with residents.

Refine Prototypes

Solidifying prototypes so they can work with the station in a 'plug and play' capacity, and ensuring they can survive on-site testing.

Test & Implement

Observing residents as they try experiences using the newly developed prototypes, with facility staff operating the experiences alongside residents.

Digital Library Creation

Once the station had a set collection of content and regular bookings, we developed a digital content library to accompany it. Process: Iterate, Test, Refine, Test, Hand-off

Initial Research & Exploration

Understanding the Stages of Dementia

The residents who we were designing for were mostly at the mid to late stages of Dementia, so it was important to learn about their quality of life, behaviors, and accessibility considerations. I was directed to the Reisberg Scale to understand these stages.

On this scale, our target audience lie between stages 4 and 6. Key characteristics of this population are;

  • Memory deficiencies

  • Desensitivity to stimuli

  • Difficulty concentrating

  • Difficulty socializing

  • Behaviors can be similar to children with Autism

Another key point to understand was that this project was meant for life enrichment, not attempting to cure Dementia in any way (which at the moment is not possible).

Researching Multisensory Experiences

Building off our insights from the stages of Dementia, we wanted to research the efficacy of (multi) sensory experiences for people with Dementia, Autism, and in general.

Key sensory stimuli were;

  • Scent - Strongly connected to memory

  • Touch - Can facilitate immersion and a sense of presence

We also explored potential implementation techniques

  • Diffusers - I researched the use of diffusers for controlling and emitting scent according to content displayed on screen

  • Ultraleap Haptics - I explored the capabilities of the Ultraleap Ultrasonic Haptic device by trying its demo content, and attempting to make my own demos. It was not a good fit because the tactile feedback was not strong enough, and the small surface area made it difficult to implement.

Slide from a tech review document I prepared
Playing around with the Ultraleap demos

Evaluating the Potential of AR/VR

The leadership of the SFCJL were also curious about AR/VR and wanted us to explore its use for the project.

As a long time AR/VR enthusiast, I used my expertise to research existing immersive technologies and analyzed their application with primary and secondary research.

I discovered that since we were dealing with a sensitive population, our research showed that Virtual/Augmented Reality solutions were not suitable.

People with mid to late stage Dementia have difficulty concentrating and struggle with memory, which includes memory of where they are and what they are doing. The inclusion of a virtual reality would further disconnect them from their reality and cause confusion or even harm.

I synthesized my findings into a slide deck comparing different immersive and interactive devices that would be presented to leadership.

Slide comparing active/passive interaction with level of immersion

Synthesizing Research & Defining Principles

I also used our learnings of the problem space and the facilty to create an Ecosystem Diagram to map out how different stakeholders would interact with our solution, and how it exists within the SFCJL facility.

Our research led us to form design principles that would serve as guides throughout the product development process.

Adaptive

Safe

Social

Rich

Screenshot of the research database I helped create & maintain
Ecosystem Diagram I created

Interval

Fast Forward

After the initial research was complete, I took a break from the project and worked at another role for a year while Maria Mortati and Scott Minneman built the actual station.

The first iteration of the station featured immersive and socially rich experiences, with multisensory output including scent, sound, haptics, and touch.

Iteration 1 of the MCES (2021)

Critique

I rejoined the project in 2023 when they needed the station to have more interactive, and exciting experiences. While also building off some of the ideas that had been partially worked on before.

Another key consideration for designing for the newer experiences was how they could be used independently. Primarily experienced by residents while being operated and maintained by the SFCJL staff.

This challenge was one I was very excited to tackle.

Iteration 1 of the MCES (2021)

Output

Cat Interactive Experience

Challenge

Creating more interactive, immersive, and personal experiences for the residents that can be easily set up.

A resident using the cat experience, assisted by facility staff

My Input

A key insight uncovered during our research was that people with autism and people with mid to late stage dementia have similar needs when it comes to sensory stimulation.

I took inspiration from a sensory product meant for children with autism that we were experimenting with, and created an interactive cat petting experience!

The experience uses pressure sensors embedded in a stuffed cat toy to detect when its being pet, and outputs an immersive cat petting video that includes the deep bass of cat purrs. The experience features haptic feedback through the inclusion of LFEs (Bass Emitters).

Why cats? I used cats as the basis for this experience because the theme enabled the experience to combine tactile feedback (petting), haptics(purring), and immersion(cat video) in a natural way. Plus I like cats, and so do a lot of the residents.

How did I make it easy to set up? With this project I needed to think about final implementation throughout the process, I made sure I used replicable components where applicable (Bass Emitters, stuffed cats, audio interface) and only made custom designs when needed (like the arduino code and setup)

A key insight uncovered during our research was that people with autism and people with mid to late stage dementia have similar needs when it comes to sensory stimulation.

I took inspiration from a sensory product meant for children with autism that we were experimenting with, and created an interactive cat petting experience!

The experience uses pressure sensors embedded in a stuffed cat toy to detect when its being pet, and outputs an immersive cat petting video that includes the deep bass of cat purrs. The experience features haptic feedback through the inclusion of LFEs (Bass Emitters).

Why cats? I used cats as the basis for this experience because the theme enabled the experience to combine tactile feedback (petting), haptics(purring), and immersion(cat video) in a natural way. Plus I like cats, and so do a lot of the residents.

How did I make it easy to set up? With this project I needed to think about final implementation throughout the process, I made sure I used replicable components where applicable (Bass Emitters, stuffed cats, audio interface) and only made custom designs when needed (like the arduino code and setup)

A key insight uncovered during our research was that people with autism and people with mid to late stage dementia have similar needs when it comes to sensory stimulation.

I took inspiration from a sensory product meant for children with autism that we were experimenting with, and created an interactive cat petting experience!

The experience uses pressure sensors embedded in a stuffed cat toy to detect when its being pet, and outputs an immersive cat petting video that includes the deep bass of cat purrs. The experience features haptic feedback through the inclusion of LFEs (Bass Emitters).

Why cats? I used cats as the basis for this experience because the theme enabled the experience to combine tactile feedback (petting), haptics(purring), and immersion(cat video) in a natural way. Plus I like cats, and so do a lot of the residents.

How did I make it easy to set up? With this project I needed to think about final implementation throughout the process, I made sure I used replicable components where applicable (Bass Emitters, stuffed cats, audio interface) and only made custom designs when needed (like the arduino code and setup)

⭐️

Connect to Content

Add layers or components to make infinite auto-playing slideshows.

Outcome

Key points of feedback we got for improvements were;

  • People wanted to pick up and hold the cats (how did we not think of that)

  • The platform could be more comfortable

Residents that we tested this experience with were engaged throughout their session, as they enjoyed petting the cats. The experience quickly became a part of the regular library of experiences available to the residents.

Accessible Haptic Feedback

Challenge

A key finding was the effectiveness of haptic feedback and it's ease of implementation. At the time, Scott Minneman had created a physical platform for providing haptic feedback for the residents. But many of the residents were wheelchair users, and the form of the platform made it impossible for them to experience it.

I needed to adapt the large platform into a form that could be easily accessible for wheelchair-using residents.

The station with the original haptic feedback platform under the chair
A resident using my version of the haptic feedback platform

My Input

the platform needed to be much smaller, so I explored using fewer haptic emitters since the original platform was using 4. I experimented with using 1 large LFE (Low Frequency Emitter or Bass Shaker), which would allow for a stronger sensation, and less surface area at the cost of more power and height.


I also experimented with taking the existing smaller LFEs and using only 1 of them on a foot rest.


Using a foot rest was a key decision as it allowed the user to feel the haptics without having to climb on it in any way. The cost was that the haptic sensation would not spread as much since it was focused on fewer points.

A key insight uncovered during our research was that people with autism and people with mid to late stage dementia have similar needs when it comes to sensory stimulation.

I took inspiration from a sensory product meant for children with autism that we were experimenting with, and created an interactive cat petting experience!

The experience uses pressure sensors embedded in a stuffed cat toy to detect when its being pet, and outputs an immersive cat petting video that includes the deep bass of cat purrs. The experience features haptic feedback through the inclusion of LFEs (Bass Emitters).

Why cats? I used cats as the basis for this experience because the theme enabled the experience to combine tactile feedback (petting), haptics(purring), and immersion(cat video) in a natural way. Plus I like cats, and so do a lot of the residents.

How did I make it easy to set up? With this project I needed to think about final implementation throughout the process, I made sure I used replicable components where applicable (Bass Emitters, stuffed cats, audio interface) and only made custom designs when needed (like the arduino code and setup)

A key insight uncovered during our research was that people with autism and people with mid to late stage dementia have similar needs when it comes to sensory stimulation.

I took inspiration from a sensory product meant for children with autism that we were experimenting with, and created an interactive cat petting experience!

The experience uses pressure sensors embedded in a stuffed cat toy to detect when its being pet, and outputs an immersive cat petting video that includes the deep bass of cat purrs. The experience features haptic feedback through the inclusion of LFEs (Bass Emitters).

Why cats? I used cats as the basis for this experience because the theme enabled the experience to combine tactile feedback (petting), haptics(purring), and immersion(cat video) in a natural way. Plus I like cats, and so do a lot of the residents.

How did I make it easy to set up? With this project I needed to think about final implementation throughout the process, I made sure I used replicable components where applicable (Bass Emitters, stuffed cats, audio interface) and only made custom designs when needed (like the arduino code and setup)

⭐️

Connect to Content

Add layers or components to make infinite auto-playing slideshows.

Outcome

Key points of feedback we got for improvements were;

  • Some wanted more powerful haptic feedback

  • Desire for a more adjustable platform

This prototype was so much fun to use, I kept one of them for myself and continue to use it for games and entertainment, it just works with anything.

MCES Digital Content Library

Challenge

As the MCES matured, the need for a library system arose. With more residents becoming regular users of the system, they needed a system that could track their interests and provide a personalized library.

My Input

Principal designer Maria Mortati had consulted a group of students from CCAs MDes program to design the first iteration and lay the foundations for this part of the project. I was involved when they handed the design off to Maria, and we had to solidify the design and hand it off to a developer for a functioning site.

Solidifying

The student group did a great job in laying out the foundation of the web app, with coherent IA and page layouts. But the prototype needed an easy to implement design system, and a more streamlined user flow.

We wanted to strip the web app down to its core components, the resident profiles (with their content preferences), and the experience library (personalized playslists for each resident)

Simplifying the design system

The first thing I did was change the type system from the custom font that the students used, to an easily available Google Font so that the developer could have easy access.

I also cut down on the amount of colors used in the UI, to simplify the functionality they convey.

Implementation & On-Site Testing

The Combined Experience

A lot of time was spent on solidifying the experiences we put together.

That means it needed to connect with the rest of the station in a plug and play capacity, and needed to be able to withstand a liiitle bit of abuse.

A resident using the haptic footrest with a driving experience that has the resident play a racing game using a Logitech Racing Wheel.

An Unexpected Finding!

While putting together a racing experience we just had the residents play a 'practice race' becuase we wanted to evaluate the sensation of driving with the haptic footrest. But we did not designate a goal for the racing experience.

The resident enjoyed the experience, and was energetic and active. But he refused to leave.

At first, I was so happy that a resident enjoyed an experience so much they did not want to leave. But the lack of a well defined ending meant that the resident did not know it had ended, and was frustrated we were taking it away from him. Which relates to our key early finding;

People with mid to late stage Dementia can have similar behaviors to children with Autism

Next Steps

My involvement in the project is largely over, it's currently in the final stages of being handed off to the SFCJL for continued use, headed by Maria Mortati and Scott Minneman. I'm currently working with Maria Mortati in making the haptic footrest and cat experience into independent products. I also want to help write a paper for this project that helps establish design principles and advocates for more design intervention in this space.

Reflection

Older humans are a severely underserved population, with or without Dementia. This project really fueled me to pay more attention to their needs and how we may address them.

This is the best project I have ever had the privilege to work on, and it changed the way I think about my future and how designers can create impact. I've always been a designer that loves to work in a hands-on capacity, screens, toys, games, physical form, all of it. This was a unique chance to stretch all my skills and have the independence to create objects from scratch, as challenging as that is.

I was the founding product designer at Virdio, a startup aiming to create engaging at home workout experiences delivered through Augmented Reality.

While I was there I was responsible for designing the Virdio apps, AR activities, design system, conducting user research, and collaborated with engineers to ensure the quality of shipped features. Being in a startup, I was also involved in many other smaller roles like marketing, content creation, and graphic design.

This case study is an overview of my role, output, and methodologies.

My Role

Defining the Virdio Design System

Before I joined Virdio their product was designed by their engineers, as is the case with many startups.

While designing the wireframes for the Virdio apps, I also defined the design system. I wanted the system to represent Virdio's brand while also allowing for the content (workouts) to shine in action. I did not create all the icons, my focus was on deciding how colors and symbols could be used to communicate functionality.

The most fun part for me was when designing elements that would be used accross the AR activities, and I used common motifs like traffic cones and boxes as UI elements to communicate their functionality as being the same in AR as their real life counterparts.

The design system included dynamic states for the streaming UI, 3d elements to represent AR artefacts, and a light visual design inspired by Google's Material 3.

The Challenge

My main challenge with this task was creating a system that easily adapts to different touchpoints, as that was a key goal for the product.

I addressed this challenge through thorough documentation of component behavior and how they are meant to be used, while also working closely with developers to help them implement the designs.

Designing the Mobile, Desktop, and Tablet Apps

A key goal for the Virdio was to have their services available on all popular platforms, that meant I was in charge of designing for the mobile, desktop, and tablet touchpoints simultaneously.

The Virdio app design was split into 2 parts, the 'browsing UI' was for filtering, exploring, and booking fitness classes and tracking progress. The 'streaming UI' was for viewing workout classes which also included AR equipment that the user would interact with using their body.

The browsing UI was designed to be quick and easy to browse, it included a list of class sessions that could be filtered and provided detailed description in the class detail page.

The streaming UI was designed to be non intrusive so that the AR objects and user's body would be visible, while also clearly showing key elements like fitness metrics.

The Challenge

It was challenging to design across 3 platforms together, while maintaining visual and functional consistency. We frequently conducted design reviews and evaluated the quality of the apps between each other.

Another big challenge was maintaining the quality of the workout viewing/attending experience across devices, because having a non-intrusive UI is much easier for larger screens since there is more space and much more challenging for mobile screens. I worked to create many different explorations on how a mobile UI could work while researching how competing products deal with this issue. The result was that the larger touchpoints had a more expansive UI, while the mobile UI had several dynamic UI elements that would react to the content on screen.

Designing the AR workouts and setup experience

This role was my first experience working with an AR experience. I had designed for VR before, but Virdio's product implemented AR as a part of the user's webcam using machine vision. So it was more like designing for a Microsoft Kinect-like experience rather than one for an HMD.

All of the AR activities were made to mimic their real life counterparts, with the AR elements serving as signifiers communicating the correct way to do an activity (like having correct squatting posture) and also when an action is successful (like successfuly completing a squat). I used color states to communicate success/failure and shape to communicate affordances of an activity.

A constraint we had was the level of compute we had. Virdio's app was meant to be lightweight and flexible, so while the AR elements were placed and tracked in a 3d space but were always 2d elements to be as light as possible.

The Challenge

The biggest challenge for the AR experience was onboarding. Virdio's AR is screen based so it's meant to be used on laptops, smartphones, and with webcams. The experiences also relied on having spatial data for where the user will be performing their workouts, so it was extremely important to have a reliable AR setup experience so that users could experience the workouts as they were meant to be.

The developers worked to simplify the setup requirements, so the user just needed to provide 3 points of reference to setup their space, 2 for width and 1 for depth. We designed a step by step system where the user would touch cones placed in their space that would give them visual feedback as they setup, so they could also see if something is wrong and correct as they go.

Designing internal tools for broadcasting and organizing classes

Virdio relies on real fitness instructors to run classes, like Peloton but with AR. So I was also in charge of designing the workflow management tools for planning and posting fitness classes that would be viewed and booked by the Virdio users.

A key objective with designing the workflow management tool was making the timelines for tasks clear, and giving instructors the ability to leave glanceable notes.

The Challenge

The main challenge with designing this tool was that it was built on the original design that was built by the developers, and due to time constraints the new system needed to exist within the older design.

This required a restrained approach in the redesign. I created the new tool to be as simple as possible, serving key functions while still working within the older system.

Reflection

The biggest challenge

The most important skill I learned, and also the thing I struggled most with was stakeholder management. It was my first experience working closely with engineers and executives and I learned how to make tradeoffs and how to advocate for better design practices with stakeholders.

Providing a consistent experience

Providing a consistent experience across all touchpoints was a big challenge that I still think about. Virdio is not a traditional cross-platform product because the device significantly influences the user experience. A mobile workout experience was completely different from a desktop one, and if I had to do it again I would give more time to creating bespoke expriences for each touchpoint, which would also require me to advocate for better design and more time investment from stakeholders.

In the end…

Working at Virdio was a transformative experience for me as a designer, I grew my skills as a designer, prototyper, and collaborator while getting to work on an incredibly cool, novel product that blended machine vision with Augmented Reality.

I'm grateful to have had the chance to impact a product with a high level of influence.