People can't smell things through their screens (yet), so how might multi-sensory digital experiences extend to olfactory perception in helpful ways?
Research
Design
1 product designer 👋🏻
2021
In early March of 2020 I contracted COVID-19 and completely lost the ability to smell and taste. Almost a year and a half later, when working on this project, I was able to smell and taste at about 20% of the capacity I originally could, but most things didn't smell and taste as they should. To give a few examples of what I mean, vegetable stock cooking on a stovetop would make the kitchen smell like a dumpster and chocolate almost always tasted like blood.
Relatively speaking, very little is known about treating olfactory nerve damage effectively. Avenues for recovery are limited, don't work reliably, and take a very long time to have any effect at all. At times, the whole thing can feel a bit hopeless.
Despite the growing number of people struggling with long-term olfactory damage due to COVID, there are very few digital tools available to aid them in their rehabilitation and recovery. This lack of tooling inspired me to ideate on how a digital experience could help people with olfactory damage track changes in their sensory perception over time.
Challenges inherent to this problem include:
After speaking with other sufferers of long COVID, their loved ones, and a handful of long COVIDÂ researchers, I was able to craft two representative user personas for my app.
The feedback I received made it clear to me that this tool would not only benefit anosmia and parosmia sufferers, but also the medical staff, scientists, and loved ones striving to understand and empathize with their plight.
Usually, at this point in the design process I am preparing to make paper wires. However, in this case, I first had to resolve the challenge of gathering and visualizing the flavor data that would comprise the crux of the app’s functionality.
This concept does not have direct competition, and in fact the closest parallel I could think of was a mix between a wine or coffee tasting journal and a daily food intake app. Research was necessary to discern how others have quantified complex flavor profiles into numeric, visually translatable data in the past. In order to get a better idea, I looked at visual aids from sommeliers and gastronomists specifically.
The pre-existing models I found presented a few problems:
The four-quadrant scatter plot employed frequently by whiskey sommeliers is very useful for things within a single family, but a bit too limiting if one wants to explore a broad variety of food types.
The flavor wheel, a mainstay in the third-wave coffee community, goes in the other direction and displays far too many flavor options to be useful in our case. Asking people who have little to no taste to discern between honey and molasses notes is not only a fool's errand but also wholly unhelpful.
After much thought and collaboration, I decided to move forward with displaying the data in two ways:
First, a radar chart will allow for a clear visual representation of multiple taste nodes coming together into one unique flavor profile shape. The app will limit entries to a handful of food options chosen for their common availability, variety, and reputation for difficulty in the anosmia community (yes, there are strong commonalities across the population).
For each food, a set number of taste nodes will be chosen for users to rank numerically between 1 and 10. While most nodes will relate to that specific food's flavor profile (sweet, fresh, etc), others relate to olfactory damage in general and will be universally included (metallic, rancid). With a ranking for each node, corresponding points can be connected to form the flavor “shape”.
When one looks at the data like this, differences between the user and control group are immediately discernible. It becomes clear that these two entities are not having the same sensory experience at all.
This data alone doesn't show everything we want, though. Maybe a user’s ability to taste the sweet notes of a food specifically is improving over time. I wanted a way to see nuanced areas of improvement on a single node of taste as well.
For this, a line graph with a single flavor’s numerical ranking on the Y axis and dates of entry on the X axis tracks whether a user is getting closer or farther from the control data. In this example, on July 1st the user ranked the sweet notes of a lemon at 7 vs the control's ranking of 3, but by late August their perception of the sweet node dropped to 4, only 1 point away from the control. Their taste is evolving in a way that suggests healthy olfactory nerve regeneration.
I split the task of digital wireframing and prototyping into two parts: the native mobile app and web experience. I envisioned the native mobile app being specifically for account holders creating and keeping track of entries, while the web app would have both a login portal for registered users and opportunities for the public to explore visualization of aggregate user data.
For this project, I spoke with four respondents of varying ages and genders in a moderated usability test, asking them to walk me through creating a new taste entry in the native mobile app. User feedback focused on further adjusting navigation flows and editing capabilities:
The original visual concept came from a poster I made:
Otherfood is how I've come to describe foods that I can taste but which now taste completely different than they once did. Almost familiar, but not. I chose a simple, high contrast palette of warm neutrals and black to preserve the data as the focal point of the experience. Limiting the typeface choices to two grotesks ensured cohesion and readability. I added liquified graphical background overlays for a bit of spice and to reinforce the theme of distorted perception.
Check out my hi-fi prototypes here (mobile app), here (mobile web), and here (desktop web). These will move over to Figma and get expanded, eventually.
This is a tool that I'd like to use and based on that fact alone, I'd like to move forward with building it. When I find the time, next steps are as follows:
This work is currently on hold, but I will continue to update this case study as I move forward with the build process :)