Spotify — how do you think I feel today?

A case study on designing a mood filter feature for Spotify’s UI

Cornelia Herischek
9 min readJan 8, 2021

It’s one of these Mondays. I’m about to start a new week of UX/UI work, freshly assigned with a new project — but not feeling fresh at all. I love what I do, but, boy, do I need to kick myself in the** on this Monday morning. “I need to get myself in work mode. Somehow…”

Music! That’s the answer.

A quick shuffle through Spotify’s playlists however irritates me:
Focus music… sounds good at first. But “Classical Focus”? Way too complex for my preferences of getting into a flow state. I need something easygoing. How about “Happy & Relaxed”? Sounds perfect. But wait, what is this? Pop anthems?!

Frustrated I close the app again.
This could be easier, couldn’t it?

This week’s project at Ironhack’s UX/UI Bootcamp covered adding a feature to an existing app’s UI. The scope of the project was to analyze the original design and deliver an interactive hi-fi prototype that stays coherent with the app’s UI, i.e. not disrupting its style.

This time, the task I was assigned came from a real user: Bettina.
Her wish was to have a Spotify feature that lets her mark music with faces of different emotions so that the app learns from her, and in the future, would suggest music based on her mood.

As much as I could relate to this, I needed to remind myself that I’m not the user. I wanted to analyze my user’s actual intentions behind the given statement to understand, what she really needed. This is why I translated Bettina’s task into a Job-to-be-done that outlined her motivations more clearly:

Bettina wants to categorize music depending on what she feels like, so she could have quick access to personalized playlists that both fit her mood and her personal taste.

“But Spotify does have playlists sorted by mood already!”

Yes, Spotify offers mood playlists in their category menu. But those are generic, pre-curated playlists and are not individualized by the user’s actual preferences and listening history.

“But there must be apps that do this!”

Of course, I did the competitor analysis and found some underdog apps that were trying to offer such a service. But without publicly calling them out — none of them were particularly aesthetic or usable in their designs.

I don’t blame any of these apps, though. The topic of sorting music by mood is highly complex. First of all, having a comprehensive database, both of the music offered as well as the users’ input, would be required to make this work at all. Clearly, you’d rather expect those big players with a large user base to have access to such data.
Also, when it comes to moods, you don’t just have a simple trinity of happy / mellow / sad. The range of emotions is much more diverse than that. And how people would associate their own emotions with certain music styles, songs, etc. would be highly individual.
How could such a complex problem be solved?

💡 The initial idea

Obviously, I’m not a data scientist or analyst, frankly, not an expert on numbers at all… you can ask my former maths teachers. (If any of them reads this: I’m sorry.) And, yes, my role in this project was to focus on the UI site and the visual design.
However, I had this one initial idea that I couldn’t get out of my head:
I saw one of the competitors sorting mood data into an X-Y diagram of certain mood qualities. My limited knowledge of data told me, however, that the wide spectrum of mood qualities needed to extend the 2D room.
Especially, when the idea was to create a feature that uses the user data input not only to sort songs but also to predict and suggest similar songs. And, talking of business needs, it certainly would make more sense to Spotify, if the data and insights already collected, could be transferred to other uses as well. This is why I decided on a bit more complex data model — a concept that would be crucial for the feature design.

I visualized a rough draft, of how I imagined the data should be categorized and decided on the qualitative criteria for moods: energy, brightness, and intensity. How this will look like in practice, you’ll see in a minute.

Sketch of how mood qualities could be placed in a 3D scatterplot

I was now able to move on to the visual design process. I analyzed Spotify’s UI and was especially curious to see, what the users’ entry points to the app were when they’d be listening to music. I wanted to provide the user with quick and easy access to setting a mood while they’d be listening to a particular song.

🧐 What? Where?

I found out that there were two entry points to the app:
First was you’d access the app via the regular menu icon. You’d then find the home screen with a playlist overview and a minimized music navigation bar at the bottom. The second would be the song quick access via the miniature player (e.g. on the phone’s lock screen or expandable top menu), which would then lead directly to the detail page “current song playing”.

The next question was, what the feature should actually look like? Should it actually be “faces” (like my user stated in the beginning), i.e. big smileys placed prominently on top of the page? Or should the function be controlled with some kind of other emojis? Should it maybe be abstract symbols?

I sketched some initial ideas — because putting things onto paper clears the head. I tried cute faces, stylized faces, shapes, weather symbols that could be associated with moods… But soon I noticed, that the only way to stay consistent with the original Spotify UI, is to create a functional icon that’s minimalistic and shaped from thin outlines only. When I analyzed the app’s design even further, I found out that there was only one way that Spotify would use a human shape for their icons (“show artists” or “show song info”). It was a silhouette of a person from the belly upwards.
I decided to use and modify this for my design.

My next finding was, that it would make the most sense to place the feature at several touchpoints within the app. The reason is, that in a hypothetical initial testing phase, data collection would be crucial. Only with comprehensive data, individualized playlists could be generated — and the user needed to be motivated to provide the data quickly and without much effort. Because of this, I added the feature at three different places: within the music navigation bar on the home screen, on the song detail page, and also in the extended song menu. By the way, for many other of Spotify’s current features, this already is the standard.

I then created the actual mood filter page where you could specify a song’s mood in two steps: I) setting the main mood for the song, which would be set the qualitative criteria on the X-Y axis, and II) specifying the song’s mood by an intensity slider (that would define the location on the Z-axis) and/or setting additional associative tags.

I built a first mid-fi prototype and handed it to some users, that commonly used Spotify, for testing. The task was: “You’re listening to music and want to add a ‘mood’ to the song currently playing.”

The user test’s main findings were, amongst others:

  • Many users confused the (future) mood playlists with the actual feature because they were placed too prominently. Even though they were placed exactly where there already are playlist suggestions (so actually it would only be new titles), people would try to access those.
    Iteration: I left the original playlist menu and put the mood playlists lower in the hierarchy in a horizontal scroll menu like it is the current standard.
  • It proved, however, that within the song detail page, users would almost equally try to find the mood feature in the song controls at the bottom of the page as well as in the extended options in the top right corner.
    This proved that it would make sense, to place the feature at several locations.
  • When the users accessed the actual mood filter function — a completely new designed page — the users proved a lot of curiosity and spent some time on the site to try out various options. However, this disregarded the intended page’s hierarchy of 1) the user first selecting a mood quality from the X-Y axis (energy or brightness) and then 2) define it by further attributes and/or an intensity slider.
    Iteration: The hierarchy would be visually marked more clearly by precise instructions and higher opacity of the inactive options.
User flow: happy paths, unhappy paths, and maybe-not-so-happy paths

The user flows I defined for the final high-fidelity design, would take the user by the hand at all times. Experienced users would have super quick access to the new mood function. For others, however, there would be two other options to access the feature, to avoid frustration, and motivate coincidental discovery.

Time to put everything into shape! I prepared some reusable styles for the high-fidelity prototype and rebuilt all important original screens. Because it was so much fun learning about Spotify’s UI in detail, I also decided to custom recreate a majority of the icons. By doing so, I learned more about Spotify’s visual design and was able to create my very own icon: the mood icon.

Styles & Atomic Design

Styles and Atomic Design Elements — all custom rebuild by me
Hi-Fi Prototype Screens with the placement of the new Mood Filter Icon, 360x640px tested on a Samsung Galaxy S7

Curious about the result?

Below you’ll find the full experience. You can see, that the mood filter works by first setting the qualitative factor of a mood’s brightness (bright vs. dark) or energy (energetic vs. calm). After setting the main mood, it’s optional to specify the mood further by either sliding the intensity bar (low to high, pre-set as neutral) or selecting suggested associative mood tags.
Depending on how you’d modify the main mood or the slider bar, the associative mood tags suggestions would change. This opens up the option to further individually specify moods and use these tags for individual ratings and searches tags, while not changing the song’s basic allocation within the data scatter plot (which is necessary to link to other songs by similarities).

Try out the prototype yourself!
Just follow this direct link to Figma.

👁 Key Takeaways

While creating this design I got to know first-hand what possible obstacles I’m still to encounter in the real-life UX/UI world.
The challenges of adding a feature to an existing product, don’t necessarily lie in the conceptualization of the feature itself (because frankly, options are endless), but how to then stay true to existing design standards, not disregard hierarchies, and to make the actual use of the new feature suitable to both user and business needs.
However complex this task was, I really enjoyed the insights I gained from it and hope to soon prove my analytical thinking in the field.

— All photographs used are from unsplash.com and added with their amazing Figma plugin. No ad, just a big thank you for saving my time!

Project specifics: independent work · 4 days · Nov’ 20

--

--

Cornelia Herischek
Cornelia Herischek

Written by Cornelia Herischek

Berlin-based UX/UI designer, croissant addict & cat lover. M.A. degree in Media Studies & former project manager in audio. I use emojis in business contexts.

Responses (1)