It seems that the decades old debate within the design community of UI vs. UX is rearing up again. Most recently with the metaphor of ketchup bottles. While I think this is a neat way to visualize the discussion, I’m convinced this is a growth marketing scheme by Heinz to sell more ketchup to designers. 😜
Remove the ‘vs.’ from the conversation. This is about parts to a whole
Before we dig into the specifics of what a UI (User Interface) is, we need to reel the conversation back to a higher level and talk about experience.
I’m talking about how humans experience things overall, not specifically UX (User eXperience). UX is a specific term given to a segment of humans who use digital products. However, different industries give different terms to the same bits of psychology for people using things. And whatever you call them—users or customers—unless we’re talking about the UX of dog toys, they’re all humans with different needs and goals.
Human experience is about perception, which is created through how our mind interprets things through our senses.
Generally speaking there are 5 human senses:
Each of these senses create a contextual equation that leads to create a person’s overall experience of something. Each human sense adds up to create how a person forms a mental model around a particular person, place, thing, or experience over time.
- If you can see it, it’s part of the overall experience
- If you can hear it, it’s part of the overall experience
- If you can taste it, it’s part of the overall experience
- If you can smell it, it’s part of the overall experience
- If you can touch it, it’s part of the overall experience
Well what’s an Interface and what does it do?
An interface is the medium or vector in which a human can experience something. In digital tech, that’s a UI (User Interface) or formally known as GUI (Graphic User Interface). It’s also an interactive tool that helps someone manipulate the outcome of the experience by accomplishing goals.
Depending on the use case, that interface largely occupies the sense of sight when it comes to digital applications or websites (sometimes sound). However, in different use cases (or industries) the UI affects different human senses.
Note: When we take disabilities and accessibility into consideration, the equation changes. We can see this in the digital world when we design for people who are blind. The experience changes from a largely visual one to an auditory one.
Examples using some of my favorite things— automobile design
Let’s talk about large chunks of metal and plastic that can move quickly. The automotive industry has been well established for over a century and vehicles are continuously evolving. Inside the cabin of a vehicle exists a UI that helps you accomplish your primary goal of going places, along with secondary goals like being comfortable.
Instead of User Experience (UX), we’ll talk about Driver Experience (DX). The words “User” and “Driver” just help set the context for the use-case at hand. The important part is the X (eXperience).
Above we can see the UI differences between a Tesla and a Toyota 4Runner. Generally, the UI in an automobile includes the steering wheel, dashboard indicators, shifter, and center console unit with the sound system, climate control, and more. Tesla has advanced this concept quite a bit with a more consolidated digital approach.
Which UI do you prefer? Well that should depend on your goals. The Tesla UI is touchscreen and allows for great automated on-road driving and control of a lot of things from one digital screen.
Conversely, the 4Runner still boasts large physical dials and the tried-and-true hierarchy of separation to control the vehicle. This is specially geared towards off-roading and overlanding.
Each UI is tailored for a different DX. The UI of the Tesla utilizes sight as the primary method of interaction, while the Toyota utilizes sight and touch as methods of interaction on a more even scale.
Sure you could argue the Tesla has a “touch screen,” but touch isn’t used to clearly differentiate the numerous mechanics of the vehicle—setting a auto-pilot course is going to physically feel the same as changing the climate in the car. Whereas you could change the volume of the radio or change the temperature of the 4Runner with your eyes closed (but I wouldn’t recommend that).
Other vehicular parts that contribute to the experience.
As I mentioned, these UI components are just part of the overall DX. With any vehicle, hearing is also a part of the equation that works alongside the UI. Think about how your directional makes a clicky noise or your car lets you know when you’re not buckled up? Ding…ding…DING DING DING. Corollary, another driver’s horn is there to warn you when you make a mistake.
Your experience in layers—where secondary UIs come into play
Continuing with the vehicle metaphor we can begin to talk about secondary UIs. In the use-case of driving, the secondary UI is the road. It’s the physical thing that you travel on that’s littered with signs to direct you on how to act so your driving experience isn’t bad.
The primary UI (inside the vehicle) and the secondary UI of the road both add up to your overall experience. Here are a few examples of the equation:
- Great vehicle UI + good road = good experience
- Great vehicle UI + road falling apart = bad experience
Within the immediate situation the primary UI is mandatory, while the secondary UI is optional in most cases. You’re stuck with your car, but you can most likely choose a new road to travel down.
As you look at more factors that play into the overall DX you could come up with quite a few equations. Don’t even get me started on cruise control settings.
Bringing this perspective back to the digital world we work in
If we bring our perspective on driving experience and vehicles to the digital realm, there’s a parallel. Now that we’re back in digital design land, we’ll use UX again.
Let’s start with a use-case of digital banking. If you bank with any large bank there’s probably a digital app for it. In that app, there’s a UI that allows you to attempt to accomplish your financial goals—whatever they may be.
The secondary UI that contributes to your overall experience is your actual phone or tablet. This is interesting because the app UI is strictly digital and is largely utilized based on sight (again, unless accessibility comes into play), the device is based on sight, touch, and sound.
Let’s talk about how all these can contribute to the overall experience with a few examples:
- Great app UI + phone in good condition = good app experience
- Great app UI + a cracked screen = bad app experience (swap out your device for a better one and suddenly the secondary UI creates a better experience)
- Poor app UI + good phone = still a bad experience
You can start to see why there are layers of the UI that ultimately influence the overall experience. Each UI layer provides a level of interaction that can make or break the overall UX.
Another example to help illustrate how different UIs impact our senses and change the experience
Let’s talk about eating and food.
If we consider food as the UI that helps us achieve our goal of consuming calories for energy, we can expand that view quite a bit.
If you’re dining out at a restaurant, what makes a good experience?
- The primary UI is the food (do you eat it with your hands or utensils?)
- The secondary UI is the utensils, plates, bowls, etc. These can be optional in most cases, but the primary UI (food) is manipulated through the secondary UI (utensils). Much like the app is manipulated by the phone.
The overall experience is much more broad. Smell, lighting, texture of the food and utensils, and even other people in the restaurant or the music that is playing all contribute to the experience. In this case, the UI is a small part of the experience, but it’s the most necessary one.
Thanks for reading!