(Originally published here.)
This weekend we celebrated my grandfather’s 90th birthday. It was wonderful to sit down for a meal with extended family members I hadn’t seen in years — some who only remembered me as a young boy.
Looking around the room, a bittersweet realization crept over me: that may be the last time I would sit together with all of the great-aunts and uncles I’d known for so long — the ones whose grins I still remember from childhood.
As our lunch finished, we set a cake before my grandfather and lit one candle for each decade of his life. I took out my phone, opened the camera application and set it to video. As we sang happy birthday, I slowly panned across the room, capturing all the loving faces before focusing in on my grandfather. He took off his WWII veteran’s hat and concentrated as he blew out the candles.
A few years back, after my grandmother passed, I found a saved message from her in my voicemail. Now and again I would listen to it, taking comfort in her voice as she wished me luck at school. The video I’d taken of my grandfather would be an equally valuable keepsake.
Or so I thought.
When I looked back at my phone, I realized that the entire time I’d held it up, I hadn’t recorded a single thing! It would have been natural to blame myself, but I could see that part of the problem was the app’s design. And that only made it worse.
Last summer, usability experts Don Norman and Bruce Tognazzini strongly criticized Apple for its recent design practices, pointing out that several key usability principles were “largely or completely missing in iOS.” One of these missing principles was feedback: the cues that “allow a person to know what happened after an action was done”. Poor feedback, it turns out, is what undid my beautiful family recording.
One of the ten usability heuristics outlined by Jakob Nielsen is match ormetaphor. The idea is that people understand things more easily when they follow the same patterns or signals that appear in other contexts. In this case, Apple chose a red circle to signal that its camera is in video mode (but not filming). This invites confusion, because that same symbol often means that a recording is in progress.
In the iOS camera app, when a user starts recording video, the large red circle becomes a slightly smaller square.
This is another mismatch between the iOS recording symbols and the ones seen elsewhere. Normally a red light comes on to grab attention and notify users that a machine is recording. But the iOS camera does the opposite: its red symbol becomes slightly less visible when video is recording.
The larger problem here is that the “ready to record” and the “now recording” signals are easily confused when users aren’t focusing on them.
In a real situation, the user’s focus will bounce back and forth between the image on the screen and the real events happening in front of the camera. The controls will be out of focus, seen only peripherally.
Apple could save their customers a lot of headaches by changing the “ready to record” symbol to something that doesn’t usually mean “recording now”. It would be as easy as this:
This design would allow most users to see — even in their peripheral vision — whether a video was being recorded or not. The view would be something like this:
None of this will bring back the video I lost, but hopefully it can motivate people to pay more attention to context and usability when designing products.