So the iPhone X has been on the market for a little while - what do people think? The first impressions from the tech industry clearly focused at the top of the device, the ‘Notch’. It seemed bewildering why Apple chose to have part of the screen effectively stuck behind the camera. Why not have an ever so slightly shorter screen like Samsung or Google? The second thing focused on, was the removal of the home button.
From a usability perspective, this change is far more drastic. Other companies have removed the physical home button and now use an on-screen button (I like Samsung’s implementation of this). Apple hasn’t gone down this road, it is all about the gestures.
When Steve Jobs unveiled the iPhone way back in 2007 he started by saying “Here are four smartphones”, waving his hand up to the floor-to-ceiling screen behind him. It displayed the Motorola Q, BlackBerry, Palm Treo, and Nokia E62, products whose bottom halves were filled with directional buttons and physical keyboards. “What’s wrong with their user interfaces?” he asked. He then made it clear it was the 40 or so buttons on the bottom of the phones were the problem. The keyboards were unchangeable and took up half of the device.
When he revealed the iPhone there were audible gasps in the auditorium. The device looked like a sheet of glass with 1 obvious button at the bottom. Over the years this button has become synonymous with simplicity. Generally, no matter where you are in the phones OS, you can get back by clicking the home button. As time moved on you were then able to click it twice to multi-task or hold it down for Siri. Despite these additions, the home button became part of the design of the iPhone. Everyone knew how to use it (whether you were an Android or iOS user).
With the total removal of the home button, Apple has decided to use swipe gestures. There is a thin white bar which sits at the bottom of the phone which you can use in place of a home button. Instead of pressing a button, you swipe for the same functionality.
The gestures available are quite clever. You swipe up to unlock the phone or return to the home screen and you swipe up and hold for the app switcher. There is a lot more thumb/finger movement required for the simpler tasks. Despite this, you can now swipe left or right on the bottom bar to flick between apps (which is brilliant). Due to these new gestures, the flick up to close apps is now replaced by an X in the corner of the tile (how very Mac OS).
A big problem with using gestures is that they are not necessarily as intuitive as the developer thinks they are. Nicole Nguyen, writing for BuzzFeed, had trouble with the new gestures right out of the box:
“I opened the box, peeled off the screen sticker slowly, because I'm dramatic (and also it's the best part of this job), and powered up the phone for the first time. Then I accidentally pressed the flashlight shortcut on the lock screen and swiped up, which took me to the setup page. Tried to swipe down from the top right corner, the new gesture for Control Center. Nada. Tried swiping down from anywhere. Nada again.
So I set up the phone with the flashlight on. And that was my first five minutes with the iPhone X.”
Another issue is when existing gestures are changed. You develop a sort of “muscle memory” to using a particular gesture so it can be jarring to have to relearn what to do. On the iPhone X, the control centre is accessed by swiping down from the top right rather than from the bottom (access to flashlight etc).
We knew some of these changes were coming, but now that they’re here, some are rather finicky. One example is reachability, this has been around for a while and works by pulling the screen down. This makes it easier to reach the top of the screen with one-handed use on a larger phone. On an iPhone Plus model, you lightly double tap the home button. On an iPhone X, you invoke reachability by swiping downward on the gesture bar at the bottom of the iPhone X’s screen. This sounds simple but it’s actually quite tricky. For some, there was confusion about this gesture: One reviewer was trying to swipe upward from the bottom of the screen and then swipe the app window down in order to activate reachability. It didn't work.
These issues were faced by technology bloggers and reviewers. These are people paid to review lots of different phones and devices all year round. If they have difficulty, some people won’t be able to find certain gestures at all. In some cases, they may simply never know they exist. There are some people who use iPads to access the Internet who are completely unaware of gesture controls.
This seems like Apple are trying very hard to keep the iPhone platform as unique as possible, while not thinking about simplicity and the user experience. Why not just add a virtual home button which disappears when you don’t need it? This would be recognisable and user friendly. Did Apple decide not to “because Android did it first”?
Alexa and Google Voice are becoming very popular in the home and office. Vocal interaction with devices like the Amazon Echo improves on an almost weekly basis. With fewer physical buttons on our smart devices, we will have a world of technology controlled by gestures and voice commands. This also brings other benefits. If a device has no buttons, wireless charging and behind the glass speakers then it should be more resistant to water and dust. This will always be something customers want on devices costing £1000.
There will always be people who prefer the feel of physical buttons. The recently released Nokia 3310 showed the desire for "Retro tech". We are entering a new world of usability and interaction with our devices. I am excited to see where it could lead and curious as to what comes next. Let’s hope the user is not forgotten in the new crazy world of swipe, swipe, swipe.
David recently joined our team after spending 10 years in operations roles with a variety of blue chip companies. He's already successfully participated in user research and prototyping projects for Solverboard and The Picture House.