How would you remove the iPhone home button?

September 19, 2016
 
by
 
Adam Babajee-Pycroft
,
 
Managing Director (UX)
 
, in
 
UX Design
“hand holding iphone

2017 will mark the 10th anniversary of the iPhone. Most of the speculation on technology websites suggests that this will herald a major redesign which will involve removing the top and bottom bezels. According to the Wall Street Journal Apple’s Chief Design Officer, Jony Ive wants an iPhone “to appear like a single sheet of glass”. This would be simple if the iPhone home button was purely for unlocking the device but the button currently allows users to access several features including: 

Before the phone is unlocked:

  • Waking the phone from sleep allowing access to notifications and control of music/home automation features
  • TouchID – To unlock the iPhone with the user’s fingerprint
  • Siri – Voice recognition
  • Apple Pay – Allowing the user to make contactless payments (authenticated by fingerprints)
  • Apple Wallet – Allowing the user access to location-specific credentials such as airline boarding passes.

After the phone is unlocked:

  • Single tap – Return to home screen
  • Double tap – Access multitasking

Since the release of the iPhone 7, the physical home button has been replaced by haptic feedback which emulates the presence of a real button through small vibrations.

Removing the affordance entirely presents a difficult User Experience challenge. How would you provide quick access to unlock the phone or to features like Apple Pay which is time sensitive? We posed the question to Adi, Dave, Kenton and Adam.

Adi 

User Experience Consultant

I had a hard time coming up with ideas for this because as an iPhone user myself I absolutely hate the idea, it just seems really counter-intuitive!

I’d use Siri – the phone would be motion-activated so Siri would know when to ‘listen’ and would use voice recognition so only the user could unlock it. You’d swipe left /right for Apple Pay, Wallet, etc, or again just use Siri and ask it to open whatever you need – the vocal recognition would mean only you could access any sensitive information. First Direct is already using this type of technology  – more specifically voice biometrics – to make their customer's transactions more secure, so it could work in this context

Dave 

Technology Director & Partner

On my old Android phone, there was no home button. You tapped the bottom of the screen to bring up the Android buttons – back, home and menu. I could see Apple doing something similar, but I don’t think they would want to be seen to be copying Android – certainly not so obviously.

For unlocking, I think they’ll do something more original. They’ll remove fingerprint ID altogether. I did think they could use it on the rear of the phone, but I think it’s more likely to be handled by motion sensors. Picking up and orienting the phone triggers facial recognition and an unlock that way. The fall back would be a swipe to unlock with more than one finger to prevent accidental unlocks and “problematic” messages. The rest of the functionality will be handled by gestures I think, like OSX. I’m not sure they’d be the same as OSX, since a four-fingered pinch to get to the home screen is awkward if you’re using the phone with one hand, but I think gestures will be the way to go. For home, though, a force press on the screen seems the method of least effort, at least in my eyes.

Kenton 

User Experience Consultant

Simplest solution? Ditch the physical button for digital inputs on the bottom of the screen. Apple Watches also use tap to wake that illuminates the screen with a single touch. They could also try incorporating more multi-touch gestures, perhaps a three finger tap returns home, something like that. I believe current iPhone have yet to utilise more than two finger inputs. The iPhone 7 currently uses gesture based shortcuts for much of its navigation, for instance the notification centre is accessed through a downwards swipe from the top of the screen. However the HTC one also uses a three finger downwards swipe to disconnect the phone from a device or an upwards one to choose where to share content such as music, photos and video. Moreover a three finger tap activates voice command mode.

I could see Apple expanding on their current offering of gestures to accommodate the loss the home button.

Adam 

Managing Director & Head of UX

With the iPhone 8 marking the 10 year anniversary of the iPhone and over 1 billion units sold many existing users will already have a strong mental model of how the home button works. In Andrew Hilton’s book “Establishing Context” he argues that when we pick up objects which extend our abilities, we perceive them as an extension of ourselves. The author also explores the idea of embodied cognition. Embodied cognition is the theory that many features of human cognition are shaped by aspects of the body beyond the brain. Simply put, people can often act first and think later. For example, have you ever moved your underwear to a different drawer? How long does it take for you to adapt to the new circumstances? Imagine you visited that drawer 85 times per day for 10 years. How long would it take you to get used to the change? Another popular theory involves extending the haptic feedback introduced for the iPhone 7 to the entire screen. However, this may prove challenging for users with motor difficulties or those who have generally fat or clumsy fingers. One possible approach is to keep the home button affordance in the same position as an indented circle over the screen. However, the challenge is determining whether the user aims to press the home button or the UI element under it. This would be especially difficult on responsive websites rather than native apps. This issue is compounded by the current location of the home button falling within the area which is most comfortable for users to reach.

 I believe that the best solution is to overhaul the “lock” button, currently located on the side of the screen. Depending on the number of taps or force used, the button could provide access to a number of features including toggling lock / unlock, Apple Pay and Apple Wallet. Siri could be activated using speech (“Hey Siri”) as per current iPhones. If technically possible, the entire screen could act as a Touch ID sensor. Whilst less exciting than completely removing all physical buttons this would be less disruptive to the user experience and could pave the way for future iterative changes. After a user has unlocked the phone, there are already swipe gestures available to access multitasking and the home screen, but these could also be achieved through the lock button. For what it’s worth, the swipeable “cards” on the iOS10 lock screen may provide clues about how Apple may expose the current features such as Apple Pay after removing the home button.

Image Credits

Do you have a design challenge?

Is your business trying to change customer behaviour? Is there an element of your website or services you want to improve? Why not get in touch for a free, initial consultation?

Adam Babajee-Pycroft

Managing Director (UX)

Our founder Adam has over 13 years of experience in UX. He’s fuelled almost exclusively by coffee (using one of his seven coffee making devices),curry and heavy metal. Before founding Natural Interaction in 2010, Adam managed UX for AXA Life’s UK business. Since then, he’s worked with a range of clients across the automotive, eCommerce and tech startup sectors, delivering impressive results for brands including BMW, Mini, The Consortium and National Trust.

Get in touch

To find out more about how our UX services can help your business, contact Alex now.

Contact us