Skip to content

Why 3D Touch could be the most important addition to the iPhone in years

First came capacitive screens and multi-touch, two innovations which stood the original iPhone apart from every other smartphone on the planet and have been used by developers to create brilliant, beautiful and easy to use apps.

Eight years on and we have the next big step in iOS user interaction, 3D Touch.

Read more: How to use 3D Touch on the iPhone 6s & 6s Plus

Introduced on the 2015 line of MacBooks before eventually reaching the Apple Watch, where it was called Force Touch, 3D Touch allows the screen, or in the case of the MacBook, the trackpad, to detect the pressure with which a touch is applied. The 3D in its name comes from the idea that you push into the screen, adding a third dimension to the lateral swipes used to control multi-touch apps. Apple sensibly abandoned the ‘Force’ label, realising that implying force needs to be applied to operate a device is neither sensible nor good PR.

However, 3D Touch isn’t really about sensing pressure at all. As Apple software engineering boss, Craig Federighi explained to Bloomberg: ‘It starts with the idea that, on a device this thin, you want to detect force. I mean, you think you want to detect force, but really what you’re trying to do is sense intent.’ Sense intent. That’s huge, and it gives us a clue to where Apple’s user interface designers are going with iOS, and by extension OS X. It’s not enough to respond to what a user does, the idea is to interpret what a user means, and respond to that. Federighi acknowledge the difficulty in doing that. ‘You’re trying to read minds. You have a user who might be using his thumb, his finger, might be emotional at the moment, might be walking, might be laying on the couch. These things don’t affect intent, but they do affect what a sensor [inside the phone] sees. So there are a huge number of technical hurdles,’ he said.

In short, while the Corning Gorilla Glass used to make the screen on the iPhone 6s has been toughened to allow it to better withstand the added pressure applied to it thanks to 3D Touch, Apple really wants to remove the glass altogether as a barrier between user and software. By combining 3D touch with haptic feedback, such as a small vibration, iOS is going beyond the screen, microphone and speaker as the primary means of interacting with apps.

How far can it go? Independent research by R. Kevin Nelson into the code in iOS 9, carried out after its launch, suggests there’s a great deal of potential. Currently iOS can detect three levels of pressure. These are used in three ways: system-wide shortcuts, known as Quick Actions; and in-app shortcuts – peek and pop. A peek requires a little pressure and a pop, slightly more.

Currently, pop is used to do the same thing as a regular touch, but after the user has already applied ‘peek’ pressure. That means that instead of having to ‘peek’ then release and tap, you can ‘peek’ then just increase the pressure to ‘pop.’

However, Nelson’s exploration of iOS 9’s code suggests that another three levels of pressure could be detected. More than that, said Nelson ‘becomes a burden more than it helps.’ He’s probably right, even six pressure levels right now feel like they would be difficult for most of us to apply.

That may be why Apple has started with three, and used the third one (pop) only where level 1 (peek) has already been applied. As users, we need to get used to the idea and the mechanics of applying force to signify intent on a user interface. But it’s easy to see how those six levels could be used, say, in games to run or drive faster, for example.

There are are other reasons for limiting 3D Touch currently. It’s only available on the iPhone 6 and iPhone 6s at the moment, so all the features which use it must also have an alternative means of controlling them, for those with older iPhones, as well as iPads. That applies to both iOS 9 itself and to apps.

It’ll take a couple of years at least before enough of the iOS user base has a 3D touch capable device and developers can start implementing features that only use 3D touch as a means of interaction. At that point, we’ll see much wider use of it in apps and games. Imaging, for example, applying a photo filter with 3D touch and using pressure to dictate the strength of the effect.

There’s already speculation that 3D Touch means the end for the iPhone’s Home button and that pressing firmly on the screen will take its place. However, there is one major problem with that – TouchID. Without a Home button, the TouchID reader would have to be re-located. Where would you put it? On the side? Hardly, given how thin the iPhone is and how much of your fingerprint TouchID currently needs to make a successful identification. On the back? That would present a problem if, like many iPhone users, you keep your device in a case.

The best solution would be to build the sensor into the screen itself, and that’s just what a company called Sonovation is working on. In July it announced it could place biometric sensors under a Gorilla Glass display. Apple patents uncovered recently also point to this as a potential solution.

There’s no doubt that Apple would like to remove the Home button; it would provide significantly more screen space. And some of the OS features enabled by 3D Touch – calling Siri and multi-tasking, for example — replicate those of the Home button. That’s no coincidence.

It may not happen in 2016, but it would seem clear that at some point, a combination of 3D Touch and under-screen biometric sensors will mark the end of the Home button on both the iPhone and iPad.

Whether it’s new ways of using apps or enabling a radical re-design of the hardware, 3D Touch is huge and promises to change the way we use not just the iPhone, but the iPad and Mac too, permanently.