Labels (choose what you want to read about)

Sunday, July 21, 2019

Is iOS intuitive any more?

Cross-posting from my Linkedin

The iPhone OS was game changing back in 2007 when Apple effectively created the modern smartphone. Truly ground up, easy and intuitive to operate even for someone who may not be very tech savvy. But that was more than a decade ago. I am not sure the 'intuitive' title should hold any longer. Having been on Android since the start (minus a fling with an iPhone in 2014), I recently got the chance to use iOS at length because my wife moved from a Google Pixel and got an iPhone. Early impressions have been disappointing to say the least - across software and hardware. But OS/navigation-wise, here are 3 significant misses which have bugged me:

1. Back key is physically in the worst spot possible. Across every single app on the iPhone, if you want to go back one screen, the back key is in the top left of the screen (see image). On large screens like the iPhone X or XS (not to mention the Max/Plus models), that means it's impossible for a right handed user to reach the back key without significantly changing the grip. The idea of a dead spot has been known since at least 2014 (see heat map image below), so I was shocked to note that iOS is still keeping it's main navigation button bang in the dead spot. Super annoying on a daily basis, compared to Android where the back key is so user friendly at the bottom, right in the thumb's arc area. I checked how my iPhone using friends are managing: one of them with the 8 Plus has become a left handed phone user without realizing it!


Thumb Zone Heat Maps - from https://www.scotthurff.com/posts/how-to-design-for-thumbs-in-the-era-of-huge-screens/

2. No clarity on how to reject a call on the lock screen. I literally had to Google 'how to reject a call on the iPhone' because I was looking to reject a spam call and all I could see on the screen was a 'slide to accept' message (image below)! No reject button anywhere. Turns out, apparently you have to double click the power key. Whaaat? So users are expected to pore through the user manual or refer to Google for one of the most basic functions of the phone? Yes I get it Apple: you want to use a slider on the lock screen so people don't accidentally decline calls when phones are in pockets. But there is a simple enough solution to that (which most Android phones use): slide left to reject; slide right to accept. Or top / bottom in the case of the Pixel. We shouldn't need training by NYT (here) to learn how to reject calls!


3. Annoying lock screen camera shortcut. Three weeks in, I still couldn't figure out how to reliably use the camera shortcut on the lock screen (image below). Nice round button but how does it deploy? Is it a double click? A long click? A swipe starting from that roundel, going up or down or perhaps sideways? Whatever I was doing, sometimes it would work and other times it wouldn't. An initial Google search told me there is a better (hidden) shortcut to the camera by ignoring the round button entirely and swiping the entire screen left, which is what I have been doing. It's only for this article that I finally figured out how to make the round button work: it's a 'hard click' (Force Touch in Apple lingo) and not a long click or a double click. Sigh, another hidden navigation approach that requires me to leaf through the user manual to figure out. Meanwhile the Pixel's camera shortcut is also a hidden one (double click the power key), but at least it gives you massive upside once you learn it: this shortcut will work anywhere. Lock screen or not, home screen or not, inside any app on the phone...just double click the power key and boom, your camera is on.



These complaints aren't a knee-jerk reaction to gesture control. Designed well, gestures can become magical for power users, while not coming in the way of the experience for average users (e.g. Microsoft Office shortcuts). But this here is Apple messing up the everyday experience by forcing a mish-mash of four different approaches (1. On screen instructions 2. The power and volume keys 3. Force touch 4. Gesture control) all into a single navigation language.

Make no mistake, these are all small irritations in the grand scheme of things. None seem to be permanent - some Googling or reading through the manual will equip users with answers, but isn't that the whole point of intuitive navigation? Meanwhile, there remain many other ways in which iOS is still a great OS (e.g. the buttery smooth transitions, the lovely thin white bar at the bottom that anchors in your brain as a home key replacement - I haven't missed the physical home key at all!). But the UI/UX disappointments are still many relative to where I expect iOS to be 10 years after launch - especially in comparison to how polished Android has become on Google's flagship Pixel phone.

And Google has been taking UX to the next level on the Pixel: not just placing menus / options where you expect to find them, but using AI (or plain common sense) to assist you in common tasks. Example of SMS contextual options below - without opening the messages app, I can mark a text as 'read', or copy the one time pass code. If it's a personal text, Google auto-suggests responses, which are suitable occasionally if not always. There are similar neat touches across the board which make you feel like you are truly interacting with a 'smart' phone.

And that is the crux of the matter: perhaps Apple - by largely focusing on aesthetics - will miss out on next-gen OS design powered by AI, just like Siri is becoming an also-ran in the war between Google Assistant and Amazon Echo.

No comments:

Post a Comment