Brief usability notes of a recent experience: coming back to iOS.
I left iOS a bit after iOS7 was announced. I moved to Android and experienced the evolution of KitKat and Lollipop on my Nexus 5.
Out of curiosity, and now that the OS has been greatly improved, I decided to get into iOS again and bought an iPhone 6. I’ve been using it as my main phone and there are two areas in which I find myself struggling a lot: navigation and keyboards; this post is about the later.
I often jump between 3 keyboard languages: Spanish, English and Japanese. And before today, I didn’t really pay attention to the way I used to type on Android, but it’s clear to me that it was easier compared to doing it on iOS.
The app I used for these comparisons is Evernote.
Right from start we can see Android’s keyboard is cleaner. But if we take a moment to better analyze what’s going on here, we’ll notice something more important: Android’s keyboard has less visual weight, and offers more features.
- Direct access to comma and dot.
- Quick access to numbers (hold and swipe).
- No buttons metaphor.
- Accent color for Shift and Return.
- Extra option to change language (keyboard icon).
- Extra step to access comma, dot and numbers.
- Heavier visual weight.
- Space feels a bit cramped.
- Direct access to speech recognition (and it’s pretty good, works great with spanish and japanese).
There’s a big difference in the way both OSs handle showing a list of previously activated keyboards. Both are good solutions with different features, so there’s no easy way to decide who did it best. It’s more about which one works better for you.
- List inside a modal window.
- Languages appear in English. (ie. “Japanese”)
- Extra line of informantion about the keyboard.
- Extra option to add/choose keyboards.
- List in a contextual flyout, right where your finger is holding the globe icon.
- Languages appear in their language (ie. 日本語).
- Extra option to turn on/off Predictive text.
Android has a great feature for predictive text: you can access a big list of suggestions just by tapping the 3-dot icon below the central prediction.
I typed “Incr”, note how the first prediction is “Once”, and that’s good, but I was going to write something quite different.
Without adding more characters, by tapping the 3-dot icon I can see that the full list includes the one word I was going to type: “Incrementally”.
On iOS two things can happen: it either predicts the word I’m about to finish writing (my iPhone found “Incrementally” until “Incrementa”), or I ignore the feature completely and just finish writing the word by myself.
The second option, to ignore predictive text, might not seem like a big deal for iPhone users who only use one language/keyboard. But that’s exactly the point here: having features like Android’s 3-dot icon is important for users who appreciate small details that add big value to the way they work&write while jumping between keyboards.
We take keyboards for granted. When I came back to iOS I noticed my experience with the device was being one of small struggles, but I couldn’t pinpoint the reason. It wasn’t the “iOS version” of Evernote, Twitter, WhatsApp, Instagram, etc. It was deeper than that. It was iOS’ dated navigation/keyboard synergy.
On the other hand, iOS just works, is intuitive, is most people’s choice. So what’s wrong?
I think the key here is this: By being open to use Android and going through its evolution from KitKat to Lollipop, gave me perspective on what mobile experiences can be. That might sound vague, but now that I came back to iOS I’m living it first hand. I use iOS and I’m constantly finding how different interactions could be improved.
Questions like “Does this mean Android is better?” would open trivial discussions that lead nowhere. So, regardless of OS, I’m more interested in questions like:
- How is the overall experience of my app affected by keyboard interactions?
- Which areas of my app rely heavily on user’s input? Can I change those interactions to make them simpler?
- Is my team addressing the product internationalization correctly? What are we taking for granted?
- How should I approach usability testing for a market that is prominently bilingual?
Much of this post might not click with people who are married to one OS, or that never need more than one language. But as I said in the “Typing Japanese” section: one doesn’t necessarily need to speak different languages in order to make observations and take notes for wider understanding on sophisticated design solutions, which is how I would describe Android’s Japanese keyboard, Predictive Text and main layout.