One Blind Guy’s Experience with Android. How accessible is it really?

Revision History

17 June 2016

  • Expanded on the use of OK Google
  • Refined discussion of Focus Speech Audio

 

Introduction

If you read this blog on a regular basis, you’ll already know that I enjoy playing with new technology. Since one of my roles here at Mosen Consulting is to train people in the use of technology, I can even justify playing with new gadgets and calling it work.

Three years ago, I bought a Nexus 7 tablet. I became broadly familiar with the Android operating system in conjunction with TalkBack, Android’s official screen reader for the platform. I then left it neglected, because I didn’t consider it an environment that was either pleasurable to use or that could meet my productivity requirements.

A lot of the apps that shipped on the device were inaccessible, and I had a lot of trouble getting gestures to register. I got tired pretty quickly of performing a double-thump, just to perform an action.

Several factors led me to pick up a new device and take a fresh look at the Android experience.

First, three years is a long time in technology. The operating system has matured and TalkBack developers have responded to feedback.

Second, using some of Google’s apps on my iDevices has given me huge respect for the user experience they are creating. Google Now, Voice Search, YouTube and Maps are great tools. It made me curious to learn more about what the native Android experience might be like, unconstrained by Apple’s sandbox approach.

Third, while it appears that there may be a way with the iPhone 7 series to charge the phone while using wired headphones, the headphone jack controversy got me thinking about the compromises I have made in exchange for the very good accessibility experience iOS offers. The many books, podcasts, email messages and blog posts I’ve put together since I bought my first iDevice are a testimony to the fact that iOS and VoiceOver have changed my life beyond measure. Yet there are still things I can’t do with my iDevices that I was doing with my old Nokia phones running Symbian all the way back in 2009.

Apple’s sandbox approach is designed to keep our phones safe and protect our privacy. All in all, it succeeds very well with this objective. But experienced users pay the price in terms of flexibility. Even though I consider myself a proficient iTunes user, and except for when Apple breaks accessibility I can do what I need to do with it, I still resent that I can’t cable my own phone to my own computer, see it as a drive, copy music to a music folder, and play that music in any number of apps. If needed, I want to be able to delete music from my phone in the same way.

I want to do simple things like assign a piece of music as a ring tone, without having to go through a bunch of hoops to have it contain a certain file extension and get it imported in just the right kind of way.

I miss the ability to run certain kinds of apps like call recorders, in those situations where being able to record a quick podcast interview when I’m mobile would be useful.

In short, there are a bunch of ways in which Apple limits my ability to do what I want with the device I paid for. I’m sure they’d say that it’s for my own good, but as an experienced computer user, I think I’m the best judge of what’s good for me.

Admittedly, through the use of various application programming interfaces, they’ve backed off a little bit over the last few years as Android has become the dominant player. But the operating system itself is fundamentally locked down unless you jailbreak. Jailbreaking is harder to do as Apple closes the exploits that make it possible. It also eliminates the possibility of participating in Apple’s beta programme.

So philosophically, as someone who likes to tinker and customise, my heart is with the ethos of Android. But as the old saying goes, philosophy bakes no bread. As blind people, we’re constantly battling ignorance and discrimination. We owe it to ourselves to be as productive and efficient as we can be with our technology. So I’ll go with the option that allows me to get as much work done when I’m away from my computer as possible.

With that background in mind, this is my account of what it was like to spend some quality time with a current Android device. This blog post is no more than one blind guy’s experiences, and it’s written from the perspective of an end user, designed to be digestible and helpful to end users. Any review like this is going to be subjective. It’s influenced by the things I do most with my phone. At least some of them will be different from the things you do with yours. The post has a lot to say about TalkBack as a screen reader, but accessibility is a means to an end. The end is being able to use the device and applications you choose. So I’ll also spend some time talking about the user experience I’ve had getting some tasks done.

This post has been reviewed by expert Android users, because while it contains my personal opinions, I want as much as possible to avoid errors of fact. There may be some though, so I encourage you to read any comments left in response.

Even though I’ve tried not to let this colour my review too much, of course it’s based on the fact that I’ve been using iDevices for years.

While some blind people, annoyed with those who point out Android’s shortcomings, would claim that it isn’t fair or appropriate to compare TalkBack and VoiceOver, I couldn’t disagree more. When a sighted person looks at purchasing a smartphone, they’ll compare the two platforms and the way things are done on each. I agree that TalkBack doesn’t have to do things in the way that VoiceOver does, but the two screen readers should be compared based on how efficiently a blind person can get the job done, and the number of apps that work well with it. So I’m not going to shy away from drawing comparisons in this post.

Choosing a Device

Choosing an Android device can be both liberating and confusing. Not only do you have to consider how much storage you might need and the size of the screen as you do when purchasing an iPhone, but Android gives you a vast array of options from which to choose, from many manufacturers.

In the end, I decided to purchase a Huawei Nexus 6P. It’s a phablet, about the same size as my iPhone 6s Plus. I bought a Nexus device because they are produced by original equipment manufacturers to Google’s specifications. They run Android as Google intended it to be, without added apps or modifications.

The price that many Android users have to pay for a wide range of devices and user experiences is fragmentation. Operating system updates can either take months to appear on some devices, or they may never appear at all. Five months after the release of Android 6.0 Marshmallow, it was only running on 2.3% of devices.

When you buy a Nexus device, you can be assured that you’ll be one of the first to receive updates to the operating system. As someone who likes to be on the cutting edge and try new things, that appealed to me, particularly given that accessibility seems to be improving steadily with every release.

Indeed, I briefly upgraded my Nexus 6P to the Android N preview, until I realised that beta testing an operating system and screen reader with which I was not intimately familiar was a bit too much to take on at once. Upgrading and downgrading was a snap. Simply enrol in the beta, and the update is pushed to your device. Unenroll, and you’re given a software downgrade and have to start over. But let’s go back to the beginning.

Set-up

Once I got the Nexus 6P home from the store, I was unable to complete the initial set-up process without sighted assistance. I’ve set up a number of Android-based devices without issue in the past, the most recent of which was the Kindle Fire I bought a couple of months ago, so I’m very familiar with the process. You should be able to power up the unit, and when at the initial set-up screen, hold two fingers down on the screen. Despite having sighted assistance on-hand to confirm that the initial set-up screen was indeed being displayed, the gesture to start an accessible set-up did absolutely nothing. I have since learned from some other Nexus 6P users that they have had a similar issue, and that it may have been corrected in newer versions of the software. That seems indeed to be the case. I erased my phone after applying software updates and started over. This time the accessibility gesture worked. After holding two-fingers down on the screen for a couple of seconds, a highly intelligible text-to-speech engine prompted me to keep my two fingers held down until I heard a beep. TalkBack then allowed me to complete the rest of the phone’s set-up independently. However, when I unboxed it, the only thing I could do was get sighted assistance to complete the set-up as well as start TalkBack from the device’s Accessibility Settings. Not the best of starts.

If you are able to complete the setup of your Android device independently using TalkBack, be prepared to have a set of headphones on-hand. TalkBack with the default Google keyboard will not speak passwords if you’re using the phone’s built-in speaker. While I appreciate the intentions behind this feature in that a blind person may not know who is around them when they’re typing in sensitive password data, this restriction seems a little arbitrary in a way that doesn’t sit well with Android’s philosophy of greater user flexibility and responsibility. Forcing the user to have a pair of wired headphones on-hand in an era where some may not own any could cause real issues getting the device configured. It seems to me a rather pointless measure anyway, since the device is not smart enough to know what’s at the end of that 3.5 mm cable connected to the headphone jack. Sure, most people will connect headphones, but they could just as equally plug in a big wired speaker and blast the password to the neighbours. In the end, we as blind people know when we’re in an appropriate environment to set up a device. Another work-around if you don’t have headphones handy is to skip the Wi-Fi setup, enable the speaking of passwords in accessibility settings, then connect to a wireless network. I recommend enabling this setting anyway, because when a password field is populated and the speaking of passwords has not been enabled, TalkBack doesn’t tell you the number of characters that the field contains.

Yet another option is to install one of the many accessible third-party keyboards that don’t feature this password restriction, but of course you have to set up the device before you can do that.

TalkBack guides a new user through a familiarisation process by way of a tutorial, in which you’re introduced to the gestures and given a chance to practice them. A form of this tutorial has been available for some years, but it’s structure and language have become increasingly comprehensive and friendly. It’s a nice touch. For me, it’s benefit at start-up was somewhat lessened by the fact that an older version of TalkBack shipped with my device, and the gesture set of the update has changed in some key respects. This is difficult to avoid if a product is evolving, and it’s better to have this issue than be stuck with a product that is stagnant. I was impressed that after eventually applying the update from the Google Play Store, I received a notification telling me that gestures had changed. And you can go back and run the tutorial at any time should you need to practice.

Many other core Android apps, such as Contacts, Gmail and even the default keyboard are individual apps in themselves and can receive updates through the Play Store. This is a different approach from the one Apple takes, where many of its core apps are built into the OS. It’s a great strategy, particularly in an environment like Android where there’s so much OS fragmentation. Even if your device manufacturer takes an age to update you to a newer version of Android, at least you can grab the latest screen reading technology and new core apps that are available. It’s worth keeping in mind though that some accessibility improvements may be dependent on changes at the operating system level.

Part of the setup process involved registering my fingerprint with the sensor located on the back of the phone. This was totally accessible, giving me feedback all the way through and clearly showing me the various security options available.

Initial Impressions

As I immersed myself in Android, I took some notes about things that stood out for me. Some of these issues have accessibility ramifications, while others are the observations of an experienced iOS user having a play, and are not blindness-specific.

Enormous Improvement with On-board apps

There was a time, not so long ago, where you couldn’t enjoy a truly accessible experience on Android until you replaced or enhanced many of the default applications and features with more accessible alternatives. That’s fun for the geeks among us, but for those who just want to get on with using their new phone and may not be too tech-savvy, it was a steep hill to climb. Compared with the last time I took a serious look at Android, it’s like night and day.

I haven’t played with every feature of every app, but I’ve opened most of the stock apps and explored a little.

Google Maps works well, although I miss the ability to explore the screen with my finger and get a feel for the layout of streets and intersections as I can in iOS. Nevertheless, it’s a snap to get directions and other information. Transit information isn’t available through Apple Maps in New Zealand, so the accurate information I get from an app so intrinsic to the operating system is a welcome change for me.

I found it easy to navigate the calendar and add appointments.

The Gmail app is useable, but in my view not terribly efficient. I appreciate though that efficiency can be subjective, and some people may consider it adequate. I was given the tip to install and use a third-party app called Aqua Mail. Since I have so many email accounts to manage, I had to pay for the premium version, but it’s one of the best mobile email clients I’ve used. It even supports Imap push, which means I’m able to be even more responsive than with my iPhone. It’s a beautiful thing.

Google’s default keyboard is now accessible. Some people will appreciate the haptic feedback, which gives you the impression that you’re getting some traction from the virtual keyboard as you type. Typing is similar to touch typing on the iPhone, in that you slide your finger around the screen, lifting your finger when you find the character you wish to enter. If you prefer what iOS calls “standard typing”, where you must double-tap or split-tap a key, then you’ll need another keyboard. There are plenty of these in the Play Store. It seems to me that there are more accessible keyboards on Android than there are in iOS, possibly because Android has had third-party keyboards for much longer.

My Nexus 6P comes with Google’s Messenger app. It feels to me a lot like the iOS Messages app. It’s accessible and a pleasure to use.

I got up and running with Play Music from Google without any trouble, taking advantage of the free trial offered to all nexus purchasers.

Surfing the web with the latest build of Chrome is now truly useable, and the granularity features in TalkBack make it easy to navigate by elements you need when on a web page. I found myself missing reader mode in Safari though, which helps a blind person to get past the clutter and onto the important content.

In short, long gone are the days when you’ll get an Android device out of the box and throw up your hands in horror. I understand that Google has accessibility champions for all of their major product lines now, demonstrating a real recognition of their need to step up to the plate and ensure that accessibility is a part of their DNA.

Consistency

The Nexus 6P has no physical home button. I’ve not found this a problem at all, since it remains an icon that is visible on the screen at the same place at all times. But if you have issues with locating the Home icon, there is a TalkBack gesture, which I actually find more cumbersome than locating the button. More on that when I get to discussing TalkBack.

To the right of the Home icon is the Overview Button. This is a little like the App Switcher in iOS, and shows you apps and other items you’ve used recently. From here, you can check how much RAM and energy an app is using and how much storage it consumes. You can close the app, and uninstall it if you like. If you know your way around a computer, you’ll appreciate all of this information being so close to hand.

To the left of the Home icon is the Back button, a feature I like very much in Android. Some iOS apps have a back button, and some do not. Some implement a back button inconsistently, so that it’s available in only some parts of the app. The Back button is always available, without exception in Android. It’s a function of the operating system that is kind of like the Back button in a web browser. Pressing it repeatedly retraces your steps until you’re at the Home screen, which is the top level.

As someone trying to come up to speed, I found this consistency very helpful.

A Smart Home Screen

If you’re a user of newer versions of Windows, you’ll be familiar with the concept of active tiles, which may display news, weather, and other information that changes. In iOS, widgets are available, but they’re tucked away in the Today view, leaving your Home Screen a static grid of app icons where the only thing that changes is the badge count. Widgets are tiny applets that display useful information.

Android allows you to jazz up your home screen with a combination of apps and widgets. If you like the weather visible right on page one of your Home Screen, it’s doable. If, like me, you work a lot with currencies other than your local one, you can put exchange rates right there. There are thousands of these things, so one needs to be a bit selective or you’ll get information overload. Widgets certainly make you feel like you have plenty of information literally right at your fingertips.

Your home screen can be even more useful thanks to the ability to add a contact right to it. Third-party apps make this possible on iOS with more effort, but in Android, it’s a feature that’s part of the OS. This could be particularly useful for less tech-savvy users who may need a small set of contacts they can reach in emergency situations.

Feels like Home

Product pricing is always a tricky business for companies serving many markets. Some of us do the numbers on the cost of these devices, and feel that when the current exchange rate is being taken into account, we’re not getting a fair deal. This has been a common complaint with the pricing of iPhone in New Zealand, but what makes it all the more irksome is that a number of flagship features aren’t available here. So we’re paying more for less.

Apple has not seen fit to make its News app available in New Zealand, but Google’s is and it seems to work very well. It takes advantage of the well-established web-based Google News service, which mines articles from the web and can be customised to your preferences. Being Google, it gets better over time at understanding what you like to read.

Similarly, Google’s weather information in New Zealand is more accurate, because they’ve taken the time to integrate with New Zealand’s Met Service, the official provider of weather information here.

As already mentioned, transit directions work here whereas they don’t in Apple Maps.

Google’s Voice Assistant knows much more about local things than does the present implementation of Siri, including rugby and cricket information, something I’m sure my readers in Australia, the UK and a number of other countries will appreciate.

Apple Pay isn’t available in New Zealand, and nor is Android Pay at the moment. However, the fact that the Near Field Communications chip on Android phones isn’t locked down to the hardware or OS manufacturer means that alternative payment solutions can be used.

In short, my Android device seems more aware and more capable of serving me in my location.

While I’m on the subject of NFC, the openness of the technology on Android lends itself to some awesome applications. We own a couple of UE Meggaboom Bluetooth speakers, which are NFC-aware. All I had to do to get the speakers paired with this Android device was to touch the two devices together in the right place. Very impressive.

Our bus system here is also NFC-enabled with a supported SIM, so you’ve paid for your bus trip just by getting on the bus with your Android phone.

The Play Store

I’ve enjoyed using the Play Store, Google’s App Store equivalent, very much. There are two areas where it has really stood out for me.

First, while the Store experience is fully accessible on the device itself, it’s also fully accessible via any browser. It’s a pleasure to use Firefox with JAWS to explore the Store, search for specific apps, and then nominate a specific device to which I want to send the app. If the device is switched off, all of the requests will be queued for when the device is next on and connected to the Internet.

Yes, a similar function is available through iTunes, but I find a browser-based experience more speedy and pleasant.

The second thing I like about the Play Store is so beneficial to anyone with accessibility needs that it gets a “hey wow” award. If you buy an app and find it to be inaccessible, you can press a button within two hours of making the purchase that fully refunds you, no questions asked. This has been extended from an initial limit of 15 minutes. Two hours is ample to determine whether the app is fully accessible, completely unusable or somewhere in between, and if there are issues, whether you’d like to chance your luck on trying to get the developer to make some changes.

Text-To-Speech

I’ve been a very happy camper while conducting this evaluation, because I’ve had Eloquence on my phone. You can install a range of voices onto your device, and any app can hook into those voices. It’s elegant, and it works. You can also purchase the full range of Nuance iVocalizer voices.

The voices I purchased from Code Factory all offer a feature many of us have been asking for in iOS for a long time – a pronunciation dictionary. The pronunciation of unusual words can vary widely between text-to-speech engines, so there are advantages in having the ability to change pronunciation on an engine-by-engine basis. It would still be useful at times to have a pronunciation dictionary in the screen reader itself, when you want to make global changes.

Presently, it’s not possible to change pronunciation when using Android’s default text-to-speech.

It’s a little thing, but I do appreciate that when I’ve set my language to a non-American version of English, the text-to-speech will use words like “full stop” instead of “period”.

Voice Commands in Google Now

When Google was established, it was all about search. So it’s not surprising that Google does a mind-blowing job of responding, often with remarkable precision, to specific questions. Ask Siri a question, and while it will sometimes give you a specific answer, it will more frequently tell you, “I’ve found something on the web, take a look”.

Google Now also integrates with third-party applications, vastly extending the feature’s capabilities. If I tell Google Now to take a note, I get prompted for the name of the app I want to use.

I can tell Google Now to play a specific clip on YouTube, or an artist in Google Play Music.

If I say “Send a WhatsApp Message to Bonnie”, that’s all it takes.

Just as with Siri, you can post on Twitter or Facebook by voice.

I found Google Now to be snappy in issuing its responses, and highly accurate with dictation. The latter is hard to quantify and it may just be wishful thinking on my part, but it’s backed up by some studies which suggest that Google has a higher accuracy rate.

I have, however, come away from this process with a new appreciation of Siri. First, I can issue the “Hey Siri” command, or hold down the Home button, from anywhere in my iPhone. The process of configuring “Hey Siri” to respond to my voice is completely accessible.

For some but not all users, Google Now can be launched system-wide, or even when the phone is locked if you choose to configure it that way. There are two problems.

First, the process is not easy for a blind person to set up in currently shipping versions of Android, although it is vastly improved in Android N, currently in beta. To get “OK Google” to work system-wide, you currently have to quit or suspend TalkBack before invoking the screen where the configuration choices are located, then start or resume it again. Compared to the simple, fully accessible “Hey Siri” process, it’s not a good experience at all and will put many people off configuring “OK Google” for global use.

Second, while I can use “Hey Siri” with my language set to New Zealand English, the OK Google feature is disabled for me altogether in Android because of my language choice. This is unusual, since I have found overall that Google generally supports New Zealand well. Global OK Google is a rare exception. 

Google Now’s ability to control system functions is lacklustre compared with Siri’s. With Google Now, I can toggle off Wi-Fi and Bluetooth, but not cellular data. I can add an appointment or reminder, but I can’t change or delete one. I can adjust the brightness, but can’t turn on or off do not disturb.

Most important of all, I can’t turn TalkBack on or off, despite Google clearly knowing what I want. If I say “OK Google, turn TalkBack on”, I get taken to accessibility settings. That’s not very helpful, since I’m blind and have no idea where to perform the two taps that would enable TalkBack for me.

Chromecast

I took another foray into the world of Google with the purchase of a Chromecast Audio device. It’s a low-cost dongle, configurable via an Android or iOS app, that connects to your Wi-Fi. It has a range of audio output jacks so you can connect it to wired speakers, a rack system, a Sonos CONNECT etc. Once set up, you can beam audio from your device to the Chromecast Audio.

There’s also a full audio and video version, that plugs into an HDMI port.

In iOS, functionality is limited because specific apps have to support the technology. But in Android, Chromecast support is baked into the operating system. It’s a great experience, and it is superior to Apple’s AirPlay technology in one important respect. In iOS, unless you’re using an iTunes in the Cloud service, your iDevice is responsible for sending the entire TV show, movie or podcast from the iDevice to the device receiving the AirPlay signal. Your enjoyment of the content can be affected by Wi-Fi glitches, crashes, or incoming calls. It can also be a battery drain.

Once you start “casting” something, your mobile device is taken out of the mix. Chromecast takes over the streaming. You can make calls, play other content on your phone, even shut it down altogether if you want to, and the content just keeps on playing on the device you’re casting to.

Chromecast is also fully multi-room aware.

I came to have a real appreciation for how well the technology functions on Android. It just works.

Third-party apps

I’ve briefly tested over 200 apps in the time I’ve been working with my Nexus 6P. Some of them were recommended by experienced Android users, and some were Android versions of apps I have and like on my iPhone.

Almost all of the blindness apps I tried operated similarly to their iOS counterparts. I had excellent results with KNFB Reader, finding it to be accurate and fast. I appreciated being able to take a few pics without having to pay for the app again, having plonked down the cash for the iOS version already.

TapTapSee also didn’t disappoint.

I worked with a couple of currency identification apps specific to Android that performed their function well.

Sadly, BlindSquare isn’t available for Android and I missed having it around.

Voice Dream Reader exists on Android, although it is doing such clever things with gestures for VoiceOver users in iOS that I noticed how less efficient it was to use on Android. That’s not at all a criticism of the developer, but I think more of a reflection on the constraints when developing. I suppose if the app were completely self-voicing and took TalkBack out of the loop altogether, a similar experience might be possible.

Dice World, whose developers have shown outstanding commitment to accessibility, works very well on Android, although there isn’t as much customisation of what the app speaks as there is in iOS. I have heard from some Samsung Galaxy users that some Samsung Games features need to be disabled to get Dice World to work correctly.

In terms of general apps, a very small number worked better for me on Android than on iOS. The one that stood out was the Healthmate app for my Withings Smart Body Analyser which I’ve blogged about previously. There were far fewer unlabelled buttons than on iOS, and the app was much more pleasurable to use.

Some apps were similar in accessibility between the operating systems. Apps like The Guardian, NPR News, and BBC News were very good.

Android seems well-served by the major social media and messenger apps. It seemed to me that I did a little more swiping around in Facebook for Android than I do in iOS, but it was accessible and I was able to use it without issue.

The official Twitter app works well, although just like its Mac equivalent, I found no way of stopping it from verbalising both the full name and the Twitter username in every tweet. I would prefer to hear just the full name.

I installed the Android version of my old friend Tweetings. By default, the user experience for a TalkBack user isn’t optimal, requiring multiple flicks for every tweet. But I got some great advice on how to reconfigure the app, after which it was a more than satisfactory experience.

I had no difficulty getting up and running with Skype, making and receiving calls easily with a layout that is similar to the iOS app.

I found the majority of apps were less pleasant for me to use on Android than iOS. Some of these relate to the user interface of, and what I perceive to be deficiencies in, TalkBack. No actions rotor and less intuitive gestures make for more long presses and scrolling around. But others were far less subjective. Unlabelled buttons were more common. In my banking app, which is 100% accessible on iOS and a joy to use, I couldn’t even log in once I’d set it up, because the keypad to enter the security code was inaccessible. Eventually, I worked out that I could get around that problem if I had a Bluetooth keyboard handy. Once I’d logged in, I could pay bills to external sources, but not transfer money between accounts.

The DSAudio app for my Synology NAS was useable, but not terribly efficient. I have a very large music collection, and in iOS, the table index means that I could scroll to artists beginning with a specific letter of the alphabet with precision. There was no such table index in the Android app, perhaps this is a kind of control that just isn’t present in Android. TalkBack has some excellent scrolling capabilities which I was able to use, but it’s not as precise as choosing a specific letter of the alphabet to get to, and took me much longer.

An app I use a lot in iOS, the Sonos controller, was useable I think only because I knew what I was looking for thanks to the fully accessible iOS app. The Android version is verbose and erroneously identifies the kinds of controls in use.

What was impressive though is that when using Google Play Music, I was able to send music directly to any of my Sonos devices, rather like the way AirPlay works in iOS.

In fairness to Android, there’s probably a chicken and egg element about the high number of inaccessible apps in at least some of these cases. In many English-speaking countries at least, I don’t think there’s any doubt that the majority of blind smartphone users are using iOS. App developers are therefore likely to receive more requests from iOS users to address accessibility deficiencies in their apps. So I do intend to take up some of these accessibility issues, especially with my bank, to see how easily they might be addressed.

It won’t solve all the problems, but if the inaccessibility of an app relates to little or no text labels on buttons, you can label them either through experimentation to see what the buttons do, or with sighted assistance.

Some apps I use regularly aren’t available for Android at all, but usually, I was able to find accessible substitutes. For example, in the absence of Downcast or Overcast, I used Pocket Casts as my podcast client. It’s available on both iOS and Android, which meant that I was able to keep my podcasts in sync.

Using TalkBack

A screen reader is at the very heart of a blind person’s user experience of a computer or smartphone. So how capable is TalkBack and how does it measure up?

Adjusting the Volume

After getting TalkBack started, one thing that flummoxed me right away was that the volume controls on the phone were not controlling talkback volume. The tutorial, which was a bit of a strain to hear because of the low volume, made no mention of the fact that if I want to adjust it, I need to place a finger anywhere on the screen and then use the volume controls on the side of my device. It would have been very helpful had this been mentioned in the tutorial, and is a serious omission. This approach of having to place a finger on the screen to adjust TalkBack’s volume took a lot of getting used to. I seldom want to adjust the ringer volume, which is the default behaviour for these controls, but for various reasons, I want to adjust screen reader volume regularly. I often carry my phone around in my pocket, cabled directly to my hearing aids. Being able to adjust the volume one-handed is important to me.

A couple of days in, I stumbled upon a little free utility, Rocker Locker, that loads at start-up and locks the volume controls into adjusting media volume. This suited me fine until I got more ambitious with the device. You can use the volume controls on your device to move the cursor around an edit field, so you can review, modify and delete text. But if you use a utility that locks the volume controls into performing media adjustment, it prevents this function from working. You can perform cursor movement via the menu system, but it’s nowhere near as simple. So ultimately, I surrendered to the force and just accepted that placing a finger on the screen before adjusting the volume is just how it is.

TalkBack Tutorial

As mentioned in the section on set-up, TalkBack comes with a tutorial which teaches users about how the screen reader works. It runs automatically the first time you run TalkBack, and it is available at any time from within TalkBack’s Settings. If you’re getting to know a new device and user interface, there’s a lot to take in. So users may get the basics down first, then revisit the tutorial several times as they add to their knowledge. The tutorial is well-structured for this, being divided into five lessons. Lesson one introduces the user to navigation basics, starting with exploring by touch. If you’re coming from another device that uses touch, these concepts will be rudimentary, but it’s a great introduction for those who haven’t used touch before. It covers locating icons on the screen, then double-tapping to activate the last icon that was spoken. It then moves onto swiping left or right to navigate between icons.

Lesson two introduces us to scrolling with talkback. This is the point at which anyone switching from iPhone will start to notice some key differences. More on scrolling shortly.

Lesson three covers the global and local TalkBack menus. Some of the functions on these menus are now available through the use of more convenient gestures. Others are still handy, such as spelling the last thing that TalkBack said, repeating it, getting to settings for TalkBack and TTS among other things.

Lesson four covers text navigation, and how one can move by common units such as character, line, word, sentence, paragraph and page.

Finally, lesson five covers text editing.

As previously mentioned, I’d like to have seen the tutorial cover the important first step of setting the volume to a level that’s comfortable for the end user. I would also like to have had a lesson of the tutorial devoted to the earcons. These are sounds made by TalkBack that are designed to provide contextual information to the user without the need to have the information spoken. But nowhere within TalkBack itself is there an explanation of what all these bings and bongs, which makes TalkBack sound a bit like an old video game, mean.

While the tutorial is a wonderful feature and could be improved with an extra lesson or two, I think there is a separate use case for a good old fashioned practice mode, something available in most screen readers. Shortly, I’m going to have a lot to say about the angular gestures used a lot within TalkBack. They can be hard to master, and I’d like a mode where you can perform a gesture and hear what it does. This should apply to gestures specific to TalkBack, as well as those belonging to the operating system.

Hints

Just as in VoiceOver, TalkBack by default speaks hints, providing users with information about how to interact with a control. One feature I came to appreciate was that when customised actions were available for the current item from the local context menu, TalkBack would tell you what they were without the need to invoke the menu.

Working with lists

The need to scroll through lists within TalkBack has in the past made navigating larger lists convoluted, because swiping left and right didn’t auto-scroll the view of what was on-screen as is common in other touch environments. TalkBack therefore includes a range of commands to scroll through both vertical and horizontal lists.

TalkBack now offers an auto-scroll feature which is on in its settings by default. This has made a big difference, but it’s still not as robust or seamless as iOS. One difference between iOS and Android is that when you swipe using TalkBack, the screen wraps. For example, if you are in TalkBack’s settings and reach the bottom, swiping once again will cause you to go to the first item within the app. At least, that’s how it should work. I find that usually, if I swipe from the start of a list of items to the end, auto-scrolling works pretty well, and mostly offers a seamless experience similar to VoiceOver in iOS. But once you wrap back around to the top again, auto-scrolling may not work as predictably. When auto-scrolling stops working reliably, swiping left, right and then left again may not produce predictable results. And indeed if you do that too quickly, you’ll activate one of TalkBack’s multi-layered gestures.

TalkBack Gestures

If you’re coming from iOS, one thing you’ll need to get used to is the implementation of what I’m calling angular gestures. Google may have an official term for them, but I’ve not seen it documented anywhere. The closest thing an iOS user has to this approach is VoiceOver’s two-finger scrub to activate the Back button. The two-finger scrub gesture is fairly tolerant in that you can perform it in all directions, whereas TalkBack makes use of a number of such gestures, so they must be performed precisely. For example, you may need to swipe left then right, up then right, down then left, down then up, or right then down, all in a fluid motion.

Even though I’ve been dabbling in Android since 2013,and have been immersing myself in it as I get to know my Nexus 6P, I still find these gestures difficult. I think the issue is that you have to be holding your device fairly straight before you perform them, since if you don’t, the vertical part of a gesture may be interpreted by the software as a diagonal one.

I also believe that these gestures are better left to lesser used functions, with simpler gestures being used for functions users are likely to perform regularly. For example, TalkBack makes no use of triple taps. It makes no use of two, three or four-finger taps at all. I don’t know whether there is a limitation in the operating system that prevents TalkBack from using such gestures, but having a wider range of taps available would improve the user experience a great deal.

When I’ve raised this with Android users, one suggestion that came back was that some older devices aren’t capable of detecting multi-finger gestures. It’s hard for me to imagine that there might be many such devices in 2016, but if it is in fact the case, then it’s another example of how fragmentation can constrain the accessibility experience.

While I appreciate that Android is not iOS, there is a set of gestures that has become common to both iOS and Windows. Some of these, such as swiping right and left, are supported by TalkBack. Others, such as the rotor and a two-finger swipe down to perform a read all function, are not. At present, this gesture is part of the set used for scrolling.

While I understand that we can’t expect all user interfaces to be identical, if a convention has been established as I believe it has been in terms of a gesture set for accessible touch screen navigation, using that convention is in the company’s best interests since it reduces the learning curve, and that, most importantly, is in the end user’s best interests.

One of the core functions of any screen reader is the ability to read continuously from where you are to the bottom of the screen or end of a document. When you want to read a newspaper article on the web or in a news app, or when you have a long email you want to hear, you’ll perform a say all function. TalkBack offers the ability to perform a say all in a number of ways, none of which I believe are particularly appropriate given the importance of this feature and the frequency with which it is used. You can shake the phone, and you’re offered some control over how hard you need to shake it to invoke a say all. Set the threshold too low, and you’ll be reading the screen continuously just by walking around with the phone in your pocket. Set it too high, and you’ll get fatigued pretty quickly from shaking your phone every time you want to read something continuously. Try shaking it on the bus too often, and you’ll be approached by bemused passengers who want to know if there’s something wrong with the blind guy’s phone that they might be able to help with. Or maybe you’ll elbow the passenger next to you due to your enthusiastic shaking.

You can assign it to an angular gesture. Again, the issue I have here is that I find them difficult to perform at the best of times, and I’ve given up trying to perform them when the phone is in my pocket.

If your device supports it, you can also single-tap or double-tap the side of the device. This can also lend itself to accidental activation, and can equally be difficult to perform when you want to use the feature.

What I would give for a simple two-finger swipe down to get this essential function, just like in iOS and Windows.

Even when you manage to perform the read all function, in many apps I have found that it doesn’t work at all. In some apps, there are significant pauses or there is a little repetition, while in some apps it works very well.

Similarly, you can get to the top or bottom of a screen with layered vertical gestures, while the much simpler four-finger tap at the top or bottom of the screen goes unused.

A gesture has not been implemented to stop TalkBack’s speech. The only way to do so at present is to wave a hand in front of your device’s proximity sensor if it has one. I’ve found this problematic at times because when I’m using the phone, one of my hands seems to rest in around that location naturally, causing continuous reading when I’ve been lucky enough to get it working to stop.

The TalkBack team has made some welcome changes to gestures, dispensing by default with the circular menus that many of us found frustrating. Now the default is to show these menus in list form. Perhaps we’ll see a gesture set that is more familiar to users of other touch screen products in future.

Your Commands Your Way

TalkBack gives you the ability to reassign gestures and keyboard commands to suit your preferences. With my trainer’s hat on, I can se that this may cause some issues for trainers if they don’t realise that a user has changed the gestures and keyboard commands around. With my end user’s hat on, I appreciate this flexibility.

The configurability is somewhat limited, in that you first choose from the available list of gestures, then you choose the function you’d like to assign. Obvious candidates such as multiple taps of two or more fingers are not available.

The same kind of functionality is available for Bluetooth keyboards. By default, a set of TalkBack keys are assigned using Shift and Alt as the modifier keys. I find them intuitive and quite effective. But if you’re a creature of habit and you want to use a key combination that makes the keyboard support feel more like VoiceOver, you can do that.

I have not found a way to navigate through an edit field or select text by character and word from the Bluetooth keyboard, but you can type into edit fields, hear what you’re typing, and backspace over text to delete it. Pressing Control+A to select all text, then Delete to erase it, also works. . Shortcut keys, an area where Apple and some third-party developers have been making significant progress of late in iOS, are scarce in Android apps. Occasionally, I find that I can press Control+N to start something new in an official Google app such as Gmail or Calendar. A few other keys work, but not many.

One only needs to look at the efficient experience offered by Twitterific in iOS with its suite of keyboard commands to know what a productivity boost well-implemented keyboard shortcuts can be.

Granularity

Granularity refers to the unit by which you navigate, such as character, word, line or paragraph. It is great to see TalkBack supporting paragraph granularity, VoiceOver on iOS does not. When you’re in a web environment, you’ll also want to navigate by elements such as links, headings and form controls.

TalkBack now has easy granularity control. One simply swipes up or down to choose the granularity, then swipes left or right to navigate by the unit you’ve chosen.

It’s elegant in its simplicity, although some may find the simplicity to be limiting. In Windows and iOS, a rotor feature allows you to choose what will happen when you swipe up and down. The rotor need not be limited to text navigation. You might, for example, want to adjust your speech rate, or toggle a certain function on and off. In iOS, apps that are programmed appropriately can include an actions rotor, making it easy to select commonly used functions based on context. I find that the actions rotor gives me a significant productivity boost.

By hard-wiring the swipe up and down to only apply to text navigation, TalkBack has limited the functionality of one of the simplest gestures one can perform.

For example, in certain controls, swiping up and down with one finger might have been able to change values. Instead, one interacts with certain controls a little like one does in OS X. You double-tap a control, and double-tap to select a value which causes you to be exited from the control. You then evaluate the impact of the change you’ve made, then repeat the process as many times as needed in order to find the value you want.

The lack of a rotor means that at present, there is no simple way to adjust speech rate on the fly. You’ll need to get all the way into the text-to-speech settings to do that. And there isn’t a way to adjust punctuation, an important feature when you’re proofing

On the flip side, of all the iOS and Windows gestures, I’ve found that people seem to find the rotor gesture the most difficult to master.

Playing Nice with other media

It’s important that any smartphone screen reader co-exists effectively with the many audio and video apps available. TalkBack attempts to do so, but in my experience there are often significant issues.

TalkBack’s volume is controlled using the multimedia volume, the same volume control that governs audio and video applications. So it’s not possible to set TalkBack’s volume independent of the playback volume, other than determining that it should be a certain arbitrary percentage of the playback volume. VoiceOver, on the other hand, has a rotor item that provides for separate control of its volume, independent of media apps.

A number of screen readers now offer a feature that VoiceOver calls audio ducking. This is where the volume of any media you’re listening to is slightly lowered when the text-to-speech engine says something, and raised again when speaking has stopped. In TalkBack, this feature is called focus speech audio. At least on the hardware I have, it is far too buggy to be left enabled. It works as intended in some apps, such as Google Play Music, but it makes a number of other media apps unusable. When I’m listening to a radio station via TuneIn Radio or CSpan Radio, focus speech audio causes audio to stop playing altogether whenever the screen reader speaks, resuming exactly from where it stopped when speaking has finished. This makes it impossible to have a radio station on in the background while doing other things, something I do regularly. I’m advised that going into TuneIn’s settings and enabling a toggle telling it not to pause audio in the background will improve the experience with that app. But there are still many other apps where keeping focus speech audio on breaks things.

I’ve also not found a way to play and pause audio from anywhere, as one can with the handy magic tap in iOS. There’s an argument to be made that the magic tap is trying to perform too many functions and sometimes gets confused, such as failing to pause audio that’s playing when you try to use the gesture to answer the phone. I have some sympathy for that argument, but there still in my view needs to be a way to toggle playback from anywhere with TalkBack.

Dimming the Screen

iOS has a feature known as screen curtain, which protects the privacy of blind users by darkening the screen when VoiceOver is on. It gives you peace of mind to know no one’s looking over your shoulder while you’re working with sensitive data.

TalkBack’s equivalent is called Dim Screen. You can enable a shortcut to toggle it on and off, and/or enable it in TalkBack settings. Unfortunately, I’ve had to stop using the feature, because I’m too often prompted by messages within other apps that an overlay is obscuring the screen. This feature is not fit for purpose, and it would be better to remove it than frustrate users with a very poor user experience.

BrailleBack

Regular readers to this blog will know how passionate I am about ensuring that mainstream manufacturers pay appropriate attention to Braille. While I have expressed reservations on several occasions about the quality of Apple’s Braille support, it is considerably more advanced than BrailleBack on Android. I truly wish I had more positive things to say.

BrailleBack is a separate, free application, available from the Play Store. It’s not part of TalkBack, but it is dependent on it. If you shut down TalkBack because you want to use Braille only, you’re out of luck. You’ll get a little Braille feedback, and can still control the Android device from your Braille display, but you won’t have sufficient information to use the device successfully.

The most critical flaw in the integration of TalkBack and BrailleBack is that you can’t turn speech off. Even if everything else about the Android experience had exceeded my wildest expectations, this one thing would be the deal breaker. I often work on all of my devices with speech off. Whether it be in Windows, iOS or OS X, I can always toggle speech off with a quick command. You just can’t do it with Talkback. What this means is that when I’m in a meeting, I’d have to have something connected to the headphone jack, or turn the volume all the way down, just to use my phone as a Braille-only device. If I wanted to read something while I enjoy some music, I can’t do that, because every time BrailleBack scrolls to a new chunk of text, TalkBack speaks it.

Once you’ve installed BrailleBack, you’ll find that the command set is unorthodox. I’ve been using Braille devices since the VersaBraille 33 years ago. All Braille software I’ve used has adhered to a command set that has become a standard. As a product manager in this field for a couple of companies, I was always careful to adhere to it, and it’s not as if BrailleBack offers anything that is logical or better, it’s just deficient and confusing.

Normally, you would expect dots 1-2-3-cord to take you to the top of a document or screen, and dots 4-5-6—cord to take you to the bottom. In BrailleBack, dots 1-2-3-cord opens keyboard help, while dots 4-5-6-cord does nothing. This standard set of commands include navigation and speaking of lines, words and characters. Few of them are observed.

There seems to be a lack of structured mode in BrailleBack, meaning that several icons are placed one after the other on the screen. On the Home Screen, icons are separated by a colon. You can press a cursor routing key to activate an icon. I found no reference to how you might activate an icon if you don’t have cursor routing keys, but this may be because the software is sensitive to the display that is connected.

Being able to hold down a cursor routing key to simulate a long press is a nice feature.

Unified English Braille is now supported for contracted and uncontracted output, but not for input. Contracted input is not supported at all . Contracted input admittedly is a difficult thing to get right, Apple still hasn’t managed it.

Before you can even enter any text from your Braille display, you need to go into the device’s settings and enable the Braille hardware keyboard, setting it to be the default.

While I had excellent results as a speech user with Kindle, I could not find a way to read material beyond what is on the current screen in Braille. Contrast this with the seamless way VoiceOver scrolls through pages in iBooks and Kindle with a Braille display.

The combination of all these factors makes a mainstream Android device a non-starter for the serious Braille user. It makes it not viable in education where Braille is important. And most significantly, it is a non-starter for DeafBlind users, who are information-deprived, and deserve far better than this. BrailleBack desperately needs some love, and I believe Google needs to take it seriously. It should be part of the core accessibility suite, not an optional download, so it’s easy to activate from early on in the device set-up process.

Conclusion

I’m predisposed to liking the way Android does things, and as I’ve outlined in this lengthy post, there is a lot to like. In terms of ensuring that Google’s own apps are accessible, the progress made has been enormous. You can buy an Android device from the store, hopefully get TalkBack up and running yourself although that’s not guaranteed, and be assured of a pretty decent out-of-box experience. That’s great news for people who are on a budget, or who for whatever reason just want to have a choice. That freedom is available to sighted people, and I think it’s fair to say that it’s now available to us.

Can you get things done and truly use an Android phone effectively? In my view, the answer is that depending on your requirements, an Android device may work for you. If you’re a heavy Braille user, the inability to mute TalkBack and the limited Braille command set is probably going to be a deal breaker for you, as it is for me.

If you don’t use Braille, I think Android is very much viable for users in two categories. If you’re highly geeky, you have a profound philosophical objection to the way Apple prevents you from using your device as you see fit, and you like customising the heck out of your device, there’s really no contest. I enjoyed being able to work with file managers, cable up my phone to my PC to copy data both ways, and just generally basking in the lack of constraints.

At the other end of the spectrum, if you’re on a budget and an iPhone’s out of your price range, and you tend to stay on the beaten path in terms of the tasks you want to perform, I think Android now offers a reasonable experience.

I wouldn’t call it an optimal experience at this point. There are probably some ongoing underlying issues to be addressed at the operating system level, and work will assuredly continue with Android N, but the major reservation I continue to have with Android is the user interface of TalkBack. It’s come a long way, with complex circular menus no longer the default, and granularity far simpler now, but it still feels convoluted and geeky to me. In saying that, I readily concede that if you use it day in, day out for a long time, it may become second nature.

As I mentioned about 8000 words ago, I bought the nexus 6P so I could be assured of receiving Android updates promptly. I think that was a sound decision for someone like me to make, nevertheless I do have a bit of buyer’s remorse and wonder if I would have been happier with a Samsung device, running their screen reader which has a gesture set that is more orthodox from the perspective of a Windows or iOS user. You can perform a say all with the familiar two-finger swipe down. A swipe up or down will adjust granularity just as it does in TalkBack, but there is also a gesture, far easier for many to perform than the iOS rotor, which allows you to swipe up and down to adjust speech rate, screen reader volume, toggle screen dimming and more. The magic tap works to play and pause media anywhere. While I haven’t minded not having a physical Home button on my Nexus 6P, Samsung’s devices tend to offer one, allowing you to perform a familiar triple-click home to toggle their screen reader on and off. So if it weren’t for how long it sometimes takes Samsung to push the latest version of Android, I would be replacing my Nexus 6P with a Galaxy S7, specifically because I believe TalkBack is letting down what could now be quite a good user experience for speech users. Since Android fragmentation is an issue that affects everyone and is of concern to Google, I’m hopeful we’ll see some improvements here, which might remove my primary reason for not going with the more intuitive Samsung solution.

I also remain hopeful that Samsung will do for Braille on Android what they’ve done for speech.

That said, I have only examine an S7 briefly, so there may be issues with its product that I’ve not had the chance to experience. For example, some people have expressed concern about the amount of software that is included on Samsung devices that can’t be removed. It also has a speaker that sounds mediocre given its price.

If Samsung offered their screen reader for sale to owners of other Android devices, I’d buy it. After all, I’d have two obligation-free hours to give it a try and could get my money refunded if I found it wasn’t much better.

I’m excited about keeping my Android device around, because unlike iOS, it’s undoubtedly possible for someone else to come along and produce an alternative screen reader, as Samsung have demonstrated. Would sufficient people pay for one if it offered a better experience? I’m not sure they would. But perhaps a group of open-source developers who see the efficiency flaws in TalkBack and believe in an open platform may do something special. Then again, TalkBack may continue to evolve. There are some very capable people involved with its development for whom I have immense respect. It seems to me that the fundamental problem TalkBack now has is a legacy user interface that is familiar to those who have used it for a while, but may put off potential adopters.

While I’m on the subject of the capable people, it has been gratifying to see members of the TalkBack development team engaging with end users on email lists. We have a long-standing culture of this kind of exchange of ideas in the blind community, and it’s good to see Google open to this kind of communication.

While I’m encouraged by all the progress that has been made, and optimistic about the progress that will continue to be made, for now this exercise has given me a renewed appreciation of the VoiceOver and iOS experience. I’m no fan boy and have offered what I hope is constructive criticism over the years when I think its warranted, but Apple got the basics right back in 2009 with the core gesture set. From there, year on year, they’ve added things that have enhanced efficiency. Bluetooth keyboard support, Braille display support with all its idiosyncrasies, handwriting, Braille screen input, the actions rotor, the item chooser, in-app keyboard shortcuts, all help a blind person to get and enter information efficiently and reliably.

While surprising liberation continues to occur, I do wish iOS would be less of a control freak. But in the end, what matters to me most is how efficiently I can manage my busy life. I love to tinker, but I also have clients and commitments.

And of course, most of the practical benefits in terms of the services Google offers are available on iOS, with most of them now being highly accessible.

So unless you’re a major geeky hacker, or an iPhone just isn’t in your budget, I do believe that for now, as I write in June 2016, iOS is a more polished, reliable, efficient experience from an accessibility perspective. That said, those who say Android is unusable by a blind person are I think selling the platform short. If you have the opportunity to use an Android device for yourself, I highly encourage you to take that opportunity. If you’re coming from iOS, give yourself some time. Some things are different, but that doesn’t automatically make them inferior. Some stores and carriers offer a 14 or even 30 day right of return policy, and if you like to try new things, having a look at an Android device will I think be worth your while.

I can’t wait to see what happens next.

Have you switched from one platform to another? What prompted you to make the switch, and how well has it worked out for you? Are you an Android user with something to share on this post? I’d welcome your constructive comments.

45 Comments on “One Blind Guy’s Experience with Android. How accessible is it really?

  1. Thank you for this, it was very detailed and insightful. I’ll try to be brief, as I agree with most of what you say here, gripes included. (Particularly in regards to brailleback, voice actions, dim screen and side taps.) As always, standard disclaimer applies: my thoughts and opinions are my own, and do not represent the views of anything or anyone else. Yadda yadda, etc.

    Your experience re: initial setup is quite unfortunate. There is another gesture for starting accessibility, which is the same gesture but performed on the power widget. However it has to be activated first, which makes it almost useless in my opinion. You are most likely to need that gesture before you are able to access that menu, in a situation where the Google startup wizard gesture is for whatever reason broken/unavailable. Relying on it strikes me as a mistake.

    Re: NFC, I love its implementation on Android and feel that IOS has some serious catching up to do in this arena. (that and widgets are my two biggest complaints with the platform, apart from the standard, philosophical ones.) One of my favorite, little-known uses of NFC is an app called Farebot, which allows me to view the balance and trip history on my public transit card.

    You reflect on this later in the post, but you can use the up and down swipe gestures to quickly change granularity, so navigating in edit fields is pretty simple even if you can’t use your volume controls as cursor keys. Text selection, on the other hand, is something else entirely.

    The tutorial experience has been overhauled in TB4.5, the latest public version as of this writing. It might be worth updating the post to clarify which you used. I agree though that a section for identifying what all the earcons do would be extremely helpful.

    If you want your two-fingered swipe down to trigger continuous read, that’s do-able. You’ll just have to rebind scroll up, as that’s what that does by default. I wouldn’t recommend it though, since as you say it does not work well a lot of the time. For this reason, I rarely use continuous read, and I think you will find scrolling to be a better use for that gesture.

    As for the right-angled gestures TB has become infamous for, I rarely use them. In fact, the only one I use with any regularity at all is the notifications gesture, and I perform it correctly about 90% of the time. However, I don’t actually need to use that one, since it is also possible to open notifications by a 3-fingered swipe down or a double-tap-then-drag-down motion from the status bar at the top of the screen. So, while I understand the frustration of trying to learn them (it is by far the largest pain point I witness when observing others go through the setup wizard for the first time) you can get by just fine without them in most cases.

    I use a bluetooth keyboard full time and have for years. In edit fields, you can use the left/right arrow keys to go by character, ctrl+left/right to move by word and up/down to move by line. Very familiar. When out of edit fields, you can use keyboard shortcuts to alter the granularity, then the TB shortcuts for navigation to move. In 4.5, these are now bound to keys by default. It’s a lot of keys to press, but it works. I should say though that I’ve never felt the need to actually do this, so in the real world it may be impractical.

    In regards to granularity, many of your issues seem to circle back to it not behaving as on IOS. I like the rotor, but I’m not convinced that copying it is the right approach for Android. I do think though that there is room to make the features accessed by the rotor easier to access in Android; seek bars in particular are not handled very well at present.

    I’ve not heard about focus speech audio not working as intended, though I have not used the feature in some time. Likely this is ap-specific. Similarly, many apps that play music allow you to pause, stop or switch tracks on the lock screen; I myself solved this by purchasing a headset with buttons. Android also supports media keyboards, so if you have one with playback controls (which are fairly ubiquitous, at this point) they should work as well.

    I have not yet tried Galaxy TalkBack, however I have some serious reservations about considering it a TalkBack replacement. First, it is closed source. While TalkBack’s sourcecode is updated in, let’s say a lackadaisical fashion, the effort is nonetheless made. I don’t feel that this is very collaborative, especially since much of it is based on TalkBack source to start with. Also, while the current experience may be more to your liking, I do not have confidence that Samsung’s commitment to accessibility will turn out to be stronger than Google’s in the long term. It seems to mostly be copying Apple, which makes sense in a way, since Samsung seem to be trying to emulate Apple as much as possible. (the most egregious example being a high-profile lawsuit regarding, if memory serves, the S3. They lost.) GTB first appeared on the S6, and I do not know how frequently it is updated. So, obviously, time will tell on this; however I hope you stick with the 6P at least long enough to get the N update and the next version of TalkBack, as there are some changes which I Think you will like.

    • Oh! One other thing, I almost forgot: you can suspend TalkBack by long-pressing both volume keys, then resume it by bringing up the lock screen. Did you try this with Brailleback? I assume this wouldn’t solve your issue, but thought I’d suggest it in case you hadn’t tried.

      • Thanks for this Drew. I’d be interested to hear more about binding the two-finger swipe down, but as you say and as I mentioned in the post, Read all is not a very reliable feature at present.

  2. Thank you. I have been an Android user for five(?) years now, and it has certainly improved vastly over that time, and it has far to come still. I am glad finally to see an in-depth, impartial article on this. Too many are written by apple adherents (dare I say fanboys?) who don’t even bother giving android a go, but rather just sprout nonsense from that time they tried Android 2.2.
    I have to say that as a partially sighted person who uses both magnification and a screen-reader on a daily basis, I find Android to be far more usable. While the integration between TalkBack and “magnification gestures” is not as tight (for example, you cannot use TalkBack scrolling while the screen is magnified!) as it is on iOS, and you can’t magnify to the same extent as you can on iOS (a problem for apps with miniscule text), I find that iOS magnification results in many false taps while I’m trying to pan around. Also on that note, many applications seem not to work if they are launched without Voiceover and then have Voiceover enabled while the app is open, a problem I have never experienced on Android, which I find quite annoying as someone for whom using user interfaces is usually easier sans-screen-reader but for whom reading large blocks of text visually is inefficient.
    Re your note regarding the peculiarity of TalkBack gestures: I think it comes from the fact that TalkBack used to (I don’t know if it still does) just introduce an abstraction layer which sort of subtracted a finger from the gesture and then handed it to the OS. That’s why single finger gestures don’t affect apps – the OS sees them as being 0-finger gestures while TalkBack sees them as 1-fingered; scrolling works by TalkBack passing one finger’s action (I think the average of the two fingers’) action to the OS. Programmatically, this is a simpler approach, but it is far less extensible. Trying to introduce something like the rotor would trigger all sorts of weirdness under this model. Once again, I have no idea if this is still the way gestures are handled, but I think it might be.
    As for apps, the full suite of Google apps are very good in terms of accessibility. Some features I do like from iOS are in the camera, such as Voiceover describing the lighting, focus and faces in the photo – this is something for which the infrastructure exists, but which hasn’t been implemented. I do like that custom actions can be easily triggered in iOS with a swipe and double-tap, but these are still doable in Android (though I am partially sighted, so that does make a world of difference).
    I would have to say, though, that my biggest gripe with Android at the moment is Google’s tendency to introduce a different user interface for TalkBack users – I feel that that is extremely bad practice, and also just really confusing. For example, the clock and dialler apps both (by default) have a sort of bauble that you swipe to the action you want. This is pretty intuitive and minimises mistaps. However, in the clock app, the bauble is still there, but you cannot swipe it, you activate the actions normally, and in the dialler the bauble doesn’t even show, you just see buttons. This bewilders me, as these apps were fully accessible with the bauble thingy, and then they changed their operation in a strange and unintuitive (in my opinion) way.
    If you do want “OK Google” detection system wide, it will work with TalkBack, you just can’t activate it with TalkBack enabled – get a sighted person to help you enable it with TalkBack off (not suspended) and then re-enable TalkBack and it (should) work. “Focus Speech Audio” is a good feature, but as you say, it is really buggy. It doesn’t work properly with some apps, and, most annoying in my opinion, it renders dictation useless.
    On the other hand, I find Voiceover on my iPad (admittedly a couple of generations out of date) to be slow, janky and very rigid. Also – and this is what keeps me away from iOS as a primary mobile OS – I just find the software experience unintuitive on iOS, and I am willing to deal with a few gripes to enjoy what is, in my opinion, a far superior Operating System in Android. I do understand that Android is not for everyone, and I don’t think iOS is not capable, but it doesn’t quite work for me, even when I consider the issues Android has.

  3. Hello,

    Thank you for writing this blog post. It’s nice for me, an Android user of over 3.5 years, to see a pretty fair representation of Android. I also use Apple and have, for the last year and a half. I teach both Apple and Android. So many people believe that Android is not accessible. In my teaching of both platforms, I have noticed a few things.

    I have never understood why, so many people who are Apple users by default, despise the Global and Local context menus. I find them easy to use and to me, they make much more sense than Apple’s rotor. The way I see it, is it’s all about what you like. That’s why Android’s great, in my opinion. I love the amount of choices we have. If you want a phone, which functions much more like your IPhone, go for a Galaxy of some variant and use Voice Assistant, rather than Talkback. If you want one which doesn’t, go with a Nexus, a Sony, which would be my personal pick, an LG, or whatever. I still find that dang rotor frustrating to use personally. I tend to swipe too high, after I’ve made my selection, and it takes me out of the message, or whatever I was trying to read by character. I have come to the realization that I never will like the rotor, and that’s ok. I find the Global and Local context menus much simpler. I’ve observed that all the people whom I have taught, who are Android users by default, do not struggle with the use of the context menus. it’s only people who are Apple people by default and are trying to make the switch to Android, who struggle with those gestures. Also, another complaint I hear, time and time again, from Apple people, is the fact that there are no 2-fingered double-taps, three finger triple-taps, etc. Again, in that aspect, I find Android much, much simpler to use and Apple needlessly complex and frustrating. I dislike very much, having to remember what gesture does what on Apple and how many fingers to use for what and how many times to tap. Again though, I know I’m in the minority, in thinking that Android is simpler in that way, but I know of a few people who are Android users by default, who agree with me. I do know that Android is a 1 and 2 fingered system only, from what I have read. So, you cannot use more than 2 fingers on the system at a time. This being the case, Samsung must have done something incredible, to copy almost all of Apple’s gestures.

    I’m not here at all, to bash Apple. I’m just coming at it from the other angle of being a long-time Android user and loving how Android doe’s things. I do agree though, BrailleBack needs some significant help and, in my opinion, it needs much more love than TalkBack.

    Again, thank you for your post, and have a good day.

  4. Thanks for this post. It’s good to hear from someone who hasn’t used Android for a few years. Actually you may have a better view of what has changed over time, than we, Android users, who only see the small increments and forget about where we came from long ago. Just as the previous poster, I have no issue with the LCM and GCM menus and the sensitivity of the OS to how you perform them is decreasing, so they are getting easier to master. Strange thing about these gestures is that my sighted children never had any issue with them. In the days where TalkBack had to be turned off by using an option from the GCM, they had a 100% hit rate, whereas mine was lower. maybe it’s because they draw the shapes as they would actually draw a letter, whereas I made 90 degree turns.

    There is already another screen reader available called shineplus by atlab, that can be downloaded from the play store. It combines speech and magnification, no braille and has it’s own user interface that differs greatly from TalkBack. Some on the Android list like it a lot. It is updated frequently. Currently it doesn’t support the new Chrome a11y integration, howerver I think this is planned soon.

  5. Hi Jonathan, a very well written and clear blog post in my opinion. As a firm I O S user, it’s good to see the differences. I got an android linovo tablet from a friend and I haven’t really played with it yet, but would like too. It’s already been set up, I just need to explore the gestures with talkback more. Are there any useful android email lists out there to join? Again, very well done on the writing.

  6. I am coming from pretty much the same position as you, Jonathan. I have had two Nexus sevens, had a really difficult time with the system, and pretty much gave up on it except for reading Kindle books with my TTS of choice, which is not available on iOS . Last year, I purchased a Nexus six, and my experience with that device largely mirrors your excellently written comments, with the exception that I did not have the initial set up difficulty that you did. I do not particularly care for the menus, as I find the need to do to jesters where one should do to be inefficient, especially when I don’t get the angle jesters right in many instances. That has gotten better, but it is still an efficiency handicap in my view. One of the things I like about iOS is that more actions are available at the first level of interaction, that is, you do A single gesture, and something happens I wouldn’t mind having the menus, if some of the gestures such as continuously reading, would be available with a single gesture. But since you can’t have multiple finger gestures in android, it’s too limited. I will surely do hope this could change, and since Samsung has done it, it is clearly doable. Leave the menus for those who like that method of access. But add more gestures, so that the rest of us can have it our way, to quote an old TV commercial. After all, that is supposed to be the calling card of android. It is flexible. It allows you to do things in more than one way, and less you’re a talkback user, and then you are limited. I think of this rather like what happens in word processors that have shortcut keys. You can use menus or ribbons to find things. Or you can remember those shortcut keys for the functions that you use the most often. Obviously, there are not as many touch gestures as there are short cuts on a keyboard. But the principle is still the same. I appreciate the advances that have been made with talkback interacting with android over the last few years. But until read all is rocksolid and easily accessible with a single no fail gesture, and until I can go through large amounts of email quickly, reading and deleting as I go, as I can with iOS and can’t, even with aqua mail, Android will never become my primary operating system, even though it does have some advantages, particularly in the area of file management and ease of access to and from the system to a computer, for instance. Also, the lack of the reader functionality in chrome is inexcusable. Why can’t we have a plug-in that will enable that? Firefox has that, but then they have introduced other problems into their browser. And there are many other browsers. But that’s the point. You should get a device with apps that are fully functional and up-to-date and accessible. Google has made great strides in this area. But it’s not there yet in my opinion. I hope it succeeds, because there are some excellent devices using android that cost a whole lot less than iPhones.

    • I’ve been an android user for several years and before that I used IOS for a year or so. The thing I liked least about Talkback is the 90 degree corner gestures. I’ve noticed that they work better on some Android phones than others; I’m assuming it is at least partly an issue of the screen resolution.

      I did something last year that made life *Much* better. I have the global context menu set to a single swipe up and the local to a single swipe down. Given what those menus let you access I found that I was able to do things much faster and more reliably.
      John made a couple comments about the GMail app; personally I like it the least of all the Google apps. I’m using a Samsung Galaxy Corps Prime and I just disabled the GMail app and set my GMail up with the built-in email app–which I find to be a much better experience.
      I agree that Voice Over has a bit easier learning curve compared to Talkback and that it would be really nice if more gestures were available for talkback functions.
      I am a Chereful Android user as I personally hate I-Tunes with a passion and I agree with John that having your phone just come up as a drive when you plug it into the computer is much better than being locked out of the file system. I don’t know if people will agree with me, but the true deal killer for me against apple is that you can’t expand an I-Phone’s memory. I paid $99 for my Galaxy which came with 8GB of memory. I bought a 64GB sd card for $25 at the same time so I had plenty of space. With an Apple device if I want more storage space I have to buy a new phone. I now have a second 64GB sd card which I can load data onto it just by plugging the card directly into the computer. It takes about 5 seconds to switch cards and I have effectively about doubled my storage without having to give Apple more money!

  7. I think the only thing I saw you might have got wrong is the part about google now. I can say OK google even when my phone is in my pocket. The IPhone 6s Plus is the only IPhone that allows you to do Hey Siri with out pushing a button while many of the devices that run Nexus allow you to call up google Now with out touching the phone. There are tricks for the lower IPhones like you push the button once then she will listen for a while. My old Motorola G allows me to talk to it with out touching it. So out of all of this that is one thing you might want to check again. I will warn you with the fixes Samsung has made the one big problem is they do not keep up with the builds even to this day. So when we get accessibility updates it means you have to update your phone sometimes with Samsung where as with Nexus devices you get instant updates like IOS. Oh there are also other screen readers out for Android you might want to look around.

  8. Jonathan, as I am also a new Android user it was nice to read this objective review. At the very beginning of your piece you mentioned past difficulties with some gestures not always being accepted. I was hoping that you would touch on that a bit more when chronicling your new experiences, if you’ll pardon the pun. As I continue to use Android, one of my primary frustrations is that a simple double-tap is often not registered and almost seems to be interpreted as an attempt to use Explore by Touch, which not only doesn’t act upon my command but it also means I invariably lose my place as focus is changed to a region of the screen away from where I tried the double-tap gesture. This is one of the most frustrating and infuriating aspects of Android which I regularly encounter, both on a Nexus 9 as well as a Nexus 6. With iOS, my double-tap gesture works nearly 100% of the time, regardless of how I double-tap. On the Nexus devices, it seems as though you must perform the double-tap rather quickly and with a lighter touch than on iOS devices. To me, this is a bit like releasing a keyboard where the enter key is only accepted when it is pressed by pushing the key down at a very specific angle. Just as I would want to press keys on a keyboard without considering how much pressure I am applying, I should be able to perform a simple and absolutely necessary double-tap gesture in any way that I choose, which is what I’m accustomed to with iOS.
    The angle or L-shaped gestures are also awkward. I tend to be able to use them easier with my right hand. The problem is that, with my iPhone, I find that I perform many gestures with my left hand while I hold the device in my right hand. With Android, this won’t work as I can’t seem to perform those angle-shaped gestures with my left hand most of the time. I hope that Google develops a better way of performing these commands.

    • Hi David. Double-tapping is much better than it was when I was using my Nexus 7, but yes, I can confirm that on my Nexus 6P, it’s still not as consistent as iOS.
      I’m not sure whether there’s some sort of trick to it, but my experiences with double-tapping not registering the way I intend mirror yours.
      I have this issue with my Fire tablet as well, so it seems to be some sort of Android issue.

      • hello, if you use a lighter touch . things will work better. I am just pointing this out for all new users of android and those who may transition to android to se what android offers. Just pointing this so there will not be frustrations with android. also there will be android n which will improve things for all. one more difference between the two platforms, if you have talkback update you don’t have to upgrade the entire operating system to get a new version of talkback. you get it indepently of the operating system. there lately has been frequent updates to talkback as the eyes-fre , the development team in charge of talkback is working on improving and refining to be the best it can be for all to use. Also there are plans to work on braill back as well. The development team is aware of issues and are working on fixes and refinements in due time.

  9. Thank you for a well-constructed post. I am glad we are largely to the point where individual use cases can dictate whether people can “get things done and truly use an Android phone effectively.” For instance, the deal breaker for me against Android is Talkback and the media volume. Talkback could be perfect, but if I can’t read a novel and quietly listen to radio in the background, I have no use for it. Likewise, I don’t use Braille, so that feature doesn’t personally bother me. And someone else may struggle with the Rotor on iOS and not care about reading and streaming radio at all. It’s getting more and more subjective.

    That being said, I’m glad you acknowledged that Apple got the basics right in 2009 and has continued to improve since then. Fundamentally, that’s worth celebrating. Talkback, in contrast, still feels like the tag-along little sibling who is trying to catch up. Apple has really baked accessibility into the system and made refinements since then. And even if I prefer some things about Android and Google services, I’m incredibly grateful and respectful of Apple’s commitment. For instance, there are three accessibility features that Apple maps has that Google maps does not: the ability to explore the screen with your finger, the ability to flick and hear your current location spoken at any time while getting directions, and the ability to turn tracking on with headings and hear streets, intersections and points of interests, which arguably can replace the cost of BlindSquare for many people. I’ll admit that Google maps has better transit data and arguably better traffic rerouting, so I use it often. But the above features of Apple Maps are just one example of how the company really goes the extra mile and bakes accessibility into the system for a wonderful user experience. I’m hoping we’ll see more of that with Google.

  10. A concise and interesting read. You and I completely agree on the braille requirement. I got started with Android to do some beta testing and possibly some development. The development environment has a fair way to go before it can compete with Xcode. The biggest difference is that AOS uses Java as its development language versus Objective-C or Swift for the Apple devices. It’s good to know what newer device you chose and to know that we chose a similar device for the first one. Again, a great and insightful post.

    • swift will be coming to android in the future versions of android as google and oracle are at wits ends with java. this way there will be great apps whether that be for I OS or Android be useable on both platforms. I don’t know, but its coming as wwe android users have heard about this lately.

  11. I recently switched from using an Iphone to using Android, with a Samsung Gallauy edge S7. There were several things I did not like about the Iphone, such as the cact that my Bluetooth devices didn’t alws stay connected, and every time I downloaded an vcdate, something new was broken. With using Android, my Bluetooth devices continuously stay connected, although I do have to use a separae device to connect my hearing aids with my ph”z. But that device does work well for controlling both speech and media voluce, as well as controlling my hearing aids. I’ve had a pretty good experience overall with using Voice Assist, although I’m not an advanced techy, and it jd be nice for someone who is to do an evaluation and review of Voice assist. I’m also a BrailleBack user, and I agreed with the observations made in the article. I do manage to get it to work with my phone, and about the only thing I haven’t figured out is how to input the at sign. But I had to do a lot of just playing around to figure out how to do symbols and make it work. I very much appreciated all the information that was in this article, I found it very helpful, thank you.

  12. A truly insightful, honest & objective in depth review Jonathan. You seem to be open minded & even handed with your arguments against both the big A & the big G.
    As a user of an iOS (iPhone 6s), a Nexus 5p & a Samsung Galaxy S6 Edge, I’d like you to ponder upon the various a11y possibilities that might be an offer on android With the introduction of different shape devices now in the mainstream from Samsung and LG.
    Samsung does have a good set of a11y implementations, it’s a transitory approach: Galaxy Talkback changed to Voice Assistant millie Typifies The conflict and confusion at some of its implementation have caused , often at the expense of clashing with common Android gestures. Hoping it would improve.
    As far as alternative screen readers go, you now have some choice out there. Firstly, try Shine Plus. It’s only a 1MB download & it’s free! Again, if you can get past the awful guide & lack of Braille, then it’s a start.
    Amazon has also introduced Voice View, which I’m sure you can obtain from trusted sources.
    Also on the subject, you did not touch on launchers like Nova or Apex, which might make the screen more to your liking. Some are even designed for seniors, and many are free. This is a notable iOS omission.
    Then there are Roms. In particular I urge you to look at Cyanogen Mod, which is free or available as default OS on the OnePlus brand of phones. As a starter, this should give you that finer focus audio control that stock Android lacks, among other a11y tweaks.
    My biggest gripe in Android is, as you rightly observed, the inability to navigate by element type, e.g. Headings, links, buttons, tables etc! This is my deal breaker for not using Android full time. Such lack of granular controls make it difficult for anyone with multiple conditions like Dyslexia, ADD, or painful eyes with partial sight,or a similar combination. What surprises me, is that Google’s very own Chrome OS already has many of these, and so many more a11y capabilities that make it my go to device for a lot of tasks. These are the same features that Android users have been wanting for a very long time, e.g a simple way to go to previous or next controls, change of settings on the fly, automatic screen reader improvements etc to name some more.
    Finally, observing your 2nd reason for trying out Android again, I’d really love to see Google Talkback migrated over to iOS as an alternative screen reader.
    Thanks for reading & have a good day ahead. Keep it up! 🙂

  13. An excellent blog post Jonathan. I wondered if you have tried using the new “Inbox by GMail” app instead of the regular GMail app. I find the Inbox app to be much more accessible and also has some very smart features.

    • Thanks Kiran, I’ve not tried that yet and if I can set up all my Imap and exchange accounts with it, then I’ll definitely look it over.

      • Inbox only works with gmail accounts at the moment and so it would not b beneficial to jonathan for setting up other third party email accounts. I have tried this with inbox anths why I also use aqua mail like jonahtan.

  14. I think the BrailleNote Touch will be a game-changer for productivity and efficiency in the Android space. It’s the first time I’ve had a reason to want a tablet. It’s unfortunate that they won’t be providing initial access to widgets but I’d consider that a minor inconvenience considering what it can do.

  15. Here’s why I switched to iOS, and what would make me go back to Android if Apple doesn’t catch up sooner rather than later.

    I switched to iOS in 2014 because I found that apps I was using after updates broke TalkBack accessibility, and after playing with the latest CyonagenMod at the time made my device slow and unstable.

    As a former Android user from 2010 to 2014, here’s my obsovations:

    Setup

    I’ve owned three devices, the first was the Huawei IDEOS U850 running Android 2.2 FROYO (or Frozen Yogurt(, A Sony Xperia Neo with 2.3 Gingerbread, and an Xperia U with 4.0 ICS (or Icecream Sandwich). I also had a Nexus for for a week in 2013, but didn’t like the audio output especially after recording Audioboos.

    With the devices accept the Nexus 4 and Sony one with initial ROM I had to ask for sighted assistance to get TalkBack running because the gesture to turn it on didn’t exist, and TalkBack wasn’t installed as part of the OS.

    So I had to flash, (install CyonagenMod) a custom ROM and GAPPS (Google Apps) to get things how I wanted, and it involved lots of very late nights of reading, trials and tribulations since I had to download the packages separately because Google didn’t allow GAPPS to be packaged with the ROM, and I had to ensure I got the package with TalkBack included because some GAPPS didn’t include it. To setup the phone I had to draw a square from left edge clockwise until a short musical tone played, then TalkBack came up speaking.

    Vanilla Android, is the stock pure Android as it is on a Nexus. Android has many different ROMs, (or flavours) that do different things and feature different customizations, some of which wern’t accessible at all and caused me headaches when getting them installed. CyanogenMod (CM) was the most easiest and had hardly any issues, accept when trying out later version Jellybean was sluggish and froze when I attempted to set it up.

    The Nexus 4 setup with 4.4 KitKat, worked as it should, I had no problems holding two fingers in the middle near the top of the screen and waiting for the speech to prompt me when to let go. Everything else was flawless in the setup process.

    Navigating around

    Touch and Explore was inplimented in 4.0, so when I was using 2.2 and 2.3 I had to resort to the Eyes Free Keyboard and eyes Free Shell. My first phone had a round button in the middle called D-pad (or directional pad) which enabled me to move around the system, but my other ones didn’t. Upgrading to CM on my Sonys allowed me to navigate by touch like I did on the Nexus.

    Getting apps

    Before the change to Google Play, apps came from the Android Market, and moving around was a hit and miss. Unlabeled buttons everywhere made it frustrating for me to navigate, and I had to use the D-pad to move element by element one at a time to find what I wanted such as read description and to find the install button. Using apps like Facebook and Twitter was unusable at all, so I had to result to an app called FriendCaster for Facebook and TweetCaster and Tweetings for Twitter.

    Using Google apps like Gmail was also hit and miss on the lower Android versions, and Maps I could partially use for directions, but I didn’t find the experience all that great, so I had to use an app called Walky Talky for navigation because it announced the street numbers as I passed them, and would hear a loud tone before the announcement of directions, which was my favourite feature.

    Another favourite feature I miss is being able to explore the streets before going out. I had an app called Intersection Explorer, which gave me the first experience of explore by touch even on the lower Android devices. Placing my finger in the middle of the screen, TalkBack would say “Currently at Wakefield Street”. Moving my finger around would give me directions such as “Moving East along Mount Street”, walked 300 metres to Wakefield Street and Mount Street”. I could also explore around to other streets, and get verbal directions EG “Moving South along Queen Street. Now 200 metres from corner of Queen and Fort Street”.

    On 4.0 and 4.4 the experience had improved slightly with transit options, in particular how far away my bus was and when I boarded, distance from my destination. Unfortunately I had to keep the Maps navigation part open, and had to use TalkBack to get distance announcement. That feature was removed in updates to Maps.

    Taking calls and texts

    This somewhat basic function on 2.2 and 2.3 had to use an app called Talking Caller ID, since TalkBack didn’t announce it, and answering the phone was a chore since I had to know where the button was. In 4.0 – 4.4 brought in Caller ID announcement and a way for me to find the answer button. Texts was relatively easy with Messages, just go in, type the name of the person to send the text to in the first field. Move down with d-pad to the body field, type message, then move right to an unlabeled Send button, which was labeled in the upper versions.

    Getting updates and why I got out when I did

    Phones came with the version of Android released in that cycle EG 2.2, 2.3, 4.0, etc, and would not be updated by my carrier or manufacturer. The Nexus 4 was an acception since Google updates it over the air.

    As apps got bigger and powerfull I found my phone to be sluggish, accessibility being broken even with updates to TalkBack, and new phones still being released without the latest version of Android and software updates, I felt I no longer wanted to put my personal data at risk with an immature, unstable ecosystem, and my Nexus 4 experience put me off getting the next models up, which were way out of reach in terms of price, and the Google store wasn’t available in New Zealand at the time.

    So when I got a $330 payment from the AECT (Auckland Energy Consumer Trust), I wanted to see if I could get the iPhone on a plan, which I didn’t really want to do since plans can cost more than the phone itself.

    The application was approved, and I got the 5s on interest free installments, which will be paid off in September. However after checking my rating I found a few defaults, which means I’m unable to get the 7 or plus model, but at least I’m happy with the experience. Got phone home, VoiceOver up and running, updated to iOS 8.0 from 7, and now on the latest 9.3.2. If I was still on Android I’d never see the latest version, unles I had a Nexus.

    Would I go back?

    After what I read of this post, Android has cirtenly come a long way. But I feel sorry for those people who are purchasing phones with serious security vulnerabilities, with carriers getting in the way of manufacturers. I saw an article of one phone that just got 5.1.1 Lollypop this year, even though 6.0 Marshmallow has been out for over a year, and one of the Samsung phones is now getting Marshmallow.

    Personally I’d like to see Google take some sort of control to manufacturers and carriers before I consider Android fully matured to go back to if I didn’t go down the Nexus route:

    New phones manufactured after IO (excluding June) up to October 31st 2017, absolutely MUST ship with the very latest Android and software updates, and cesation as of November 1 until February 1st the following year, with three years of Android updates.
    Phones manufactured from February 1 up to May 31st must be on the latest stable Android release and software updates, cesation in June, with same update polocy.
    Phones manufactured before May 31 2017 up to March 1st 2014, MUST be upgraded to the latest released version it can handle, if still supported by Google. Phones manufactured before March 1st will be end-of-lifed.

    The only thing that would make me go back is the more things I’m able to do with Google Now. I’ve read a few posts about Apple finally opening up Siri to apps, but in my view it doesn’t go far enough. I’d like to be able to say “Book a table for five at Al Volo Pizza for 7:30 Thursday evening”, and pay with the card I have in iTunes, saving on duplicates of credic card info storage. I could also use my bank app that holds payment info, sending a token to confirm the transaction is being made by me.

    With all the leakks of stolen usernames, passwords and credit card information over the past few weeks, I’m losing trust in having my personal information stored on their servers, and no longer have my debit card with my power retailer. What would be nice is having all my personal data such as address, phone number, etc stored on the phone. Then when it comes to pay, if using the app, it’s just one touch to make the payment using my banking app. If on my computer, a QR code displays on screen to scan, and, one touch authorises the transaction.

    For Uber which also stores my info, once I’ve completed my journey, the app sends a push notification informing me it’s time to pay. Authorise payment takes one touch on the fingerprint sensor, all keeping my data off their server.
    The supermarket where I shop has online shopping, and again my debit card and address are stored. All I would like to do is place my order, move through the checkout process, scan the QR code, wait for the authorisation dialog to come up on my phone, bang! Payment info including delivery address is sent to the supermarket all ready to be delivered. If I chose pickup, the nearest store to my location is informed, and I get a push notification once my order is ready.

    Having this solution ensures I’m in control of my data, and when I move place, all I need to do is update the address in my contact card, and all those who I deal with get my updated info when I login. It also means there’s no payment info to exploit, since merchants only get encripted transaction tokens.

    Conclusion

    I didn’t talk much about modding to make my Android experience better since it’s too technical and newbys should NOT attempt this. But the experience I had was something I was forced into by no falt of my own, Google and manufacturer, and was a big challenge. At least with Android anyone can tinker with the OS, but for those who feel uncomfortable doing so shouldn’t have to be stuck on versions that are more than three years old, putting their data and safety at risk.

    Perhaps one day when my iPhone 5s can’t handle it anymore, I’ll revisit the platform I held so ddear when either I get a Nexus, or maturity sets in when I know I can get a decent phone with the latest version of Android. Until then, it’s iOS for me.

    • Thanks for the thorough and interesting reply David.
      I think Android has come a very long way, even since the last time you looked at it. It may not yet meet your needs but when the time comes, it’s worth another look I think.

      • Sometimes I find it quicker to pull out my phone instead of my EFTPOS, TM, Countdown, Subway and AT HOP cards, and it’ll be awesome if there was an integrated rewards system which lists all the reward schemes, and sign-up is only one tap or click if on computer away. I find FlyBuys OK, but most of the places that are involved I don’t do business with, and the ones I do don’t yet support it. I have bills to pay and a mouth to feed, and I’m getting only a point here and a few sence there in terms of the reward schemes offered here. I’ve had a OneCard since back in the Foodtown days, and only got my first voucher three years ago, which was only $15.00 given I’ve spent hundreds over the years. With all the specials I’ve got over this time, why aren’t the savings added onto my OneCard after a cirtern threshold, such as if the savings are over $200, I can use it for this week’s shop.

        There’s some apps in the App Store that I’ve found for checking out restaurants EG Zomatto, OpenTable and other’s I can’t recall at this moment, yet I’m unable to take full advantage of the features in Auckland. Japan has had phone payments for over a decade, so how come they wern’t catered for during RWC 2011?

        I have PayWave with Westpac, but the banking apps interface I find unintuitive, has no Touch ID support, which Apple should Mandate for all apps that handle payments. EFTPLUS has a reward scheme that is trying to eliminate the extra reward card and only use your EFTPOS card, but you have to tell it part of your card number. I’m a regular at some of the cafes and bars in Central Auckland quite a bit, and my phone should be able to tell me specials since I’m a regular customerr, and after waving the phone to pay, confirm with a touch on the sensor. PT should be no different, with an app called MoveIt, just tap on the side of the door to confirm I’m on the right bus, then find the tag terminal so I can tag on, all knowing that my TM consession is applied and balance given to me on screen which VO reads when I tag off at my destination.

        Because our main banks are Aussie owned, there’s some resistance for Apple Pay since they’re wanting to implement their own mobile payment systems. Apple Pay is in Ausie under ANZ, but not on this side of the ditch. You can get a special phone and SIM that allows you to pay and use PT already, but it’s limited to 2Degrees and phones are only sold in Wellington. I hope part of the Auckland Transport upgrade sorts out those issues, especially those who are coming up and want to use PT in the big smoke but can’t because of different systems.

        I’ll prob look at it again after Android N, since there’s a feature I like that puts system updates into a separate partition, so when you reboot the phone, the new update is applied, no need to worry about the non-spoken progress OTA update of iOS and current Android version. But from what I’ve read you’ll need a new phone to use it.

        With technology moving so fast I’ve got no need to pay with cash anymore, and I find going cash free makes the wallet lighter. But the idea of paying using mobile is relatively new in NZ, and needs some wide-spread testing with lots of people before retailers. In 1995 I was told off by Homai staff that I went behind their backs and used a new system called EFTPOS, but I said “you wait, this cash will be gone from my wallet in les than 20 years”, and it’ll be the standard. Again, blind people using systems way, way before the sighted.

  16. Hi Jonathan. Thanks for such a detailed review. It’s great to know that Android has improved significantly in the last couple of years. I have recently been considering making my next phone an Android unit, particularly as the absence of a headphone socket on the next generation of iPhone looks increasingly likely. At this stage though, I think the poor Braille support may still be a deal breaker for me as I do use Braille extensively when on the road.

  17. A nice post, a staunch IOS fan though. But will give it another try

  18. just to let you know i also use the shine plus free screen reader on android when talkback doesn’t work in certain apps i turn on shine and when shine is on i go back and turn off talkback and use shine and can switch back and forth between shine and talkback without sighted help.

  19. Hello,
    I have been a long time user of apple, and I just want to point out that I have tried android 3 times.
    The first time was on a Samsung galaxy Nexus, running Android 4.1.
    The second time, was on a google Nexus tablet wich worked very well.
    The third time was on a galaxy S-7, running android Marshmellow I think. On both the phones there were numeris problems, but mostly with the S-7.
    I was unable to send messages to people, because I was getting an error telling me I needed admin privillages. The only way I found to resolve this problem was to go in to safe mode, wich requires sighted assistance to navigate.
    The second problem I found, was that there was no way to dictate text, in to any sort of edit field, (no gesture to activate it).
    If I could get a good phone, and some insites, I’ll go to T-Mobile and try it again. I’m not saying I would switch to android full time, but it has evolved from the time I first used it, and that’s saying something. But if you can’t even send texts, that’s a problem. Also, the Uber ap is not accessible with android, so that is also a problem.
    Any insites would be greatly appreciated.

  20. Hi there,

    To fix that Tune In Audio issue, find the Tune In Settings section within the app itself, then make sure that the item called “Pause Audio Instead Of Ducking,” is “unchecked.”

    In addition, here’s a list of the “Galaxy S6/later Voice Assistant/Galaxy Talkback Shortcuts for its version of Talkback, curtousy of Blind Bargains:
    http://www.blind.bargains/b/12580
    Plus, a podcast on using the Samsung Galaxy S6 Edge with using Voice Assistant/Galaxy Talkback, thanks to Cool Blind Tech:

    https://www.coolblindtech.com/samsung-s6-edge-is-reinventing-accessibility-for-android-users/

    • Thanks Trenton, it’s good to know there’s a work-around for TuneIn, but the feature within TalkBack itself is still faulty. It breaks other apps, including Audible and Cspan Radio.

  21. Hi, Jonathan. Great post, except, well, for some reason, Waway or whatever you thought the Nexus was. It’s a Google product in all its purity. I believe Samsung partnered with Google to make it. Also, here’s a bit of fun to ponder about my Android history. Now, I hated Motorola, but getting a Samsung J7 Galaxy is truly liberating. It was really liberating to be able to bring the phone home with me and Trenton and I didn’t even have sighted help with any but the SIM cards. We just had the store fix up the phone numbers since I and he wanted to be specific, but as per the phone setup, that was a breeze. All I had to do was just hit the Next button, lots of times and just fill in all fields, allow this, allow that, and bang! The phone wasn’t hard to do. It was pretty good right out of the box, no sighted assistance. Since Trenton and I do not have a sighted room or housemate, it is vital to us having phones that actually work out of the box and don’t cost so much money! Apple products are usually expensive and meant for the high end market, but we managed to score Amazon fire tablets for less than eighty bucks, I don’t know how much fifty U.S. dollars is over where New Zealand or England are, but Fifty U.S. DOLLLARS! That’s a score! Usually, tablets cost more. Also, setting up the Fire was a breeze despite the temperature. lol
    With my J7, though there is no fingerprint thing, it still has a great bunch of apps I use, and marshmallow particularly has some more access solutions. When I ended setup of my Samsung phone, I was so happy. Seriously, I was thrilled to be able to bring home something and set the thing up. But Motorola does not have good service and I had been stuck with G and E phones, too few gigs. Beth likes to put lots of stuff in there.
    One more thing to add, the KNFB reader will work on the J7, and I have shared files in HTML to Google Drive. So Jon, if you’re a KNFB fan, I’d only ask that camera pixel size be 10 or more. Mine is about 13 mp’s of space, and it works. I’ve been able to read an old notice on pest control I forgot about for a while. I also am really liking being able to direct my selfie shots in the right way, and Trenton took pics of me as well. Too bad those pics we don’t see unless it’s Facebook images, and those images have alternative text on them, which is cool stuff!
    From my perspective, Samsung has come a great long way since the flippity phones of yesteryear. Since suing Apple for patent infringement, Samsung has really taken a gigantic leap, and the J7 is evidenced there. It all began with S6’s though, and we were really happy we had done our homework. Note to future Android users: as much as I’ve thought Trenton doing so much research and listening to countless phone reviews drove me nuts, it paid off. Trenton’s homework and comparison shopping was essential in our purchase of android phones from Samsung. We talked about it a lot, and we didn’t feel Nexus 6p’s were the best phones because we would have waited a while, and they’d cost even too much for SSI checks to handle. However, I refuse to do business with Motorola because of the constant use of off shore non-English speaking call centers that could not tell you everything or know what you need as a blind person. Samsung does not know their products all the way either, but there’s always a suggestion box on their website or something maybe I missed.
    When we got home with our phones and eventually started using them, Trenton and I were texting, making calls, installing apps, the whole shebang that we did on our previous phones. But the 13 mp camera has liberated me big time. Now, Trent can capture moments in his life he wants to capture, like, for instance, that crazy pose I’m striking at one time or another or that video clip of us feeding each other Hershey kisses. Secondly, I am able to jump downstairs and check the mail on the double if I must and I can still read papers now! And best of all, Android pay is the best thing in the world and a step up from Motorola’s lack of NFC. NFC is important so Trenton and I can also identify objects and bar coded boxes and packages. We refuse to spend countless thousands of dollars on the bar code or pen friend or whatever is expensive enough to break our bank accounts. Trenton and I actually did have a fun videographed moment, and those android phones have enhanced our abilities so much so that I’d go back and do it again.
    Of course, I wish more wearables were accessible with the completeness that Apple’s watch is. I want to do a full on Android thing, but I won’t go so far as to betray Android City and buy an iphone because of file management. But accessing the phones is not that hard. I have used the l-shaped menu gestures, and someone said they were hard, and I agree, but improved gestures are always a welcome option.
    As the proud owner of Android, I want to say that I’ve seen nothing but flexibility and ease of use with my latest score for the Samsung j7. I have customized tones, and I just heard my email one go off, so if you, Jonathan, were to text me, I could pick a sound or a movie line you like and I’d know it was you without having to look. You could do the same if, say, a work person texted you, and the customizable text tones and ring tones have always been a real obsession. Trenton’s ring and text have to do with Star Trek, all that customizing is helpful. Now all I want to do now is play with wallpaper. I do much more at a low price with android and still have peace of mind on the market and no contract. And our carrier, T Mobile, gave us free pizza and a great buy one get one deal. More of these with Android cost less.

    • I am also with T-Mobile, and I’m going to test a galaxy S-6. I had horrible issues with the s-7. I’m willing to see if it actually for me can be used on a daily bases, but I have used an IPhone, I don’t think I could ever let go of it. Luckly with T-Mobile, if I don’t like it after a month or so, I can always upgrade back. What phone do you have? ANd what mail client should I get, as I have 2 exchange mail accounts?
      ALso, how is uber with android?

  22. This is a very good blog post. I’ve been using Android for about 2 years now. I have a Google Nexus 7 from 2013 and it has been interesting to see Android and TalkBack evolve. I remember running 4.4 when webpages were unusable with TB. Every time you would access web content, it would say web view and cause the entire device to freeze. Fortunately, these issues are fixed with a combination of Android 6 and TalkBack 4.5.

    How do you reassign the two finger flick up and down gestures? I wasn’t aware those gestures could be changed. I’ve checked the manage gestures section of TalkBack settings and it doesn’t have those gestures listed.

  23. Great post Jonathan, and lots of wonderful comments as well. My experience with Android almost mirrors your’s Jonathan.

    I bought a Nexus 7 in June of 2013. I purchased the Nexus 7 2012, first edition model. I spent many countless hours playing with Android and TalkBack. I learned enough to be able to navigate around, download apps from the Google Play Store, listen to podcasts, read books etc., etc. My Nexus came with Android 4.1, Jelly Bean. Mostly I found the Android experience frustrating and exhausting. I would spend hours on it trying to do things, and end up with a headache from the adventure. Navigating webpages was a nightmare with no heading navigation.

    Things got a little better when I updated to KitKat, but still I found Android way too much effort to use on a daily basis. I finally,like you, put my Nexus in a drawer where it stayed for quite some time. I’d take it out once in a while and play with it, but would soon shove it back in the drawer. Then, when Android 5.0 , Lollipop came out, I installed the update thinking this might make things better. What a mistake. My Nexus, which was already a bit sluggish, became painfully so. I wished I hadn’t installed the update. Once again it went back in the drawer. I left it there until Lollipop 5.1.1 came out. I installed the update, and things were somewhat better. My system was still quite sluggish, but, I was amaze to find out I could now navigate websites by headings.

    A couple of months ago I had my younger brother help me wipe out my system and clean install Lollipop. Actually we installed an unsupported version of Marshmellow… because my Nexus 7 2012 is not getting the Marshmellow update. Anyway, we got Marshmellow installed, and then tried to install a GAP package so that I would have TalkBack. We couldn’t get the GAP package to install, so we just did a clean install of Lollipop. This has made my Nexus run not quite so sluggishly. I’ve been using it quite a lot over the past two months, and really am beginning to like it! It’s still a bit sluggish, and freezes at times, but I am really liking the new method of changing navigation by swiping one finger up or down. I also like the swipe up then down quickly to jump to the top of a list or down then up quickly to jump to the bottom. When I go into settings, a quick down then up swipe usually lands me right on the Accessibility setting. I like the Overview button, and use it to close apps that are still running to help prevent some of the sluggishness. I use both Aqua Mail and the GMail app, and find they both work quite well. I’d love to have a swipe gesture that would bring up a menu with a choice to delete a message like iOS has… maybe some day. And for sure a 2 finger swipe down to read all would bbe so nice. I find that Google Now is better at answering my questions than SIRI a lot of the time. I like the Hangout dialer app which allows me to call people on their land lines or cell phones from my Nexus 7… cool. The Bookshare app Go Read works really well. I watch Netflix on my Nexus 7. Today I tried the Google Keep app for taking notes, and it works wonderfully.

    I just purchased a new iPhone 6S last November, so wont’ be switching to an Android phone any time soon, but I can definitely say that I might consider it for my next phone, and I never thought I’d ever consider switching away from my iPhone. . I am looking for a cheap Nexus 7 2013 model so I can install Marshmellow and see what that is like. I still love my iPhone, and am much more productive on it than I am on my Android, but there are things the Android does that are quite nice. I’ve been using Google Docs on my Nexus 7, and it’s great for reading Google Docs. I’m able to create documents as well, but for some reason I don’t get any feedback from TalkBack when typing in a Google Doc. If anyone has a solution for this, I’d be forever grateful. I use an Apple wireless bluetoothe keyboard, so maybe this is the issue? I also have a problem with characters repeating as if a key were stuck down. I found an app on the Google Play Store called External Keyboard Helper that has fixed that problem for the most part.

    I did get my Brailliant BI 40 to pair with my Nexus. I could read with it, but was not able to input with it. I’m not sure why.

    As I said earlier, I was able to pair my Apple Bluetoothe keyboard with my Nexus 7, and it connects quite quickly when I turn on my keyboard. I’ve also connected my Big Jambox Bluetooth speaker to the Nexus 7. Bluetooth connectivity seems to work very well on Android.

    Thanks Jonathan for sharing your experiences with Android. Thanks also to everyone who commented. This was an interesting read.

  24. Just from what you’ve presented here, I’m thinking that alternative screen readers may be the silver bullet for android. I had no idea Samsung hadits own screen reader. Thank you for this article.

  25. How do use use Voice Dictation in Marshmellow? Because there is no short cut gesture

  26. For the reading mode, you can try lightning browser, which has it.
    https://play.google.com/store/apps/details?id=acr.browser.barebones
    For braille… I’m no expert, but you can try brltty:
    http://mielke.cc/brltty/download.html
    One certain advantage over brailleback is that it doesn’t need talkback to work.
    I don’t know why it is not present on the play store and has to be installed manually. If you know that much about braille, though, I think you will have no problem trusting the brltty developers.

    Two reasons I prefer android over ios are the presence of eloquence, alongside the open text to speech system which allows this, and the fact that I can write apps for android from a windows pc, at no additional cost.

  27. The main reason Y bought a Android Moto E, was my Budget, it have Android 6, very usable, but costs 6 times less tan a iPhone, of course, if required for my day work, y will buy a iPhone, but that’s not te case.

  28. I forgot to say that since Android 6 versión, a SD card can be used as internal memory, so since Y added a SD and fussioned it with the internal memory now I have 30 free gb of storage for installing Apps, my original internal storage was 8 gb, so, y recomend to buy an Android device with not less versión tan Android 6.
    My sister (she have not blindness problem) did the same operation with her Moto G III Gen.

  29. Great and informative article Jonathan! I ask your permission to translate it into italian and publish it in some free blindness related reviews and blogs..

    • Thanks Luca for the kind comments and also for seeking permission to translate.
      If you think it would be of interest to readers I’d be delighted for you to. Thanks.