Android, giving Pie a try

Those of you who have followed me for quite some time know that every so often, I decide to give Android a try, to see if I can use it as my primary mobile operating system.  I confess, the geek in me loves the openness of Android, the idea that I can customize just about everything on the platform and make it my own.  Alas, my particular use case is such that I depend on my mobile device quite heavily in a professional setting and so efficiency for me matters a great deal.  In the past, my experience with Android has been that while I could accomplish *most* of the tasks I need to accomplish,I was unable to do so with enough efficiency to make a switch possible.  Still, I keep being drawn back to Android for many reasons, one of the main being the multitude of device and price options available.

 

I want to stop right here for a moment and realize that everyone’s use case is different.  It’s important to recognize that there are many folks who have been using Android, with little to no complaint, for years and can’t imagine using anything else.  When it comes to Android versus iOS, I think the “better” operating system is the one that works best for each individual.  Sure, one could compare the number of accessible apps, the level of standards-based accessibility support, or any one of a number of factors, but the real measure, in my opinion, is: does it do for me what I need it to do?

 

For this time around, I chose to go with the Essential phone because it runs the latest version of Android, Pie, doesn’t contain a bunch of apps and other stuff I don’t care about, is available at a decent price point for the specs, and probably most important, was available on Amazon PrimeNow which meant I could have the device in-hand in under an hour; yeah, I’m not the most patient person, especially when it comes to tech.  I decided that I wouldn’t immediately blog about my experience as I wanted to see first if this really would be a viable option for me.  After over a month in, I’m able to report that I’m extremely impressed with the accessibility changes that have come to Android and its apps.

 

Initial struggles and frustrations

 

I think it’s fair to say that whenever switching to a new operating system or hardware device, there are bound to be some initial user frustrations.  In this case, I switched both things and found that I needed to remind myself of this quite a bit especially during the first week.

 

Initial setup wizard accessibility

Right out of the box, I encountered some initial accessibility challenges with the Android getting started wizard whereby TalkBack, the Android screen reader, wouldn’t let me activate certain options.  These issues have since been fixed, however, my phone did not come with the latest updates installed.  I needed to explore by touch until I found the correct options, disable TalkBack, touch where I thought I had found the option on the screen, re-enable TalkBack, and hope that I had done everything correctly.  Eventually, I was able to successfully get through initial setup and was then able to install the latest updates ensuring that this problem will go away if I should ever need to reset my phone in the future.

 

Speech options

Personal preference alert here, but I am not a huge fan of Google’s text to speech which is the only option available during initial setup.  Additional voices can later be purchased from the Google Play store, Google’s marketplace for apps, music, books and other things, but new users might not be aware of this.  There are actually quite a number of voices available including Eloquence and eSpeak which are likely familiar to Windows screen reader users.  Purchasing additional voices via the Play store makes perfect sense, but because this is very different than what I’ve gotten used to with iOS, it was an initial frustration for me.

 

No native braille screen input,

Lack of native braille screen input is definitely my largest frustration to date.  When this feature was first introduced to iOS, I wasn’t sure if I would ever get used to it, however in time, the ability to use my screen to enter braille characters enabled me to type with incredible efficiency.  This functionality is missing from Android and I dearly miss it.  Third party options are available, but i have yet to find one that works as fluidly as the solution on iOS.  For one thing, TalkBack must be disabled in order to use any of the third party solutions and while the solutions are mainly self-voicing, this is definitely a frustrating step.  I’ve found one Android-based braille screen input solution that works extremely well, Soft Braille Keyboard.  Unfortunately, while Soft Braille Keyboard can still be installed, it cannot be obtained from the Google Play store.  I also have no idea if updates for Soft Braille are forthcoming which is a real shame.  Braille screen input has the potential to make a real difference in how a blind person enters text on a mobile device and I sincerely hope we see additional innovation in this area on Android.

 

Same apps are not necessarily the same.

One of the first things I did on Android was to search for and install the apps I’ve been using on iOS.  I was pleased to find that in most cases so far, the android counterparts to my iOS apps have been very accessible.  That said, the design and layout of these apps is often very different leading to some initial confusion for me.  On iOS for example, my banking app has tabs across the bottom that allow for quick navigation between sections of the app.  On Android, however, that same banking app has a hamburger-style menu that contains similar options, similar, but just different enough to make for some initial confusion, at least for me.  It’s These differences are certainly to be expected, but if you’re switching from iOS, they may be a source of initial frustration.

 

Helpful resources and the awesome community

As I’ve tried to get up-to-speed, there are a few resources that have proven invaluable.  First, the really awesome Eyes-Free community is full of people who have been very patient with me, and with others new to Android.  I’ve gotten tons of fantastic resources through this mailing list-based community and am extremely grateful to all those who have been willing to share their knowledge and tolerate my frequent questions.  Inclusive Android is another fantastic community resource with a wealth of information.  In time, I would like to create a page dedicated to Android resources that would be helpful to new users, or to developers wishing to build more accessible applications.  If you know of a resource that should be included, let me know.

 

Conclusion

I haven’t actually sold my iOS device yet, but I’m very impressed with just how far Android has come.  While it’s certainly not free of frustrations, but what operating system is?  Android has gone from an operating system that was challenging for me to use in my daily life to one that I can use almost as effectively as iOS.  And I say “almost as effectively” in part because I’m still getting up-to-speed and the natural learning curve of any new operating system is bound to cause a temporary drag on productivity.  I’m really excited with what I’ve seen thus far though and hope you will continue to join me as I blog about this new adventure.

 

Gabby’s Gifts 2018, helping kids at the children’s hospital who can’t come home over the holidays.

Gabby’s Gifts 2018, helping kids at the children’s hospital who can’t come home over the holidays.

Since she was little, my daughter, Gabby, has had a medical condition that has required frequent visits to the children’s hospital.  One of the more positive things to come out of this is that she has developed a deep sensitivity for children who don’t get to leave the hospital to spend the holidays at home.  At age nine, Gabby decided to start a program to get gifts for children who must remain at the hospital.  These gifts are purchased from a list provided by the hospital based on things the children have asked for which do not pose any medical or other risks.  Gabby delivers these gifts to the hospital where they are sterilized, wrapped and delivered to the children by hospital staff.

In 2016, I took Gabby’s initiative online in the form of a GoFundMe campaign and the response was overwhelming.  In a somewhat ironic twist, Gabby herself wound up in the hospital last year becoming one of the very kids she has tried so hard to help.  After months of recovery, I am happy to say that Gabby is doing extremely well.  Gabby remains very committed to this cause, and is again raising money for the 2018 holiday season.

 

If you are able to take a few minutes to check out her campaign, I would appreciate it.  I think what she’s doing is really incredible, even more so after our experience last year.  Gabby’s Gifts 2018 can be found here.

 

Thanks in advance for taking the time to check out her campaign and for sharing.

 

 

Blogging with WordPress again, is this thing still on?

It’s been a while since I’ve posted a blog entry on WordPress and so I have no idea if this thing is even still working or if there’s updates somewhere that I still need to install.  For the past while, I’ve been experimenting with a service called Micro.blog and while I love Micro.blog, it has the current drawback that posts can’t be categorized.  Anyway, I won’t write much here as for all I know I”m blogging to myself, but if things are working, I’ll start blogging something more useful soon. 🙂

 

 

How Microsoft’s accessible OneNote helps me to manage a medical crisis

On April 25, our daughter, Gabrielle, was rushed to the hospital by ambulance after having a breathing episode.  Gabrielle (Gabby) has a condition which unfortunately causes her to have many such episodes, however, this time was different as she had a seizure and lost consciousness, twice.  To say that the weeks since have been a nightmare would be a huge understatement.

 

in addition to all the emotional stuff, the sheer volume of incoming information soon became overwhelming.  Multiple doctors conducting multiple tests, prescribing multiple medications, making multiple changes to her diet, proposing multiple theories as to what might be going on with her.  Our focus needed to be on Gabby and on the situation and yet we also needed to do our best to stay on top of the ever-growing pile of information if we were to have any hope of making informed decisions in reference to her care.  How to manage it all?

 

Information was coming in all sorts of formats.  “Call me with any questions,” said many of the doctors while handing me their printed cards.  “Here’s a bunch of articles I’ve printed out for you to read,” said others.  My own frantic research attempts were turning up links and information at a staggering rate.  And of course there were the actual meetings with her medical team that required me to write stuff down very quickly and without much time to prepare.  I have a plethora of scanning and note-taking apps, but I really needed everything centralized in one place.  not only that, but I needed to make sure my wife and I could share information back and forth without giving any thought to the actual logistics of making that happen.

 

I’ve been a huge OneNote fan ever since learning of the Microsoft Accessibility team’s efforts to make it accessible.  I use OneNote primarily for work, but also use it to keep track of various things going on in my personal life.  Still, I’ve always had the luxury of knowing that if OneNote failed me, I could use a backup option and while it might be less convenient, I could certainly make due.  Within hours, I no longer felt like I had that luxury:  I needed a system that would work for more than just me.  I needed a system that would be dependable.  I needed a system that would allow me to use my phone, my computer, or Anything my wife, Jenn might have access to from the hospital.  OneNote met all those requirements, but accessibility of OneNote is relatively new, should I really trust it for something like this?

 

Dealing with all the print.

 

Microsoft makes a product called Office Lens which allows a photo to be taken of a printed page.  The text in that photo can then be recognized using optical character recognition and the results read aloud.  One of the really awesome things about Office Lens, at least on iOS is that I get spoken feedback when positioning the camera.  I can also send the original image, along with the recognized text version to OneNote.  Whenever given something in print, whether a sheet of paper or business card, I tried to immediately capture it using Office Lens.  Being wired on caffeine and Adrenalin, I’m amazed i was able to hold my phone’s camera steady enough to capture anything, but Office Lens talked me through the positioning and for the most part, it worked great.  Certainly I didn’t get 100% accuracy, but I got names and numbers and enough text to get the gist.  Microsoft also makes a version of Office Lens for Windows 10 which I was very excited about until I realized it wouldn’t let me use my flatbed scanner, apparently, like the mobile versions, it’s really designed to use a camera.  I found a work-around by scanning pages using an alternative app and importing the images into Office Lens, but maybe someone out there knows of a better way?  During this past CSUN, Microsoft demonstrated the ability to scan a document using their Surface Pro, I may need to add this thing to my Christmas list if it really works.

 

Quickly writing stuff down.

 

i don’t know how many times I’ve heard the saying “there’s never a pen around when you need one,” but it’s true.  No matter how prepared I think I am to write something down, it almost never fails that someone has information for me when I’m in the most inconvenient place to receive it.  One great aspect of OneNote is that there are numerous ways to quickly take down information.  On iOS, there’s a OneNote widget that allows me quick access from any screen.  I can pull down the notification center, swipe right and my OneNote widget is the first widget on my list.  I simply select the type of note I wish to write, text, photo, or list, and get a blank page for writing.  I have the option of titling my page although if I’m in a hurry, I’ve found it easier to just write whatever it is down and title the page later.  If I’m not in a position to type, or if there’s simply too much information, OneNote gives me the option to attach a voice recording to a note.

 

If I’m at my computer, I have a really great option for taking a quick note:  The OneNote desktop app which is bundled as part of office has a feature called, Quick Note.  From anywhere, I simply press windows+n and I’m placed in the title of a new blank page.  I can write a title or title it later, most important, I’m at a place where I can just start writing.  When I close the window, my note is saved and I’m returned to wherever I was when I hit windows+n.  This makes it possible for me to take down a note literally at a moment’s notice, i don’t even have to cycle through open windows which is great since I generally have a ton of those open at any given time.  My only gripe is that OneNote stores these quick notes in their own notebook and I have to move them to the correct place later.  I’m hopeful there’s a setting somewhere which will allow me to configure this behavior, but if not, I consider it a very small price to pay for an ultra-convenient way to take a quick note.

 

Managing Gabby’s home care.

 

While Gabby still has a long medical journey ahead, she is stable and is able to be home with medication, monitors and other supports in place.  Coordinating which medications she needs to take and when, in addition tracking other aspects of her condition is again something we’re managing to accomplish with OneNote.  First, we created a to-do list of all her medications to use as a sort of template.  We then copied this list renaming each copy to a corresponding date.  In this way, we can keep track, day-to-day, of which medications have been taken and which remain; no back-and-forth between Jenn and me around whether Gabby has taken a specific medication or not.  There are a few drawbacks to this system, most notably that if any of her medications change, we’ll need to delete and re-create all the future pages in her journal section.  There are certainly other to-do apps that could help us more easily manage recurring to-dos like this, but by using OneNote, we’re able to keep all her information centralized and synchronized.  In addition, using OneNote makes it easy for us to track events such as breathing episodes and other real-time observations which we could not properly capture in a to-do app.  As we continue to work toward figuring out the best next step for Gabby, we have a central place to compile research.  Also, as medical bills and insurance claim determinations start arriving by mail (amazing how fast that happens) we have a way to organize that as well.

 

Problems and challenges.

 

I don’t regret my decision to use OneNote to help me manage these past few weeks, not even a little.  That said, I have encountered some challenges and feel they’re worth mentioning.  To be fair, i see that OneNote for iOS actually has an update today, so some of these may no longer exist.

 

On the iOS app, when using a Bluetooth keyboard, editing text doesn’t seem to work as expected.  Specifically, when i arrow around, sometimes I find myself on a different line, sometimes on a different word, commands to move word by word don’t seem to work as I think they should.  My stopgap solution has been to simply not edit on iOS; I hit the asterisk ‘*’ key a few times to mark that there’s a problem, hit enter and just keep on typing..  While editing would be great on iOS, and maybe it’s just me who’s doing something wrong, my primary interest is in capturing the information knowing that I can always clean it up and edit it later on my PC.  When using Braille Screen input, my preferred method of typing on iOS, i sometimes need to double tap the text area even though the keyboard is visible.  I’m not sure why this is the case, but it’s an easy fix to a strange problem.

 

On the PC side, working with the Windows 10 OneNote application is far easier than working with the OneNote Desktop application provided as part of Office.  That said, the Quick Note functionality is only available in the Office version, not the Windows 10 app version.  For the most part this doesn’t cause any problems, it’s just a little confusing as if you want to use Quick Notes, you have to make sure the Office version of OneNote is installed even if, like me, you don’t use it for anything else.

My other frustration with the Quick Notes functionality of the Office app, as mentioned above, is that i can’t seem to change where it wants to actually put my quick notes.  I want them in the cloud, within a specific notebook, and Office wants them on my local machine, in a different notebook.  Fortunately it’s very easy to move notes from one place to another, it’s just one more thing I need to remember to do and if I forget, those notes won’t be synchronized to my phone and to Jenn.

Currently, in the Windows 10 OneNote app, I cannot figure out how to check items off the to-do lists.  I can read the lists just fine, but can’t tell what’s checked and what isn’t.  My solution for this is to simply use iOS for now when checking off Gabby’s medication.

 

Office Lens has got to be one of the coolest apps ever, especially on iOS where it provides fantastic guidance for positioning the camera.  On Windows, Office Lens seems very accessible although I haven’t figured out how to make it work with my flatbed scanner. I don’t know if there’s a way to fix this, or if I need to find another way to import scanned images into the windows 10 OneNote app, such that text within the image is recognized.

 

Summary

 

Throughout my life I’ve done many things to prepare for all sorts of emergencies, starting as far back as fire drills in elementary school, but I’ve never given a great deal of thought to what for now I’ll call, informational preparedness.  The following are a few questions you may wish to consider as, having the answers now when they’re not needed, is much better than not having them later, when they might be.

  • If I were in a situation where I needed to write something down, right now, how would I do it?
  • Am I dependent on one device?  Put another way, if I drop my phone or laptop and it smashes, what does that mean for the information that’s important to me?
  • Do i have the contact numbers for friends, family, doctors, transportation services, friends and any others I might need and can i access them quickly?  Do I have these on more than one device and do I know how to access them wherever they are?
  • Do I have a way to share information with someone else in a way that makes sense to me and them? Who might that someone else be and have we discussed this specifically?
  • How do i handle materials in an inaccessible format to me in an urgent situation? it might be fine for my neighbor to help me read my mail, but they may not be available to me all day, every day.
  • Does my doctor/pharmacy/healthcare provider have a way to send me information in a more accessible format? Many places are using online systems similar to MyChart, but getting that set up when it’s actually needed it is not fun — it’s really not.

I’m sure there are many other questions that should be asked, but the above list should be a good starting point. Certainly let’s keep the conversation going and if there are others, put them in the comments and I can add them to the list.

 

Finally, I want to thank the OneNote team and countless others who have been working to make technology accessible.  Technology is truly an equalizer in ways that, even as a member of the accessibility field, continue to amaze me and I couldn’t be more appreciative.

 

My Frustrations with Android Notifications

Notifications:  They tell me when I’ve missed a call, gotten an Email, received a text message and so so much more.  Notifications have become a critical part of how I work and play and without them, I sometimes wonder if I’d know where to begin.

 

The Lock Screen

On iOS, the first way I likely  encounter notifications is on my lock screen.  Quite simply, when I wake my phone, notifications show on my lock screen in the order received, oldest to newest.  So, when I wake up in the morning, or come out of a meeting and grab my phone, I can quickly skim through whatever notifications I’ve missed over night.  On Android, the experience is very different.  First, my lock screen shows notifications, however, they do not seem ordered in any particular way.  For example, looking at my lock screen right now, I see a FaceBook notification that came in an hour ago followed by a Skype notification telling me about a message I received three minutes ago.  Next to both of these notifications, I have an “expand” button which, if activated, will show me additional notifications from that application.  Put another way, the notifications seem to be grouped even if the groups themselves don’t seem ordered in any particular method.  On the one hand this grouping thing is kind of neat as I can quickly see the apps that have sent me notifications and, if I’m interested in the particulars of any, I can expand them.  The problem is that this too doesn’t seem standardized between applications:  Some applications group notifications as just described, others don’t.  In addition, some applications have a specific button that says “expand” to which I can swipe and others require me to tap on the notification itself and go on faith that it will expand to show additional content.  Others say “dismissable” although I haven’t figured out how to actually dismiss them.  Much as I like the concept of grouped notifications, the inconsistencies I’ve observed so far make it more confusing than anything else.  One cool thing that Android seems to have on the lock screen though is this thing I’m calling the notification summary bar.  If I explore by touch, moving upward from the bottom of the lock screen, I encounter a line that, when touched, reads a number followed by a detailed listing of all my notifications.  I’m not sure what this looks like visually as there’s just no way all the content that gets read aloud would fit on the lock screen, let alone a single line.  Still, it’s a good way to quickly get an overview of all notifications.

 

Notification Center and the Notification Shade

Both iOS and Android have a way to display notifications once the device is unlocked, iOS calls this the notification center and Android (at least TalkBack) calls this the Notification Shade.  On iOS, the Notification Center is opened by using a three-finger swipe down gesture from the top status bar.  On Android, there are two ways to access the Notification Shade, either a TalkBack-specific swipe right then down gesture, or a two-finger swipe down from top gesture.  I’m improving, however in the beginning, it was a bit challenging for me to perform either of these gestures reliably.  When the Notification Shade is activated, I first encounter the time followed by my WIFI status and a control to disable WIFI, then my cellular signal status, then my battery status, then my Bluetooth status, then my screen orientation, and then my notifications.  While this is quite a bit to have to go through, having a sort of quick control center easily available is neat.  As with the lock screen, notifications are grouped, or at least they attempt to be and like the lock screen, the grouping doesn’t seem consistent.  On the shade, I have a GMail notification that says “nine more notifications inside”.  Other notifications though don’t tell me how much additional content they may or may not include and I only know they are expandable as they are followed by a button that says “expand.”  This button isn’t programmatically associated with the notification though, so unless I swipe through this shade, I’m not sure which notifications are associated with buttons to expand additional content.  The Notification Shade also contains a few details that don’t appear on my lock screen, one is my local weather and another is an Android notification advising me that I can enable the ability to unlock my phone with my voice.  While it doesn’t really bother me, the weather appearing here is a bit incongruous with the other types of notifications present.  At the very end of the Notification Shade is an unlabeled button which I’ve discovered is a clear all notifications button of some sort.  I know it’s possible to clear all notifications on iOS if using an iDevice with 3D touch, however, this seemingly simple and logical feature has existed on Android for a long time now and it could almost be fantastic.  I say almost because, when I activate this button, my phone starts going crazy and counting down messages while playing a notification tone, “82 messages 81 messages 80 messages 79 messages 78 messages …” and a tone for each one.  I’ve discovered that if I lock my screen at this point, the countdown seems to proceed much faster, probably because TalkBack isn’t trying to read the number of messages.  I really have no idea why this is happening, but while the clear all notifications feature is a good one, I definitely hesitate before using it.

 

Sounds, vibrations and other observations

One of the more baffling things I’ve noticed about notification sounds on Android is that, at least on the devices I’ve tried, they always play through both the headphones (assuming headphones are plugged in) and the phone’s speaker.  So, let’s say I’m in a meeting and I decide to have a text conversation with someone — strictly a hypothetical situation in case my boss happens to be reading this blog post. 🙂  I plug headphones in and send a text.  When I receive an answer though, the notification sound is played through both the headphones and the phone’s speaker.  I can set my notification alerts to vibrate only and solve this problem, but it still strikes me as odd that I can’t make notification sounds play strictly through headphones.  Conversely, if I’m on a call, phone/Skype/WebEx/other, I don’t hear any notification sounds at all.  Presumably the thinking here is that I wouldn’t want my call interrupted with additional sounds being played, however, I find those notification sounds very helpful for determining the notification I just received.  If I get a notification while on a call, indicated by a vibration, the only thing I can do is open the Notification Shade and hope that the most recent notification is on top, or at least not grouped with other notifications.  In reality, this has proven extremely problematic for me, almost to the point of being a complete deal breaker.  Part of the reason this doesn’t work as smoothly as it possibly could is because TalkBack forces me to make a very difficult choice; whether notifications should be read aloud when the phone’s screen is locked.  If I enable this feature, all my notifications get read aloud when the screen is locked including sensitive content such as text messages, Hangouts conversations and so forth.  If I disable this feature, TalkBack stays quiet when notifications appear on the lock screen, however, as the screen automatically locks after a short period of time when on a call, this means nothing gets read which isn’t helpful since I don’t get the sounds in the call scenario either.  But let’s push that entire mess to the side for just a moment and talk a little about notification sounds themselves.  One of the really cool things about Android is that many apps allow their notification sound to be customized.  This means that unlike in iOS where many applications use the default iOS tri-tone notification default sound, Android applications allow the user to pick from whatever text/notification sounds exist on the device.  This is one feature I absolutely love, at least I would if Android would stop resetting certain sounds to the default sound.  For example, I configured my device to play one notification sound when receiving work Email and another sound when receiving personal Email.  That worked fantastic for three days or so, but now I’m getting the default notification sound regardless of whether Email is received on my work or personal accounts.  Other apps which have unique notification sounds on iOS don’t seem to have any unique sounds on Android, either that, or they do have the same unique sounds, but the default notification sound is being played for reasons I can’t explain.  For example, there’s an accessible dice game called Dice World which has a notification sound of dice rolling when an opponent has played their turn.  Initially, this sound would play just fine on my device, but now, I just get the standard notification sound and don’t seem able to change it.  Quick side note:  Yes, I do have the “play notification sound” enabled in Dice World.  Same situation with Tweetings, a very powerful Twitter client that has built-in notification sounds that initially played, but which now no longer do.  Point here is that the ability to customize notification sounds is extremely powerful, but I’m not sure how stable it is.  In addition, not all apps allow notification sounds to be customized in the first place.

 

As I wrap up this blog post, I’m left with the feeling that I’m barely scratching the surface of Android notifications.  I say this because I’ve gotten feedback on Twitter and elsewhere that others are not having the same experiences as me.  For example, some people claim to have an edit button on their Notification Shade which allows them to specify how notifications get sorted while others do not.  I’m also not sure if anyone else is experiencing the same inconsistencies as me with regard to notification sound preferences resetting themselves to default.  In the end though, I remain confident that I can find workable solutions to these challenges, how difficult those solutions may be to implement remains to be seen.

 

 

While all men may have been created equal, all Android devices are not.

I wanted to write about something which Android folks probably take incredibly for granted, but which might be a bit perplexing to users coming from iOS.  And it’s one of these things that, as I write it, seems incredibly silly yet it’s really not.  In fact, it’s one of those twist of irony things that make Android an attractive option in the first place.  First some background.

 

As written previously, I’ve been learning on a Motorola G4 Play which I picked up on Amazon Prime for around $149.  I really love this particular device, it’s small, it’s light-weight, texture wise it’s easy to grip, and you just can’t beat the cost.  Still, one thing that’s been frustrating me is that occasionally, when I double tap something, the phone doesn’t register a double tap.  In addition, I find that sometimes I’ll swipe my finger from left to right and yet the phone will act as if I hadn’t swiped at all.  I was complaining about this to a friend of mine, David, one night.

 

“gosh,” I complained, “how can anyone take Android seriously when it can’t even reliably recognize a swipe gesture?””

 

After a bit of a pause, David cautiously replied, “do you think it could possibly be your hardware?”

 

Honestly, in my frustration, I hadn’t even considered the hardware angle and how that might have a real functional impact on things like gesture recognition.  But it stands to reason that the hardware differences between my $149 Moto G4 Play and David’s $649 Pixel just might be a factor somehow.

 

Going down the rabbit hole

 

I wanted to get an idea of just how much of a difference different hardware configurations might make, especially in terms of device accessibility.  Obviously devices with faster processors and more RAM are going to perform at greater speeds, but what about touch screen sensitivity, ROM customizations and anything else that might impact accessibility?

 

I started out by purchasing an Asus ZenFone 3 Laser because not only is its metal construction extremely solid, but, well, it just has a really awesome name.  OK all that is true, but I really bought it because it has a larger display, more RAM and a slightly faster processor than my Moto G4 Play.  After getting through the setup process, I was introduced to what Asus calls ZenUI 3.0.  ZenUI is basically the Asus customization of Android including a number of applications and widgets, a redesigned home screen, customized dialer, custom sound effects for things like locking and unlocking the screen, and notifications and other tweaks to make their phone unique.  Coming from iOS, the idea that the entire home screen, notifications and even the phone dialer can be customized is very unsettling.  After all, if I talk to another iPhone user, I can walk them through how to place a call because I do it exactly the same way on my own device.  The asus customizations, however, were so significant that I was unable to figure out how to access my menu of applications.  I want to be clear here, I’m not saying necessarily that finding the application menu is inaccessible, it just wasn’t at all intuitive and definitely wasn’t the same experience as on my Moto G4 Play.  What I soon learned though is that Android allows for the installation of what are known as Launchers.  My understanding thus far is that Launchers basically define things like the home screen layout.  After installing the Google Now Launcher, which is apparently installed by default on my G4 Play,my application menu appeared where I was expecting it and some of the other random dialogs that had started popping up simply went away.  In the end, I experienced similar frustrations to those I had been facing with the G4 Play with the additional frustrations of figuring out how to get my home screen and other aspects into a state where I could use them.  As awesome as its name is, the ZenFone soon found its way back to the store.

 

Next up, I purchased a Blu 5R which is also a solidly built phone — yeah, I tend to gravitate toward phones that are solid, heavy and which feel like they won’t fall apart at the drop of a hat.  As with the Asus model that I returned, the Blu phone has a larger screen and slightly better specs than my G4 Play.  While the Blu had its share of customizations, such as rather cute startup and shutdown sounds and a number of pre-installed applications, my experience was a very positive one.  Although not perfect, I experienced fewer issues with gesture recognition, I loved the finger print sensor (the G4 Play doesn’t have this) and the speaker, once I realized it was initially covered over by a sticker, is really fantastic.  If anyone reading this is looking for a budget entry-level phone, the Blu 5R should definitely be considered.  I wound up returning mine, but only because I couldn’t justify it given that I already own the G4 Play.

 

And so it was with great anticipation that I awaited the arrival of my latest phone from somewhere in China, the OnePlus 3T.  I’d never heard of the company, OnePlus, but they are a startup specializing in high-end, high-performance devices at mid-level prices.  The specifications of the OnePlus 3T rival those of the Nexus at just over half the price and the reviews are fantastic.  If I decide to seriously make the switch to Android, the 3T, with it’s super fast battery charging capability, 6 GB of RAM, convenient slider to quickly enable do-not-disturb and amazing form-factor is a device I could see myself using day-to-day.  More importantly though, gestures are definitely recognized, accurately and consistently.

 

What have I learned?

 

I’ve actually learned a lot over these past few weeks, beyond the fact that my local electronics shop has a really great return policy.  First, when purchasing a new Android device or when seeking assistance, it’s important to remember that Android devices can be different, sometimes vastly so.  If you’re coming from iOS, this is extremely important because for the most part, iOS devices and how they operate are pretty similar across the board.  Another thing I learned is that when an Android user says, “hmm, I’m not experiencing that issue,” it could really mean that, given their specific hardware/software which may be different than yours, they’re really not experiencing the same issue as that which you may be experiencing.  It’s been my experience that sometimes, when an iOS user says that they’re not experiencing an issue, it’s meant as a mild rebuke:  something along the lines of, “I’ve got the same hardware as you, I’ve got the same software as you, it’s working fine on my end, clearly it’s a problem on yours.”  Looking over this paragraph, I realize I’m over-using the word experience, but in a way, that’s exactly what we’re talking about here.  One of the very things that makes Android such an attractive option is the flexibility to customize just about every part of the experience.  This comes at a cost though, the cost being fragmentation between what I might experience and what you might experience.

 

Getting started with my Moto G4, pretty straight-forward, until it wasn’t.

One of the things that I’m most excited about with Android is that now, it’s possible to activate TalkBack when the device is first booted.  While folks today may take this for granted, this ability to be off and running right out of the box wasn’t always the case.  Back in the day, with my first Android device, I had to get sighted help to walk me through the initial setup and then had to have them help me get TalkBack from the Google store since it didn’t even come pre-installed.  So yes, I’m very excited that I can order an Android device, have it arrive and without assistance, can get the thing talking.

Hardware description

The Moto G4 Play is a very small device which feels like it’s made entirely out of plastic.  The device is very thin, thinner than my iPhone.  While the device appears to be all plastic, it also feels rugged.  On the right side of the device are two controls, the top most being the sleep/power button which contains tactile markings.  Below that is another somewhat long button which serves as the volume control.  The bottom edge of the device contains a micro-USB port for charging.  There are no controls at all on the left side of the device.  The top of the device contains a headphone jack.  Quick side note here, it’s actually taken me a while to get used to the headphone jack being on top of the device because for many years, Apple has positioned theirs on the bottom of the device.  Anyway, this brings us to the front of the device which has an ear piece on top followed by the touch screen.  At the bottom of the touch screen is a tiny tiny little hole.  I actually initially thought the screen was chipped or something, apparently, this hole is for the microphone.  Flipping the device over, the back is pretty nondescript except for the camera which does protrude a little.  The entire back of the device can actually be pealed away revealing a removable battery, a SIM slot and a micro-SD card slot.  To clarify, when I say that the back can be “pealed away”, I mean it quite literally.  Using a finger nail or something extremely thin, the back can literally be pried off the back of the device.  The plastic actually bends when doing this and I confess that when I did this for the first time, I was totally sure the device would not go back together again.  Having done this multiple times since, I continue to be amazed that the back does snap back into place with no ill effects other than my own slightly elevated heartbeat.

Getting this thing up and talking

One of the really neat things about many Android devices is that when they’re booted, the user gets a small vibration to indicate that something is happening.  Personally, I love this added bit of confirmation as I don’t have enough light perception to tell if the screen is on or not.  Anyway, after waiting a little while, i placed and held two fingers on the screen, this is the shortcut I was told would activate TalkBack.  And sure enough, after a few seconds, it did!  TalkBack helpfully launched a tutorial to help me learn its gestures and become more familiar with how I could navigate different types of objects on the screen such as edit boxes, lists, multi-page scrolling and so on.  The tutorial is broken up into small lessons each of which contain exercises that can be performed to make sure the user understands what’s going on; it is in one of these lessons that I encountered my first real problem.

A bit about TalkBack gestures

One of the things the initial TalkBack tutorial does is acquaint the user with the basic TalkBack gestures.  Coming from the perspective of an iOS user, some of these gestures seem weirdly complicated, like the swipe up then right to open the global TalkBack menu, or the swipe right then down to open the notification shade.  Still, my goal here is to learn and so I followed the directions given in the tutorial.  Eventually, I got to an exercise which focused on changing the reading level, the reading level meaning whether TalkBack should read line by line, word by word, character by character … you get the idea.  “Swipe up then swipe down,” said the tutorial, but this didn’t accomplish anything.  Well that’s not entirely true, what it did accomplish was getting TalkBack to read random things on the screen.  I tried this multiple times and try though I might, I could not get the reading level to change from its default.  Was I doing something wrong?  Maybe I’m not swiping up and then down straight enough?  Maybe a swipe means something slightly different in the Android world than it does in the iOS world?  The other thing I found myself doing at this point was slightly tilting the device.  This had the effect of causing the screen to change orientation between portrait and landscape modes.  Not a problem except in this instance where half of my up/down swipes were probably being interpreted as left/right swipes because of the change in screen orientation.  Eventually, I got frustrated enough to press the “next” button and continue on with the tutorial.  The next tutorial lesson tried to teach me about cursor movement, but guess what?  Yep, the cursor can also be moved by characters/words/lines/… and yep, I couldn’t make that work either.

Surviving the tutorial and beyond

Eventually, I “nexted” my way through the rest of the tutorial and reached the “finish” button.  Setup continued at this point and I was able to log into my Google account, answer questions about syncing, location sharing and eventually got a screen that told me I was ready to go.  Ready to go, but where?  I’m going to write a separate post detailing this next topic in more detail, but the thing I discovered is that the concept of a home screen like exists in iOS is not as cut and dry in Android.  I’ve learned that many “Launchers” exist for Android and that depending on which came bundled with the device, or which the user may have downloaded, one device’s home screen may not look at all like another.  I’ve had my device for a few weeks now and still cannot figure out any logic to its home screen.  What I have figured out though is that along the bottom edge of the touch screen, there are a few virtual buttons.  From left to right these buttons are: “back” “home” and to the far right, “overview”.  Just above the “home” button is an “applications” button which takes me into an alphabetized grid-view of the applications installed on my device.  Maybe there’s a more efficient way to access apps, especially those I frequently use, but for now, I rely on this alphabetized grid to find just about everything.  I figure: if it’s not in the grid, I probably don’t need it right?

Quick conclusions

  • I really love that I can enable TalkBack right out of the box.  This means that I can impulsively order a $149 phone from Amazon Prime Now, have it delivered and start using it right out of the box.
  • TalkBack has a really neat tutorial that helps me learn in a very structured way.  Each lesson provides detail on specific TalkBack-related tasks I may wish to accomplish and then provides me with exercises so that I can practice accomplishing them.
  • I have no idea at this point how to change my reading level so that I can have TalkBack read word by word or line by line, I’m just stuck with the default for now.  The big problem I have with this is, well, the tutorial was pretty specific on how I should change from level to level and it just didn’t work.  Quick note: I’ve since solved this problem with a number of updates, I’ll get to that in a future post.
  • I have a home screen that doesn’t make sense to me.  I’ve shown it to sighted folks who always respond the same, “huh, that doesn’t look like mine.”  Research has told me that I can change this by installing something called a Launcher, but I don’t know where to even begin with that yet.  In the meantime, I have this very nicely alphabetized grid which while not super efficient, does help me find everything I need to find.

More to come soon, so please stay tuned. 🙂

 

Rediscovering Android, my journey begins

While I’ve been an iOS user for many years, I’ve always been very curious about Android.  Indeed, the openness of the platform and all that that entails speaks to my inner geek.  I first experimented with Android back in 2011 when Android 2.1 was all the rage because way back then, it was still possible to get phones like the T-Mobile G2 with physical keyboards and, as a serious texting and social media junkie, this appealed to me.  Eventually I drifted back to iOS until a few years later when I traded in my iPhone for a Samsung Galaxy S III.  The Galaxy served me well until Apple released Siri which I just had to have and so back to iOS I came.

 

Over the years, there’s been quite a bit of innovation happening in the Android accessibility space and while I do my best to keep up with it all, it’s hard to really understand it if I’m not using it.  That said, iOS is working just fine for me, so it’s hard for me to justify the cost of a high-end Android device like Google’s Pixel even though it admittedly does look pretty darn cool.  And so it came to pass that one day, while researching Motorola cable modems on Amazon Prime Now, I came across the Moto G4 Play which retails for around $149.  um… a quick note to my more juvenile readers, this model is the letter G followed by the number 4 followed by the word play, not the “foreplay” you immediately thought of when your speech synthesizer read this to you :).  Anyway, I realize that at this price point, this phone won’t be as capable as higher-end models, but for someone like me who is generally curious — wanting to see what’s out there — I thought this model might be a good place to start.

 

What I thought I’d do is write a series of articles chronicling my rediscovery of Android, so that others who might be thinking of giving it a try will have an idea of the types of things they might encounter.  I’m not looking to convince anyone that Android’s better than iOS or vice versa.  Also, as I write, keep in mind that I too am learning and discovering, so if you find I’m doing something wrong, or if I should be doing something a different way, please don’t hesitate to comment.

 

I’m excited to be rediscovering Android and hope you enjoy the journey with me.

 

Gabby’s Gifts 2016, helping kids at the children’s hospital who can’t come home over the holidays.

Since she was little, my daughter, Gabby, has had a medical condition that has required occasional testing at the children’s hospital.  One of the more positive things to come out of this is that she has developed a deep sensitivity for children who, unlike herself, don’t get to leave the hospital and spend the holidays at home.  At age nine, Gabby decided to start a program to get gifts for children who must remain at the hospital.  These gifts are purchased from a list provided by the hospital based on things the children have asked for which do not pose any medical or other risks.  Gabby delivers these gifts to the hospital where they are sterilized, wrapped and delivered to the children by hospital staff.

 

This year, I’m helping Gabby take her Gabby’s Gifts initiative digital.  I’ve set up a GoFundMe campaign for those who may wish to donate to her online.  Below, find Gabby’s own description of her initiative.  If you are willing to donate but encounter challenges with GoFundMe, please reach out and we can figure out another way to make it happen such as PayPal, Square Cash, or good old-fashioned paper check.

 

In advance, thanks so much for reading, considering and sharing.

 

Dear Friends,

Christmas time is here. This is the season of caring, sharing, family, and fun. Unfortunately for many children this year Christmas will be another day in which they have to be in the hospital away from their friends and family. In 2012, I realized that I was very blessed to have so much in my life and I wanted to give something back. I decided to try and make Christmas a little brighter for those children who, by no choice of their own, can’t be with their family and friends at Christmas. There are a lot of children that have to spend the holiday season at the hospital which is very sad, hence Gabby’s Gifts was born. Every year I collect money or gifts to donate to the University of Minnesota Children’s hospital. All of the money collected is used to purchase gifts for the children such as toys, slippers, pajamas, and other things that the kids have asked for.

If you would like to donate to Gabby’s Gifts please do so before December 12. I will be bringing the donations to the Hospital on the 14th of December this year. I have also set up an online GoFundMe page for those who may wish to make an online donation: https://www.gofundme.com/gabbys-gifts-2016 .

I want to thank you in advance for considering donating and helping to make a child’s Christmas just a little brighter.

Sincerely,

Gabrielle Sawczyn

 

 

A blind person renting a car? Apparently, that idea isn’t as shocking as I once thought 

Recently, I traveled to New York where the plan was for me to connect with one of my colleagues and then travel to visit a client. Since we were arriving from different airports and since we would be needing a car, it made the most sense for us to meet up at the car rental counter. I was amused thinking of the reactions I would likely get from people as I, a blind guy, asked for directions to car rental. The reaction I got from one guy though really made me stop and think, he said, “oh, you must be going to rent one of those new autonomous cars, that’s got to be so neat.”. To him, the idea that a blind person might be renting a vehicle wasn’t very far fetched at all. I casually mentioned my destination to a few other people just to see what kind of reaction I might get. Strangely enough, the only somewhat negative reaction came from a woman who was all concerned that I could get hurt crossing the street which needed to be crossed in order to get to car rental. My take-away from the day? There remain people skeptical that blind people can independently cross streets, but the idea that blind people could possibly be renting cars is no longer the unbelievable concept it might once have been.