Dear kindly woman who suggested I try a lemon bar with today’s lunch, I want you to know that I consider having taken your advice to be one of the better life choices I’ve made. :)
Couldn’t be more excited about this new feature of Microsoft To-do which finally allows lists to be shared. @ToDoHelp. While it doesnt’ have all the advanced features of some of the other to-do apps out there, this one is my app of choice due to it’s simplicity an of course accessibility. I should also note that the Windows 10 app version of To-do is also very accessible.
So I’m catching up on #TheArchers and I’m finally in 2014. It seems there’s a new actor playing Tony. Anyone know or remember why?
I love blogging, I genuinely do, however, it’s one of those things that tends to consume more time than I have. For years, I’ve been using WordPress and I can’t be more grateful for all the hard work people have done to make it an incredibly accessible platform. Still, here’s how blogging tends to go for me.
- I come up with a really great idea, something that I want to tell the world about.
- I log into my WordPress blog, all fired up and ready to write.
- As soon as I log in, I see that there’s a WordPress update that, among other things, addresses some security concern or other. This seems important, so I go ahead and install it. After all, this just takes a sec right?
- WordPress uses a number of plugins to add additional functionality and my dashboard shows that a few of these need updating as well. I figure what the heck, I’m already updating other things, I might as well do the plugins too.
- And now the theme, the look and feel of my site, it has an update as well, might as well grab that while I’m here.
- Finally, everything’s updated, but wait, one of those plugins has a new feature and it wants to tell me all about it. I can have it tell me later, but maybe it’s a cool feature, I might as well read about it now.
- Wow, that really was a neat feature, can’t wait to try it out… but i came here for something … this great idea … just don’t remember what it is any more. Maybe it’ll come back to me tomorrow.
Simplicity is key for me because I’m easily distracted. Writing requires a lot of focus for me, so when it comes to blogging, I need something that allows me to log in, write, post, and just be done. And so I’ve started looking into other blogging platforms in the hopes that I can find one that works the way I work versus one that needs me to change the way I work best.
Microblogging is hardly a new concept, in fact it’s the one on which Twitter is based. There’s a rather fascinating Wikipedia article on the subject, but in short, microblogging traditionally is a stream of quick status updates – FaceBook and Twitter feeds are good examples. The problem comes in when I want to write something more than a quick update (e.g. something that exceeds Twitter’s 280 character limit): the answer has often been to use a second platform, one for the shorter microblog updates and the other for lengthier blog posts. The ability to cross-post may make it seem like only one platform is being used, but in the background, there’s usually two, possibly even more as there are other platforms that focus on specific media types such as images or audio.
Microblogging with micro.blog
I should take a moment to thank Josh, @Lioncourt, for turning me on to micro.blog. Josh has been talking about it for a while now, but it’s only recently that I really sat up and took notice, because it has the potential to solve many of my frustrations.
First, simplicity: Micro.blog couldn’t be easier to use, simply choose the “new post” option and start writing. If the post is less than 280 characters (Twitter’s limit) I don’t even need to title my post. This is actually quite handy as often times, it’s thinking about a proper title that gets me stuck for a while. If I want to be more lengthy, I then get an option to add a title. Posts can be cross-posted to Twitter FaceBook and elsewhere and they get cross-posted in a way that makes sense for those particular social networks. Best part for me, I don’t need to install plugins to make this happen, it’s just built in and it works.
Second, hosting and updating: I have options including the option to not deal with any of the hosting and updating at all. For what I would consider a nominal fee, $5/month as I write this, Micro.blog will host my site for me. This gives me the freedom to not worry about hosting costs, server updates, maintenance, it’s just all done for me. For an additional $5/month, I can add the option for Micro.blog to host audio for what they’re calling “microcasts”, or short podcasts. Given the storage requirements and other complexities, this extra $5 seems totally worth it for those who may want the feature. What if I decide to keep my content on my own blogging platform(s), but want to take advantage of the cross-posting and community aspects of Micro.blog? Micro.blog has a plan for that as well. Paying the monthly fee though gives me more than just the ability to get out of the frustration of self-hosting, it also helps ensure that my content remains mine. My timeline – I’ll talk about timelines in a minute – has no advertising. My content isn’t being sold and as a customer, my opinion actually matters. You can read more about Micro.blog plans.
Third, social media and community. Micro.blog has a concept of timelines. Very similar to Twitter’s, posts from people I follow show up in order. If I’m interested, I can reply to a post which comes across to them in the form of a mention. They can then reply to me, others can chime in resulting in conversation. I can read the timelines of microbloggers I follow to discover others that may be of interest – think Twitter in its earlier days. I’m still exploring, but I’m impressed with the sense of community I’ve already found.
What about the accessibility of Micro.blog? So far, everything seems very accessible, with one area – and it’s unfortunately a big one – the editor for actually writing posts. Using NVDA, I’m finding that I’m not able to track my cursor when editing. While I really hope this will eventually be fixed, I’m finding that I’m really not using Micro.blog’s editor at all though. Micro.blog uses the increasingly-popular Markdown standard meaning that I can write my post in almost any editor and just copy/paste it over. Many popular apps, such as MarsEdit and Drafts 5 even have support for posting to Micro.blog’s platform. Options abound and even more are being encouraged.
I’m still looking at solutions, but so far, Micro.blog has me very impressed. Micro.blog was created by Manton Reece You can read more about why Micro.blog was created, the article really resonates with me. @Lioncourt thanks again for convincing me to give this a try and @Manton thank you for making such a cool concept a reality.
Just a second quick audio test to see if I have this correctly configured.
Oh my gosh, so excited, just received the following from @IAAPOrg
Congratulations! We are pleased to inform you that you have passed IAAP – Certified Professional in Accessibility Core Competency (CPACC) examination.
On April 25, our daughter, Gabrielle, was rushed to the hospital by ambulance after having a breathing episode. Gabrielle (Gabby) has a condition which unfortunately causes her to have many such episodes, however, this time was different as she had a seizure and lost consciousness, twice. To say that the weeks since have been a nightmare would be a huge understatement.
in addition to all the emotional stuff, the sheer volume of incoming information soon became overwhelming. Multiple doctors conducting multiple tests, prescribing multiple medications, making multiple changes to her diet, proposing multiple theories as to what might be going on with her. Our focus needed to be on Gabby and on the situation and yet we also needed to do our best to stay on top of the ever-growing pile of information if we were to have any hope of making informed decisions in reference to her care. How to manage it all?
Information was coming in all sorts of formats. “Call me with any questions,” said many of the doctors while handing me their printed cards. “Here’s a bunch of articles I’ve printed out for you to read,” said others. My own frantic research attempts were turning up links and information at a staggering rate. And of course there were the actual meetings with her medical team that required me to write stuff down very quickly and without much time to prepare. I have a plethora of scanning and note-taking apps, but I really needed everything centralized in one place. not only that, but I needed to make sure my wife and I could share information back and forth without giving any thought to the actual logistics of making that happen.
I’ve been a huge OneNote fan ever since learning of the Microsoft Accessibility team’s efforts to make it accessible. I use OneNote primarily for work, but also use it to keep track of various things going on in my personal life. Still, I’ve always had the luxury of knowing that if OneNote failed me, I could use a backup option and while it might be less convenient, I could certainly make due. Within hours, I no longer felt like I had that luxury: I needed a system that would work for more than just me. I needed a system that would be dependable. I needed a system that would allow me to use my phone, my computer, or Anything my wife, Jenn might have access to from the hospital. OneNote met all those requirements, but accessibility of OneNote is relatively new, should I really trust it for something like this?
Dealing with all the print.
Microsoft makes a product called Office Lens which allows a photo to be taken of a printed page. The text in that photo can then be recognized using optical character recognition and the results read aloud. One of the really awesome things about Office Lens, at least on iOS is that I get spoken feedback when positioning the camera. I can also send the original image, along with the recognized text version to OneNote. Whenever given something in print, whether a sheet of paper or business card, I tried to immediately capture it using Office Lens. Being wired on caffeine and Adrenalin, I’m amazed i was able to hold my phone’s camera steady enough to capture anything, but Office Lens talked me through the positioning and for the most part, it worked great. Certainly I didn’t get 100% accuracy, but I got names and numbers and enough text to get the gist. Microsoft also makes a version of Office Lens for Windows 10 which I was very excited about until I realized it wouldn’t let me use my flatbed scanner, apparently, like the mobile versions, it’s really designed to use a camera. I found a work-around by scanning pages using an alternative app and importing the images into Office Lens, but maybe someone out there knows of a better way? During this past CSUN, Microsoft demonstrated the ability to scan a document using their Surface Pro, I may need to add this thing to my Christmas list if it really works.
Quickly writing stuff down.
i don’t know how many times I’ve heard the saying “there’s never a pen around when you need one,” but it’s true. No matter how prepared I think I am to write something down, it almost never fails that someone has information for me when I’m in the most inconvenient place to receive it. One great aspect of OneNote is that there are numerous ways to quickly take down information. On iOS, there’s a OneNote widget that allows me quick access from any screen. I can pull down the notification center, swipe right and my OneNote widget is the first widget on my list. I simply select the type of note I wish to write, text, photo, or list, and get a blank page for writing. I have the option of titling my page although if I’m in a hurry, I’ve found it easier to just write whatever it is down and title the page later. If I’m not in a position to type, or if there’s simply too much information, OneNote gives me the option to attach a voice recording to a note.
If I’m at my computer, I have a really great option for taking a quick note: The OneNote desktop app which is bundled as part of office has a feature called, Quick Note. From anywhere, I simply press windows+n and I’m placed in the title of a new blank page. I can write a title or title it later, most important, I’m at a place where I can just start writing. When I close the window, my note is saved and I’m returned to wherever I was when I hit windows+n. This makes it possible for me to take down a note literally at a moment’s notice, i don’t even have to cycle through open windows which is great since I generally have a ton of those open at any given time. My only gripe is that OneNote stores these quick notes in their own notebook and I have to move them to the correct place later. I’m hopeful there’s a setting somewhere which will allow me to configure this behavior, but if not, I consider it a very small price to pay for an ultra-convenient way to take a quick note.
Managing Gabby's home care.
While Gabby still has a long medical journey ahead, she is stable and is able to be home with medication, monitors and other supports in place. Coordinating which medications she needs to take and when, in addition tracking other aspects of her condition is again something we’re managing to accomplish with OneNote. First, we created a to-do list of all her medications to use as a sort of template. We then copied this list renaming each copy to a corresponding date. In this way, we can keep track, day-to-day, of which medications have been taken and which remain; no back-and-forth between Jenn and me around whether Gabby has taken a specific medication or not. There are a few drawbacks to this system, most notably that if any of her medications change, we’ll need to delete and re-create all the future pages in her journal section. There are certainly other to-do apps that could help us more easily manage recurring to-dos like this, but by using OneNote, we’re able to keep all her information centralized and synchronized. In addition, using OneNote makes it easy for us to track events such as breathing episodes and other real-time observations which we could not properly capture in a to-do app. As we continue to work toward figuring out the best next step for Gabby, we have a central place to compile research. Also, as medical bills and insurance claim determinations start arriving by mail (amazing how fast that happens) we have a way to organize that as well.
Problems and challenges.
I don’t regret my decision to use OneNote to help me manage these past few weeks, not even a little. That said, I have encountered some challenges and feel they’re worth mentioning. To be fair, i see that OneNote for iOS actually has an update today, so some of these may no longer exist.
On the iOS app, when using a Bluetooth keyboard, editing text doesn’t seem to work as expected. Specifically, when i arrow around, sometimes I find myself on a different line, sometimes on a different word, commands to move word by word don’t seem to work as I think they should. My stopgap solution has been to simply not edit on iOS; I hit the asterisk ‘*’ key a few times to mark that there’s a problem, hit enter and just keep on typing.. While editing would be great on iOS, and maybe it’s just me who’s doing something wrong, my primary interest is in capturing the information knowing that I can always clean it up and edit it later on my PC. When using Braille Screen input, my preferred method of typing on iOS, i sometimes need to double tap the text area even though the keyboard is visible. I’m not sure why this is the case, but it’s an easy fix to a strange problem.
On the PC side, working with the Windows 10 OneNote application is far easier than working with the OneNote Desktop application provided as part of Office. That said, the Quick Note functionality is only available in the Office version, not the Windows 10 app version. For the most part this doesn’t cause any problems, it’s just a little confusing as if you want to use Quick Notes, you have to make sure the Office version of OneNote is installed even if, like me, you don’t use it for anything else.
My other frustration with the Quick Notes functionality of the Office app, as mentioned above, is that i can’t seem to change where it wants to actually put my quick notes. I want them in the cloud, within a specific notebook, and Office wants them on my local machine, in a different notebook. Fortunately it’s very easy to move notes from one place to another, it’s just one more thing I need to remember to do and if I forget, those notes won’t be synchronized to my phone and to Jenn.
Currently, in the Windows 10 OneNote app, I cannot figure out how to check items off the to-do lists. I can read the lists just fine, but can’t tell what’s checked and what isn’t. My solution for this is to simply use iOS for now when checking off Gabby’s medication.
Office Lens has got to be one of the coolest apps ever, especially on iOS where it provides fantastic guidance for positioning the camera. On Windows, Office Lens seems very accessible although I haven’t figured out how to make it work with my flatbed scanner. I don’t know if there’s a way to fix this, or if I need to find another way to import scanned images into the windows 10 OneNote app, such that text within the image is recognized.
Throughout my life I’ve done many things to prepare for all sorts of emergencies, starting as far back as fire drills in elementary school, but I’ve never given a great deal of thought to what for now I’ll call, informational preparedness. The following are a few questions you may wish to consider as, having the answers now when they’re not needed, is much better than not having them later, when they might be.
- If I were in a situation where I needed to write something down, right now, how would I do it?
- Am I dependent on one device? Put another way, if I drop my phone or laptop and it smashes, what does that mean for the information that's important to me?
- Do i have the contact numbers for friends, family, doctors, transportation services, friends and any others I might need and can i access them quickly? Do I have these on more than one device and do I know how to access them wherever they are?
- Do I have a way to share information with someone else in a way that makes sense to me and them? Who might that someone else be and have we discussed this specifically?
- How do i handle materials in an inaccessible format to me in an urgent situation? it might be fine for my neighbor to help me read my mail, but they may not be available to me all day, every day.
- Does my doctor/pharmacy/healthcare provider have a way to send me information in a more accessible format? Many places are using online systems similar to MyChart, but getting that set up when it's actually needed it is not fun -- it's really not.
I’m sure there are many other questions that should be asked, but the above list should be a good starting point. Certainly let’s keep the conversation going and if there are others, put them in the comments and I can add them to the list.
Finally, I want to thank the OneNote team and countless others who have been working to make technology accessible. Technology is truly an equalizer in ways that, even as a member of the accessibility field, continue to amaze me and I couldn’t be more appreciative.
Notifications: They tell me when I’ve missed a call, gotten an Email, received a text message and so so much more. Notifications have become a critical part of how I work and play and without them, I sometimes wonder if I’d know where to begin.
The Lock Screen
On iOS, the first way I likely encounter notifications is on my lock screen. Quite simply, when I wake my phone, notifications show on my lock screen in the order received, oldest to newest. So, when I wake up in the morning, or come out of a meeting and grab my phone, I can quickly skim through whatever notifications I’ve missed over night. On Android, the experience is very different. First, my lock screen shows notifications, however, they do not seem ordered in any particular way. For example, looking at my lock screen right now, I see a FaceBook notification that came in an hour ago followed by a Skype notification telling me about a message I received three minutes ago. Next to both of these notifications, I have an “expand” button which, if activated, will show me additional notifications from that application. Put another way, the notifications seem to be grouped even if the groups themselves don’t seem ordered in any particular method. On the one hand this grouping thing is kind of neat as I can quickly see the apps that have sent me notifications and, if I’m interested in the particulars of any, I can expand them. The problem is that this too doesn’t seem standardized between applications: Some applications group notifications as just described, others don’t. In addition, some applications have a specific button that says “expand” to which I can swipe and others require me to tap on the notification itself and go on faith that it will expand to show additional content. Others say “dismissable” although I haven’t figured out how to actually dismiss them. Much as I like the concept of grouped notifications, the inconsistencies I’ve observed so far make it more confusing than anything else. One cool thing that Android seems to have on the lock screen though is this thing I’m calling the notification summary bar. If I explore by touch, moving upward from the bottom of the lock screen, I encounter a line that, when touched, reads a number followed by a detailed listing of all my notifications. I’m not sure what this looks like visually as there’s just no way all the content that gets read aloud would fit on the lock screen, let alone a single line. Still, it’s a good way to quickly get an overview of all notifications.
Notification Center and the Notification Shade
Both iOS and Android have a way to display notifications once the device is unlocked, iOS calls this the notification center and Android (at least TalkBack) calls this the Notification Shade. On iOS, the Notification Center is opened by using a three-finger swipe down gesture from the top status bar. On Android, there are two ways to access the Notification Shade, either a TalkBack-specific swipe right then down gesture, or a two-finger swipe down from top gesture. I’m improving, however in the beginning, it was a bit challenging for me to perform either of these gestures reliably. When the Notification Shade is activated, I first encounter the time followed by my WIFI status and a control to disable WIFI, then my cellular signal status, then my battery status, then my Bluetooth status, then my screen orientation, and then my notifications. While this is quite a bit to have to go through, having a sort of quick control center easily available is neat. As with the lock screen, notifications are grouped, or at least they attempt to be and like the lock screen, the grouping doesn’t seem consistent. On the shade, I have a GMail notification that says “nine more notifications inside”. Other notifications though don’t tell me how much additional content they may or may not include and I only know they are expandable as they are followed by a button that says “expand.” This button isn’t programmatically associated with the notification though, so unless I swipe through this shade, I’m not sure which notifications are associated with buttons to expand additional content. The Notification Shade also contains a few details that don’t appear on my lock screen, one is my local weather and another is an Android notification advising me that I can enable the ability to unlock my phone with my voice. While it doesn’t really bother me, the weather appearing here is a bit incongruous with the other types of notifications present. At the very end of the Notification Shade is an unlabeled button which I’ve discovered is a clear all notifications button of some sort. I know it’s possible to clear all notifications on iOS if using an iDevice with 3D touch, however, this seemingly simple and logical feature has existed on Android for a long time now and it could almost be fantastic. I say almost because, when I activate this button, my phone starts going crazy and counting down messages while playing a notification tone, “82 messages 81 messages 80 messages 79 messages 78 messages …” and a tone for each one. I’ve discovered that if I lock my screen at this point, the countdown seems to proceed much faster, probably because TalkBack isn’t trying to read the number of messages. I really have no idea why this is happening, but while the clear all notifications feature is a good one, I definitely hesitate before using it.
Sounds, vibrations and other observations
One of the more baffling things I’ve noticed about notification sounds on Android is that, at least on the devices I’ve tried, they always play through both the headphones (assuming headphones are plugged in) and the phone’s speaker. So, let’s say I’m in a meeting and I decide to have a text conversation with someone – strictly a hypothetical situation in case my boss happens to be reading this blog post. :) I plug headphones in and send a text. When I receive an answer though, the notification sound is played through both the headphones and the phone’s speaker. I can set my notification alerts to vibrate only and solve this problem, but it still strikes me as odd that I can’t make notification sounds play strictly through headphones. Conversely, if I’m on a call, phone/Skype/WebEx/other, I don’t hear any notification sounds at all. Presumably the thinking here is that I wouldn’t want my call interrupted with additional sounds being played, however, I find those notification sounds very helpful for determining the notification I just received. If I get a notification while on a call, indicated by a vibration, the only thing I can do is open the Notification Shade and hope that the most recent notification is on top, or at least not grouped with other notifications. In reality, this has proven extremely problematic for me, almost to the point of being a complete deal breaker. Part of the reason this doesn’t work as smoothly as it possibly could is because TalkBack forces me to make a very difficult choice; whether notifications should be read aloud when the phone’s screen is locked. If I enable this feature, all my notifications get read aloud when the screen is locked including sensitive content such as text messages, Hangouts conversations and so forth. If I disable this feature, TalkBack stays quiet when notifications appear on the lock screen, however, as the screen automatically locks after a short period of time when on a call, this means nothing gets read which isn’t helpful since I don’t get the sounds in the call scenario either. But let’s push that entire mess to the side for just a moment and talk a little about notification sounds themselves. One of the really cool things about Android is that many apps allow their notification sound to be customized. This means that unlike in iOS where many applications use the default iOS tri-tone notification default sound, Android applications allow the user to pick from whatever text/notification sounds exist on the device. This is one feature I absolutely love, at least I would if Android would stop resetting certain sounds to the default sound. For example, I configured my device to play one notification sound when receiving work Email and another sound when receiving personal Email. That worked fantastic for three days or so, but now I’m getting the default notification sound regardless of whether Email is received on my work or personal accounts. Other apps which have unique notification sounds on iOS don’t seem to have any unique sounds on Android, either that, or they do have the same unique sounds, but the default notification sound is being played for reasons I can’t explain. For example, there’s an accessible dice game called Dice World which has a notification sound of dice rolling when an opponent has played their turn. Initially, this sound would play just fine on my device, but now, I just get the standard notification sound and don’t seem able to change it. Quick side note: Yes, I do have the “play notification sound” enabled in Dice World. Same situation with Tweetings, a very powerful Twitter client that has built-in notification sounds that initially played, but which now no longer do. Point here is that the ability to customize notification sounds is extremely powerful, but I’m not sure how stable it is. In addition, not all apps allow notification sounds to be customized in the first place.
As I wrap up this blog post, I’m left with the feeling that I’m barely scratching the surface of Android notifications. I say this because I’ve gotten feedback on Twitter and elsewhere that others are not having the same experiences as me. For example, some people claim to have an edit button on their Notification Shade which allows them to specify how notifications get sorted while others do not. I’m also not sure if anyone else is experiencing the same inconsistencies as me with regard to notification sound preferences resetting themselves to default. In the end though, I remain confident that I can find workable solutions to these challenges, how difficult those solutions may be to implement remains to be seen.
I wanted to write about something which Android folks probably take incredibly for granted, but which might be a bit perplexing to users coming from iOS. And it’s one of these things that, as I write it, seems incredibly silly yet it’s really not. In fact, it’s one of those twist of irony things that make Android an attractive option in the first place. First some background.
As written previously, I’ve been learning on a Motorola G4 Play which I picked up on Amazon Prime for around $149. I really love this particular device, it’s small, it’s light-weight, texture wise it’s easy to grip, and you just can’t beat the cost. Still, one thing that’s been frustrating me is that occasionally, when I double tap something, the phone doesn’t register a double tap. In addition, I find that sometimes I’ll swipe my finger from left to right and yet the phone will act as if I hadn’t swiped at all. I was complaining about this to a friend of mine, David, one night.
“gosh,” I complained, “how can anyone take Android seriously when it can’t even reliably recognize a swipe gesture?””
After a bit of a pause, David cautiously replied, “do you think it could possibly be your hardware?”
Honestly, in my frustration, I hadn’t even considered the hardware angle and how that might have a real functional impact on things like gesture recognition. But it stands to reason that the hardware differences between my $149 Moto G4 Play and David’s $649 Pixel just might be a factor somehow.
Going down the rabbit hole
I wanted to get an idea of just how much of a difference different hardware configurations might make, especially in terms of device accessibility. Obviously devices with faster processors and more RAM are going to perform at greater speeds, but what about touch screen sensitivity, ROM customizations and anything else that might impact accessibility?
I started out by purchasing an Asus ZenFone 3 Laser because not only is its metal construction extremely solid, but, well, it just has a really awesome name. OK all that is true, but I really bought it because it has a larger display, more RAM and a slightly faster processor than my Moto G4 Play. After getting through the setup process, I was introduced to what Asus calls ZenUI 3.0. ZenUI is basically the Asus customization of Android including a number of applications and widgets, a redesigned home screen, customized dialer, custom sound effects for things like locking and unlocking the screen, and notifications and other tweaks to make their phone unique. Coming from iOS, the idea that the entire home screen, notifications and even the phone dialer can be customized is very unsettling. After all, if I talk to another iPhone user, I can walk them through how to place a call because I do it exactly the same way on my own device. The asus customizations, however, were so significant that I was unable to figure out how to access my menu of applications. I want to be clear here, I’m not saying necessarily that finding the application menu is inaccessible, it just wasn’t at all intuitive and definitely wasn’t the same experience as on my Moto G4 Play. What I soon learned though is that Android allows for the installation of what are known as Launchers. My understanding thus far is that Launchers basically define things like the home screen layout. After installing the Google Now Launcher, which is apparently installed by default on my G4 Play,my application menu appeared where I was expecting it and some of the other random dialogs that had started popping up simply went away. In the end, I experienced similar frustrations to those I had been facing with the G4 Play with the additional frustrations of figuring out how to get my home screen and other aspects into a state where I could use them. As awesome as its name is, the ZenFone soon found its way back to the store.
Next up, I purchased a Blu 5R which is also a solidly built phone – yeah, I tend to gravitate toward phones that are solid, heavy and which feel like they won’t fall apart at the drop of a hat. As with the Asus model that I returned, the Blu phone has a larger screen and slightly better specs than my G4 Play. While the Blu had its share of customizations, such as rather cute startup and shutdown sounds and a number of pre-installed applications, my experience was a very positive one. Although not perfect, I experienced fewer issues with gesture recognition, I loved the finger print sensor (the G4 Play doesn’t have this) and the speaker, once I realized it was initially covered over by a sticker, is really fantastic. If anyone reading this is looking for a budget entry-level phone, the Blu 5R should definitely be considered. I wound up returning mine, but only because I couldn’t justify it given that I already own the G4 Play.
And so it was with great anticipation that I awaited the arrival of my latest phone from somewhere in China, the OnePlus 3T. I’d never heard of the company, OnePlus, but they are a startup specializing in high-end, high-performance devices at mid-level prices. The specifications of the OnePlus 3T rival those of the Nexus at just over half the price and the reviews are fantastic. If I decide to seriously make the switch to Android, the 3T, with it’s super fast battery charging capability, 6 GB of RAM, convenient slider to quickly enable do-not-disturb and amazing form-factor is a device I could see myself using day-to-day. More importantly though, gestures are definitely recognized, accurately and consistently.
What have I learned?
I’ve actually learned a lot over these past few weeks, beyond the fact that my local electronics shop has a really great return policy. First, when purchasing a new Android device or when seeking assistance, it’s important to remember that Android devices can be different, sometimes vastly so. If you’re coming from iOS, this is extremely important because for the most part, iOS devices and how they operate are pretty similar across the board. Another thing I learned is that when an Android user says, “hmm, I’m not experiencing that issue,” it could really mean that, given their specific hardware/software which may be different than yours, they’re really not experiencing the same issue as that which you may be experiencing. It’s been my experience that sometimes, when an iOS user says that they’re not experiencing an issue, it’s meant as a mild rebuke: something along the lines of, “I’ve got the same hardware as you, I’ve got the same software as you, it’s working fine on my end, clearly it’s a problem on yours.” Looking over this paragraph, I realize I’m over-using the word experience, but in a way, that’s exactly what we’re talking about here. One of the very things that makes Android such an attractive option is the flexibility to customize just about every part of the experience. This comes at a cost though, the cost being fragmentation between what I might experience and what you might experience.
One of the things that I’m most excited about with Android is that now, it’s possible to activate TalkBack when the device is first booted. While folks today may take this for granted, this ability to be off and running right out of the box wasn’t always the case. Back in the day, with my first Android device, I had to get sighted help to walk me through the initial setup and then had to have them help me get TalkBack from the Google store since it didn’t even come pre-installed. So yes, I’m very excited that I can order an Android device, have it arrive and without assistance, can get the thing talking.
The Moto G4 Play is a very small device which feels like it’s made entirely out of plastic. The device is very thin, thinner than my iPhone. While the device appears to be all plastic, it also feels rugged. On the right side of the device are two controls, the top most being the sleep/power button which contains tactile markings. Below that is another somewhat long button which serves as the volume control. The bottom edge of the device contains a micro-USB port for charging. There are no controls at all on the left side of the device. The top of the device contains a headphone jack. Quick side note here, it’s actually taken me a while to get used to the headphone jack being on top of the device because for many years, Apple has positioned theirs on the bottom of the device. Anyway, this brings us to the front of the device which has an ear piece on top followed by the touch screen. At the bottom of the touch screen is a tiny tiny little hole. I actually initially thought the screen was chipped or something, apparently, this hole is for the microphone. Flipping the device over, the back is pretty nondescript except for the camera which does protrude a little. The entire back of the device can actually be pealed away revealing a removable battery, a SIM slot and a micro-SD card slot. To clarify, when I say that the back can be “pealed away”, I mean it quite literally. Using a finger nail or something extremely thin, the back can literally be pried off the back of the device. The plastic actually bends when doing this and I confess that when I did this for the first time, I was totally sure the device would not go back together again. Having done this multiple times since, I continue to be amazed that the back does snap back into place with no ill effects other than my own slightly elevated heartbeat.
Getting this thing up and talking
One of the really neat things about many Android devices is that when they’re booted, the user gets a small vibration to indicate that something is happening. Personally, I love this added bit of confirmation as I don’t have enough light perception to tell if the screen is on or not. Anyway, after waiting a little while, i placed and held two fingers on the screen, this is the shortcut I was told would activate TalkBack. And sure enough, after a few seconds, it did! TalkBack helpfully launched a tutorial to help me learn its gestures and become more familiar with how I could navigate different types of objects on the screen such as edit boxes, lists, multi-page scrolling and so on. The tutorial is broken up into small lessons each of which contain exercises that can be performed to make sure the user understands what’s going on; it is in one of these lessons that I encountered my first real problem.
A bit about TalkBack gestures
One of the things the initial TalkBack tutorial does is acquaint the user with the basic TalkBack gestures. Coming from the perspective of an iOS user, some of these gestures seem weirdly complicated, like the swipe up then right to open the global TalkBack menu, or the swipe right then down to open the notification shade. Still, my goal here is to learn and so I followed the directions given in the tutorial. Eventually, I got to an exercise which focused on changing the reading level, the reading level meaning whether TalkBack should read line by line, word by word, character by character … you get the idea. “Swipe up then swipe down,” said the tutorial, but this didn’t accomplish anything. Well that’s not entirely true, what it did accomplish was getting TalkBack to read random things on the screen. I tried this multiple times and try though I might, I could not get the reading level to change from its default. Was I doing something wrong? Maybe I’m not swiping up and then down straight enough? Maybe a swipe means something slightly different in the Android world than it does in the iOS world? The other thing I found myself doing at this point was slightly tilting the device. This had the effect of causing the screen to change orientation between portrait and landscape modes. Not a problem except in this instance where half of my up/down swipes were probably being interpreted as left/right swipes because of the change in screen orientation. Eventually, I got frustrated enough to press the “next” button and continue on with the tutorial. The next tutorial lesson tried to teach me about cursor movement, but guess what? Yep, the cursor can also be moved by characters/words/lines/… and yep, I couldn’t make that work either.
Surviving the tutorial and beyond
Eventually, I “nexted” my way through the rest of the tutorial and reached the “finish” button. Setup continued at this point and I was able to log into my Google account, answer questions about syncing, location sharing and eventually got a screen that told me I was ready to go. Ready to go, but where? I’m going to write a separate post detailing this next topic in more detail, but the thing I discovered is that the concept of a home screen like exists in iOS is not as cut and dry in Android. I’ve learned that many “Launchers” exist for Android and that depending on which came bundled with the device, or which the user may have downloaded, one device’s home screen may not look at all like another. I’ve had my device for a few weeks now and still cannot figure out any logic to its home screen. What I have figured out though is that along the bottom edge of the touch screen, there are a few virtual buttons. From left to right these buttons are: “back” “home” and to the far right, “overview”. Just above the “home” button is an “applications” button which takes me into an alphabetized grid-view of the applications installed on my device. Maybe there’s a more efficient way to access apps, especially those I frequently use, but for now, I rely on this alphabetized grid to find just about everything. I figure: if it’s not in the grid, I probably don’t need it right?
- I really love that I can enable TalkBack right out of the box. This means that I can impulsively order a $149 phone from Amazon Prime Now, have it delivered and start using it right out of the box.
- TalkBack has a really neat tutorial that helps me learn in a very structured way. Each lesson provides detail on specific TalkBack-related tasks I may wish to accomplish and then provides me with exercises so that I can practice accomplishing them.
- I have no idea at this point how to change my reading level so that I can have TalkBack read word by word or line by line, I'm just stuck with the default for now. The big problem I have with this is, well, the tutorial was pretty specific on how I should change from level to level and it just didn't work. Quick note: I've since solved this problem with a number of updates, I'll get to that in a future post.
- I have a home screen that doesn't make sense to me. I've shown it to sighted folks who always respond the same, "huh, that doesn't look like mine." Research has told me that I can change this by installing something called a Launcher, but I don't know where to even begin with that yet. In the meantime, I have this very nicely alphabetized grid which while not super efficient, does help me find everything I need to find.
More to come soon, so please stay tuned. :)
While I’ve been an iOS user for many years, I’ve always been very curious about Android. Indeed, the openness of the platform and all that that entails speaks to my inner geek. I first experimented with Android back in 2011 when Android 2.1 was all the rage because way back then, it was still possible to get phones like the T-Mobile G2 with physical keyboards and, as a serious texting and social media junkie, this appealed to me. Eventually I drifted back to iOS until a few years later when I traded in my iPhone for a Samsung Galaxy S III. The Galaxy served me well until Apple released Siri which I just had to have and so back to iOS I came.
Over the years, there’s been quite a bit of innovation happening in the Android accessibility space and while I do my best to keep up with it all, it’s hard to really understand it if I’m not using it. That said, iOS is working just fine for me, so it’s hard for me to justify the cost of a high-end Android device like Google’s Pixel even though it admittedly does look pretty darn cool. And so it came to pass that one day, while researching Motorola cable modems on Amazon Prime Now, I came across the Moto G4 Play which retails for around $149. um… a quick note to my more juvenile readers, this model is the letter G followed by the number 4 followed by the word play, not the “foreplay” you immediately thought of when your speech synthesizer read this to you :). Anyway, I realize that at this price point, this phone won’t be as capable as higher-end models, but for someone like me who is generally curious – wanting to see what’s out there – I thought this model might be a good place to start.
What I thought I’d do is write a series of articles chronicling my rediscovery of Android, so that others who might be thinking of giving it a try will have an idea of the types of things they might encounter. I’m not looking to convince anyone that Android’s better than iOS or vice versa. Also, as I write, keep in mind that I too am learning and discovering, so if you find I’m doing something wrong, or if I should be doing something a different way, please don’t hesitate to comment.
I’m excited to be rediscovering Android and hope you enjoy the journey with me.
Since she was little, my daughter, Gabby, has had a medical condition that has required occasional testing at the children’s hospital. One of the more positive things to come out of this is that she has developed a deep sensitivity for children who, unlike herself, don’t get to leave the hospital and spend the holidays at home. At age nine, Gabby decided to start a program to get gifts for children who must remain at the hospital. These gifts are purchased from a list provided by the hospital based on things the children have asked for which do not pose any medical or other risks. Gabby delivers these gifts to the hospital where they are sterilized, wrapped and delivered to the children by hospital staff.
This year, I’m helping Gabby take her Gabby’s Gifts initiative digital. I’ve set up a GoFundMe campaign for those who may wish to donate to her online. Below, find Gabby’s own description of her initiative. If you are willing to donate but encounter challenges with GoFundMe, please reach out and we can figure out another way to make it happen such as PayPal, Square Cash, or good old-fashioned paper check.
In advance, thanks so much for reading, considering and sharing.
Dear Friends, Christmas time is here. This is the season of caring, sharing, family, and fun. Unfortunately for many children this year Christmas will be another day in which they have to be in the hospital away from their friends and family. In 2012, I realized that I was very blessed to have so much in my life and I wanted to give something back. I decided to try and make Christmas a little brighter for those children who, by no choice of their own, can’t be with their family and friends at Christmas. There are a lot of children that have to spend the holiday season at the hospital which is very sad, hence Gabby’s Gifts was born. Every year I collect money or gifts to donate to the University of Minnesota Children’s hospital. All of the money collected is used to purchase gifts for the children such as toys, slippers, pajamas, and other things that the kids have asked for. If you would like to donate to Gabby’s Gifts please do so before December 12. I will be bringing the donations to the Hospital on the 14th of December this year. I have also set up an online GoFundMe page for those who may wish to make an online donation: https://www.gofundme.com/gabbys-gifts-2016 . I want to thank you in advance for considering donating and helping to make a child’s Christmas just a little brighter. Sincerely, Gabrielle Sawczyn
Recently, I traveled to New York where the plan was for me to connect with one of my colleagues and then travel to visit a client. Since we were arriving from different airports and since we would be needing a car, it made the most sense for us to meet up at the car rental counter. I was amused thinking of the reactions I would likely get from people as I, a blind guy, asked for directions to car rental. The reaction I got from one guy though really made me stop and think, he said, “oh, you must be going to rent one of those new autonomous cars, that’s got to be so neat.”. To him, the idea that a blind person might be renting a vehicle wasn’t very far fetched at all. I casually mentioned my destination to a few other people just to see what kind of reaction I might get. Strangely enough, the only somewhat negative reaction came from a woman who was all concerned that I could get hurt crossing the street which needed to be crossed in order to get to car rental. My take-away from the day? There remain people skeptical that blind people can independently cross streets, but the idea that blind people could possibly be renting cars is no longer the unbelievable concept it might once have been.
On July 26, I received yet another support Email saying in part,
Dear Steve, Thank you for contacting Weight Watchers. My name is [Name redacted] and I will be more than happy to assist you with troubleshooting your application. I do apologize for this inconvenience. Your email has been escalated to me. In order for us to be sure we offer you the best support for Weight Watchers Mobile, please answer the following questions for us: * Are you using a mobile device or a computer? * What is your device model and Operating System? * If you are using an iPhone, iPad or iPod, please confirm whether you are using the Weight Watchers Mobile app for iPhone App or accessing our mobile site [a.weightwatchers.com](http://a.weightwatchers.com/) ? * If you are using a computer, what internet browser are you using. * If you have not already done so in your initial Email to us, please let us know what error you are receiving. * If your issue is technical in nature and you have not already done so in your initial Email to us, please describe as best you can what is occurring and what steps you took prior to running into the problem. Also please provide any error messages you may have received. As soon as we receive your response we will investigate on your behalf.
OK, clearly, they’re still confused. That said, this issue is obviously on someone’s radar as there most recent app update has fixed the SmartPoint values reading on foods. The daily and weekly totals still don’t read correctly, but at least now I am no longer disillusioned by chocolate cake having a 0 point value. :)
While the title of this post may seem a bit dramatic, I assure you it isn’t, at least not to me. In a nut shell, the situation is this: I pay for an app or service, use the app or service and then, with one update, it suddenly becomes impossible to use the app or service any longer. This may not seem like that big a deal to those who are able to see, but for those of us who depend on VoiceOver or other assistive technologies, it’s a situation that is very real.
As many of my social media followers know, I’ve been a member of Weight Watchers for quite a few months. After all, I can definitely stand to lose a few pounds and I’ve seen the program be successful with many who have benefited greatly from it. I was also very encouraged to learn that Weight Watchers has a page dedicated to accessibility which says in part::
In our ongoing commitment to help as many people as possible to lose weight, including those with disabilities, Weight Watchers is dedicated to improving accessibility for people with visual impairments in the following ways.
The page then goes on to describe how to use the Weight Watchers online service with the JAWS screen reader, with VoiceOver and Safari, how to request information in alternative formats, how to optimize the Tracker for accessibility and much more. I felt their commitment to accessibility to be genuine and in all fairness, their web site and iOS app worked extremely well, that is until the latest version.
For those unfamiliar with Weight Watchers, the program is essentially a points-based system where by individuals are allocated a number of points to be used throughout the day and foods are also given a point value, healthier foods receiving lower values than non-healthy foods. A person can eat whatever they wish, the goal being to stay within their allocated number of points. In short, it’s totally fine to have a big slab of chocolate cake, but because that slab of cake has a high point value, a smarter decision might be to opt for different, more healthier foods. Using their iOS app, it’s possible to look up a food’s point value and to track it against the daily total. Not only is this an efficient system, but the app can be instrumental in making healthy food choices by allowing the user to look up point values before deciding what to eat.
Like many of their customers, I update the Weight Watcher’s app regularly. I certainly didn’t anticipate any problems when installing the latest version described as:
What's New in Version 4.9.1 Fixed an issue with the barcode scanner. We're always working to improve the app and maximize your experience — thanks for sharing your thoughts so we can make it even better. More exciting improvements to come!
Imagine my surprise when, after installing this harmless-looking update, all the point values suddenly started reading as ‘0’?
After getting over my initial euphoria over chocolate cake suddenly having a ‘0’ point value, I realized that the problem was in fact an accessibility one. For whatever reason, VoiceOver is no longer able to read point values accurately. What this means is that in search results, when adding foods, when reviewing meals and anywhere else a point value might present itself, it is simply read as ‘0’. Given the critical part the point values play in the program, this is a real problem. How can I utilize a system based on points when I can’t read the actual points?
So, what to do? My first step was to utilize live chat functionality which is built directly into the Weight Watchers app. This chat system is pleasantly accessible and since it’s available around the clock, I thought it would be a quick way to describe the issue and see if it had already been reported. After explaining the situation to the chat representative, my chat was “transferred”; I never knew a chat could be transferred. Anyway, I get a new representative to whom I again explain the situation only to have my chat disconnected. By this point my hands hurt from all the typing in addition to my already-mounting frustration, so I figure the next best thing to do is to contact them via the web site. I do this, being sure to mention that I’m blind, this is an accessibility issue followed by a descriptive explanation of the problem. Over a day later, I receive this response:
Dear Steve, Thank you for contacting Weight Watchers. My name is [name redacted] and I'm sorry about the challenges that you have encountered in accessing your account through the WW Mobile App. Rest assured, that I will help you with your concern. I appreciate your subscription with our Online Plus plan. We want to take this opportunity to thank you for trying our site and for making us a part of your weight loss journey. Please try the following troubleshooting steps: 1. Please log out from the App and log back in. 2. If that does not work, force close the App if you have an Android device. Then relaunch the App. For iOS, close the App by double-clicking on the home button, swipe up on app snapshot, and click home button. Then relaunch the App. 3. If steps 1 and 2 do not work, delete the App and reinstall. Please note that recently scanned items are stored locally on the device and will be lost when you uninstall. If you would like to keep a recently scanned item, please save it as a favorite. The Mobile App requires iOS 8.0 or later. It is compatible with iPhone, iPad, and iPod touch. For Android users, it requires Android 4.0.3 and up. While it might also work on an Android tablet, it is not yet fully supported and may not be compatible. Let us know how things go! If the troubleshooting steps do not help, please reply here with details about what you are experiencing. We’ll investigate further and reach out should we need to gather additional details.
Clearly the rep misunderstands what’s meant here by “accessibility” despite my having mentioned blind, VoiceOver, and referencing their own accessibility page in my request. No matter, I decide to be a trooper and try all the steps which, as expected, don’t accomplish anything at all. I’ve sent an even more descriptive reply and as of this writing, have heard absolutely nothing.
So why the dramatic post title? It’d be one thing if this were a situation pertaining to one specific company or app, but this is a situation that occurs again and again. Right now on my phone, I have an entire folder of apps that fall into this category, apps that I either want to use or that I’ve come to depend on which have become partially or completely useless to me. Some of these apps are health-related, some are social and more disturbingly, some are productivity apps that help me maintain employment. The company may change, the app or web site may change, but what it all amounts to is that I spend a lot of time feeling frustrated and navigating the realm of tech support when, like everyone else, I just want to live my life. It’s especially sad in this case though, given Weight Watcher’s
"ongoing commitment to help as many people as possible to lose weight, including those with disabilities,".
Before I’m dismissed as just another hater, let me say that like many, I couldn’t wait for the Apple Watch. I thought the idea was cool, the tech was cool, the possible use cases were cool. To that end, I waited up half the night to pre-order the watch just as soon as Apple listed it on its site. I counted down the days (and eventually the hours) until its arrival. I took half the day off, so that I would be sure not to miss the UPS delivery driver and I spent the weekend after receiving it excitedly installing and setting up apps. Since then, I’ve attempted to use the Apple Watch daily, I’ve listened to numerous podcasts (both disability-related and non) on tips and tricks to make use of the watch and after all that, it remains a struggle at times not to just throw the damned thing across the room.
So why this post? I still think the watch represents really cool tech and despite my desire to throw it, I think it’s even worth all I’ve gone through. I’m happy to be an Apple Watch owner. My hope is that in describing the challenges I’m experiencing, others will identify as having similar experiences or even better, others will have solutions, solutions that can move this tec from being cool, to actually being useful for me. So, let’s get to it.
- Sluggishness: The watch seems incredibly sluggish. Whether it's waking it up to simply check the time, finding an app, launching an app or doing something within an app, it seems to take forever. Sometimes I have to try not to tap the screen in a "hurry up already" gesture.
- Loading, loading, loading, loading...: Sometimes, when I launch an app, I get the app as expected, but often, I get this "loading" graphic. This seems to happen somewhat randomly, but when it does, there seems to be no hope of getting anything done. I've tried forcing the app to quit and relaunching it, but this has yet to ever fix anything. Eventually, my only recourse is to perform the task on my phone which, I could have done initially.
- Hearing me is not the same as listening to me: I press and hold the digital crown, speak a command, let go of the digital crown, get the little vibration that seems to mean "got it," and ... nothing at all. So I think OK, maybe I didn't speak clearly or maybe there was background noise, so I go through the process again and again, nothing. So I think maybe it's just not able to contact whatever it needs to contact on the network, however, I find that it's connected and my phone is connected and Siri works just fine on my phone. By this point, I've gone from trying to do something on the watch to trying to troubleshoot potential connection issues with Apple. Sometimes restarting the watch fixes this, sometimes it doesn't help at all.
- Placing a call doesn't always place a call: This is somewhat related to the above point in that I'll ask the watch to call someone, it will say calling so-and-so, but nothing ever happens. Eventually, I tap the screen to see if anything has happened only to be greeted by the watch face. It's almost like the watch is saying, "huh, was I supposed to do something?"
Excepting calendar and activity, every notification has the same tone and vibration pattern: On the phone, I often can tell what app is notifying me because apps are not forced to use Apple’s default notification tone. Indeed some apps even allow me to set a specific notification tone within the app. Not so on the watch where every notification uses the same tone/vibration. Put another way, when I hear the notification tone, I don’t know if it’s something important like a breaking weather alert, or something that can wait like FaceBook wondering if I know someone or other. The net result is that I often ignore notifications and then have a pile of them to go through later, or just miss things entirely. To try and address this, I’ve stopped many notifications from going to my watch, but isn’t that part of the reason for having it in the first place?
- Sometimes, I just want to check the time: OK, to be fair, this might be made easier if I were to use a different watch face or fewer complications or something, but I'm itemizing it here because it drives me crazy and may be doing the same to others. Essentially, there are times when I just want to, well, check the time. So I tap my watch screen and after waiting for it to do its wake-up thing, it reads me the current temperature, or my next appointment, or my battery status, everything but the current time. So I try and flick through the watch face, but that just tells me I have unread notifications. I eventually give up and figure that time is just an elusion anyway.
- Where'd that app go anyway?: I've tried multiple ways to organize my watch apps to make them efficient and easy to find. I've dragged them here, I've dragged them there, I've uninstalled them and tried reinstalling in the order I want to see them and yet it seems that no matter what I try, the watch eventually mocks me by deciding to just do its own thing with my app organization. It's very probable that I don't have a good understanding of the Apple Watch app layout, so if someone has a good description of this, I'd be happy to check it out. In the end though, I need to quickly be able to open an app and not spend a minute looking for it, or tell the watch to open it and hope it's not one of those times where the watch is "out to lunch" somewhere.
- I could spend half my life deleting things: So this is only in part a criticism of the watch, but only in part. If I receive an iMessage, it goes to all my Apple devices. Now, I can easily delete it from my iPhone and iPad. On the watch though, I have to open messages, long tap on the thread, choose delete, and confirm that I really do want to do this delete thing. Since there's sluggishness throughout this entire process, every step takes quite a bit of time. Way more complicated and far less efficient than on other devices where I can delete a thread with just two gestures. This probably applies to other apps as well, but Messages is the app I notice this happening in the most.
- Apps that seem to do nothing: OK, I can't blame this on the watch, but there are a few apps that seem to serve no purpose what so ever. For example, if a messaging app lets you view messages but not reply or otherwise interact with them, what's the point? In such situations, is it best to leave the app installed in the hopes it'll eventually do something, or is it better to uninstall it and just mirror notifications?
Again, the above frustrations represent those that I face on a daily basis. This doesn’t mean that the watch is something to stay away from and definitely doesn’t detract from the “coolness factor”. I would love to know though, am I alone? What frustrations, if any, are others facing? And, most important, does anyone reading this have ideas of things I might try?
Too often, we are quick to criticize developers for not doing enough to make their apps accessible. Today, I’d like to extend my thanks to one who has consistently embraced accessibility.
Threema is a messaging application that offers end-to-end encryption. In English, this basically means that your chats via Threema can only be read by their intended recipient. As Threema puts it on their web site, they offer “seriously secure messaging.” What makes Threema stand out to me personally though is their dedication to accessibility. Not only do they constantly seem to improve the experience for VoiceOver users, but they are very transparent about it going so far as to call it out in their release notes. And why shouldn’t they? Making stuff accessible does require hard work and having done it, this is something they totally should be bragging up. So thank you, Threema, for being awesome.
Well, it’s that time of year already, that magical, mystical time that we call CSUN. CSUN is the 31st Annual International Technology and Persons with Disabilities Conference. It’s a place where learning is shared, ideas are conceived, people throughout the field of accessibility network and – one of my favorite things – where new gadgets and gizmos are often unveiled. I’m extremely fortunate to work for an employer who has been willing to send me to this conference of awesome and I’m very excited about the opportunity to attend and to present. That said, the expectation is that I come away from CSUN full of new knowledge that I can utilize to better help my clients reach their goals. So, while you and I may be at the same conference, meeting up might be a challenge since my primary reason for going is to attend sessions and learn. To that end though, I thought I’d post the sessions I’m currently planning to attend, so that if our paths cross, we can at least say hi.
Wednesday, March 23, 2016
- Using Visual ARIA to Physically See and Learn How ARIA Works
Visual ARIA allows for ARIA to be visually observable to aid in the learning process and to convey when ARIA is incomplete or being misused.
- Create Your Accessible Taste: McDonald's Accessible Kiosk Initiative
A case study on McDonald's effort to make their Create Your Taste kiosks accessible
- Dueling Mobile Note: I'm a presenter. :)
As accessibility efforts intensify across mobile platforms, More and more users with disabilities are questioning which mobile solution is the “right choice” for them.
- Google Apps Accessibility
Learn more about the latest accessibility improvements to Google Apps
- The Mind's Eye: Perception Through the Mind of 5 Visually Impaired Personas Note: I am a presenter. :)
This presentation will demonstrate/discuss the perception of visual content by 5 Personas: Dyslexia, Low-Vision, Partial Lifetime Blindness, Accidental/Recent Blindness, and Lifetime Blindness.
- Touchscreen Accessibility in Self-Service Terminals
Touchscreens on self-service terminals can cause accessibility challenges. We present the results of a study developing an accessible input method for a touchscreen ATM.
- The Possibilities are I-infinite: Tactile Overlays for the iPad
This presentation will provide participants with the skills and tools they need to begin making tactile overlays. Various examples of activities and applications will be given.
Thursday, March 24, 2016
- Interactive Maps, from Google to Bing, How Do You Make Them Accessible?
Making interactive maps accessible is much more than just provided text locations of different points. Gian Wild explains exactly what needs to be done.
- Accessibility Overview of Amazon’s Devices, Starting with Our $49.99 Tablet
Demonstration of the latest in Amazon device accessibility, including Alexa and Echo, Kindle E-Readers, Fire TV, and our Fire Tablets with VoiceView starting at $49.99
- An Appliance Display Reader for People with Visual Impairments
We describe ongoing research at Smith-Kettlewell on the Display Reader project to enable blind and visually impaired people to read appliance displays.
- Manufacturers' Device Showcase
Visit the Device Showcase to see the latest phones, tablets and more! The Showcase is open from 11 am to 1 pm daily. This is your opportunity to meet with device manufacturers and sample their products.
- Is it A Link Or A Button? The Ultimate Showdown Note: I think I'm an unlisted presenter. If not, I'll be the guy in the audience with extremely strong opinions on the subject. :)
We will bring together 5 experts, 10 scenarios and have them privately decide whether the link or button role is more appropriate for each.
- Chrome & Chrome OS Accessibility
Learn about the built-in accessibility features within Chrome & Chrome OS. We'll also demo braille support in Chrome, our screenreader ChromeVox, & Chrome on Android.
- How Wells Fargo is Improving Access for People with Disabilities
Learn about Wells Fargo Bank’s overall strategy for people with disabilities and improved access to banking services, including launching JAWS in over 6,000 bank locations.
- Design Thinking at Google - Methodologies & Mindsets for A11y Innovation
Join us to learn about the Design Thinking Framework and how it helps to drive user centered design thinking in the Accessibility context.
Friday, March 25, 2016
- Grommet: An Accessible Open-Source User Experience Framework
Be amongst the first to learn about Grommet, a modern UX framework created by Hewlett Packard that allows for rapid development of accessible web applications.
- Mobile Testing: Through the Eyes of a Screen Reader User and A11Y ExpertNote: I'm a presenter and in fact, due to last minute scheduling conflicts, I'll likely be the only presenter. Someone please bring me some black coffee? :)
A screen reader user and A11y Expert will demonstrate performing an A11Y assessment on mobile devices. They will demonstrate the Pod Methodology, techniques and tools.
- Accessibility Support Baselines: Balancing User Needs Against Test Effort
Approaches for creating an enterprise support baseline and test strategy in light of changes in the desktop assistive technology market and mobile device fragmentation.
- Digital Accessibility at Small businesses
Highlighting differences about accessibility at small businesses compared to enterprises, this presentation will focus on increasing awareness and scaling accessibility at this important, forgotten sector.
- OpenAIR Challenge: Mentoring the MentorsNote: I'm not an official presenter, but as I was a mentor, Joseph Karr O'Connor asked me if I might attend. I have a tremendous amount of respect for Joseph and this entire initiative, so it will be my honor to do so.
The Open Accessible Internet Rally (OpenAIR) has a mentorship program. This presentation will focus on the experience of the mentors.
- Strategies for Implementing Accessible Online Media
This presentation will cover web accessibility laws and guidelines as well as how to apply these standards when creating accessible online media at your institution.
Receptions and evening thingsCSUN boasts a number of receptions and other evening events and I'm not sure yet which of those I'll be attending. Traditionally, Deque holds an evening reception and that I'll definitely be attending, just as soon as I figure out where and when it is. Whether at a presentation, a reception, lunch or coffee somewhere in the midst of all that, CSUN is a great opportunity to connect and I'm looking forward to meeting as many people as possible. If you want to connect, please reach out to me on Twitter, or comment here and I'll gladly exchange contact info. Looking forward to a great CSUN16.
- Grommet: An Accessible Open-Source User Experience Framework
- Using Visual ARIA to Physically See and Learn How ARIA Works
My mouth dropped open in disbelief when a friend, Grace, told me about an app designed to help the blind stop rocking back and forth, something that many blind people do. There’s lots of reasons for the rocking that I won’t go into here, but suffice it to say it’s one of those habits that parents, educators and other adults try to curb in children in an effort to help them be more “socially acceptable.” Well move over parents, educators and other adults, because as Apple would say, “there’s an app for that.”
Brought to us by the New Mexico Commission for the Blind:<blockquote>iFidget is an app designed to help people with a range of habits from rocking back and forth to restless leg syndrome or even just constant fidgeting. It has an incredibly simple design, but it has a very big future.
iFidget is designed to be used while you’re sitting. It can be set to vibrate or play a sound when it detects that you aren’t sitting still. iFidget attempts to tell the difference between somebody who is rocking, fidgeting or moving constantly vs somebody who is just shifting their weight at a table.</blockquote>
The description goes on from there describing how the app can be a “therapeutic tool” that can help people who subconsciously engage in this behavior and wish to stop. So how does it work? Basically, the app runs on an iOS device and when motion is detected, it vibrates to provide the user with a subtle reminder, presumably to be still. The app can also play a sound effect if vibration isn’t an option or isn’t desired. In addition, the app’s sensitivity can be adjusted to ensure that a greater or lesser amount of motion is needed to trigger the alert. But wait, that’s not all. iFidget also gives the user – or someone working with the user – the ability to see a graph showing just how much rockin’ is happenin’.
As a long time hard-core rocker myself, I had mixed feelings when I heard about iFidget, the first one being absolute horror that kids could potentially be forced to use this app in school settings “for their own good.” Would a child see this as a gentle reminder or a means of negative reinforcement? And what about the potential humiliation of needing to share the graph with an educator or therapist of some kind? Second, the app just doesn’t seem very practical to me. I’ve been using it throughout the day and initially found that the app alerted me to any motion including when I’d engage in such socially unacceptable tasks as reaching for my coffee cup. Adjusting the sensitivity helped with this, however, the app would still alert me to major motion such as my standing up to walk into another room. In fact, I got quite the massage walking from my basement office to my upstairs kitchen. The app also doesn’t run in the background and can’t be configured to run when the iOS device starts up. Oh yes and if the device’s screen locks, the app stops working as well. One other discovery I made is that if I put the device in my pants pocket, I could rock with my upper body all I wanted – how long before kids figure that one out?
I posted a link to the app on Twitter and the response was swift and immediate.
The tweets go on and on and on … the above is just a small sampling … clearly this is an emotionally charged issue. While I’m certainly not opposed to apps that help people self-improve, I remain concerned about the potential long-term effects this could have on blind kids if forced to used this app. Oh and one more thing, while the description may claim that this app “has a very big future,” the app itself hasn’t been updated since November 20, 2014. So, positive or negative, what do you think?
Now that Markus has turned 16, he’s wanting to find a job, after all, what 16-year-old can’t use some extra money? :). Back in my day, this was an exciting time and there was never a shortage of things a 16-year-old could do: papers needed delivering, restaurants needed servers, stores needed folks to stock shelves, and fast food places needed people for just about everything. While many of these jobs still exist, actually applying for them is nowhere near as easy as it used to be. “Go online” they say, “fill out our online application.” While the going online part is easy for kids of just about any age, figuring out how to complete the forms just so they’ll be excepted by the online job systems can be a real challenge. For example, the applications are designed to capture education/experience/previous jobs and so forth, but if you’re just starting out, what to actually fill in? Unfortunately, leaving many of these fields blank is not always an option meaning that without filling something in, the form can’t actually be submitted.
I certainly don’t envy today’s kids just trying to get a start. While the challenge used to be getting up the nerve to approach a potential employer and asking for a job, the challenge today’s kids have is figuring out how to successfully navigate the complex and impersonal online job applications, which are not at all geared toward helping someone get a Fresh start, in the hopes that their information will be routed to a human somehow somewhere. I definitely wish Markus and all of today’s kids the best of luck.
subscribe via RSS