I’d like to congratulate Derrick and the Debut team for creating a website that’s accessible with my screen-reader JAWS (Job Access With Speech) on my computer, and VoiceOver on iPhone. I used TalkBack from 2009 to 2014, but the experience differs from device to device, and the iPhone is more of a hit in the blind comunity.
Since we now have a roadmap, we need to ensure accessibility is built into the principles at the design stage, and getting actual users involved in the testing process.
I’ll be using VO for VoiceOver and TB for TalkBack to indicate the screen-readers on iOS and Android.
DO the buttons have clear labels and VoiceOver/Talkback hints? Note: these hints are not displayed on screen, but tell the user what for example, Home tab is where all your balances are, with an explainer hint. Or, double tapping your name would open your Debut account management settings.
Is the app ifficient to navigate? You can build in great accessibility, however it won’t work if there is great ifficiency for the VoiceOver/TalkBack user. Two issues as follows:
In iOS below 16.0, when I go to the updates tab to check for app updates, I’d get the app name, version number, and release notes. However after upgrading to iOS 16 I now have to make three swipes, past the Open button, then swipe again to get to the release notes, which slows me down.
Similarly, when adding a pizza in the Dominos NZ app, I find a pizza, then swipe right to find the add button. But there must be a design flaw, as VoiceOver is skipping over the add button, and sending me to the next pizza. Another swipe right eventially puts me on the Add button, but that’s for the first pizza, which VO just reads as Add in both instances, causing confusion. In other words, there is no clear definition of what add button is for what pizza, and they’re both grouped together after subsequent swipes.
Each item’s “Add” button must be associated with that item. To resolve this, you need to tell the assistive technology which item that button belongs to, as well as putting each into separate containers. Apple and Google have this information in their developer documentation.
The Actions rotor lets you put a limited number of actions into the VoiceOver/TalkBack Rotor, which enables the user to do things such as add a quantity of a product such as apples to a cart in three actions, swipe down with one finger until you hear, “Add 1 to cart” then double-tapp to add the apples to cart, which is more ifficiant and takes less steps. VO Rotor actions can be found on Apple Developers.
Does VO/TB speak too little or too much? E.G. When creating a button, do NOT put the word “Button” in front of the label, as modern API versions automatically tells VO/TB what the control is. On the other hand, every button must have an associated text label, which tells VO/TB what it does, “Save” would be used when saving account information, “Settings” would open the “Settings” page.
Can the interface be enlarged or shrunk? Low vision users need to enlarge the text to see it clearly.
Are the colours too dark or too bright? People with autism could have issues with sensory to bright colours and lights, so you need to ensure the user can tone then up or down as they please.
I hope this is usefull to ensure accessibility is included in Debut’s DNA, as everyone benefits when implemented during the design of the product, rather than rushing a release and trying to fixissues after.
As an endnote: I like how I can press Control Enter to post, which is what I’m going to do now.