An interview with Kevin Chao

Kevin Chao seems to be everywhere tweeting, commenting and contributing to lists about various products and companies ranging from Apple, NVDA, Adobe, Google and Android. He’s a student, visually impaired and lives in the USA and as he himself admits loves to ‘get under the hood’ of technologies to see how they work and how they can be improved.

Like Jamie Knight, who I interviewed a couple of years ago, Kevin has an in incredibly in-depth knowledge of his field and gives a lot back to the web as a whole. I was lucky enough to grab some of his precious time to ask him a few questions about the state of accessibility across desktop and mobile as well as screen readers, WAI ARIA and HTML5.

Kevin, you’re a busy man! Tell us a little about what you do?

Back eight years ago is when I first had to personally utilize assistive technology as a freshman in high school. This was a shocking and life changing experience, especially when it came to access to information and independence. The range of assistive technology I had to embrace and master included a long white cane, screen reader, Braille writer, Braille, and note takers. I have always been self-motivated, a self-learner, inquisitive, challenged the status quo, pushed things to the limits, and always asked what if and why not? Of course, these various characteristics and aspects about me continued throughout high school and is the current me.

Flash forward to present day, where I’m a college student majoring in computer science and part-time Quality Assurance for Student E-Rent Pilot Project. I have always been intrigued, understood, and promoted the use of technology as a viable options, solution, and answer for many challenges, issues, and inefficiencies. This will mostly explain my college major and work in revolutionizing access to textbooks for students in higher education.

One of the themes I have observed and experienced personally and professionally is historically, there has been a lack of information that is readily accessible and lack of choice in accessible technologies. I found a challenge with this concept and have made a huge effort in working with mainstream and assistive technology companies in increasing awareness of, promoting the need for, and educating users in making more accessible technologies, ensure they work out of the box, and for users to have informed choice.

I agree it’s important a person shouldn’t be limited by choice due to limitations in software, hardware or content meeting their needs. Tell us a little about your journey with assistive technologies?

The first screen reader I was shown, taught, and used was the status quo—JAWS. Since then, I have used a variety of desktop and mobile screen readers, including: Talks for Symbian, Mobile Speak for Windows Mobile, VoiceOver for iOS, NVDA for Windows, VoiceOver for Mac OS X, System Access, Access for Windows, Window-Eyes for Windows, TalkBack for Android, Spiel for Android, and Mobile Accessibility for Android.

Would you consider yourself a typical user?

No, I would not consider myself a typical user. Most users have not tried about every platform that has been created, used about every permutation and combination of screen reader, enjoys getting under the hood, figuring out how all of this works, and advocates, promotes, and wishes for more and more. I would consider myself an enthusiastic and curious user who advocates for greater access and shares experiences in order to promote the knowledgebase of users, allowing for informed choice.

What do you think about the currant state of screen readers and web content?

The web is a constantly changing, dynamic, and exciting environment in which the majority of us spend our time, whether that be professionally or personally. The web used to be static, mainly text, and very simple—web documents. For the most part, it made it so that as long as the operating system(s) and web browser(s) were accessible with screen readers, the experience was quite good.

Now, let’s jump forward to where we have nearly half a dozen different operating systems, nearly half a dozen web browsers, and nearly a dozen screen readers. In addition, the web has become very rich, interactive, and contains multimedia content—web applications. Fortunately, there exists the constantly developed Accessible Rich Internet Applications (WAI ARIA) specification which web browsers and screen reader developers can and should follow in order to make the current and future web universally accessible, including to screen reader users. I think that overall, there exists some good examples of accessible web applications, but the majority are not there, and it will require work from the entire industry, and for all of us to work as a community.

You mentioned WAI ARIA earlier can you give us some good examples that you’ve seen?

Some excellent examples of WAI-ARIA, other than the various test pages that exist are from Yahoo! Yahoo! Mail provides a very rich, interactive, and accessible web amail app experience, which is extremely close to the experience that screen reader users are used to, have to come embrace, and look for in a desktop Email client. Yahoo! Search is also highlights the power of Search Direct with screen readers.

The key factors that allowed Yahoo! ARIA and accessibility initiatives to work extremely well are as a result of the visionary, leader, and manager being a very experienced and skilled screen reader user and understanding and embracing the need to work with the community. Yahoo! Worked very closely with NVAccess (developers of NVDA) and Mozilla (developers of Firefox), ensuring that everyone was in it together, testing, iterating, and delivering great ARIA.

I believe WAI ARIA should be used only when content can’t be made accessible using HTML and CSS. This is partly because it’s not fully supported across screen readers and browsers but also because it can be very hard to get right. Have you come across any examples where WAI ARIA is used but to little benefit for the user?

I am in full agreement. In addition to the excellent points that you mentioned, Henny, ARIA is a working draft, and ARIA 2.0 is the direction we are heading, not yet there, and needed for what we need for the next generation of the web. An example of where ARIA is used, but poorly implemented, is an opt-in experience, doesn’t work with all screen readers, and doesn’t provide braille access among many other things is:

With ChromeVox, NVDA, System Access and others we’ve seen a rise in free open source screen readers contending with commercial and more ‘establishment’ screen readers such as Jaws and WindowEyes. What are the differences between the two?

The commercial expensive screen readers are driven by, and focused on, government funding, business purchases, and retrofitting the screen reader to work with an inaccessible/unusable product. In contrast, the free open source or integrated screen readers are proactive, aggressive, and bold by keeping up very well with the technology industry, building to standards, and have users of the technology involved in the development of the screen readers.

Henny: Competition can only be a good thing as far we the end user is concerned!

There has been a lot of debate about the state of accessibility in HTML5. What are your thoughts on this?

From all I have gathered from research, discussions, etc. the overall landscape for HTML5 and accessibility is that ARIA is now integrated into HTML5, where now and before with HTML4, ARIA has been an add-on approach. There are lot more nuances, details, and specifics involved in HTML5 and accessibility, but nothing is yet finalized, the particulars are still being worked out, but I do think that integrated, improved, and embraced accessibility by way of ARIA in HTML5 is excellent.

What tips would you recommend to the day to day developer wanting to test with screen readers?

Before a developer should be concerned with testing with a screen reader, I would recommend that they get familiar with the various accessibility standards and language, such as WCAG, W3C, ARIA, Section 508/504, etc.

Once the developer’s product has complied with the relevant standard, I would suggest the developer use a screen reader, unplug/disable the visual display monitor, and mouse. Then, I would advise the developer navigate, read, and interact with his/her product, taking careful note of all information the screen reader is providing, evaluating if it is intuitive, clear, and usable. Of course, the developer will need to iterate several times before it’s accessible and usable, but a very important part in the entire experience is to get real world day-to-day user of screen readers to test, provide feedback, and iterate.

What’s the first thing you do when you open a web page?

This will depend if it’s on desktop or mobile, but I will describe and outline both.

On Windows, using Firefox, and NVDA I use ‘D’ to navigate by ARIA landmarks, ‘H’ to navigate by headings (if there are no landmarks present or I want next level overview), NVDA+F7 to have a full list of elements, TAB among elements, and UP/DOWN ARROW to navigate by lines.

On iOS, using Safari, and VoiceOver I use Web Rotor to locate ARIA landmark, SWIPE DOWN/UP to navigate to next/previous landmark; rotor to heading (if no ARIA landmarks are present or if I want next level overview), SWIPE DOWN/UP; TOUCH EXPLORE to “see” layout and where elements are; and SWIPE RIGHT/LEFT to go to NEXT/PREVIOUS OBJECT.

Mobile content seems to be about as accessible as web content on desktop 10 years ago. What kind of hindrances are you seeing there?

Similar to the way the desktop environment worked for a very long time, people primarily used applications that they got from various media, installed, and ran. The Windows accessibility APIs were not great when early on, but have got better as time pass, and inaccessibility issues lie in custom controls, widgets, and user interfaces.

Mobile applications, especially on iOS have become quite accessible and usable, which is due to well-established accessibility APIs, documentation, developer community, and user advocacy. The various issues that do exist when it comes to mobile web apps or local apps are in the operating system, screen reader, and unaware/uneducated developers of the need and value of accessibility.

Apple have done so much for accessibility in mobile but what other alternatives are there for blind and partially sighted users and what devices/platforms do you think organisations should be testing mobile content with?

Android has been dubbed the 800 pound guerrilla that is unavoidable. This includes accessibility, especially when it comes to 4.0/Ice Cream Sandwich as the built-in web browser is accessible, TalkBack provides access to the touch screen, is available on tablets and smartphones, and this is the platform that is right behind iOS in terms of popularity, accessibility, and the future operating system that will power mobile devices and much more.

The challenge with finding Android test devices is organizations/companies need to and should get smartphones/tablets that are running pure/vanilla Android, such as Samsung Galaxy Nexus, and ensure that it does not include any custom UI, skins, overlays, etc. such as MotoBlur, HTC Sense UI, Samsung Touchwiz, etc. This will ensure one is testing with pure Android, there are no manufacturer conflicts/interference, which can include breaking/removing accessibility.

Before iOS, Nokia and Talks seemed to be the phone of choice for people with visual impairments. Are these still worth testing with? And what about the new Nokia Screen Reader?

Most companies, organizations, and institutions resources are very limited. This is most true when it comes to mobile platform support, accessibility, and the ability to have thorough testing, support, and accessibility across a wide arrange of platforms. If the knowledge, person(s), time, money, and availability of devices exist; Talks and Mobile Speak on Nokia Symbian should be tested, developed for, and ensure the mobile web experience is accessible there as well. The challenge is that the user-base for Symbian as a platform, including Talks or Mobile Speak is very small, has decreased dramatically over the last few years. In addition, Nokia has dropped Symbian as a platform in favour of inaccessible and unusable with a screen reader, Windows Phone.

What are the key things to making web content accessible on mobile?

It is important to be aware of the different ways that one can access the mobile web using a screen reader: iOS Safari/VoiceOver, Android Ice Cream Sandwich (natively with AndroidVox and Chrome) or pre-Android 4.0 with Mobile Accessibility Web Browser and a D-Pad, trackball, or ARROW KEYS.

Knowing what environment’s, browser, and screen readers can be used is very important and will lead to specifically what needs to be done on the part of the developer, which includes: ensure all controls, including images have meaningful, clear, and concise labels; all controls, text, and other information can be read, navigated, and interacted with. The most important aspect is that all this be done during the research and design phase, rather than towards the end of the development cycle or after a web application or document has shipped. Throughout the entire process, it is important that the developer and real world day-to-day users of the screen readers test, provide feedback, and developer iterates.

Finally, if you were stranded on a Desert Island and you could choose one piece of technology what would it be and why?

Apple iPhone 4S due to the well over half a million iOS apps (allowing one to do just about anything and everything imaginable), great accessibility of these iOS apps, and Siri/VoiceOver working extremely well, and the great integration providing a stellar user experience.

Henny: Maybe I’ll come back to you in a couple of years and see if the answer is the same! Kevin, thank you very much for your time and sharing your experience and ideas.


4 thoughts on “An interview with Kevin Chao

  1. Pingback: Hands free browsing – an interview with Kim Patch | » Henny Swan's blog

Comments are closed.