On October 7th, 2017, I had the privilege of attending the 8th annual Boston Accessibility Conference. It was held at the IBM Watson Health Global Headquarters in Cambridge, MA and was hosted by the Boston Accessibility Group, a team of individuals dedicated to promoting “universal design of and accessibility to IT.”
Here at Wakefly, our focus is mainly on the Web aspects of Accessibility. The Boston Accessibility Conference (or as it was better known “a11ybos2017“) touched on a much broader range of accessible topics, namely:
- Artificial Intelligence (AI)
- Virtual Reality (VR)
- Affordability of Assistive Technologies
- Mobile Apps and Websites
- Wearable Tech
- Cognitive Web Accessibility
Unfortunately, I wasn’t able to attend all sessions, due to overlapping scheduling. Below is a breakdown of all the sessions as well as my attendance:
- Cognitive Assistant for the Blind (Keynote) by Chieko Asakawa from Japan (✔attended)
- Impact of Artificial Intelligence (AI) on Accessibility Solutions (✔attended)
- Inclusionary Technology: A Solution for Managing Accommodations of a Mobile Workforce (✔attended)
- Innovations in Affordable Assistive Technology (✖ did not attend)
- Empowering and Employing the Blind and Visually Impaired (✖ did not attend)
- Navigating an Inaccessible World: Inclusive Design for Mobile Apps and Websites (✔attended)
- SNaSI: Wearable Tech to Help the Blind and Visually Disabled in Face-to-face Interactions (✖ did not attend)
- Ableism, AI, Filtering, and Assistive Technology: Take Off the Digital Blindfold (✖ did not attend)
- Cognitive Web Accessibility: Eye Tracking, Machine Learning, and App Development (✔attended)
Below is a summary of the a11ybos2017, based on attended sessions. Unfortunately, slides or resources linked to any presentations attended were not made available to us.
After the opening remarks, the Keynote speech was delivered by Chieko Asakawa, an IBM Fellow (IBM’s most prestigious technical honor). Computers are reaching the point where they can help in sensing, recognizing, and understanding the real world for people who are blind and visually impaired through the advances made in cognitive computing technologies. Ms. Asakawa discussed her research related to Information Accessibility at IBM Research, then the concept of Cognitive Assistance for the Blind, and lastly played short videos and demos (one demo, below, starts at approximately 5 minutes and 40 seconds into video below):
Ms. Asakawa’s keynote speech was incredibly informative in showing the advances that are being made with the use of AI and how this technology can improve quality of life.
Next was the panel on the Impact of AI on Accessibility Solutions. This discussion covered innovations in AI, machine learning, and new possibilities and solutions that benefit people with disabilities with VR. Topics covered:
- Aging population:
- Using VR to bring people together in virtual spaces (i.e. – virtual family reunions hosted in another country)
- Empathy training with VR (i.e. – experience various forms of disabilities);
- Facial and emotion recognition via AI
- Pros: small devices are able to reach a greater range of disabilities
- Cons: decrease in human interaction (i.e. – devices in place of caregiver)
- Factors that may hinder AI:
- Reliability. We are far from 100% AI interpretation, recognition. In one example, a bottle of Diet Coke was interpreted as a bottle of beer.
- Government ethics and regulation (>i.e. – automated cars)
- Sharing of AI data (currently, a reluctance to share)
- Downside of AI
- Being too dependent on AI to quickly
- Errors in identifying data can be dangerous
- Potential to change our behavior (i.e. – no more road rage with automated cars)
- The future of AI
- Will it bring down costs for Assistive technologies?
- Unemployment is high amongst disabled individuals.
- Costs of devices are also high
- Will the government decide to fund? How about organizations?
- AI is very powerful. Who will be responsible for it? Who will govern it?
- Will it bring down costs for Assistive technologies?
The AI panel discussion was followed by lunch and then three break-out sessions, each running roughly 45 minutes long. This portion of the schedule comprised multiple sessions occurring at the same time.
Navigating an Inaccessible World: Inclusive Design for Mobile Apps and Websites was the first break-out session I attended. This was chosen because it directly aligns with what we do here at Wakefly. The website portion covered Accessibility best practices to consider when planning a new site, including color-contrast considerations, alternate text usage, form element labeling and meaningful link text. Things to look out for during wireframes such as a page without headings, keyboard traps (Facebook and Twitter feeds), <on-change page content (announced to user?), and alternate text provided on Calendar widgets, social media icons, and star ratings.
The Inclusive Design for Mobile Apps portion covered a few apps used by blind individuals. Both BlindSquare and Seeing Eye GPS are navigation apps that can describe a surrounding environment, announce points of interest, and detail street intersections. BlindWaze is a crowd sourcing app that helps individuals find the exact location of a particular bus stop. It was developed with support from Google and the MBTA
Inclusionary Technology: A Solution for Managing Accommodations of a Mobile Workforce was the second attended session. The foundation for developing an inclusive and productive workforce is the accommodation process and the mobility of today’s workforce increases the complexity of managing this process effectively. To make matters worse, approximately 20% of those with a disability are unemployed.
The Mobile Accommodation Solution (MAS) is an app built to increase employment opportunities for disabled individuals as well as streamline the disability accommodation process throughout every phase of the employment cycle. The Job Accommodation Network (JAN) helps people with disabilities enhance their employability and shows employers how they can capitalize on the talent that people with disabilities add to their workplace. MAS leverages the free services of JAN with the ultimate goal of creating a more inclusive workplace for disabled employees.
The final break-out session attended was a student panel called Cognitive Web Accessibility: Eye Tracking, Machine Learning, and App Development. In this panel, three students spoke individually about their experiences in testing Microsoft Office Word and PowerPoint accessibility, cognitive load via eye tracking, and simplifying text to make sentences easier to read (i.e. – transform 12th grade reading text down to 3rd grade without changing its original meaning).
A11ybos2017 was an incredible event. I now have a much greater awareness for Accessibility, much more broadly than I did in my “Web Accessibility only” world before. The exposure we all received from the speakers, sessions, and topics was invaluable and I cannot wait to attend a11ybos2018 next year.