This is paid content. It was written and produced by Studio 61 of Red Ventures’s marketing unit in collaboration with the sponsor.
According to the World Health Organization, 1 billion people around the world have some form of disability. That's around 15% of the population. It's not surprising, then, that Microsoft launched the AI for Accessibility program to put powerful technology in the hands of those building assistive tools that support independence and productivity. This effort is part of Microsoft's broader AI for Good initiative, which also includes programs dedicated to environmental sustainability, humanitarian action, and preserving cultural heritage.
Applications that assist
AI for Accessibility is a $25 million commitment from Microsoft aimed at harnessing the power of AI to accelerate the development of new assistive solutions. The support includes use of Microsoft's Azure cloud and AI tools. All AI for Accessibility projects focus on at least one of three challenges:
- Employment: Helping people develop more advanced skills for the workplace and evolve existing efforts that support inclusive hiring, ultimately lowering the disability unemployment rate.
- Daily life: Building solutions that offer heightened independence to perform daily tasks and opportunities to personalize tools for unique needs.
- Communication and connection: Offering equal access to information by creating more inclusive avenues for listening, speaking and writing.
AI can serve as the 'brains' behind tools that enhance independence and productivity for people who have disabilities. Here are a few examples of how organizations have used Microsoft's AI to unlock solutions to challenges faced by people with disabilities.
Zyrobotics, among the the first grantees of the AI for Accessibility program, developed an app called "ReadAble Storiez." This STEM-based reading fluency program is designed for students with diverse learning needs, and it helps fill in the gaps for students who may not have access to speech-language or occupational therapists. By creating custom speech models with Microsoft Cognitive Services and Azure Machine Learning, ReadAble Storiez identifies when a student needs feedback, much like a therapist or teacher would recognize and provide.
InnerVoice, a mobile app developed by iTherapy, uses Azure AI technology to teach language and literacy skills using computer vision. Using a tablet or phone camera and onscreen 3D avatars, InnerVoice leverages Microsoft's facial recognition tools to create an interactive platform that helps connect the dots between language and facial expressions. For example, teachers and learners can use InnerVoice to label everyday items and promote communication skills.
The Frist Center for Autism and Innovation at the Vanderbilt University School of Engineering is working on a new service intended to support individuals with autism as they prepare for job interviews. The "Career Interview Readiness in Virtual Reality" (CIRVR) platform uses a variety of Azure AI tools to evaluate several key components of interaction during interviews and provide feedback organically. Once complete, the system will be able to remind the user of things like maintaining eye contact or providing more context-rich answers to questions.
ObjectiveEd is developing the "Braille AI Tutor" app, which incorporates Microsoft AI-based speech recognition to help students practice reading Braille with personalized, gamified learning plans. The app sends a word or a sentence to a refreshable Braille display for students to feel and read aloud, and then provides immediate feedback to offer an interactive learning experience.
iMerciv is developing a navigation app called "MapinHood," aimed at helping pedestrians who are visually impaired walk to their intended destinations more efficiently. Using crowdsourced data and Azure machine learning, the app alerts users audibly to potential hazards on their route and calls out useful elements in their surroundings such as benches, ramps, and water fountains.
Beyond the AI for Accessibility program, Microsoft also offers several apps and services of its own that empower people with disabilities.
Microsoft Translator is an AI-powered communication technology that uses an advanced form of automatic speech recognition to convert raw spoken language – ums, stutters, and all – into fluent, punctuated text. The Rochester Institute of Technology has piloted the Presentation Translator tool within PowerPoint in its classrooms to offer real-time lecture captions that help bridge communication barriers for students who are deaf or hearing-impaired.
Seeing AI is a free app designed for the blind and low-vision community that helps enrich the user's awareness of the visual world around them. The app uses AI to describe people, text, and objects to unlock many "first time" experiences, like distinguishing currency notes and reading handwritten cards.
The diversity of these grantees and their AI applications are impressive unto themselves, and these are just the early entrants into this continuing program.
Technology has an important role to play in the lives of those with disabilities, and we are just beginning to see the impact AI will have in driving innovations that empower everyone.