Because of concerns about privacy, many people are wary of digital personal assistants like Siri, Alexa or Cortana that are always listening. But for older adults living alone, some of the latest advances in interactive technology could be game-changers. For example, a recently introduced Amazon Echo Show feature helps adults with poor vision live more independently by identifying household items using Alexa’s Show and Tell skill.
Love to cook but not sure if what you’re holding is cayenne pepper or cumin? Just ask Alexa “What am I holding?” Simple, disaster adverted. Technology like the Show and Tell feature can help empower people with vision impairment to feel more independent in their own homes and less reliant on others. The new Alexa skill was developed with visually impaired individuals to make sure their needs were being met. The Echo Show’s camera scans the items and the device makes a sound to indicate the label is positioned correctly.
Other accommodating features of the Echo Show include an adjustable speech rate, captioning for hearing impaired users and the ability to transcribe incoming voice messages on Echo devices with a screen. Users need only say, “Alexa speak slower (or faster)” to adjust the speaking rate. Google and Microsoft are also getting on board with tools and apps that help users with speech impairments, deafness or limited mobility.
According to the World Health Organization, an estimated 15 percent of the population live with some type of disability which adds up to more than 1 billion people globally. And with a rapidly aging population, this number is expected to rise over the next 20 to 30 years.
Read more about how people with disabilities are using Artificial Intelligence to improve their lives by following this link to a recent Public Broadcasting Service Nova Tech+Engineering report.
Add Your Voice
0 Comments
Join the Discussion