A new "Show and Tell" capability available for Alexa digital assistant on Echo Show devices in the US lets users get audible replies to the question "What am I holding?"
Echo Show screens built with camera capabilities use computer vision and machine learning technology to recognize what people have in hand, according to the Seattle-based technology titan.
"It's a tremendous help and a huge time saver because the Echo Show just sits on my counter, and I don't have to go and find another tool or person to help me identify something," mechanical engineer Stacie Grijalva said in a blog post.
"I can do it on my own by just asking Alexa."
Grijalva, who lost her sight as an adult, is a technology manager at a center for the blind and visually impaired in the California coastal city of Santa Cruz that worked with the Amazon team.
"The whole idea for Show and Tell came about from feedback from blind and low vision customers," said Sarah Caplener, head of Amazon's Alexa for Everyone team.
Amazon to phase out single-use plastic in India
"Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment."
Major technology firms including Apple, Google and Microsoft invest in making innovations more accessible or helpful to people with disabilities, which is seen as being good for business as well as socially beneficial.
Being able to interact with smart speakers or other devices by voice can be a boon for the visually impaired, while features such as automatic captioning of online videos can aid those who can't hear.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ