Precise numbers on deafblindness are difficult to calculate. For that reason, figures tend to be all over the place. For the sake of writing an intro to this story, we’re going to cite this study from the World Federation of the DeafBlind that puts the number of severe cases at 0.2% globally and 0.8% of the U.S.
Whatever the actual figure, it’s safe to say that people living with a combination of hearing and sight loss is a profoundly underserved community. They form the foundation of the work being done by the small robotics firm, Tatum (Tactile ASL Translational User Mechanism). I met with the team at MassRobotics during a trip to Boston last week.
The company’s 3D-printed robotic hand sat in the middle of the conference room table as we spoke about Tatum’s origins. The whole thing started life in summer 2020 as part of founder Samantha Johnson’s master’s thesis for Northeastern University. The 3D-printed prototype can spell out words with American Sign Language, offering people with deafblindness a window to the outside world.
From the user’s end, it operates similarly to tactile fingerspelling. They place the hand over the back of the robot, feeling its movements to read as its spells. When no one is around who can sign, there can be a tremendous sense of isolation for people with deafblindness, as they’re neither able to watch or listen to the news and are otherwise cut off from remote communication. In this age of teleconferencing, it’s easy to lose track of precisely how difficult that loss of connection can be.
“Over the past two years, we began developing initial prototypes and conducted preliminary validations with DB users,” the company notes on its site. “During this time, the COVID pandemic forced social distancing, causing increased isolation and lack of access to important news updates due to intensified shortage of crucial interpreting services. Due to the overwhelming encouragement from DB individuals, advocates, and paraprofessionals, in 2021, Tatum Robotics was founded to develop an assistive technology to aid the DB community.”
Tatum continues to iterate on its project, through testing with the deafblind community. The goal to build something akin to an Alexa for people with the condition, using the hand to read a book or get plugged into the news in a way that might have otherwise been completely inaccessible.
In addition to working with organizations like the Perkins School for the Blind, Tatum is simultaneously working on a pair of hardware projects. Per the company:
The team is currently working on two projects. The first is a low-cost robotic anthropomorphic hand that will fingerspell tactile sign language. We hope to validate this device in real-time settings with DB individuals soon to confirm the design changes and evaluate ease-of use. Simultaneously, progress is ongoing to develop a safe, compliant robotic arm so that the system can sign more complex words and phrases. The systems will work together to create a humanoid device that can sign tactile sign languages.
Linguistics: In an effort to sign accurately and repeatably, the team is looking to logically parse through tactile American Sign Language (ASL), Pidgin Signed English (PSE) and Signed Exact English (SEE). Although research has been conducted in this field, we aim to be the first to develop an algorithm to understand the complexities and fluidity of t-ASL without the need for user confirmation of translations or pre-programmed responses.
Support has been growing among organizations for the deafblind. It’s a community that has long been underserved by these sorts of hardware projects. There are currently an estimated 150 million people with the condition globally. It’s not exactly the sort of total addressable market that gets return-focused investors excited — but for those living with the condition, this manner of technology could be life changing.
Source @TechCrunch