Local governments in Japan are turning to artificial intelligence to improve communication with people who are deaf or hard of hearing at their public counters.
A system jointly developed by Tokyo University of Electro-communications and SoftBank Corp. converts sign language gestures into written text.
While the system currently requires equipment at the counters, the municipalities hope that it will eventually be usable with a simple smartphone.
The photo provided shows the display of a system recognizing sign language movement using artificial intelligence. (Photo courtesy of Softbank Corp.) (Kyodo)
At the Narashino City Office in Chiba Prefecture, near Tokyo, a hearing-impaired woman asked for directions to the toilet using sign language while standing in front of a camera.
A text translation appeared on a staff member’s computer screen after about three seconds. The spoken response then appeared as text on the screen in front of the woman, allowing for smooth interaction.
“Sure Talk,” as the AI system that translates the sign into Japanese text is called, uses image recognition technology that analyzes the skeletal movements of multiple areas of the body, such as the fingers and arms, to convert the sign in Japanese. Signage images of hundreds of people were scanned to develop the system.
Although the conversation with people who are deaf or hard of hearing can be in writing, the AI system is “much smoother as the translation takes place in real time,” said the Narashino city official.
Mito in Ibaraki Prefecture and Chofu in Tokyo have also set up offices in city offices to help people using the system.
“Sure Talk” still has a lot of room for improvement. Currently, he can only accurately translate gestures in about 1,500 Japanese words. “A huge amount of sign language data is needed to create an accurate translation model of signs to Japanese text,” said a SoftBank engineer involved in the development of the system.
So, to improve the accuracy of the system, the mobile communications and internet services company found it necessary to launch a website and a smartphone application seeking the cooperation of the public, calling on as many people as possible to send images in sign language. .
In a related development, Hokkaido University and Nippon Telegraph and Telephone East Corp. jointly develop an automated sign language translation system based on AI.
They aim to improve the environment in hospitals, pharmacies, tourist sites and other places, by installing cameras so that people who are deaf or hard of hearing can get information even in the absence of staff capable of using sign language. .
The Nippon Foundation and Google LLC have developed “Sign Town”, a game to help people enjoy learning sign language. Players move forward when they can correctly answer questions using signs directed at computer cameras. The game is available for free on the website of the public benefit foundation.
But for AI to be an effective tool for people who are deaf and hard of hearing, a lot more nuance is needed, argues the Japan Deaf Federation – something that will be difficult to achieve in the short term, if at all.
“As sign language has unique regional dialects and expressions, the current AI-based sign language translation is not sufficient. If made more specific, it will become an effective way to have simple conversations and respond to inquiries in public offices and other places, ”said a federation official.