City Halls Use AI to Interpret Sign Language in Japan


Local governments in Japan are turning to artificial intelligence to improve communication with people who are deaf or hard of hearing at their public counters.

A system jointly developed by Tokyo University of Electrocommunications and SoftBank Corp. converts sign language gestures into written text.

While the system currently requires equipment at the counters, the municipalities hope that it will eventually be usable with a simple smartphone.

At the Narashino Town Office in Chiba Prefecture, a hearing-impaired woman asked for directions to the toilet using sign language as she stood in front of a camera.

A text translation appeared on a staff member’s computer screen after about three seconds. The spoken response then appeared as text on the screen in front of the woman, allowing for smooth interaction.

“Sure Talk,” as the AI ​​system that converts sign language into Japanese text is called, uses image recognition technology that analyzes skeletal movements in multiple areas of the body, such as the fingers and arms. , to produce a response in Japanese. Sign language images of hundreds of people were scanned to develop the system.

Although conversation with people who are deaf or hard of hearing can be done in writing, the AI ​​system is “much smoother as the translation takes place in real time,” said an official from the city of Narashino.

Mito in Ibaraki Prefecture and Chofu in Tokyo have also set up offices in city offices to help people using the system.

“Sure Talk” still has a lot of room for improvement, however. Currently, he can only accurately translate gestures in about 1,500 Japanese words. “A huge amount of sign language data is needed to create an accurate translation model of signs to Japanese text,” said a SoftBank engineer involved in the development of the system.

Therefore, to improve the accuracy of the system, the mobile communications and internet service company found it necessary to launch a website and smartphone application that solicited the cooperation of the public, calling as many people as possible to send images. in sign language.

In a related development, Hokkaido University and Nippon Telegraph and Telephone East Corp. jointly develop an automated sign language interpretation system based on AI.

They aim to improve the environment in hospitals, pharmacies, tourist sites and other places, by installing cameras so that people who are deaf or hard of hearing can obtain information even in the absence of staff capable of using sign language. .

The Nippon Foundation and Google LLC have developed “Sign Town”, a game to help people enjoy learning sign language. Players move forward when they can correctly answer questions using signs directed at computer cameras. The game is available for free on the website of the public benefit foundation.

But for AI to be an effective tool for people who are deaf and hard of hearing, a lot more nuance is needed, argues the Japan Deaf Federation – something that will be difficult to achieve in the short term, if at all.

“As sign language has unique regional dialects and expressions, the current AI-based sign language translation is not sufficient. If it is made more specific, it will become an effective way to have simple conversations and respond to inquiries in public offices and other places, ”said a federation official.

In a time of both disinformation and too much information, quality journalism is more crucial than ever.
By subscribing you can help us tell the story well.



Source link


Leave A Reply