Sign Language Over Mobile Phones

blogHere’s an interesting article on the the “MobileASL” project at UW.

The MobileASL project at UW has been working to optimize compressed video signals for sign language. By increasing image quality around the face and hands, researchers have brought the data rate down to 30 kilobytes per second while still delivering intelligible sign language. MobileASL also uses motion detection to identify whether a person is signing or not, in order to extend the phones’ battery life during video use.

Click here to read the full article at GizMag.com

Check out the video below explaining the research and showing the phones in action

1 reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *