ABSTRACT
Sign languages have been proven to be natural languages, as capable of expressing human thoughts and emotions as traditional languages are. The distinct visual and spatial nature of sign languages seems to be an insurmountable barrier for developing a sign language "word processor". However, we argue that with the advancement of computer graphics technology and graphical implementations of linguistic results obtained from the study of sign languages, "writing" in a sign language should not be difficult. We have pursued exploratory work in constructing virtual gestures, applying hand constraints to facilitate the creation of natural gestures, and combining these gestures into meaningful American Sign Language (ASL) parts that follow the ASL Movement-Hold model. The results, although preliminary, are encouraging. We believe that space effective sign language composition is possible space with the implementation of easy-to-use graphical user interfaces and the development of specialized data management methods.
- UPenn HMS Center. http://hms.upenn.edu/.Google Scholar
- DePaul ASL Synthesizer. http://asl.cs.depaul.edu/.Google Scholar
- Matthew P. Huenerfauth. A Survey and Critique of American Sign Language Natural Language Generation and Machine Translation Systems. Technical report, Computer and Information Sciences, University of Pennsylvania, September 2003.Google Scholar
- Scott K. Liddell. Grammar, Gesture, and Meaning in American Sign Language. Cambridge University Press, 2003.Google Scholar
- Scott K. Liddell and Robert E. Johnson. American sign language: The phonological base. Sign Language Studies, Fall(64):195--227, 1989.Google Scholar
- David McNeill. Hand and Mind: What Gestures Reveal About Thought. The University of Chicago Press, Chicago, 1992.Google Scholar
- David McNeill, editor. Language and Gesture. Cambridge University Press, 2000.Google ScholarCross Ref
- American Academy of Orthopaedic Surgeons. Joint Motion: Method of Measuring and Recording. Churchill Livingstone, New York, 1988.Google Scholar
- Joel Spolsky. User Interface Design for Programmers. Apress, 2001. Google ScholarDigital Library
- IKAN: Inverse Kinematics using ANalytical Methods. http://hms.upenn.edu/software/ik/.Google Scholar
- Clayton Valli and Ceil Lucas. Linguistics of American Sign Language: An Introduction. Gallaudet University Press, 3rd edition, 2000.Google Scholar
- Vcom3D. http://vcom3d.com/.Google Scholar
- VRlab. http://vrlab.epfl.ch/.Google Scholar
Index Terms
- From creating virtual gestures to "writing" in sign languages
Recommendations
Collecting and evaluating the CUNY ASL corpus for research on American Sign Language animation
Automated synthesis of American Sign Language (ASL) animations will benefit people who are deaf with low English literacy.Annotated ASL motion-capture corpora enable researchers to produce animations with complex spatial and linguistic phenomena.We ...
Unsupervised sign language validation process based on hand-motion parameter clustering
AbstractAutomatic sign language translation process relies mainly on dictionaries of signs to interpret the right meaning of gestures. Due to the lack of large multi sign language dictionaries covering all the aspect of sign languages, the ...
Highlights- We present an automatic sign selection and validation solution based on unsupervised clustering of sign motion parameters related to the different sign ...
American Sign Language Interpreter
T4E '12: Proceedings of the 2012 IEEE Fourth International Conference on Technology for EducationThere is currently an abundance of software in the market that is used to help teach people sign language. This software is very effective; however it is very difficult to know if you are making correct hand gestures. It is quite tedious to learn from ...
Comments