Exploratorium_8.1.1
Input
- using Voice or Sign

Annotated Bibliography

COPY currently shown bibliography

Alisha Pradhan, Kanika Mehta, and Leah Findlater. 2018. "Accessibility Came by Accident": Use of Voice-Controlled Intelligent Personal Assistants by People with Disabilities. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). Association for Computing Machinery, New York, NY, USA, Paper 459, 1–13. https://doi.org/10.1145/3173574.3174033

  • This study analyzes reviews of the Amazon Echo that are written by people with disabilities, and more specifically people with visual disabilities, in order to understand how people with visual disabilities are using current personal assistant technologies, and how they could potentially be used in the future. Findings show that, although some accessibility challenges exist, users with a range of disabilities are using the Amazon Echo, including for unexpected cases such as speech therapy and support for caregivers.

Bragg, D., Koller, O., Bellard, M., Berke, L., Boudreault, P., Braffort, A., Caselli, N., Huenerfauth, M., Kacorri, H., Verhoef, T., Vogler, C., & Ringel Morris, M. (2019). Sign language recognition, generation, and translation. The 21st International ACM SIGACCESS Conference on Computers and Accessibility. https://doi.org/10.1145/3308561.3353774 

  • This paper presents the results of an interdisciplinary 2-day workshop, providing background to what is often overlooked by computer scientists, a review of the state-of-the-art, a set of pressing challenges, and a call to action for the research community.

Machine translation of cortical activity to text with an encoder–decoder framework https://www.nature.com/articles/s41593-020-0608-8

  • Training a recurrent neural network to encode each sentence-length sequence of neural activity into an abstract representation, and then to decode this representation, word by word, into an English sentence.

Peck, F., Leong, A., Zekelman, L., & Hoeft, F. M.D. Ph.D. (2018, April). Compensatory Skills and Dyslexia: What Does the Science Say?. International Dyslexia Organization.

https://dyslexiaida.org/compensatory-skills-and-dyslexia-what-does-the-science-say/

  • This article goes in-depth on the neurological mechanisms by which compensatory mechanisms and skills such as subvocalization help with dyslexia.

Subvocalization. Psychology Wiki; Fandom.

https://psychology.fandom.com/wiki/Subvocalization

  • Psychology Wiki overview on the topic of subvocalization.

Will AI Make Interpreters and Sign Language Obsolete? https://interestingengineering.com/innovation/will-ai-make-interpreters-and-sign-language-obsolete

  • The article discusses the rise of automatic speech recognition (ASR) and natural language processing (NPL) to make existing technologies more easily accessible.

M. M. Islam, S. Siddiqua and J. Afnan, "Real time Hand Gesture Recognition using different algorithms based on American Sign Language," 2017 IEEE International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 2017, pp. 1-6, doi: 10.1109/ICIVPR.2017.7890854. Available from: https://ieeexplore.ieee.org/document/7890854

  • This paper represents a real-time HGR system based on American Sign Language (ASL) recognition with greater accuracy. This system acquires gesture images of ASL with black background from mobile video camera for feature extraction. In the processing phase, the system extracts five features such as fingertip finder, eccentricity, elongatedness, pixel segmentation and rotation.

Papastratis I, Chatzikonstantinou C, Konstantinidis D, Dimitropoulos K, Daras P. Artificial Intelligence Technologies for Sign Language. Sensors (Basel). 2021 Aug 30;21(17):5843. doi: 10.3390/s21175843. PMID: 34502733; PMCID: PMC8434597. Available from: https://www.mdpi.com/1424-8220/21/17/5843

  • This survey provides a comprehensive review of state-of-the-art methods in sign language capturing, recognition, translation and representation, pinpointing their advantages and limitations. In addition, the survey presents a number of applications, while it discusses the main challenges in the field of sign language technologies.

M. M. Islam, S. Siddiqua and J. Afnan, "Real-time Hand Gesture Recognition using different algorithms based on American Sign Language," 2017 IEEE International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 2017, pp. 1-6, doi: 10.1109/ICIVPR.2017.7890854. Available from: https://ieeexplore.ieee.org/document/7890854

  • This paper presents a real-time HGR system based on American Sign Language (ASL) recognition with greater accuracy. This system acquires gesture images of ASL with black background from a mobile video camera for feature extraction.