Exploratorium_8.1.1
Output
- to Tactile or Pain sense

Annotated Bibliography

COPY currently shown bibliography

Azenkot, S., Ladner, R. E., & Wobbrock, J. O. (2011). Smartphone haptic feedback for nonvisual wayfinding. The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility. https://doi.org/10.1145/2049536.2049607 

  • These researchers used three different methods for navigation using feedback. One uses the phone as a compass, one uses touch at the corners of the phone to navigate, and the last uses vibrations to indicate direction. Participants enjoyed using the last two, specifically the vibrations and the researchers are looking to integrate in more technical navigation system.

City University of Hong Kong (2022, October 20). Research Co-Led by CityU Develops a High-Resolution, Wearable Electrotactile Rendering Device that Virtualizes the Sense of Touch. Cityu.edu.hk. https://www.cityu.edu.hk/research/stories/2022/10/20/research-co-led-cityu-develops-high-resolution-wearable-electrotactile-rendering-device-virtualizes-sense-touch

  • A collaborative research team co-led by City University of Hong Kong (CityU) has developed a wearable tactile rendering system, which can mimic the sensation of touch with high spatial resolution and a rapid response rate. Although phones allow us to hear and see our loved ones over a distance, researchers say that this device could offer the missing sense of touch during isolating times like a pandemic.

Reprogrammable Materials Selectively Self-Assemble. MIT News. https://news.mit.edu/2022/reprogrammable-materials-selectively-self-assemble-1020

  • Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are introducing magnetically reprogrammable materials that they coat different parts with to let them self-assemble. Using this method, the parts to a product such as a chair or table, once programmed magnetically, would self-assemble using just a random disturbance that makes them collide.

Headley, P. C., Hribar, V. E., & Pawluk, D. T. (2011, October). Displaying braille and graphics on a mouse-like tactile display. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility (pp. 235-236).

  • The paper examines effective methods to display braille on a moving tactile display (a haptic mouse).

Hribar, V. E., & Pawluk, D. T. (2011, October). A tactile-thermal display for haptic exploration of virtual paintings. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility (pp. 221-222).

  • The paper discusses the portrayal of paintings through refreshable haptic displays from their digital representations. The haptic display consists of: (1) a pin matrix display to the fingers to relay tactile texture information about brushstroke, (2) a thermal display on which the warm-cold spectrum of colors is mapped, and (3) the sensing of location within the painting used to change tactile and thermal feedback to create contrasts within a painting.

Manshad, M. S., Pontelli, E., & Manshad, S. J. (2011, October). MICOO (multimodal interactive cubes for object orientation) a tangible user interface for the blind and visually impaired. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility (pp. 261-262). https://doi.org/10.1145/2049536.2049597 

  • Students with visual impairments use different resources to depict traditionally visual concepts such as pie charts but these have their limitations. The authors propose using a tangible surface cube to teach these concepts. It would be multimodal and multitouch.

Pereira, T., Fonseca, B., Paredes, H., & Cabo, M. (2011, October). Exploring iconographic interface in emergency for deaf. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility (pp. 245-246).

  • This paper presents an application for mobile phones which allows communication between deaf and emergency medical services using an iconographic touch interface. This application can be useful especially for the deaf but also for persons without disabilities that face sudden situations where speech is hard to articulate.

Sánchez, J., & Espinoza, M. (2011, October). Audio haptic videogaming for navigation skills in learners who are blind. In The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility (pp. 227-228).

  • The purpose of this study was to determine whether the use of audio and a haptic-based videogame has an impact on the development of Orientation and Mobility (O&M) skills in school-age blind learners. The video game Audio Haptic Maze (AHM) was designed, developed and its usability and cognitive impact was evaluated to determine the impact on the development of O&M skills. The results show that the interfaces used in the videogame are usable and appropiately designed, and that the haptic interface is as effective as the audio interface for O&M purposes.

Tamilarasan, N., Thirumalini, S., Nirmal, K., Ganapathy, K., Murali, K., & Srinath, H. (2016). Design and simulation of ferrofluid tactile screen for Braille interface. 2016 International Conference on Robotics and Automation for Humanitarian Applications (RAHA).

https://doi.org/10.1109/raha.2016.7931890 

  • Touch screen surfaces are an issue for visually-impaired people and while screen readers are a great tool, researchers say a tactile screen would help them immensely. Current tactile braille keyboards are expensive and difficult to maintain. This project explores ferrofluids, a magnetorheological liquid, to explore the possibilities of magnetically controlled display layers.
  •