October 13, 6pm, online
ABSTRACT
For individuals who are blind, non-verbal cues, such as facial expressions, are largely inaccessible during social interactions. An exploration was made into whether Paul Ekman’s facial action units—the building blocks of expressions—can be mapped to a vibrotactile display. We present the design of a visual-to-tactile sensory substitution device and software, along with experimental findings showing the potential to convey emotions and facial movements through touch. We also present the state-of-the-art in social assistive aids and investigate existing challenges of these technologies for future work.
BIO
Troy McDaniel is an Assistant Professor in Engineering at Arizona State University. He is the Director of the Center for Cognitive Ubiquitous Computing (CUbiC) and the Principal Investigator of ASU’s NSF Research Traineeship (NRT) on Citizen-Centered Smart Cities and Smart Living. McDaniel’s research interests include haptic interfaces, robotics, human-computer interaction, and machine learning, especially for haptics. He is particularly interested in tactile vision sensory substitution and haptic human augmentation. His current research investigates how information traditionally presented visually and/or aurally may be presented haptically through novel touch-based interfaces. His application focuses are assistive technologies for individuals with sensory impairments, rehabilitative technologies for individuals with physical impairments, and technologies to support health, wellness, and smart living.
SCHEDULE
- 6-6:30pm: Networking via virtual whiteboard and zoom
- 6:30pm: Presentation
- 7:30pm: Q&A
HOW THIS WORKS:
We will be using:
- A Zoom meeting for the event
- A Miro virtual whiteboard during the pre-talk networking
Links for both the Zoom meeting and the Miro whiteboard will be sent to registered attendees.
SPONSOR
Thank you to our generous sponsor!