You are here

Plenary Talks

The 2014 Haptics Symposium will include three plenary talks by leading researchers from the wide-ranging discipline of haptics. Details about these keynote presentations are listed below.
 

Keynote 1

Keynote speaker David Eagleman is a neuroscientist and a New York Times bestselling author. He directs the Laboratory for Perception and Action at the Baylor College of Medicine, where he also directs the Initiative on Neuroscience and Law. He is best known for his work on time perception, synesthesia, and neurolaw. At night he writes. His work of fiction SUM, is an international bestseller published in 27 languages. His book on the internet and civilization Why the Net Matters, is available as an app for the iPad and as an eBook. Wednesday is Indigo Blue explores the neurological condition of synesthesia, in which the senses are blended. His latest book, the New York Times bestseller Incognito: The Secret Lives of the Brain, explores the neuroscience "under the hood" of the conscious mind -- in other words, all the aspects of neural function to which we have no awareness or access. Eagleman is a Guggenheim Fellow, a Next Generation Texas Fellow, a council member on the World Economic Forum, a research fellow in the Institute for Ethics and Emerging Technologies, and a board member of The Long Now Foundation. He is an academic editor for several scientific journals, and has been named one of 2012's Brightest Idea Guys by Italy's Style magazine.  He is the scientific advisor for the television drama Perception, and has been profiled on the Colbert Report, NOVA Science Now, the New Yorker, CNN's Next List, and many other venues. He appears regularly on radio and television to discuss literature and science.

A Vibrotactile Sensory Substitution Device for the Deaf and Profoundly Hearing Impaired

There are 2 million functionally deaf individuals in the United States and an estimated 53 million worldwide. The cochlear implant (CI) is an effective solution for regaining hearing capabilities for certain populations within this group, but not for all. First, CIs are expensive, ranging from $40,000 to $90,000. Second, CIs require invasive surgery. Third, there is low efficacy of late CI implantation in early-onset deaf adults. Given these considerations, millions of deaf individuals would benefit from a hearing replacement that has low cost, does not involve an invasive procedure, and may have a higher efficacy for early-onset deaf adults. To this end, we are developing a low-cost, non-invasive, plasticity-based solution to deliver auditory information to the brain. Specifically, we are developing a "vibratory vest" by which auditory information is captured, digitally processed, and delivered to the skin of the torso using small vibratory motors.  The term for such a technique is sensory substitution, and has previously proven successful in allowing those who are blind to have visual experience through the tongue or skin.  In this talk, we will present our fully functional, real-time, Bluetooth-operated prototype, and demonstrate our results from several speech perception experiments.
 
Funding for this research is supported by a training fellowship from the Keck Center of Interdisciplinary Bioscience Training of the Gulf Coast Consortia (NIBIB Grant No. 5T32EB006350-05) and the Renz Foundation.
 
 

Keynote 2

Keynote speaker Sile O'Modhrain is a professor in Performing Arts Technology at the school of Music, Theatre and Dance at the University of Michigan. Her research focuses on human-computer interaction, especially interfaces incorporating haptic and auditory feedback. She earned her master's degree in music technology from the University of York and her PhD from Stanford University's Center for Computer Research in Music and Acoustics (CCRMA). She has also worked as a sound engineer and producer for BBC Network Radio. In 1994, she received a Fulbright scholarship, and went to Stanford to develop a prototype haptic interface augmenting graphical user interfaces for blind computer users. Before taking up her position at the University of Michigan, Sile taught at the Sonic Arts Research Centre at Queens University Belfast and, from 2001-2005 directed the Palpable Machines group at Media Lab Europe. Here her work focused on new interfaces for hand-held devices that tightly couple gestural input and touch or haptic display.

Rekindling the Search for the Holy Braille

In 1987 Alan Kay pointed out that the development of interfaces to computers is going backwards.  Whereas children begin building their understanding of the world through embodied exploration, then “advance” to learning through images and finally reach the “sophisticated” state of symbolic understanding, our interaction with computers began by being symbolic, progressed through the GUI and only now is reaching the point where we can explore computational environments through movement and touch.
 
While this has certainly been true for mainstream computing, interaction for people with visual impairments got stuck at the symbolic phase.  We still use command lines and keyboard shortcuts to navigate through our interfaces, and they still communicate with us through words (either spoken or, very occasionally, through the symbolic code of Braille.  In many ways, therefore, blind computer users have not been able to take advantage of the revolution in information presentation facilitated by computation that has changed the lives of their sighted peers.  And this is not for lack of trying to find alternative ways for blind individuals to access non-textual information.  There is a long history of attempts to build refreshable tactile displays and, more recently, of using haptic feedback to present graphs, maps and 3D models to blind computer users.  But in most cases the main stumbling block has been the cost and availability of suitable hardware for this purpose.  The search is on, therefore, for a low-cost full-page refreshable tactile display that can convey both text and graphics.  Such a display would allow for the presentation of spatial information such as curves and lines in graphs, shapes of objects in pictures and maps, and the display of spatially-dependent Braille codes for mathematics and music.
 
A second part of the process of designing effective tactile and/or haptic information displays is in understanding how to convey spatial and relational information through touch.  Here too, there is a substantial body of work on haptic spatial perception, haptic exploration,  and on the perception of tactile diagrams and images upon which we as designers can draw. 
 
In this talk, I will introduce our own bid to find the Holy Braille.  I will first review relevant work in both hardware design and haptic perception.  I will then discuss our design process for an interactive product that takes full account of the many perceptual and cognitive layers inherent in presenting information in tangible form to users who have little or no vision.  In particular I will discuss the knotty issue of understanding the differences between sensory substitution, symbolic translation, and semantic interpretation, and the potential pitfalls of misunderstanding the relationship between these three categories of information presentation.
 

TCH Early Career Awardee Talk

The Technical Committee on Haptics (TCH) gives an Early Career Award in odd years to recognize outstanding contributions to haptics by members of our community who are in the earlier stages of their career. The 2013 awardee was Sliman J. Bensmaia, and we are happy to host him as a semi-plenary speaker at the 2014 Haptics Symposium.

Semi-plenary speaker Sliman J. Bensmaia  received a B.A. in Cognitive Science from the University of Virginia in 1995, and a PhD in Cognitive Psychology from the University of North Carolina at Chapel Hill, in 2003, under the tutelage of Dr. Mark Hollins. He then joined the lab of Dr. Kenneth Johnson, at the Johns Hopkins University Krieger Mind/Brain Institute, as a postdoctoral fellow until 2006, at which time he was promoted to Associate Research Scientist. In 2009, Dr. Bensmaia joined the faculty as an Assistant Professor in the Department of Organismal Biology and Anatomy at the University of Chicago, where he is also a member of the Committees on Neurobiology and on Computational Neuroscience. The main objectives of Bensmaia’s lab are to discover the neural basis of somatosensory perception using psychophysics, neurophysiology, and computational modeling. Bensmaia also seeks to apply insights from basic science to develop approaches to convey sensory feedback in upper-limb neuroprostheses.

Spatial and Temporal Codes Mediate the Tactile Perception of Natural Textures

Our exquisite tactile sensitivity to surface texture allows us to distinguish silk from satin, or even good silk from cheap silk. We show that the tactile perception of natural textures relies on two neural mechanisms. Coarse textural features, for example the individual elements of Braille, are represented in spatial patterns of activation across one population of mechanoreceptive afferents that densely innervate the fingertip skin. In contrast, our ability to discern fine textural features is mediated by the transduction and processing of vibrations produced in the skin during scanning. Indeed, two other populations of vibration-sensitive afferents produce temporally patterned responses to these vibrations and spiking patterns in these afferent populations convey texture information and shape the way textures are perceived.