Gerald Loeb received his medical degree from Johns Hopkins University and has led research in fundamental neurophysiology and applied neural prosthetics at the US NIH, Queen’s University (Canada) and now University of Southern California. He is Fellow of the US National Academy of Inventors and founder of SynTouch Inc., manufacturers of tactile sensors and a Technology Pioneer of the World Economic Forum.
SPEECH TITLE: SELF-ORGANIZING MIDDLEWARE FOR HAPTICALLY ENABLED ROBOTS
The actuators of most robots that operate under human control receive voluntary commands that originate in the cerebral cortex and are mediated by manipulanda or direct brain-machine interfaces for recording neuro-electrical signals. The sensors of those robots provide conscious perceptual feedback to the cerebral cortex via sensory displays (visual, auditory, tactile) or direct brain-machine interfaces for stimulating neuro-electrical signals. Such systems thus bypass the sophisticated circuits of the spinal cord that normally integrate cortical commands with ongoing somatosensory feedback to produce the graceful and robust behaviors required for haptic exploration and dexterous manipulation of unpredictable objects. The design and function of these spinal circuits in humans appear to reflect the mechanical dynamics of human limbs, so they cannot be ported directly to robotic mechatronic limbs that tend to have different dynamics. I will describe a model-based system that develops spinal-like middleware for arbitrary dynamical systems by recapitulating the spontaneous movements and Hebbian learning that occurs during human fetal development.