ENG/EFRI FY 2011 Awards Announcement

Mind, Machines, and Motor Control (M3C) Awards

The Office of Emerging Frontiers in Research and Innovation (EFRI) awarded 14 grants in FY 2011, including the following 6 on the topic of Mind, Machines, and Motor Control (M3C):

Setting prosthetic arms free
The project “A Hybrid Control Systems Approach to Brain–Machine Interfaces for Exoskeleton Control” (1137267) will be led by Jose M. Carmena, in collaboration with Masayoshi Tomizuka and Claire J. Tomlin, all from the University of California at Berkeley.

This research team will investigate how the brain controls the arm to interact with the world.  They will create a novel brain–machine interface, develop and test new hybrid control theories, and create an exoskeleton for the arm.  Focusing these innovations on arm movements that involve multiple degrees of freedom, the team will explore how the nervous system may augment sensory feedback with “motor programs” — patterns of simple movements that help anticipate and control more complex movements.  The researchers aim to uncover fundamental principles at the intersection of the brain, biomechanics, and behavior, which could enable a new generation of prosthetics to restore motor function in neurologically impaired patients.

Simulating the brain to improve motor control
The project “Distributed Brain Dynamics in Human Motor Control” (1137279) will be led by Gert Cauwenberghs, with colleagues Kenneth Kreutz-Delgado, Scott Makeig, Howard Poizner, and Terrence Sejnowski, all from the University of California at San Diego. 

The researchers aim to create an innovative, non-invasive approach for rehabilitation of Parkinson’s disease patients.  In studies of both healthy individuals and those with the disease, the team will use new wireless sensors and a novel imaging method to monitor and record body and brain activity during real-world tasks.  This data will be used to develop detailed, large-scale models of activity in the brain’s basal ganglia-cortical networks, where Parkinson’s disease takes its toll, with the help of newly developed brain-like hardware.  Building on recent advances in control theory, the team will take into account both the perceptual and cognitive factors involved in complex, realistic movements.  Ultimately, they will create a system that offers realistic sensory feedback to stimulate beneficial neurological changes. 

Elegant, adaptive control of external devices
The project “Development of New Algorithmic Models and Tools to Enhance Neural Adaptation in Brain Computer Interface Systems” (1137211) will be led by Daniel W. Moran, in collaboration with Eric C. Leuthardt and Kilian Q. Weinberger, all from Washington University in St. Louis.

The goal of this project is to create brain–machine interface technology that allows direct control of external devices as if they were a natural extension of the body.  The team will develop novel algorithms and models to control the brain’s force and torque inputs to devices such as prosthetic arms.  Because the brain changes in response to force feedback, the researchers will design the brain-machine interface to adapt by incorporating machine learning algorithms.  Finally, to accommodate situations where more than one external device may be used, the interface will be designed to differentiate the brain signals corresponding to use of a particular device.  Achieving a force-based, adaptive brain–machine interface will ultimately enable external devices like prosthetics to dynamically interact with the environment.

Decoding data to move naturally
The project “Robust Decoder-Compensator Architecture for Interactive Control of High-Speed and Loaded Movements” (1137237) will be led by Sridevi V. Sarma of Johns Hopkins University (JHU), in collaboration with Munther A. Dahleh of the Massachusetts Institute of Technology, John T. Gale of the Cleveland Clinic, and Nitish Thakor of JHU.

Achieving high-speed and natural movements of artificial, damaged, or paralyzed limbs through brain-machine interactive controls is a major challenge.  Current designs for the interface between the brain and (artificial) limb, known as a “decoder”, work best when the brain can rely on visual data from the cerebral cortex to guide the limb in a familiar, measured movement.  This research team aims to create a decoder that enables quick, smooth movements, even in changing or uncertain circumstances.  To do so, the researchers will design the decoder to compensate for lost proprioceptive data from the cerebellum, which signals the body’s position and movement in space and often becomes unreliable after a limb injury.  Using both visual and proprioceptive feedback in the interactive control model will enable more natural movement of the limb or prosthesis.

Partnering with robots
The project “Partnered Rehabilitative Movement: Cooperative Human-robot Interactions for Motor Assistance, Learning, and Communication” (1137229) will be led by Lena H. Ting of Emory University, with colleagues Madeleine E. Hackney of Emory, and Charles C. Kemp and C. Karen Liu of the Georgia Institute of Technology.

How humans physically interact with the world is not well understood.  To investigate cooperative physical interactions, the researchers will use the paradigm of rehabilitative partner dance.  Partner dance requires improvisational, collaborative physical interaction between two individuals toward particular movement goals.  Partners use force cues to physically communicate movement goals and to carry them out cooperatively.  Partner dance also involves long-term motor skill acquisition and short-term motor adaptation.  This project will use human and robot dancers to experimentally verify a hierarchical theory of human sensory-motor control and learning, and to develop predictive models of whole-body human movement for cooperative physical interactions with machines.

Replacing vision with other senses
The project “Mobility Skill Acquisition and Learning through Alternative and Multimodal Perception for Visually Impaired People” (1137172) will be led by Zhigang Zhu of the City College of New York (CCNY), in collaboration with Kok-Meng Lee and Boris Prilutsky of Georgia Tech, and with Tony Ro and Ying Li Tian of CCNY.

The goal of this project is to establish design criteria that will improve assistive technologies for visually-impaired and blind people.  The researchers will study what sensory information visually impaired people need to find their way around or to reach objects.  Visual information will be replaced and/or augmented with information perceived through other senses, such as touch, temperature, and sound.  To understand how people use these alternate modes of perception for motor control and acquiring motor skills, the researchers will analyze neural activity associated with particular movements.  This understanding will help the researchers create a sensorimotor model applicable to both human and machine function, which can improve sensor designs, displays, and robotics.

Summaries of the EFRI Projects on Engineering New Technologies Based on Multicellular and Inter-kingdom Signaling (MIKS)

- Cecile J. Gonzalez, NSF, cjgonzal@nsf.gov -