Prediction of human voluntary movement before it occurs

Ou Bai, Varun Rathi, Peter Lin, Dandan Huang, Harsha Battapady, Ding Yu Fei, Logan Schneider, Elise Houdayer, Xuedong Chen, Mark Hallett

Research output: Contribution to journalArticlepeer-review


Objective: Human voluntary movement is associated with two changes in electroencephalography (EEG) that can be observed as early as 1.5. s prior to movement: slow DC potentials and frequency power shifts in the alpha and beta bands. Our goal was to determine whether and when we can reliably predict human natural movement BEFORE it occurs from EEG signals ONLINE IN REAL-TIME. Methods: We developed a computational algorithm to support online prediction. Seven healthy volunteers participated in this study and performed wrist extensions at their own pace. Results: The average online prediction time was 0.62 ± 0.25. s before actual movement monitored by EMG signals. There were also predictions that occurred without subsequent actual movements, where subjects often reported that they were thinking about making a movement. Conclusion: Human voluntary movement can be predicted before movement occurs. Significance: The successful prediction of human movement intention will provide further insight into how the brain prepares for movement, as well as the potential for direct cortical control of a device which may be faster than normal physical control.

Original languageEnglish
Pages (from-to)364-372
Number of pages9
JournalClinical Neurophysiology
Issue number2
Publication statusPublished - Feb 2011


  • Brain-computer interface (BCI)
  • Consciousness
  • Electroencephalography (EEG)
  • Event-related desynchronization (ERD)
  • Human intention
  • Movement-related cortical potentials (MRCP)
  • Prediction
  • Voluntary movement

ASJC Scopus subject areas

  • Clinical Neurology
  • Neurology
  • Physiology (medical)
  • Sensory Systems

Fingerprint Dive into the research topics of 'Prediction of human voluntary movement before it occurs'. Together they form a unique fingerprint.

Cite this