Speech and Attention

Research in the Cognitive NeuroSystems Lab includes basic research on speech and attention. Much of this work has been performed in close collaboration with the Human Neuroscience Lab, directed by UCI Professor Ramesh Srinivasan. Much of this work, furthermore, was supported during the period 2008-2014 by generous funding from the Army Research Office through MURI and DURIP awards. Our work has focused on the use of non-invasive brain imaging methods like electroencephalography (EEG) to discern imagined speech and intended direction.


Overview

In 1967, Dewan published a paper in Nature in which was first described a method for communicating linguistic information by brain waves measured using EEG. He trained himself and several others to modulate their brains' alpha rhythms: to turn these rhythms on and off at will. Alpha rhythms reflect synchronous neuronal activity, at or about a frequency of 10Hz, concerning not only whether the eyes are open but also one's state of attention. Mental activity and attention abolish these rhythms, which are normally strongest in a state of mental relaxation. Dewan was able to signal letters of the alphabet using Morse code by voluntarily turning these rhythms on and off, with eyes closed. Signalling such letters, one by one, provides the words and phrases that the communicator has in mind.

In 1988, Farwell and Donchin described in Electroencephalography and Clinical Neurophysiology a second method for transmitting linguistic information. This method is based on the P300 response, again measured using EEG. The P300 is evoked when a person is presented a stimulus that matches what it is they are looking for: a target. Farwell and Donchin display to the thinker the letters of the alphabet, one by one, and eventually display the letter that he or she has in mind. The P300 potential would be evoked, for that target letter, so signalling the thinker's desire to communicate that letter. Again, thinkers can communicate words by signalling the word's letters one by one. Much work as been devoted to brain computer interface in recent years.

Our basic scientific research is geared towards determining whether brain waves that are more directly linked to speech production can be used to communicate linguistic information. Speech is a natural method for communicating linguistic information. Were one able to use EEG to measure directly the activity of brain speech networks, one could potentially develop an easier and faster method for communicating linguistic information using EEG. Our work on imagined speech production pursues this idea.

The research aims also to determine, from brain waves, where the linguistic information should be sent: sent in a particular direction, sent to a particular person, etc. The question is not so much how the message should be sent but where or to whom. Work on the relationship between alpha rhythms and attention has, since Dewan's time, revealed that the pattern of alpha rhythm activity in the two hemispheres of the brain provides information on where a person is focusing attention. For example, paying attention to an area in the left half of one's visual field causes the alpha rhythm activity in the right hemisphere of the brain to desynchronize (and so diminish in intensity), and vice versa. These shifts in brain activity are thought to be helpful in directing more sensory and cognitive resources to the area being attended.

 


In The News

 

11.2013

Futurescape, hosted by James Woods, Science Channel, Episode "I know what you're thinking"

06.2013

Through the Wormhole, hosted by Morgan Freeman, Science Channel, Season 4, How Do Aliens Think? Online Video Excerpt

04.2011

Discover Magazine: Silent Warrior


 

Publications

 

Chi, X., Hagedorn, J.B., Schoonover, D. & D'Zmura, M. (2011). EEG-based discrimination of imagined speech phonemes. International Journal of Bioelectromagnetism 13(4) 201-206 [PDF]

Deng, S., Srinivasan, R., Lappas, T. & D'Zmura, M. (2010). EEG classification of imagined syllable rhythm using Hilbert spectrum methods. Journal of Neural Engineering 7 1-13 [PDF]

Deng, S., Srinivasan, R. & D'Zmura, M. (working paper, 2013). Cortical signatures of heard and imagined speech envelopes. [PDF]

D'Zmura, M., Deng, S., Lappas, T., Thorpe, S. & Srinivasan, R. (2009). Toward EEG sensing of imagined speech. Jacko, J.A. (Ed.), Human-Computer Interaction, Part I, HCII 2009, LNCS 5610 (Berlin: Springer) 40-48 [PDF]

Horton, C., D'Zmura, M. & Srinivasan, R. (2011). EEG reveals divergent paths for speech envelopes during selective attention. International Journal of Bioelectromagnetism 13(4) 217-222 [PDF]

Horton, C., D'Zmura, M. & Srinivasan, R. (2013). Suppression of competing speech through entrainment of cortical oscillations. Journal of Neurophysiology 109, 3082-3093. [PDF]

Horton, C., Srinivasan, R. & D'Zmura, M. (2014). Envelope responses in single-trial EEG indicate attended speaker in a "cocktail party". Journal of Neural Engineering 11, 046015, 1-22 [PDF]

Srinivasan, R., Thorpe, S., Deng, S., Lappas, T. & D'Zmura, M. (2009). Decoding attentional orientation from EEG spectra. Jacko, J.A. (Ed.), Human-Computer Interaction, Part I, HCII 2009, LNCS 5610 (Berlin: Springer) 176-183 [PDF]

Thorpe, S., D'Zmura, M. & Srinivasan, R. (2011). Lateralization of frequency-specific networks for covert spatial attention to auditory stimuli. Brain Topography, doi:10.1007/s10548-011-0186, 1-16 [PDF]


This work was supported by the Army Research Office through MURI (54228-LS-MUR) and DURIP (W911NF-10-1-0163) awards.