<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">McCane, Lynn M</style></author><author><style face="normal" font="default" size="100%">Susan M Heckman</style></author><author><style face="normal" font="default" size="100%">Dennis J. McFarland</style></author><author><style face="normal" font="default" size="100%">Townsend, George</style></author><author><style face="normal" font="default" size="100%">Mak, Joseph N</style></author><author><style face="normal" font="default" size="100%">Sellers, Eric W</style></author><author><style face="normal" font="default" size="100%">Zeitlin, Debra</style></author><author><style face="normal" font="default" size="100%">Tenteromano, Laura M</style></author><author><style face="normal" font="default" size="100%">Jonathan Wolpaw</style></author><author><style face="normal" font="default" size="100%">Theresa M Vaughan</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">P300-based brain-computer interface (BCI) event-related potentials (ERPs): People with amyotrophic lateral sclerosis (ALS) vs. age-matched controls.</style></title><secondary-title><style face="normal" font="default" size="100%">Clin Neurophysiol</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Clin Neurophysiol</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">alternative and augmentative communication (AAC)</style></keyword><keyword><style  face="normal" font="default" size="100%">amyotrophic lateral sclerosis (ALS)</style></keyword><keyword><style  face="normal" font="default" size="100%">Brain-computer interface (BCI)</style></keyword><keyword><style  face="normal" font="default" size="100%">brain-machine interface (BMI)</style></keyword><keyword><style  face="normal" font="default" size="100%">electroencephalography (EEG)</style></keyword><keyword><style  face="normal" font="default" size="100%">event-related potentials (ERP)</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2015</style></year><pub-dates><date><style  face="normal" font="default" size="100%">02/2015</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/25703940</style></url></web-urls></urls><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;&lt;b&gt;OBJECTIVE: &lt;/b&gt;Brain-computer interfaces (BCIs) aimed at restoring communication to people with severe neuromuscular disabilities often use event-related potentials (ERPs) in scalp-recorded EEG activity. Up to the present, most research and development in this area has been done in the laboratory with young healthy control subjects. In order to facilitate the development of BCI most useful to people with disabilities, the present study set out to: (1) determine whether people with amyotrophic lateral sclerosis (ALS) and healthy, age-matched volunteers (HVs) differ in the speed and accuracy of their ERP-based BCI use; (2) compare the ERP characteristics of these two groups; and (3) identify ERP-related factors that might enable improvement in BCI performance for people with disabilities.&lt;/p&gt;&lt;p&gt;&lt;b&gt;METHODS: &lt;/b&gt;Sixteen EEG channels were recorded while people with ALS or healthy age-matched volunteers (HVs) used a P300-based BCI. The subjects with ALS had little or no remaining useful motor control (mean ALS Functional Rating Scale-Revised 9.4 (±9.5SD) (range 0-25)). Each subject attended to a target item as the items in a 6×6 visual matrix flashed. The BCI used a stepwise linear discriminant function (SWLDA) to determine the item the user wished to select (i.e., the target item). Offline analyses assessed the latencies, amplitudes, and locations of ERPs to the target and non-target items for people with ALS and age-matched control subjects.&lt;/p&gt;&lt;p&gt;&lt;b&gt;RESULTS: &lt;/b&gt;BCI accuracy and communication rate did not differ significantly between ALS users and HVs. Although ERP morphology was similar for the two groups, their target ERPs differed significantly in the location and amplitude of the late positivity (P300), the amplitude of the early negativity (N200), and the latency of the late negativity (LN).&lt;/p&gt;&lt;p&gt;&lt;b&gt;CONCLUSIONS: &lt;/b&gt;The differences in target ERP components between people with ALS and age-matched HVs are consistent with the growing recognition that ALS may affect cortical function. The development of BCIs for use by this population may begin with studies in HVs but also needs to include studies in people with ALS. Their differences in ERP components may affect the selection of electrode montages, and might also affect the selection of presentation parameters (e.g., matrix design, stimulation rate).&lt;/p&gt;&lt;p&gt;&lt;b&gt;SIGNIFICANCE: &lt;/b&gt;P300-based BCI performance in people severely disabled by ALS is similar to that of age-matched control subjects. At the same time, their ERP components differ to some degree from those of controls. Attention to these differences could contribute to the development of BCIs useful to those with ALS and possibly to others with severe neuromuscular disabilities.&lt;/p&gt;</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Jeremy Jeremy Hill</style></author><author><style face="normal" font="default" size="100%">Ricci, Erin</style></author><author><style face="normal" font="default" size="100%">Haider, Sameah</style></author><author><style face="normal" font="default" size="100%">McCane, Lynn M</style></author><author><style face="normal" font="default" size="100%">Susan M Heckman</style></author><author><style face="normal" font="default" size="100%">Jonathan Wolpaw</style></author><author><style face="normal" font="default" size="100%">Theresa M Vaughan</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">A practical, intuitive brain-computer interface for communicating 'yes' or 'no' by listening.</style></title><secondary-title><style face="normal" font="default" size="100%">J Neural Eng</style></secondary-title><alt-title><style face="normal" font="default" size="100%">J Neural Eng</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Adult</style></keyword><keyword><style  face="normal" font="default" size="100%">Aged</style></keyword><keyword><style  face="normal" font="default" size="100%">Algorithms</style></keyword><keyword><style  face="normal" font="default" size="100%">Auditory Perception</style></keyword><keyword><style  face="normal" font="default" size="100%">brain-computer interfaces</style></keyword><keyword><style  face="normal" font="default" size="100%">Communication Aids for Disabled</style></keyword><keyword><style  face="normal" font="default" size="100%">Electroencephalography</style></keyword><keyword><style  face="normal" font="default" size="100%">Equipment Design</style></keyword><keyword><style  face="normal" font="default" size="100%">Equipment Failure Analysis</style></keyword><keyword><style  face="normal" font="default" size="100%">Female</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Male</style></keyword><keyword><style  face="normal" font="default" size="100%">Man-Machine Systems</style></keyword><keyword><style  face="normal" font="default" size="100%">Middle Aged</style></keyword><keyword><style  face="normal" font="default" size="100%">Quadriplegia</style></keyword><keyword><style  face="normal" font="default" size="100%">Treatment Outcome</style></keyword><keyword><style  face="normal" font="default" size="100%">User-Computer Interface</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2014</style></year><pub-dates><date><style  face="normal" font="default" size="100%">06/2014</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/24838278</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">11</style></volume><pages><style face="normal" font="default" size="100%">035003</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">OBJECTIVE:
Previous work has shown that it is possible to build an EEG-based binary brain-computer interface system (BCI) driven purely by shifts of attention to auditory stimuli. However, previous studies used abrupt, abstract stimuli that are often perceived as harsh and unpleasant, and whose lack of inherent meaning may make the interface unintuitive and difficult for beginners. We aimed to establish whether we could transition to a system based on more natural, intuitive stimuli (spoken words 'yes' and 'no') without loss of performance, and whether the system could be used by people in the locked-in state.
APPROACH:
We performed a counterbalanced, interleaved within-subject comparison between an auditory streaming BCI that used beep stimuli, and one that used word stimuli. Fourteen healthy volunteers performed two sessions each, on separate days. We also collected preliminary data from two subjects with advanced amyotrophic lateral sclerosis (ALS), who used the word-based system to answer a set of simple yes-no questions.
MAIN RESULTS:
The N1, N2 and P3 event-related potentials elicited by words varied more between subjects than those elicited by beeps. However, the difference between responses to attended and unattended stimuli was more consistent with words than beeps. Healthy subjects' performance with word stimuli (mean 77% ± 3.3 s.e.) was slightly but not significantly better than their performance with beep stimuli (mean 73% ± 2.8 s.e.). The two subjects with ALS used the word-based BCI to answer questions with a level of accuracy similar to that of the healthy subjects.
SIGNIFICANCE:
Since performance using word stimuli was at least as good as performance using beeps, we recommend that auditory streaming BCI systems be built with word stimuli to make the system more pleasant and intuitive. Our preliminary data show that word-based streaming BCI is a promising tool for communication by people who are locked in.</style></abstract><issue><style face="normal" font="default" size="100%">3</style></issue></record></records></xml>