<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>10</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Miller, John W</style></author><author><style face="normal" font="default" size="100%">Hermes, Dora</style></author><author><style face="normal" font="default" size="100%">Gerwin Schalk</style></author><author><style face="normal" font="default" size="100%">Ramsey, Nick F</style></author><author><style face="normal" font="default" size="100%">Jagadeesh, Bharathi</style></author><author><style face="normal" font="default" size="100%">den Nijs, Marcel</style></author><author><style face="normal" font="default" size="100%">Ojemann, J G</style></author><author><style face="normal" font="default" size="100%">Rao, Rajesh P N</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Detection of spontaneous class-specific visual stimuli with high temporal accuracy in human electrocorticography.</style></title><secondary-title><style face="normal" font="default" size="100%">Conf Proc IEEE Eng Med Biol Soc</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Conf Proc IEEE Eng Med Biol Soc</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Algorithms</style></keyword><keyword><style  face="normal" font="default" size="100%">Electrocardiography</style></keyword><keyword><style  face="normal" font="default" size="100%">Evoked Potentials, Visual</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Male</style></keyword><keyword><style  face="normal" font="default" size="100%">Pattern Recognition, Automated</style></keyword><keyword><style  face="normal" font="default" size="100%">Pattern Recognition, Visual</style></keyword><keyword><style  face="normal" font="default" size="100%">Photic Stimulation</style></keyword><keyword><style  face="normal" font="default" size="100%">Reproducibility of Results</style></keyword><keyword><style  face="normal" font="default" size="100%">Sensitivity and Specificity</style></keyword><keyword><style  face="normal" font="default" size="100%">User-Computer Interface</style></keyword><keyword><style  face="normal" font="default" size="100%">Visual Cortex</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2009</style></year><pub-dates><date><style  face="normal" font="default" size="100%">2009</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">2009</style></volume><pages><style face="normal" font="default" size="100%">6465-8</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">Most brain-computer interface classification experiments from electrical potential recordings have been focused on the identification of classes of stimuli or behavior where the timing of experimental parameters is known or pre-designated. Real world experience, however, is spontaneous, and to this end we describe an experiment predicting the occurrence, timing, and types of visual stimuli perceived by a human subject from electrocorticographic recordings. All 300 of 300 presented stimuli were correctly detected, with a temporal precision of order 20 ms. The type of stimulus (face/house) was correctly identified in 95% of these cases. There were approximately 20 false alarm events, corresponding to a late 2nd neuronal response to a previously identified event.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Gerwin Schalk</style></author><author><style face="normal" font="default" size="100%">Leuthardt, E C</style></author><author><style face="normal" font="default" size="100%">Peter Brunner</style></author><author><style face="normal" font="default" size="100%">Ojemann, J G</style></author><author><style face="normal" font="default" size="100%">Lester A Gerhardt</style></author><author><style face="normal" font="default" size="100%">Jonathan Wolpaw</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Real-time detection of event-related brain activity.</style></title><secondary-title><style face="normal" font="default" size="100%">Neuroimage</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Neuroimage</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Adult</style></keyword><keyword><style  face="normal" font="default" size="100%">Algorithms</style></keyword><keyword><style  face="normal" font="default" size="100%">Brain Mapping</style></keyword><keyword><style  face="normal" font="default" size="100%">Computer Systems</style></keyword><keyword><style  face="normal" font="default" size="100%">Diagnosis, Computer-Assisted</style></keyword><keyword><style  face="normal" font="default" size="100%">Electroencephalography</style></keyword><keyword><style  face="normal" font="default" size="100%">Epilepsy</style></keyword><keyword><style  face="normal" font="default" size="100%">Evoked Potentials</style></keyword><keyword><style  face="normal" font="default" size="100%">Female</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Male</style></keyword><keyword><style  face="normal" font="default" size="100%">Pattern Recognition, Automated</style></keyword><keyword><style  face="normal" font="default" size="100%">Reproducibility of Results</style></keyword><keyword><style  face="normal" font="default" size="100%">Sensitivity and Specificity</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2008</style></year><pub-dates><date><style  face="normal" font="default" size="100%">11/2008</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.ncbi.nlm.nih.gov/pubmed/18718544</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">43</style></volume><pages><style face="normal" font="default" size="100%">245-9</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;&lt;span style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;The complexity and inter-individual variation of&amp;nbsp;&lt;/span&gt;&lt;span class=&quot;highlight&quot; style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;brain&lt;/span&gt;&lt;span style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;&amp;nbsp;signals impedes real-time detection of events in raw signals. To convert these complex signals into results that can be readily understood, current approaches usually apply statistical methods to data from known conditions after all data have been collected. The capability to provide meaningful visualization of complex&amp;nbsp;&lt;/span&gt;&lt;span class=&quot;highlight&quot; style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;brain&lt;/span&gt;&lt;span style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;&amp;nbsp;signals without the requirement to initially collect data from all conditions would provide a new tool, essentially a new imaging technique, that would open up new avenues for the study of&amp;nbsp;&lt;/span&gt;&lt;span class=&quot;highlight&quot; style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;brain&lt;/span&gt;&lt;span style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;&amp;nbsp;function. Here we show that a new analysis approach, called SIGFRIED, can overcome this serious limitation of current methods. SIGFRIED can visualize&amp;nbsp;&lt;/span&gt;&lt;span class=&quot;highlight&quot; style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;brain&lt;/span&gt;&lt;span style=&quot;font-family: arial, helvetica, clean, sans-serif; font-size: 13px; line-height: 17px;&quot;&gt;&amp;nbsp;signal changes without requiring prior data collection from all conditions. This capacity is particularly well suited to applications in which comprehensive prior data collection is impossible or impractical, such as intraoperative localization of cortical function or detection of epileptic seizures.&lt;/span&gt;&lt;/p&gt;</style></abstract><issue><style face="normal" font="default" size="100%">2</style></issue></record></records></xml>