<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Sam V. Norman-Haignere</style></author><author><style face="normal" font="default" size="100%">Jenelle Feather</style></author><author><style face="normal" font="default" size="100%">Dana Boebinger</style></author><author><style face="normal" font="default" size="100%">Peter Brunner</style></author><author><style face="normal" font="default" size="100%">Anthony Ritaccio</style></author><author><style face="normal" font="default" size="100%">Josh H. McDermott</style></author><author><style face="normal" font="default" size="100%">Gerwin Schalk</style></author><author><style face="normal" font="default" size="100%">Nancy Kanwisher</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">A neural population selective for song in human auditory cortex</style></title><secondary-title><style face="normal" font="default" size="100%">Current Biology</style></secondary-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Auditory Cortex</style></keyword><keyword><style  face="normal" font="default" size="100%">component</style></keyword><keyword><style  face="normal" font="default" size="100%">ECoG</style></keyword><keyword><style  face="normal" font="default" size="100%">Electrocorticography</style></keyword><keyword><style  face="normal" font="default" size="100%">fMRI</style></keyword><keyword><style  face="normal" font="default" size="100%">music</style></keyword><keyword><style  face="normal" font="default" size="100%">natural sounds</style></keyword><keyword><style  face="normal" font="default" size="100%">song</style></keyword><keyword><style  face="normal" font="default" size="100%">Speech</style></keyword><keyword><style  face="normal" font="default" size="100%">voice</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2022</style></year></dates><urls><web-urls><url><style face="normal" font="default" size="100%">https://www.sciencedirect.com/science/article/pii/S0960982222001312</style></url></web-urls></urls><volume><style face="normal" font="default" size="100%">32</style></volume><pages><style face="normal" font="default" size="100%">1470-1484.e12</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">Summary How is music represented in the brain? While neuroimaging has revealed some spatial segregation between responses to music versus other sounds, little is known about the neural code for music itself. To address this question, we developed a method to infer canonical response components of human auditory cortex using intracranial responses to natural sounds, and further used the superior coverage of fMRI to map their spatial distribution. The inferred components replicated many prior findings, including distinct neural selectivity for speech and music, but also revealed a novel component that responded nearly exclusively to music with singing. Song selectivity was not explainable by standard acoustic features, was located near speech- and music-selective responses, and was also evident in individual electrodes. These results suggest that representations of music are fractionated into subpopulations selective for different types of music, one of which is specialized for the analysis of song.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>17</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Moheimanian, Ladan</style></author><author><style face="normal" font="default" size="100%">Paraskevopoulou, Sivylla E</style></author><author><style face="normal" font="default" size="100%">Adamek, Markus</style></author><author><style face="normal" font="default" size="100%">Schalk, Gerwin</style></author><author><style face="normal" font="default" size="100%">Peter Brunner</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Modulation in cortical excitability disrupts information transfer in perceptual-level stimulus processing.</style></title><secondary-title><style face="normal" font="default" size="100%">Neuroimage</style></secondary-title><alt-title><style face="normal" font="default" size="100%">Neuroimage</style></alt-title></titles><keywords><keyword><style  face="normal" font="default" size="100%">Acoustic Stimulation</style></keyword><keyword><style  face="normal" font="default" size="100%">Adult</style></keyword><keyword><style  face="normal" font="default" size="100%">Aged</style></keyword><keyword><style  face="normal" font="default" size="100%">Alpha Rhythm</style></keyword><keyword><style  face="normal" font="default" size="100%">Auditory Cortex</style></keyword><keyword><style  face="normal" font="default" size="100%">Brain Mapping</style></keyword><keyword><style  face="normal" font="default" size="100%">Cortical Excitability</style></keyword><keyword><style  face="normal" font="default" size="100%">Electrocorticography</style></keyword><keyword><style  face="normal" font="default" size="100%">Female</style></keyword><keyword><style  face="normal" font="default" size="100%">Humans</style></keyword><keyword><style  face="normal" font="default" size="100%">Male</style></keyword><keyword><style  face="normal" font="default" size="100%">Middle Aged</style></keyword></keywords><dates><year><style  face="normal" font="default" size="100%">2021</style></year><pub-dates><date><style  face="normal" font="default" size="100%">11/2021</style></date></pub-dates></dates><volume><style face="normal" font="default" size="100%">243</style></volume><pages><style face="normal" font="default" size="100%">118498</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">&lt;p&gt;Despite significant interest in the neural underpinnings of behavioral variability, little light has been shed on the cortical mechanism underlying the failure to respond to perceptual-level stimuli. We hypothesized that cortical activity resulting from perceptual-level stimuli is sensitive to the moment-to-moment fluctuations in cortical excitability, and thus may not suffice to produce a behavioral response. We tested this hypothesis using electrocorticographic recordings to follow the propagation of cortical activity in six human subjects that responded to perceptual-level auditory stimuli. Here we show that for presentations that did not result in a behavioral response, the likelihood of cortical activity decreased from auditory cortex to motor cortex, and was related to reduced local cortical excitability. Cortical excitability was quantified using instantaneous voltage during a short window prior to cortical activity onset. Therefore, when humans are presented with an auditory stimulus close to perceptual-level threshold, moment-by-moment fluctuations in cortical excitability determine whether cortical responses to sensory stimulation successfully connect auditory input to a resultant behavioral response.&lt;/p&gt;</style></abstract></record></records></xml>