Out now: Commentary on infant entrainment study

Our* commentary on Köster et al. (2019) Psych Sci is now available here. Find a TL;DR below.

Entraining rhythms of the brain to study the neurodevelopment of perception and cognition is a fascinating idea. In principle, entrainment endows the experimenter with increased control over the timing of fundamental neural processes. Yet, its appeal has also created a flurry of neuroscientific research that may apply this idea prematurely. Can we simply infer entrainment
just because we are using rhythmic stimulation? It is underappreciated that there is no clear response to this question yet. This should make us more cautious not only in interpreting findings about the role of neural rhythms in perception and cognition, but also in testing rhythmic sensory stimulation as a clinical intervention for psychopathologies.

Can we know? Created with the memer package for R, adapted from a tweet by Matt Craddock.

Köster et al. (2019) studied the neurodevelopment of action expectations by presenting infants with rhythmically flickering stimuli while recording their EEGs. The authors assumed that this rhythmic stimulation entrained endogenous brain oscillations. In turn, they interpreted a band-limited difference between conditions as a modulation of infant theta oscillations. In our commentary, we discuss this assumption critically. We show how a similar pattern of results can arise from a simulated hypothetical scenario. We did not intend to simulate EEG data in their full complexity, but instead present a simplified model that makes a minimal number of assumptions. Crucially, this model does not require the involvement of entrained endogenous brain oscillations. Our findings may serve as a word of caution on inferring entrainment from (conditional effects on) band-limited neural responses in the context of rhythmic sensory stimulation, and interpreting these findings in the context of endogenous neural oscillations, in neurodevelopment and beyond.

Further reads:

Obleser J, Henry MJ, Lakatos P. What do we talk about when we talk about rhythm?. PLoS biology. 2017 Sep 19;15(9):e2002794.

Keitel C, Quigley C, Ruhnau P. Stimulus-driven brain oscillations in the alpha range: entrainment of intrinsic rhythms or frequency-following response?. Journal of Neuroscience. 2014 Jul 30;34(31):10137-40.

* Blogpost based on unpublished article elements produced as part of the submission process. As such, these have been edited by the co-authors Molly Henry, Sarah Jessen, and Jonas Obleser and have been part of the journal review process.

Out now: The brain separates auditory and visual “meanings” of words

In our new article, we tried to find new answers to an old question: Are word meanings “the same” in the brain, whether we hear a spoken word or lip-read the same word by watching a speaker’s face? After more than 4 years of work (I tested the first participant in August 2016), this study now found a great home in the journal eLife.

We asked our participants to do the same task in two conditions: auditory and visual. In the auditory condition, they heard a speaker say a sentence. In the visual condition, they just saw the speaker say the sentence without sound (lip reading). In both conditions, they then chose from a list of four words, which one they had understood in the sentence.

In the auditory condition, the speech was embedded in noise so that participants would misunderstand words in some cases (on average, they understood the correct word in 70% of trials).

In the visual condition, performance was also on average 70% correct. But lip reading skills vary extremely in the population and this is something we also saw in our data: the individual performance in the lip reading task covered the whole possible range (from chance level to almost 100% correct). Needless to say, our participants were all proficient verbal speakers (mostly college students). Quite some time ago, the idea came up that the variability in lip reading reflects something other than normal speech perceptual abilities. Is it therefore possible that the processing of auditory and visual words is completely different in the brain?

To answer this question, we recorded our participants’ brain activity while they did the comprehension task. We used the magnetoencephalogram (MEG), which detects changes in magnetic fields outside the head that are produced by neural activity.

To analyse the brain’s activity during the perception of auditory and visual words, we used a classification approach: First, we tried to reconstruct which word participants had perceived by comparing their waveform patterns in the brain (stimulus classification, or decoding). Second, we analysed which of the classification patterns we found predicted whether participants actually perceived the correct word.

In a nutshell, two main findings emerged:

  1. Areas that encode the word identity very well (e.g. sensory areas) often do not predict comprehension. Looking at it from the other angle, the areas that encode the word sub-optimally (e.g. higher order language areas) influence what we actually perceive. This is true for auditory and visual speech.
Areas that encode the stimulus (left) and areas that encode the stimulus but also predict word comprehension (right).

As once pointed out by Hickok & Poeppel, we think that the task we perform is the key that determines the results we get – and which areas are most relevant for our behaviour. In our case, higher-order language areas are most important for comprehension. But if the task was to discriminate speech sounds or lip movements, early sensory areas would probably be more task-relevant.

2. The representations for auditory and visual word identities are largely distinct. They only converge in a couple of areas, situated in the left frontal cortex and the right angular gyrus (green circles below). These areas might therefore hold some kind of a-modal perceived meaning of a word.

Summary topography of areas that predict word comprehension in the auditory and visual (lip reading) conditions.

Previous studies have often looked at brain activation across the brain using fMRI (functional magnetic resonance imaging). Activation means that something is “happening” in a brain area (leading to increased oxygen demands there). These studies usually suggest that the processing of acoustic and visual speech overlap to a large extent.

But the nature of these activations can be unclear. We think that the activation of a general language network could explain such findings, without necessarily representing specific word identities. Moreover, other studies often use categories (for example, buildings vs animals) instead of single word meanings, which could give a different picture.

Overall, our analysis of specific word identities (meanings?) shows that our brain does very different things when we listen to someone speak or when we try to lip read. This could explain why our ability to understand acoustic speech is usually not related to our ability to lip read.

Please note that this is an updated version of an earlier blog post on the preprint of the same study. Data for this study can be found here.

Call for Papers – Rhythms in cognition: re-visiting the evidence

Special issue at the European Journal of Neuroscience now open for submissions.

Guest Editors Christian Keitel (University of Stirling, UK), Manuela Ruzzoli (University of Glasgow, UK), Chris Benwell (University of Dundee, UK), Niko Busch (University of Muenster, Germany), and Laura Dugué (Paris Descartes University, France).

Call for Papers: November 2019
Extended Submission Deadline: December 21, 2020
Online Publication within One Week of Acceptance
Estimated Date of Publication of Final Issue: Spring 2021

Everyday experience may arise from a fundamentally discrete sampling of our sensory environment, just like a movie consists of still frames shown in rapid succession. Already Bishop (1932, Am J Physiol Leg) suggested such a sampling process. He further identified rhythmic brain activity as the phasic neural analogue of a shutter mechanism subserving discrete sampling. Although it contrasts starkly with our intuition of a continuous perceptual flow the sampling idea has been reiterated over the past 90 years. Ever-more precise recordings of human brain activity and novel analysis techniques have seen an unprecedented surge of interest over the last decade. Consequently, we have come to see brain rhythms as the neural implementation of perceptual sampling and the basis of cognitive functions such as attention, memory and language. In an interesting twist however, more recent negative findings on the role of pre-stimulus oscillatory phase on perception suggest that support for discrete sampling as a fundamental mechanism remains equivocal.

In our Special Issue, we call for methodologically principled studies, irrespective of their outcome, to provide us with the most detailed picture to-date as to the conditions under which perceptual sampling, and its consequences for cognition, can (not) be observed. These studies can be original contributions, replication attempts, pre-registered studies or file-drawer experiments that have to follow a thorough methodology and thus allow clear interpretations also of negative findings. We also welcome dedicated reviews, opinion pieces and methodological advances. Studies can address perceptual sampling in vision, audition or other senses by testing its impact on neurophysiological or behavioural performance measures (psychophysics). We encourage authors to make their data openly accessible along with their experimental and analysis codes in order to foster reproducibility and transparency. We invite human studies adopting neuroimaging (EEG, iEEG, MEG, fMRI) and neurostimulation techniques (tES, TMS, sensory entrainment). Animal studies will be a highly welcome supplement.

We are very much looking forward to your submissions to our EJN Special Issue “Rhythms in Cognition: Revisiting the Evidence”.


Alpha rhythms: Some slow down, some grow stronger

Researchers usually assume alpha brain waves to behave relatively similarly over time. In a new study, led by Chris Benwell and just accepted for publication in NeuroImage, we find that this is not necessarily true for the 1-2 hrs that a typical EEG experiment lasts.

In a re-analysis of data from two previous EEG experiments we saw that alpha underlies two major trends – alpha power increases (an effect previously described) and alpha frequency decreases consistently over time. Both effects maybe related to growing fatigue, the depletion of cognitive resources or a transition from a volitional to a more automatic task performance.

We also found that different sources of alpha rhythms in the brain can show different trends. While some showed both, power increases and frequency decreases, others showed only one if the two trends. Further analysis revealed that both trends do not necessarily depend on each other.


Figure We usually assume that alpha varies “spontaneously” around a mean and higher alpha leads us to observe a different behaviour. This relationship however could simply be a consequence of both, alpha and behaviour changing “deterministically”, i.e. trending over time. In experiments, we likely observe a “mix” of both sources of variance (also see Benwell et al. 2017 or here).

These findings need to be taken into account when testing for links between brain rhythms (brain state) and behaviour – a link might be reported accidentally just because both, brain state and behaviour change over time – and when attempting to manipulate alpha through stimulation – stimulation frequencies may need to be adapted on the fly, stimulation will need to target a specific alpha generator while leaving others unbothered.


Anne wins Biomag 2018 Young Investigator Award

[28 Aug 2018]

Anne has been one of three Early Career Researchers rewarded with a BIOMAG Young Investigator Award at the 2018 International Conference on Biomagnetism, held in Philadelphia, USA. Nominated as one of 10 finalists she won the award for a presentation of her recent work into the neural foundations of speech comprehension [1].


The award ceremony took place at the conference gala in the Philadelphia Museum of Art, not only known for its collection of contemporary art, but also for its iconic set of stairs famously featured in the “Rocky” movies.

[1] Keitel A, Gross J, Kayser C (2018) Perceptually relevant speech tracking in auditory and motor regions reflects distinct linguistic features. PLoS Biology ~pdf

New preprint on speech tracking in auditory and motor cortices

The tracking of temporal information in speech is frequently used to study speech encoding in dynamic brain activity. Often, studies use traditional, generic frequency bands in their analysis (for example delta [1 – 4 Hz] or theta [4 – 8 Hz] bands). However, there are large inter-individual differences in speech rate. For example, audiobooks are typically narrated with 150 words per minute (2.5 Hz), while the world’s fastest speaker can talk at 637 words per minute (10.6 Hz). We therefore reasoned that speech tracking analyses should take into account the specific regularities (e.g. speech rate) of the stimuli. This is exactly what we did in this study: We extracted the time-scales for phrases, words, syllables and phonemes in our sentences and based our analyses on these stimulus-specific bands.

Previous studies also mainly used continuous speech to analyse speech tracking. This is a fantastic, “real-world” paradigm, but it lacks the possibility to directly analyse comprehension. We therefore played single sentences to our participants and asked, after each sentence, to indicate which out of four words they had heard in the sentence. This way, we obtained a single-trial comprehension measure. We also recorded participants’ magnetoencephalography (MEG) and did our analyses on source projections of brain activity.

We show two different speech tracking effects that help participants to comprehend speech and both act concurrently at time-scales within the traditional delta band: First, the left middle temporal cortex (MTG) tracks speech at the word time-scale, which is probably useful for word segmentation and mapping the sound-to-meaning. And second, the left premotor cortex (PM) tracks speech at the phrasal time-scale, likely indicating the use of temporal predictions during speech perception.


Previous studies have shown that the motor system is involved in predicting the timing of upcoming stimuli by using its beta rhythm. We therefore hypothesised that a cross-frequency coupling between beta-power and delta-phase at the phrasal time-scale could drive the effect in the motor system. This is indeed what we found and this was also directly relevant for comprehension.

By using stimulus-specific frequency bands and single-trial comprehension, we show specific functional and perceptually relevant speech tracking processes along the auditory-motor pathway. In particular, we provide new insights regarding the function and relevance of the motor system for speech perception.

If you would like to read the full manuscript, you can find a preprint on bioRxiv here.

First WoRB – Workshop on Rhythms in the Brain – concluded

Last week the first Workshop on Rhythms in the Brain – WoRB – took place here in Glasgow, at the Institute of Neuroscience & Psychology. The organisers thank our speakers Satu Palva, Ayelet Landau, Hartwig Siebner & Niko Busch for their inspiring talks, great discussions and joining our hike up Conic Hill.

Also thanks to everyone who attended the talks, found the time to meet with our guests and our fantastic Admin team, who made it all happen.


Accepted paper: Spontaneous and deterministic fluctuations of pre-stimulus alpha power govern biases in visual line bisection

Chris Benwell’s most recent study shows that an observer’s brain state, just before seeing a transected line, influences their judgment of its centre. Also, these critical brain states do not seem to fluctuate randomly from instant to instant, but trend systematically over time.

Brain states can be defined by rhythmic activity in characteristic frequency bands. We know that the alpha rhythm (roughly ten cycles per second) plays its role in how we take in the visual world around us. For example, strong alpha can shield us from visual impressions. Weak alpha allows for more sensory intake.

Until recently this role of alpha had mostly been shown for briefly presented light flashes just bright enough to be on the verge of being visible. But Chris’ study suggests that alpha may be a gatekeeper for other aspects of visual processing in the brain, too. Here, observers judged whether a line was transected left or right of its centre. Crucially, alpha power before line presentation influenced this judgment.

Line stimuli – Observers had to judge whether lines were interrupted left or right of their actual centre.

More surprising however was that fluctuations in alpha power didn’t come as spontaneous or random as previously thought. Instead, the time observers spent on the task seemed to play a role – at least to some extent. Alpha simply increased over the course of the experiment. And here’s the novel aspect: This deterministic trend predicted a gradual shift in line centre judgments from trial to trial. On average, observers judged the centre to be more to the right than it actually was.

In brief, Chris’ experiment showed that alpha not only influences whether we see very faint stimuli but also how we make judgments about the centre of a visual object. Also, alpha does not just fluctuate randomly but has a deterministic component: When performing a task for an extended period of time alpha inclines gradually. This leads to a sustained and predictable change in our visual perception.


Schematic of spontaneous and deterministic influences of alpha (a neurometric) on reported line center (a psychometric) – the colour gradient towards purple indicates that observers showed an increased rightward bias over time meaning that they increasingly judged the line centre more right than it actually was.

These findings beg an interesting question for future research: Can we find other determinants of alpha (i.e. brain state) fluctuations? Moreover, can all ‘spontaneous’ fluctuations ultimately be described by deterministic processes?

The paper has just been accepted for publication in the European Journal of Neuroscience, and can be found here.

Benwell CSY, Keitel C, Harvey M, Gross J, Thut G (accepted) Trial-by-trial co-variation of pre-stimulus EEG alpha power and visuospatial bias reflects a mixture of stochastic and deterministic effects. European Journal of Neuroscience

This paper is part of the EJN special issue “Neural Oscillations”.

WoRB – Workshop on Rhythms in the Brain – ready to go

Proudly we announce the first Workshop on Rhythms in the Brain (WoRB) held in Glasgow on Monday, 11 Sep 2017. WoRB sets the stage for four leading experts – Satu Palva, Ayelet Landau, Hartwig Siebner & Niko Busch – show-casing and discussing their state-of-the-art research in this intense half-day programme.

WoRB aims to gather a broad audience interested in the significance of rhythmic brain activity for cognitive function.
Two key aspects will be in the spotlight:
1) What do brain rhythms code for and how do they give rise to the complexity and efficiency in human behaviour?
2) How can we drive brain rhythms and establish their causal role in cognition through brain stimulation?

The WoRB format delivers condensed talks with opportunity to discuss and get together with the speakers during breaks.
It is our hope that WoRB fosters scientific exchange and spawns future perspectives for research into the functional role of intrinsic brain rhythms.


  • 09:00 Registration
  • 09:20 Welcome
  • 09:30 Hartwig Siebner:
    Perspectives of state-informed non-invasive transcranial brain stimulation (NTBS): Creating a “state” or targeting a “state” with NTBS?
  • 10:15 Ayelet Landau:
    Attentional Sampling: a human exploration mechanism
  • 11:00 Coffee Break
  • 11:15 Niko Busch:
    Alpha Oscillations, Neuronal Excitability, and Perceptual Decisions
  • 12:00 Satu Palva:
    Entraining and modulating oscillations with TMS
  • 12:45 Lunch
  • 13:30 End of Session

More info here

Organised by Joachim Gross & Gregor Thut Organising Committee Christian Keitel (Chair), Domenica Veniero, Hyojin Park, Roberto Cecere & Christopher Benwell Website Marc Becirspahic WoRB logo courtesy of Christoph Daube Administrative support Lindsay Wilson & Alice Lee

Accepted paper on audio-visual synchrony and spatial attention

In this project, spearheaded by first author Amra Covic, we investigated the interplay of synchronised audio-visual (AV) stimuli and paying attention to their location.

AV stimuli typically have a processing advantage over unisensory stimuli. Current accounts ascribe this advantage to a secondary process, an automatic attraction of attention. We were thus surprised to find that AV and spatial attention influenced stimulus processing independently and additively, instead.

Our study made use of the frequency tagging (FT) approach. FT allowed us to keep track of two simultaneously presented stimuli. Classically stimuli flicker by switching them on and off. Here, we implemented an extra stimulus rhythm by periodically changing the shape of our grating-like stimuli (Gabor patches).

The paper has just been accepted for publication in NeuroImage.
Find the final version here: bioRxiv. ~PDF