blog reel

  • New preprint: The visual cortex extracts spectral fine details from silent speech

    A guest blog post by Nina Suess In our new preprint, we tried to find out what speech-related information can be extracted from silent lip movements by the human brain and how this information is differentially processed in healthy ageing. It has already been shown that the visual cortex is able to track the unheard… Read more

  • Out now: Commentary on infant entrainment study

    Out now: Commentary on infant entrainment study

    Our* commentary on Köster et al. (2019) Psych Sci is now available here. Find a TL;DR below. Entraining rhythms of the brain to study the neurodevelopment of perception and cognition is a fascinating idea. In principle, entrainment endows the experimenter with increased control over the timing of fundamental neural processes. Yet, its appeal has also… Read more

  • Out now: The brain separates auditory and visual “meanings” of words

    Out now: The brain separates auditory and visual “meanings” of words

    In our new article, we tried to find new answers to an old question: Are word meanings “the same” in the brain, whether we hear a spoken word or lip-read the same word by watching a speaker’s face? After more than 4 years of work (I tested the first participant in August 2016), this study… Read more

  • Call for Papers – Rhythms in cognition: re-visiting the evidence

    Special issue at the European Journal of Neuroscience now open for submissions. Guest Editors Christian Keitel (University of Stirling, UK), Manuela Ruzzoli (University of Glasgow, UK), Chris Benwell (University of Dundee, UK), Niko Busch (University of Muenster, Germany), and Laura Dugué (Paris Descartes University, France). Call for Papers: November 2019 Extended Submission Deadline: December 21,… Read more

  • New preprint: The brain separates auditory and visual “meanings” of words

    In our new preprint, we tried to figure out whether word meanings are “the same” in the brain, whether we hear a spoken word or lip read the same word by watching a speaker’s face. To answer this, participants did the same task in two conditions: auditory and visual. In the auditory condition, they heard… Read more

  • Flicker-driven brain waves and alpha rhythms – revisited

    Flicker-driven brain waves and alpha rhythms – revisited

    In our recent study we asked how waves that the brain produces by itself – alpha rhythms – relate to waves triggered by viewing a flickering screen. Both can be of similar frequency (~10Hz, or ten cycles per second) and are easily recorded from the scalp with electrodes (EEG). To recap, we did not find… Read more

%d bloggers like this: