Martijn Baart

Tilburg University
Cognitive Neuropsychology

Publications

Journal articles

López-Zunini, R. A., Baart, M., Samuel, A. G., & Armstrong, B. C. (in press). Lexical access versus lexical decision processes for auditory, visual, and audiovisual items: Insights from behavioral and neural measures. Neuropsychologia.

Bourguignon, M., Baart, M., Kapnoula, E. C., & Molinaro, N. (in press). Lip-reading enables the brain to synthesize auditory features of unknown silent speech. The Journal of Neuroscience.

Lindborg, A., Baart, M., Stekelenburg, J., J., Vroomen, J., & Andersen, T. S. (2019). Speech-specific audiovisual integration modulates induced theta-band oscillations. PLOS ONE:e0219744. doi:10.1371/journal.pone.0219744

Modelska, M., Pourquié, M., & Baart, M. (2019). No ‘self’ advantage for audiovisual speech aftereffects. Frontiers in Psychology, 10:685. doi:10.3389/fpsyg.2019.00658

Barraza, P., Dumas, G., Liu, H., Blanco-Gomez, G., van den Heuvel, M. I., Baart, M., & Pérez, A. (2019). Implementing EEG hyperscanning setups. MethodsX, 6, 428-436. doi:10.1016/j.mex.2019.02.021

Baart, M., & Vroomen, J. (2018). Recalibration of vocal affect by a dynamic face. Experimental Brain Research, 236, 1911-1918. doi:10.1007/s00221-018-5270-y

Baart, M., Lindborg, A., & Andersen, T. S. (2017). Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception. European Journal of Neuroscience, 46, 2578-2583. doi:10.1111/ejn.13734

Baart, M., Armstrong, B. C., Martin, C. D., Frost, R., & Carreiras, M. (2017). Cross-modal noise compensation in audiovisual words. Scientific Reports, 7:42055. doi:10.1038/srep42055

Baart, M. (2016). Quantifying lip-read induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays. Psychophysiology, 53, 1295-1306. doi:10.1111/psyp.12683

Baart, M., & Samuel, A. G. (2015). Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing. Journal of Memory and Language, 85, 42-59. doi:10.1016/j.jml.2015.06.008

Shaw, K., Baart, M., Depowski, N., & Bortfeld, H. (2015). Infants’ preference for native audiovisual speech dissociated from congruency preference. PLOS ONE, 10:e0126059. doi:10.1371/journal.pone.0126059

Baart, M., & Samuel, A. G. (2015). Early processing of auditory lexical predictions revealed by ERPs. Neuroscience Letters, 585, 98-102. doi:10.1016/j.neulet.2014.11.044

Baart, M., Bortfeld, H., & Vroomen, J. (2015). Phonetic matching of auditory and visual speech develops during childhood: Evidence from sine-wave speech. Journal of Experimental Child Psychology, 129, 157-164. doi:10.1016/j.jecp.2014.08.002

Baart, M., Stekelenburg, J. J., & Vroomen, J. (2014). Electrophysiological evidence for speech-specific audiovisual integration. Neuropsychologia, 53, 115-121. doi:10.1016/j.neuropsychologia.2013.11.011

Baart, M., Vroomen, J., Shaw, K., & Bortfeld, H. (2014). Degrading phonetic information affects matching of audiovisual speech in adults, but not in infants. Cognition, 130, 31-43. doi:10.1016/j.cognition.2013.09.006

Baart, M., de Boer-Schellekens, L., & Vroomen, J. (2012). Lipread induced recalibration in Dyslexia. Acta Psychologica, 140, 91-95. doi:10.1016/j.actpsy.2012.03.003

Baart, M., & Vroomen, J. (2010). Phonetic recalibration does not depend on working memory. Experimental Brain Research, 203, 575-582. doi:10.1007/s00221-010-2264-9

Baart, M., & Vroomen, J. (2010). Do you see what you are hearing? Cross-modal effects of speech sounds on lipreading. Neuroscience Letters, 471, 100-103. doi:10.1016/j.neulet.2010.01.019

Vroomen, J., & Baart, M. (2009). Recalibration of phonetic categories by lipread speech: Measuring aftereffects after a twenty-four hours delay. Language and Speech, 52, 341-350. doi:10.1177/0023830909103178

Vroomen, J., & Baart, M. (2009). Phonetic recalibration only occurs in speech mode. Cognition, 110, 254-259. doi:10.1016/j.cognition.2008.10.015

Books | chapters

Baart, M. (2012). Phonetic Recalibration in audiovisual speech (Doctoral dissertation). Tilburg University, the Netherlands. Printed by Ridderprint BV, Ridderkerk, ISBN: 978-90-5335-511- 4.

Vroomen, J., & Baart, M. (2012). Phonetic recalibration in audiovisual speech. in M. M. Murray and M. T. Wallace (Eds.), The neural bases of multisensory processes. (pp. 363-379). Boca raton (FL): CRC Press.

Proceedings

Zouridakis, G., Baart, M., Stekelenburg, J. J., & Vroomen, J. (2013). Speech perception: single trial analysis of the N1/P2 complex of unimodal and audiovisual evoked responses. Proceedings of the 13th IEEE International Conference on Bioinformatics and Bioengineering, (paper ID 165), Chania, Greece. doi:10.1109/BIBE.2013.6701590

Baart, M., Vroomen, J., Shaw, K., & Bortfeld, H. (2013). Phonetic information in audiovisual speech is more important for adults than for infants; preliminary findings. Proceedings of the 12th International Conference on Auditory-Visual Speech Processing, (pp. 61 - 64), Annecy, France. full text online

Vroomen, J., van Linden, S., & Baart, M. (2007). Lipread aftereffects in auditory speech perception: Measuring aftereffects after a twenty-four hours delay. Proceedings of the 7th International Conference on Auditory-Visual Speech Processing, (paper P05), Hilvarenbeek, the Netherlands. full text online