About me

I am a researcher at the French Institute for Research in Computer Science and Automation (INRIA) and a certified psychologist, working and living in Paris.

Over the last years I specialized in time-resolved physiological neuroimaging, i.e., magneto- and electroencephalography. My main interest lies in the functional architecture of brain dynamics in both healthy and clinical populations. At the same time, I am fascinated with collective behavior and social cognition at diverse scales, the glue that links human brain dynamics with societal fate.

A common unifying motif consists in data scientific underpinnings. Reproducible research, scalable numerical methods and data sharing are major concerns in contemporary data-intense science. As part of my work, I devote considerable energy to developing open source software that confronts these bottlenecks by facilitating automated data processing and fast access to data resources.

I hope that my work contributes to a 21st century psychology in an increasingly data-driven interdisciplinary future.

Events

Publications and Preprints

Note. Equal contributions are indicated by an asterisk .

Software

Loading my contributions data just for you (...)

MNE Software for MEG and EEG

MNE-Python is the Python open source toolbox for processing and visualizing MEG and EEG data. It facilitates a wide range of data processing task, including artifact rejection, diagnositc visualizations, decoding, source localization, time-frequency analysis and statistics.

I have been core contributor since 2012.

  • http://martinos.org/mne/dev/index.html
  • https://github.com/mne-tools/mne-python
  • My most recent major contribution to MNE is the subproject MNE-HCP for fast access to the Human Connectome Project MEG data.

    Tools for large-scale analysis of MEG and EEG data

    Over the last years I developed novel methods and tools that facilitate analysis of MEG and EEG at diverse scales by adaptively automating common processing steps and simplifying access to data resources.

    The problem of data cleaning

    Removing artifacts from EEG and MEG signals is a common and necessary step in data analysis and, unfortunately, has claimed significant investment of human attention in the past. I developed and evaluated a novel algorithm, termed autoreject, for detecting and handling contaminated MEG and EEG data segments. Autoreject is described in Jas et al 2016 and is readily usable in a "plug and play" manner in a wide array of situations. Notably, its successful usage does not require deep understanding of the method as it uses machine learning technology to handle artifact rejection in a data-driven manner, hence, reducing human processing time. It will soon be disseminated through the MNE Software. The code is accessible on github. Consider our arXive preprint for in-depth validation including a reanalysis of the Human Connectome Project MEG data.

    The problem of covariance estimation for spatial whitening

    Spatial whitening based on the between-sensors covariance is a common element of source localization techniques and affects scaling and geometry of the estimated sources. Unfortunately, the practically unknown effective sampling density can call for data-dependent regularization to ensure well-posed covariance estimates. I developed and validated a novel algorithm that achieves scalable and data-driven regularization of the covariance (Engemann and Gramfort 2015, Engemann et al. 2015) by exploiting machine learning techniques and the negative log-likelihood of the covariance as objective. The code is dissiminated through the MNE-Software and can be used through the high-level functions for covariance estimation. This functionality allows to re-allocate human attentional resources by automating a common problem of the most popular inverse solution techniques. For an overview, consider these slides.

    Automated extraction of diagnostics from clinical EEG

    EEG is a commonly used technique used in neurology to supplement diagnostics of disorders of consciousness. At the Brain and Spine Institute (ICM), Paris, §and in collaboration with the intensive care unit at the Pitié-Salpêtrière Hospital, I developed an automated, scalable pipeline for extraction of EEG-biomarkers and automated diagnostical assistance. This tool-stack can be run on various computational architectures, including Amazon Webservices, and is capitalizing on web-technology. For example, it can be plugged into a webserver with remote-control over web browser and generates HTML reports by using the MNE-Report technology. The solution is described in Engemann, Fraimondo et al 2015. Unfortunately, the source code is currently not compatible with open source licensing. Parts of the code will be made available soon on github.

    Fast access to the Human Connectome Project MEG data

    The Human Connectome Project is one important data resource providing a wealth of information, i.e., hemodynamic and electrophysiological signals from task and resting state contexts, psycho-medical and genetic data. By providing curated pipeline outputs at vairous processing stages, next to the raw data, this resource lends itself to validation for various research questions. I am the founder of the MNE-HCP, a novel library designed to maximize the comfort when using this data resource for analysis of MEG. MNE-HCP abstracts away difficulties due to diverging coordinate systems, distributed information, and file format conventions. Providing a simple and consistent access to HCP MEG data facilitates emergence of standardized data analysis practices and helps to drastically reduce human processing time when accessing these data from within Python.

    Scikit-Learn

    Scikit-Learn is a rich statistical learning library that makes available a wide array of statistical models through a coherent interface.

    I have contributed several performance enhancements and documentation improvements to the decomposition module and computational core functions

  • http://scikit-learn.org/dev/
  • https://github.com/scikit-learn/scikit-learn
  • Miscellaneous

    Smaller projects and libraries that I am authoring

  • meeg-preprocessing: utils for cleaning reporting M/EEG data
  • h5io: a convenient highlevel interface to hdf5 files
  • pyeyeparse: MNE-style access to eyelink eye-tracking data
  • PyMed: utils for processing PubMed records
  • Collaborators