I have specialized in non-invasive neurophysiology, i.e., magneto- and electroencephalography. My main interest lies in developing machine learning and statistical modeling approaches to characterizing cognitive functions from brain dynamics in healthy and clinical populations. I am also interested in social cognition.
Reproducible research, scalable numerical methods and data sharing are major concerns in contemporary science. To help address these challenges, I devote considerable time and energy to developing novel statistical methods and open source software that enable automated and repeatable data processing.
Publications and Preprints
Note. Equal contributions are indicated by an asterisk ✱.
MNE Software for MEG and EEG
MNE-Python is the Python open source toolbox for processing and visualizing MEG and EEG data. It facilitates a wide range of data processing task, including artifact rejection, diagnositc visualizations, decoding, source localization, time-frequency analysis and statistics.
I have been core contributor since 2012.
Tools for large-scale analysis of MEG and EEG data
Over the last years I developed novel methods and tools that facilitate analysis of MEG and EEG at diverse scales by adaptively automating common processing steps and simplifying access to data resources.
The problem of data cleaning
Removing artifacts from EEG and MEG signals is a common and necessary step in data analysis and, unfortunately, has claimed significant investment of human attention in the past. I developed and evaluated a novel algorithm, termed autoreject, for detecting and handling contaminated MEG and EEG data segments. Autoreject is described in Jas et al 2016 and is readily usable in a "plug and play" manner in a wide array of situations. Notably, its successful usage does not require deep understanding of the method as it uses machine learning technology to handle artifact rejection in a data-driven manner, hence, reducing human processing time. It will soon be disseminated through the MNE Software. The code is accessible on github. Consider our arXive preprint for in-depth validation including a reanalysis of the Human Connectome Project MEG data.
The problem of covariance estimation for spatial whitening
Spatial whitening based on the between-sensors covariance is a common element of source localization techniques and affects scaling and geometry of the estimated sources. Unfortunately, the practically unknown effective sampling density can call for data-dependent regularization to ensure well-posed covariance estimates. I developed and validated a novel algorithm that achieves scalable and data-driven regularization of the covariance (Engemann and Gramfort 2015, Engemann et al. 2015) by exploiting machine learning techniques and the negative log-likelihood of the covariance as objective. The code is dissiminated through the MNE-Software and can be used through the high-level functions for covariance estimation. This functionality allows to re-allocate human attentional resources by automating a common problem of the most popular inverse solution techniques. For an overview, consider these slides.
Automated extraction of diagnostics from clinical EEG
EEG is a commonly used technique used in neurology to supplement diagnostics of disorders of consciousness. At the Brain and Spine Institute (ICM), Paris, §and in collaboration with the intensive care unit at the Pitié-Salpêtrière Hospital, I developed an automated, scalable pipeline for extraction of EEG-biomarkers and automated diagnostical assistance. This tool-stack can be run on various computational architectures, including Amazon Webservices, and is capitalizing on web-technology. For example, it can be plugged into a webserver with remote-control over web browser and generates HTML reports by using the MNE-Report technology. The solution is described in Engemann, Fraimondo et al 2015. Unfortunately, the source code is currently not compatible with open source licensing. Parts of the code will be made available soon on github.
Fast access to the Human Connectome Project MEG data
The Human Connectome Project is one important data resource providing a wealth of information, i.e., hemodynamic and electrophysiological signals from task and resting state contexts, psycho-medical and genetic data. By providing curated pipeline outputs at vairous processing stages, next to the raw data, this resource lends itself to validation for various research questions. I am the founder of the MNE-HCP, a novel library designed to maximize the comfort when using this data resource for analysis of MEG. MNE-HCP abstracts away difficulties due to diverging coordinate systems, distributed information, and file format conventions. Providing a simple and consistent access to HCP MEG data facilitates emergence of standardized data analysis practices and helps to drastically reduce human processing time when accessing these data from within Python.
Scikit-Learn is a rich statistical learning library that makes available a wide array of statistical models through a coherent interface.
I have contributed several performance enhancements and documentation improvements to the decomposition module and computational core functions
Smaller projects and libraries that I am authoring