Temporally rich deep learning models for Magnetoencephalography

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
8 Downloads (Pure)

Abstract

Deep learning has been used in a wide range of applications, but it has only very recently been applied to Magnetoencephalography (MEG). MEG is a neurophysiological technique used to investigate a variety of cognitive processes such as language and learning, and is an emerging technology in the quest to identify neural correlates of cognitive impairments such as those occurring in dementia. Recent work has shown that it is possible to apply deep learning to MEG to categorise induced responses to stimuli across subjects. While novel in the application of deep learning, such work has generally used relatively simple neural network (NN) models compared to those being used in domains such as computer vision and natural language processing. In these other domains, there is a long history in developing complex NN models that combine spatial and temporal information. We propose more complex NN models that focus on modelling temporal relationships in the data, and apply them to the challenges of MEG data. We apply these models to an extended range of MEG-based tasks, and find that they substantially outperform existing work on a range of tasks, particularly but not exclusively temporally-oriented ones. We also show that an autoencoder-based preprocessing component that focuses on the temporal aspect of the data can improve the performance of existing models. Our source code is available at https://github.com/tim-chard/DeepLearningForMEG.

Original languageEnglish
Pages (from-to)1-26
Number of pages26
JournalTransactions on Machine Learning Research
Volume2024
Issue number1
Publication statusPublished - 30 Jan 2024

Bibliographical note

Copyright 2024 Transactions on Machine Learning Research. Version archived for private and non-commercial use with the permission of the author/s and according to publisher conditions. For further rights please contact the publisher.

Cite this