Navigating mathematical basics: A primer for deep learning in science

Benoit Liquet*, Sarat Moka, Yoni Nazarathy

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

We present a gentle introduction to elementary mathematical notation with the focus of communicating deep learning principles. This is a “math crash course” aimed at quickly enabling scientists with understanding of the building blocks used in many equations, formulas, and algorithms that describe deep learning. While this short presentation cannot replace solid mathematical knowledge that needs multiple courses and years to solidify, our aim is to allow nonmathematical readers to overcome hurdles of reading texts that also use such mathematical notation. We describe a few basic deep learning models using mathematical notation before we unpack the meaning of the notation. In particular, this text includes an informal introduction to summations, sets, functions, vectors, matrices, gradients, and a few more objects that are often used to describe deep learning. While this is a mathematical crash course, our presentation is kept in the context of deep learning and machine learning models including the sigmoid model, the softmax model, and fully connected feedforward deep neural networks. We also hint at basic mathematical objects appearing in neural networks for images and text data.

Original languageEnglish
Title of host publicationComputational neurosurgery
EditorsAntonio Di Ieva, Eric Suero Molina, Sidong Liu, Carlo Russo
Place of PublicationCham
PublisherSpringer, Springer Nature
Chapter5
Pages71-96
Number of pages26
ISBN (Electronic)9783031648922
ISBN (Print)9783031648915
DOIs
Publication statusPublished - 2024

Publication series

NameAdvances in Experimental Medicine and Biology
Volume1462
ISSN (Print)0065-2598
ISSN (Electronic)2214-8019

Keywords

  • Deep Learning
  • Machine learning
  • Mathematics for Data Science
  • Mathematics of Machine Learning

Cite this