Welcome …#

_images/logo-xl.jpg

Welcome to the Physics-based Deep Learning Book (v0.2) 👋

TL;DR: This document contains a practical and comprehensive introduction of everything related to deep learning in the context of physical simulations. As much as possible, all topics come with hands-on code examples in the form of Jupyter notebooks to quickly get started. Beyond standard supervised learning from data, we’ll look at physical loss constraints, more tightly coupled learning algorithms with differentiable simulations, training algorithms tailored to physics problems, as well as reinforcement learning and uncertainty modeling. We live in exciting times: these methods have a huge potential to fundamentally change what computer simulations can achieve.

Note

What’s new in v0.2? For readers familiar with v0.1 of this text, the extended section Integrating DP into NN Training and the brand new chapter on improved learning methods for physics problems (starting with Scale-Invariance and Inversion) are highly recommended starting points.


Coming up#

As a sneak preview, the next chapters will show:

  • How to train networks to infer a fluid flow around shapes like airfoils, and estimate the uncertainty of the prediction. This gives a surrogate model that replaces a traditional numerical simulation.

  • How to use model equations as residuals to train networks that represent solutions, and how to improve upon these residual constraints by using differentiable simulations.

  • How to more tightly interact with a full simulator for inverse problems. E.g., we’ll demonstrate how to circumvent the convergence problems of standard reinforcement learning techniques by leveraging simulators in the training loop.

  • We’ll also discuss the importance of inversion for the update steps, and how higher-order information can be used to speed up convergence, and obtain more accurate neural networks.

Throughout this text, we will introduce different approaches for introducing physical models into deep learning, i.e., physics-based deep learning (PBDL) approaches. These algorithmic variants will be introduced in order of increasing tightness of the integration, and the pros and cons of the different approaches will be discussed. It’s important to know in which scenarios each of the different techniques is particularly useful.

Executable code, right here, right now

We focus on Jupyter notebooks, a key advantage of which is that all code examples can be executed on the spot, from your browser. You can modify things and immediately see what happens – give it a try by [running this teaser example in your browser].

Plus, Jupyter notebooks are great because they’re a form of literate programming.

Comments and suggestions#

This book, where “book” stands for a collection of digital texts and code examples, is maintained by the Physics-based Simulation Group at TUM. Feel free to contact us if you have any comments, e.g., via old fashioned email. If you find mistakes, please also let us know! We’re aware that this document is far from perfect, and we’re eager to improve it. Thanks in advance 😀! Btw., we also maintain a link collection with recent research papers.

_images/divider-mult.jpg

Fig. 1 Some visual examples of numerically simulated time sequences. In this book, we explain how to realize algorithms that use neural networks alongside numerical solvers.#

Thanks!#

This project would not have been possible without the help of many people who contributed. Thanks to everyone 🙏 Here’s an alphabetical list:

Additional thanks go to Georg Kohl for the nice divider images (cf. [KUT20]), Li-Wei Chen for the airfoil data image, and to Chloe Paillard for proofreading parts of the document.

Citation#

If you find this book useful, please cite it via:

@book{thuerey2021pbdl,
  title={Physics-based Deep Learning},
  author={Nils Thuerey and Philipp Holl and Maximilian Mueller and Patrick Schnell and Felix Trost and Kiwon Um},
  url={https://physicsbaseddeeplearning.org},
  year={2021},
  publisher={WWW}
}