Difference between revisions of "Physics-informed machine learning"

From Research management course
Jump to: navigation, search
(Created page with "=Machine Learning for Theoretical Physics= Physics-informed machine learning<br> (seminars by Andriy Graboviy and Vadim Strijov) ==Goals== The course consists of a series of...")
 
 
(3 intermediate revisions by one other user not shown)
Line 1: Line 1:
=Machine Learning for Theoretical Physics=
+
{{#seo:
Physics-informed machine learning<br>
+
|title=Physics-informed machine learning
 +
|titlemode=replace
 +
|keywords=Physics-informed machine learning
 +
|description=The course Physics-informed Machine Learning consists of a series of group discussions devoted to various aspects of data modeling in continuous spaces.
 +
}}
 +
==Machine Learning for Theoretical Physics==
 +
<!-- Physics-informed machine learning<br> -->
 
(seminars by Andriy Graboviy and Vadim Strijov)
 
(seminars by Andriy Graboviy and Vadim Strijov)
  
 
==Goals==
 
==Goals==
The course consists of a series of group discussions devoted to various aspects of data modelling in continuous spaces. It will reduce the gap between the models of theoretical physics and the noisy measurements, performed under complex experimental circumstances. To show the selected neural network is an adequate parametrisation of the modelled phenomenon, we use geometrical axiomatic approach. We discuss the role of manifolds, tensors and differential forms in the neural network-based model selection.
+
The course consists of a series of group discussions devoted to various aspects of data modeling in continuous spaces. It will reduce the gap between the models of theoretical physics and the noisy measurements, performed under complex experimental circumstances. To show the selected neural network is an adequate parametrization of the modeled phenomenon, we use a geometrical axiomatic approach. We discuss the role of manifolds, tensors, and differential forms in the neural network-based model selection.
  
The basics for the course are the book Geometric Deep Learning: April 2021 by Michael Bronstein et al. and the paper Physics-informed machine learning // Nature: May 2021 by George Em Karniadakis et al.
+
The basics for the course are the book Geometric Deep Learning: April 2021 by Michael Bronstein et al. and the paper Physics-informed Machine Learning// Nature: May 2021 by George Em Karniadakis et al.
  
 
==Structure of the talk==
 
==Structure of the talk==
The talk is based on two-page essay ([template]).
+
The talk is based on a two-page essay ([template]).
 
# Field and goals of a method or a model
 
# Field and goals of a method or a model
 
# An overview of the method
 
# An overview of the method
Line 22: Line 28:
  
 
==Test==
 
==Test==
Todo: how make a test creative, not automised? Here be the test format.
+
Todo: how to make a test creative, not automated? Here is the test format.
  
 
==Themes==
 
==Themes==
# Spherical harmonics for mechanical motion modelling  
+
# Spherical harmonics for mechanical motion modeling  
# Tensor representations of the Brain computer interfaces
+
# Tensor representations of the Brain-computer interfaces
# Multi-view, kernels and metric spaces for the BCI and Brain Imaging
+
# Multi-view, kernels, and metric spaces for the BCI and Brain Imaging
 
# Continuous-Time Representation and Legendre Memory Units for BCI
 
# Continuous-Time Representation and Legendre Memory Units for BCI
 
# Riemannian geometry on Shapes and diffeomorphisms for fMRI
 
# Riemannian geometry on Shapes and diffeomorphisms for fMRI
 
# The affine connection setting for transformation groups for fMRI
 
# The affine connection setting for transformation groups for fMRI
# Strain, rotation and stress tensors modelling with examples
+
# Strain, rotation, and stress tensors modeling with examples
 
# Differential forms and fibre bundles with examples
 
# Differential forms and fibre bundles with examples
 
# Modelling gravity with machine learning approaches
 
# Modelling gravity with machine learning approaches
# Geometric manifolds, the Levi-Chivita connection and curvature tensors
+
# Geometric manifolds, the Levi-Chivita connection, and curvature tensors
 
# Flows and topological spaces
 
# Flows and topological spaces
 
# Application for Normalizing flow models (stress on spaces, not statistics)
 
# Application for Normalizing flow models (stress on spaces, not statistics)
Line 43: Line 49:
 
# High-order splines
 
# High-order splines
 
# Forward and Backward Fourier transform and iPhone lidar imaging analysis  
 
# Forward and Backward Fourier transform and iPhone lidar imaging analysis  
# Fourier, cosine and Laplace transform for 2,3,4D and higher dimensions
+
# Fourier, cosine, and Laplace transform for 2,3,4D and higher dimensions
 
# Spectral analysis on meshes
 
# Spectral analysis on meshes
 
# Graph convolution and continuous Laplace operators
 
# Graph convolution and continuous Laplace operators
  
 
==Schedule==
 
==Schedule==
Thursdays on 12:30 at m1p.org/go_zoom
+
Thursdays at 12:30 at m1p.org/go_zoom
 
* September 2 9 16 23 30
 
* September 2 9 16 23 30
 
* October 7 14 21 28
 
* October 7 14 21 28
Line 206: Line 212:
  
 
==References==
 
==References==
* Geometric deep learning
+
* Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges by Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Veličković, 2021
 
* Functional data analysis
 
* Functional data analysis
 +
* Mathematics for Physical Science and Engineering by Frank E. Harris, 2014

Latest revision as of 22:58, 14 February 2024

Machine Learning for Theoretical Physics

(seminars by Andriy Graboviy and Vadim Strijov)

Goals

The course consists of a series of group discussions devoted to various aspects of data modeling in continuous spaces. It will reduce the gap between the models of theoretical physics and the noisy measurements, performed under complex experimental circumstances. To show the selected neural network is an adequate parametrization of the modeled phenomenon, we use a geometrical axiomatic approach. We discuss the role of manifolds, tensors, and differential forms in the neural network-based model selection.

The basics for the course are the book Geometric Deep Learning: April 2021 by Michael Bronstein et al. and the paper Physics-informed Machine Learning// Nature: May 2021 by George Em Karniadakis et al.

Structure of the talk

The talk is based on a two-page essay ([template]).

  1. Field and goals of a method or a model
  2. An overview of the method
  3. Notable authors and references
  4. Rigorous description, the theoretical part
  5. Algorithm and link to the code
  6. Application with plots

Grading

Each student presents two talks. Each talk lasts 25 minutes and concludes with a five-minute written test. A seminar presentation gives 1 point, a formatted seminar text gives 1 point, a test gives 1 point, a reasonable test response gives 0.1 point. Bonus 1 point for a great talk.

Test

Todo: how to make a test creative, not automated? Here is the test format.

Themes

  1. Spherical harmonics for mechanical motion modeling
  2. Tensor representations of the Brain-computer interfaces
  3. Multi-view, kernels, and metric spaces for the BCI and Brain Imaging
  4. Continuous-Time Representation and Legendre Memory Units for BCI
  5. Riemannian geometry on Shapes and diffeomorphisms for fMRI
  6. The affine connection setting for transformation groups for fMRI
  7. Strain, rotation, and stress tensors modeling with examples
  8. Differential forms and fibre bundles with examples
  9. Modelling gravity with machine learning approaches
  10. Geometric manifolds, the Levi-Chivita connection, and curvature tensors
  11. Flows and topological spaces
  12. Application for Normalizing flow models (stress on spaces, not statistics)
  13. Alignment in higher dimensions with RNN
  14. Navier-Stokes equations and viscous flow
  15. Newtonian and Non-Newtonian Fluids in Pipe Flows Using Neural Networks [1], [2]
  16. Applications of Geometric Algebra and experior product
  17. High-order splines
  18. Forward and Backward Fourier transform and iPhone lidar imaging analysis
  19. Fourier, cosine, and Laplace transform for 2,3,4D and higher dimensions
  20. Spectral analysis on meshes
  21. Graph convolution and continuous Laplace operators

Schedule

Thursdays at 12:30 at m1p.org/go_zoom

  • September 2 9 16 23 30
  • October 7 14 21 28
  • November 4 11 18 25 
  • December 2 9


Date Theme Speaker Links
September 2 Course introduction and motivation Vadim Strijov GDL paper, Physics-informed
9
9
16
16
23
23
30
30
October 7
7
14
14
21
21
28
28
November 4
4
11
11
18
18
25
25
December 2
2
9 Final discussion and grading Andriy Graboviy


References

  • Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges by Michael M. Bronstein, Joan Bruna, Taco Cohen, and Petar Veličković, 2021
  • Functional data analysis
  • Mathematics for Physical Science and Engineering by Frank E. Harris, 2014