Mathematical forecasting

From Research management course
Revision as of 21:38, 30 September 2022 by Wiki (talk | contribs) (→‎One-year plan)
Jump to: navigation, search

Current version, October 2022, is here

Main topics

  1. Autoregression and singular structure analysis
  2. Tensor decomposition and spatial-time models
  3. Signal decoding and multi-modeling
  4. Continuous-time forecasting and Neural ODEs
  5. Convergent cross mapping and dynamic systems
  6. Space alignment
  7. Riemann, Metrics learning
  8. Diffusion-graph PDEs

Lab works

Lab work contains a report in the pynb or TeX format and a talk with a discussion

  1. Title and motivated abstract
  2. Problem statement
  3. Model, problem solution
  4. Code, analysis, and illustrative plots
  5. References

Note: the model (and sometimes data) are personal contribution. The rest – infrastructure, error functions, and plots, are welcome to be created collectively and shared.

Topics of the lab works are

  • Autoregressive forecasting (Singular Structure Analysis)
  • Spatial-time forecasting (Tensor Decomposition)
  • Signal decoding (Projection to Latent Space)
  • Continuous-time forecasting (Neural Differential Equations)

Discussion and collaboration

Exam and grading

Four lab works within deadlines and the exam on topics with problems and discussion. Each lab gives 2pt, and the exam gives 2pt, so 2*4+2=10.

Abstract

This course delivers methods of model selection in machine learning and forecasting. The models are linear, tensor, deep neural networks, and neural differential equations. The modeling data are videos, audios, encephalograms, fMRIs, and other measurements in natural science. The practical examples are brain-computer interfaces, weather forecasting and various spatial-time series forecasting. The lab works are organized as paper-with-code reports.

The course joins two parts of the problem statements in Machine Learning. The first part comes from the structure of the measured data. The data come from Physics, Chemistry, and Biology and have intrinsic algebraic structures. These structures are parts of the theory that stands behind the measurement. The second part comes from errors in the measurement. The stochastic nature of errors requires statistical methods of analysis. So this course joins algebra and statistics. It is devoted to the problem of predictive model selection.

Schedule

Date N Subject Link
September 3 1 Probabilistic models Slides
10 2 Models: regression, encoders, and neural networks
17 3 Processes: bayesian regression, generative and discriminative models
24 4 Functional data analysis: decomposition of processes
31 5 Spatiotemporal models
October 8 6 Convolutional models
15 7 Talks for the fist part of lab-projects The talk template
22 8 Graph convolutions and spectrum
29 9 Fourier transform and phase retrieval problem
4 10 Radon transform and tomography reconstruction
November 12 11 Tensor decomposition and decoding problem
19 12 Statistics on riemannian spaces
26 13 Statistics on stratified spaces
December 3 14 Talks for the second part of lab-projects The talk template
4 15 Exam: problems and discussion List of problems


Topics

  • Forward and inverse problems, kernel regularisation
  • Karhunen–Loeve theorem, FPCA
  • Parametric and non-parametric models
  • Reproductive kernel Hilbert space
  • Integral operators and Mercer theorem Convolution theorem
  • Graph convolution
  • Manifolds and local models

L3 courses towards machine learning

  • Functional analysis
  • Differential geometry

References

  1. Functional data analysis by James Ramsay, Bernard Silverman, 2020
  2. Riemannian geometric statistics in medical image analysis. Edited by Xavier Pennec, Stefan Sommer, and Tom Fletcher, 2020
  3. Manifolds, tensors and forms by Paul Renteln, 2014
  4. Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators by Tailen Hsiing, Randall Eubank, 2013
  5. At the Interface of Algebra and Statistics by Tai-Danae Bradley, 2020