Difference between revisions of "Course syllabus: Applied regression analysis"

From Research management course
Jump to: navigation, search
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
 +
{{#seo:
 +
|title=Applied regression analysis course
 +
|titlemode=replace
 +
|keywords=Applied regression analysis course
 +
|description=Applied regression analysis course is devoted to data modeling problems in regression analysis. The emphasis is the automatic model generation and the optimal model structure selection.
 +
}}
 
The lecture course is devoted to data modeling problems in regression analysis. The emphasis is the automatic model generation and the optimal model structure selection. This lecture course has been delivered since 2006.
 
The lecture course is devoted to data modeling problems in regression analysis. The emphasis is the automatic model generation and the optimal model structure selection. This lecture course has been delivered since 2006.
  

Latest revision as of 01:05, 13 February 2024

The lecture course is devoted to data modeling problems in regression analysis. The emphasis is the automatic model generation and the optimal model structure selection. This lecture course has been delivered since 2006.

The lecture course consists of the theoretical part (40 hours) and the practice (40 hours). The theory includes methods of regression analysis and their basis. The practice includes a series of algorithms to develop using Matlab or Scilab software tools. Prerequisites: linear algebra, statistics, and programming skills. The knowledge of the optimization methods is appreciated.

  1. Introduction to regression analysis
    • Terminology: approximation, interpolation, extrapolation, regression
    • Standard notation. Problem statement
    • What is the regression model?
    • The main problems of regression analysis
    • Linear regression and least squares
    • Introduction to Scilab/SAS
  2. Computational linear methods
    • Singular values decomposition
    • The features of the SVD
    • Using the SVD: Fisher segmentation
    • Principal component analysis
    • Substitutions in the linear models
  3. Regularization in the linear methods
    • Spaces of the singular vectors
    • The matrix norms and conditionality
    • Regularization for the LS, SVD, PCA
    • The weighted regression
    • Scales and the Pareto-slicing
    • The integral indicators and the expert estimations
    • Expert estimation concordance and regularization: linear and quadratic
  4. Group method for data handling and cross-validation
    • The GMDH principles
    • Cross validation principles and overtraining
    • External and internal criterions
    • Criterions of the regularity, minimal bias and forecast ability
    • Linear combinations of the criterions
    • Criterion space and Pareto-optimal front
  5. GMDH and model generation
    • The GMDH basic model and Kolmogorov-Gabor polynomial
    • Substitution in the basic model
    • The GMDH algorithm and its termination
    • The multilayer, combinatorial and genetic algorithms
  6. Non-linear parametric model generation
    • Problem statements and model representations
    • Four techniques of the model generation
    • Symbolic regression and problems of inductive generation
    • Substitutions and algebra of the trees
    • Expert way of initial model construction
    • Interpretable models
  7. Residual analysis
    • General statistics of the residuals
    • Dispersion analysis
    • Correlation of the residuals, Durbin-Watson criterion
    • Bootstrap of the samples
    • Error function in the data space and in the parameter space
    • Penalty for the parameter values on the linear models
    • Lipshitz constant and data generation hypothesis
  8. Data generation hypothesis
    • Random variable distribution
    • Joint distribution
    • The maximum likelihood principle
    • Univariate and multivariate normal distribution inference
    • The simplest method to estimate the distribution for given hypothesis
    • Statistic features of the parameters: consistency, efficiency, biasness
    • Graphics analysis of the parameter estimations
  9. Coherent Bayesian Inference
    • The first level of the inference
    • The parameter distribution
    • Example of the finite parametric model comparison
    • Model generation and model selection flow
    • The model evidence
    • The posterior distribution and Occam factor
    • Example of the model selection process
  10. Parameter space analysis
    • Optimal brain surgery and importance of model elements
    • Laplace approximation, one and multidimensional
    • Integration in the parameter space
    • Estimation of the hyperparameters
    • Algorithms of the Hessian matrix approximation
  11. Minimum description length
    • The MDL principle using Bayesian inference
    • Kolmogorov complexity
    • Entropy and complexity
    • Akaike information criterion
    • Bayesian information criterion
    • Data complexity and model complexity
  12. Non-parametric regression
    • Data smoothing
    • Exponential smoothing
    • Kernels and regression models
    • Splne approximation
    • Regression using the radial basic functions
    • Regression using the support vector machines
  13. Time series analysis
    • Examples of the time series
    • Stationarity and ergodicity; trend and fluctuations
    • Heteroscedasticity
    • Singular structure analysis
    • Vector auto-regression
    • Numerical experiment organization
    • Requirements and expectations in the field of applications
    • Expert point-of-view and expert estimations
    • Data preprocessing organization
    • Choice of the model class and algorithms
    • Architecture of the software
    • Report fulfillment

Prerequisites: Knowledge of linear algebra and statistics is required.

  • Grading:
  • Exam 80%
  • Coursework 10%
  • Intermediate case test 10%


see versions