Difference between revisions of "Step 3"

From Research management course
Jump to: navigation, search
 
(4 intermediate revisions by the same user not shown)
Line 3: Line 3:
 
== The seminar ==
 
== The seminar ==
 
# [https://forms.gle/4SjjkGukAnBq61mX7 The warm-up 5-minute test]  
 
# [https://forms.gle/4SjjkGukAnBq61mX7 The warm-up 5-minute test]  
# Singular values decomposition and tensor product  
+
# Singular values decomposition and tensor product with [https://timbaumann.info/svd-image-compression-demo/ demo]
# The first slide
+
# The second slide and the third slide
 
# Homework discussion
 
# Homework discussion
  
Line 14: Line 14:
 
# Download the link to your homework [https://forms.gle/MBkcGz6aK3LX8MFy7 here].
 
# Download the link to your homework [https://forms.gle/MBkcGz6aK3LX8MFy7 here].
 
# Also, [https://forms.gle/v6fEGLRWkXJ6UZXi9 Step 2 homework] reminder
 
# Also, [https://forms.gle/v6fEGLRWkXJ6UZXi9 Step 2 homework] reminder
# Refresh in your memory the gradient-based optimization, see either:
+
# Refresh in your memory the Bayesian Statistics, see either:
 +
## [https://en.wikipedia.org/wiki/Covariance_matrix Covariance Matrix], [https://en.wikipedia.org/wiki/Multivariate_normal_distribution Multivariate Normal Distribution], [https://en.wikipedia.org/wiki/Generalized_linear_model Generalized Linear Model], [https://en.wikipedia.org/wiki/Ridge_regression#Tikhonov_regularization Ridge Regression], [https://en.wikipedia.org/wiki/Bayesian_inference Bayesian inference]
 +
## the book by C.P. Bishop, [https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf Chapter 1].
 +
<!--
 +
Refresh in your memory the gradient-based optimization, see either:
 
## [https://en.wikipedia.org/wiki/Gradient_descent Gradient descent],  [https://en.wikipedia.org/wiki/Conjugate_gradient_method Conjugate gradient descent], [https://en.wikipedia.org/wiki/Gradient Gradient], [https://en.wikipedia.org/wiki/Directional_derivative Directional derivative], [https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant Jacobian matrix], [https://en.wikipedia.org/wiki/Hessian_matrix Hessian matrix], [https://en.wikipedia.org/wiki/Vector_field#Gradient_field_in_Euclidean_spaces Gradient field], or
 
## [https://en.wikipedia.org/wiki/Gradient_descent Gradient descent],  [https://en.wikipedia.org/wiki/Conjugate_gradient_method Conjugate gradient descent], [https://en.wikipedia.org/wiki/Gradient Gradient], [https://en.wikipedia.org/wiki/Directional_derivative Directional derivative], [https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant Jacobian matrix], [https://en.wikipedia.org/wiki/Hessian_matrix Hessian matrix], [https://en.wikipedia.org/wiki/Vector_field#Gradient_field_in_Euclidean_spaces Gradient field], or
 
## the book
 
## the book
 +
-->
 +
<!--statistical and Bayesian inference-->
  
<!--statistical and Bayesian inference-->
+
=== Schemes for the second slide ===
 +
====Scheme 1====
 +
# Problem to research
 +
# Method
 +
# Contribution
 +
====Scheme 2====
 +
# Problem to solve
 +
# Problem statement
 +
# Soluton
 +
====Scheme 3====
 +
# Goal
 +
# Alternatives
 +
# Choise reasoning
  
 
<!--==Fun==-->
 
<!--==Fun==-->

Latest revision as of 01:01, 5 October 2024

We are moving towards the coursework. How do you put your project description in one pitch? We reformulate the deliveries from Step 1 and Step 2 in an informal problem statement.

The seminar

  1. The warm-up 5-minute test
  2. Singular values decomposition and tensor product with demo
  3. The second slide and the third slide
  4. Homework discussion

Resources

Step 3 YouTube video

Homework

  1. For your selected project write the motivational part using this template for Step 3.
  2. Download the link to your homework here.
  3. Also, Step 2 homework reminder
  4. Refresh in your memory the Bayesian Statistics, see either:
    1. Covariance Matrix, Multivariate Normal Distribution, Generalized Linear Model, Ridge Regression, Bayesian inference
    2. the book by C.P. Bishop, Chapter 1.

Schemes for the second slide

Scheme 1

  1. Problem to research
  2. Method
  3. Contribution

Scheme 2

  1. Problem to solve
  2. Problem statement
  3. Soluton

Scheme 3

  1. Goal
  2. Alternatives
  3. Choise reasoning


Transcript of the video

Appears after the seminar.