Difference between revisions of "Step 3"
From Research management course
(Created page with "We are moving towards the coursework. How do you put your project description in one pitch? We reformulate the deliveries from Step 1 and Step 2 in an informal problem stateme...") |
|||
Line 14: | Line 14: | ||
# Download the link to your homework [https://forms.gle/MBkcGz6aK3LX8MFy7 here]. | # Download the link to your homework [https://forms.gle/MBkcGz6aK3LX8MFy7 here]. | ||
# Also, [https://forms.gle/v6fEGLRWkXJ6UZXi9 Step 2 homework] reminder | # Also, [https://forms.gle/v6fEGLRWkXJ6UZXi9 Step 2 homework] reminder | ||
− | # Refresh in your memory the statistical and Bayesian inference | + | # Refresh in your memory the gradient-based optimization, see either: |
+ | ## [https://en.wikipedia.org/wiki/Gradient_descent Gradient descent [https://en.wikipedia.org/wiki/Conjugate_gradient_method Conjugate gradient], | ||
+ | [https://en.wikipedia.org/wiki/Gradient Gradient], | ||
+ | [https://en.wikipedia.org/wiki/Directional_derivative Directional derivative], | ||
+ | [https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant Jacobian matrix], | ||
+ | [https://en.wikipedia.org/wiki/Hessian_matrix Hessian matrix], | ||
+ | [https://en.wikipedia.org/wiki/Vector_field#Gradient_field_in_Euclidean_spaces Gradient field], or | ||
+ | ## the book | ||
+ | |||
+ | <!--statistical and Bayesian inference--> | ||
<!--==Fun==--> | <!--==Fun==--> |
Revision as of 13:34, 29 September 2024
We are moving towards the coursework. How do you put your project description in one pitch? We reformulate the deliveries from Step 1 and Step 2 in an informal problem statement.
The seminar
- The warm-up 5-minute test
- Tensor product and tensor decomposition
- The first slide
- Homework discussion
Resources
Step 3 YouTube video (expected with online version)
Homework
- For your selected project write the motivational part using this template for Step 3.
- Download the link to your homework here.
- Also, Step 2 homework reminder
- Refresh in your memory the gradient-based optimization, see either:
Gradient, Directional derivative, Jacobian matrix, Hessian matrix, Gradient field, or
- the book
Transcript of the video
Appears after the seminar.