Difference between revisions of "BCI"
From Research management course
Line 22: | Line 22: | ||
* in Euclidean and (Hilbert space with useful example) dot product=bilinear form | * in Euclidean and (Hilbert space with useful example) dot product=bilinear form | ||
* bilinear form | * bilinear form | ||
− | * | + | * factorization |
+ | * spectral decomposition | ||
+ | * SVD | ||
+ | * ??? SVD in Hilbert space | ||
− | + | Why do we go from Eucledian to Hilbert space? Was: a vector as a number of measurements. Now it is a finite number of samples. Then it is a distribution of samples. The distribution is a point in the Hilbert space. We can make an inner product and tensor product of two and more distributions. Machine learning: given samples, multivariate distribution can be represented as a (direct?) sum of elements' tensor products. | |
− | Why we go from Eucledian to Hilbert space? Was: a vector as a number of measurements. Now it is a finite number of samples. Then it is a distribution of samples. The distribution is a point in the Hilbert space. We can make an inner product and tensor product of two and more distributions. Machine learning: given samples, multivariate distribution can be represented as a (direct?) sum of elements' tensor products. | ||
===PPCA=== | ===PPCA=== |
Revision as of 03:25, 25 March 2023
Contents
Brain-Computer Interfaces and Functional Data Analysis
This course is under construction. It enlightens fundamental mathematical concepts of brain signal analysis.
Each class combines five parts:
- Comprehensive introduction
- Practical example with code and homework
- Algebraic part of modeling
- Statistical part of modeling
- Join them in Hilbert (or any convenient) space
- Quiz for the next part (could be in the beginning) to show the theory to catch up
Linear models
SSA, SVD, PCA
Acceleroneter data
- Energy
Tensor product and spectral decomposition
- vector, covector, dot product
- linear operator
- in Euclidean and (Hilbert space with useful example) dot product=bilinear form
- bilinear form
- factorization
- spectral decomposition
- SVD
- ??? SVD in Hilbert space
Why do we go from Eucledian to Hilbert space? Was: a vector as a number of measurements. Now it is a finite number of samples. Then it is a distribution of samples. The distribution is a point in the Hilbert space. We can make an inner product and tensor product of two and more distributions. Machine learning: given samples, multivariate distribution can be represented as a (direct?) sum of elements' tensor products.