info
info
Topics
Vectors and Spaces
- Vector, Norm, Dot Product, Span, Subspace, Basis, Dimension, Rank
Linear Transformations
- Matrix Multiplication, Linear Map, Matrix Representation
Orthogonality and Projections
- Inner Product, Orthogonal Projection, Gram-Schmidt, Orthonormal Basis
Matrix
- Eigenvalues, Eigenvectors, Diagonalization, Jordan Form, SVD
Advanced Topics
- Pseudoinverse, Low Rank Approximation
Study Resources & Scope Used to Study
[1] Linear Algebra, 5th Edition - Friedberg, Insel, and Spence
- Vector Spaces
- 1.2 Vector Spaces
- 1.3 Subspaces
- 1.5 Linear Dependence
- 1.6 Bases and Dimension
- Linear Transformations and Matrices
- 2.1 Linear Transformations, Null Spaces, and Ranges
- 2.2 The Matrix Representation of a Linear Transformation
- 2.3 Composition of Linear Transformations and Matrix Multiplication
- Diagonalization
- 5.1 Eigenvalues and Eigenvectors
- Inner Product Spaces
- 6.1 Inner Products and Norms
- 6.2 The Gram-Schmidt Orthogonalization Process and Orthogonal Complements
- Canonical Forms
- 7.1 The Jordan Canonical Form I
[2] Mathematics for Machine Learning - Deisenroth, Faisal, and Ong
- Linear Algebra
- 2.5 Linear Independence
- 2.6 Basis and Rank
- 2.7 Linear Mappings
- Matrix Decompositions
- Video series covering vectors, linear combinations, span, basis vectors, linear transformations, matrices, matrix multiplication, eigenvalues, and eigenvectors.
[4] Python/NumPy Implementations
- Code for low-rank approximations.
This post is licensed under CC BY 4.0 by the author.