1.1 Scalars, Vectors, Matrices, and Tensors
-
Scalars : just a single number, italics in this book, lower-case variable name, such as $x$ (loswercase)
-
Vectors : an array of numbers, in order, bold lower case name, such as $\bf x$ (bold lowercase)
-
Matrices : 2-D array of numbers, with two indices, bold uppercase variable name, such as $A$ (uppercase)
-
Tensors : an array of numbers arranged on a regular grid with a variable number of axes.
1.2 Matrix Additions and Multiplications
-
Transpose : operation on matrices. We can define a vector using the transpose operator e.g. ${\bf x} = [x_{1}, x_{2}, x_{3}]^{T}$ , for scalar $a = a^{T}$
-
Add : we can add matrices to each other as long as they have the same shape.
-
To define the matrix product of matrices $A$ and $B$, $A$ must have the same number of columns as the number of rows in $B$
-
The matrix product is not commutative ( $AB ≠ BA$ )
$$C_{i,j} = \sum_k A_{i,k}B_{k,j}$$
-
element-wise product (product of the individual elements) is denoted as $A\odot B$
-
dot-product between two vectors $x$ and $y$ is the matrix product $x^{T}y$
1.3 Reference
- Linear Algebra for AI, Edwith
- Lay et al. Linear Algebra and Its applications, 5th editionS
- Ian Good Fellow. Deep Learning Book
- Gilbert Strang’s MIT Lecture
- [Summary of Gilbert LA](https://catonmat.net/mit-linear-algebra-part-one