- Linear algebra refresher
- Dot product of 2 vectors = projection of one onto the other
- Each element of product gives alignment of product with the row of the matrix
- Product of matrix and vector =
- weighted sum of columns of matrix with inputs
- projection of x onto rows

- Sparsity: filling in several zeros
- Weight sharing: repeat convolution kernel in matrix being multiplied with input
- Toeplitz Matrix
- Fully connected layer – full matrix, convolutions – sparse with shared weights
- Bruna – math for deep learning