Index of Math Notebooks
- Notebook 1 / 2017: January - May / Linear Algebra / Solving Simultaneous Equations
- Notebook 2 / 2017: May - June / Linear Algebra and Goodfellow Deep Learning p. 34 - 79
- Notebook 3 / 2017: June - August / Goodfellow Deep Learning p. 82 - p.201 / Khan Academy on gradients ∇f
- Notebook 4 / 2017: August - October / Lang Calculus of Several Variables p. 385-415; p. 49 - 96
- Notebook 5 / 2017: October - November / Lang Calculus of Several Variables p. 99 - 139; p. 434 Jacobean Matrix, container for multiple gradient f’s (∇f) / PBHL Baxandall & Liebeck Vector Calculus starting October 24, 2017.
- Notebook 6 / 2017: November - December / PBHL Vector Calculus p. 35-48 / Robert Hogg et al. Introduction to Mathematical Statistics, Seventh Edition (2014) / PBHL Vector Calculus p. 48-64 / Rosebrock Deep Learning for Computer Vision with Python Chapter 10 p. 127
- Notebook 7 / December 2017 - January 2018 / PBHL p. 69 Chapter 2, Section 1: Differential Geometry / PBHL p. 81 Chapter 2, Section 9: Particle Dynamics / PBHL p. 97 Chapter 3, Section 1: Level Curves, Sets, Graphs / PBHL p. 101 Chapter 3, Section 2: Continuity and Limits for Rm → R1 / Abu-Mostafa, Magdon-Ismail, Lin (AML) Learning from Data (2012) p. 1-15
- Notebook 8 / 2018: January - April / PBHL p.123-126 Differentiability and linear approximation in R1 → Rn / PBHL p. 126 Differentiability and Tangent Planes
- Notebook 9 / 2018: February /
- Notebook 10 / 2018: April - August /
- Notebook 11 / 2018: August - October /
- Notebook 12 / 2018: September - October /
- Notebook 13 / October 2018 - February 2019 /
- Notebook 14 / 2019: November - December /
- Notebook 15 / 2019: February - July /
- Notebook 16 / 2019: July - October /
- Notebook 17 / November 2019 - March 2020 /
- Notebook 18 / 2020: March - September /
- Notebook 19 / September 2020 - August 2021 /
- Notebook 20 / August 2021 - December 2022 /
- Notebook 21 / December 2022 - March 2024/
- Notebook 22 / April 2024 - May 2024/
- Notebook 23 / May 2024 - /
Linear Algebra Done Right
- Review of T-invariant subspaces, eigenvalues, eigenvectors, and eigenspaces
- Review of upper-triagonal matrices, diagonals, diagonal matrices
Next videos to watch
2021 Log
- 8/18/2021: Back to basics, doing problems in Singh p. 493 - 503, finding eigenvalues using 3B1B 2x2 matrix quicktechnique. Then finding associated eigenvectors by setting up the characteristic equation det(A - λI) = 0 and then solving the resulting system of equations.
- 8/19: Reviewed fastest way to solve systems of equations, whether Gaussian elimination or others to complete Singh Exercises 7.1.1 p. 503. See this link also for reduced row echelon form RREF
- 8/20: Review of Gaussian elimination problems
- 8/21: More Gauss elimination problems from K. Singh textbook
- 8/22: final gaussian elimination problems. Back to algebraic numeric calculation of eigenvalues and eigenvectors before more abstract proof versions.
- 8/23: Singh Exercises 7.1 p. 503 to calculate eigenvalues and eigenvectors
- 8/24: Learned how to solve 4-variable systems more efficiently using Wolfram Alpha as an aid for managing transition between mixed fractions and exact rational number results.
- 9/01: Two eigenvalue/eigenvector problems from Singh 7.2, p. 503
- 9/02: Posted Day 2 of Gaussian elimination
- 9/03: Posted Day 3 of Gaussian elimination and prepped IG post for Saturday
- 9/04: Day 1 of Axler, Chapter 5 on IG
- 9/05: Day 2 of Axler. Building momentum in understanding how all pieces of chapter 5 fit together.
- 9/06: Day 3 of Axler on IG.
2022 Log
- 8/03/2022 Researched some new books on abstract algebra, group theory, etc.
- 8/23 Been working on Nathan Jacobson’s Basic Algebra I, 2nd Edition for the last week or two. Had a nice breakthrough on generalizing Cartesian Products from 2 parent sets to n-parent sets. For example, the coordinate / components of an n-dimensional vector in Rn
- 8/25 More work on Jacobson / Jake
2023 Log
- 1/01/2023 notes on Peano’s Axioms used to formally develop set of natural numbers and related operations by Roberto Pelayo University of Hawaii site.
- 1/02 continued on Pelayo’s treatment of Peano’s Axioms
- 1/03 more on successor functions S()
- 1/04 Downloaded history of science/math article by Michael Segre. Extensive paper: Peano’s Axiom’s in their Historical Context (1994)
- 1/06 downloaded 12 articles about Peano collected by Hubert Kennedy in context from wiki article on Giuseppe Peano
- 1/07 More progress on Segre paper
- 1/08 From Pelayo, Axiom 9: If V is an inductive set, then N is a proper subset of V. Alternately, N ⊂ V where V is an inductive set.
- 1/09 Peano first introduced the union and intersection notation commonly used today per Segre
- 1/10 More Segre / Peano
- 1/16 Review of how to integrate all 9 axioms
- 1/23 Jacobson treatment of Peano post-Pelayo
- 1/24 Exercises on Jacobson p. 19
- 1/25 Proof of addition for N as developed by Pelayo
- 1/26 additional examples of iterated addition per Pelayo
- 5/25 Good PDF ‘Everything you always wanted to know about mathematics (2013)’ and HN thread
2024 Log
- 3/09/2024. Starting over with Math which I’ve not looked at for about a year, since January 2023. Continuing with Notebook 21 which I started back in Dec 2022. Using Axler 4th edition for this run.
- 3/25/2024. Working on visual proofs for Theorems 3.22 and 3.25.
- 4/01/2024. Began Section 3D on inverse functions, invertibility etc. Also started Notebook #22.
- 4/06/2024. Began Section 5A on eigenspaces. Interesting changes between 3rd and 4th editions wrt when operators are introduced. 3rd edition introduces operators in Section 3D; 4th Ed introduces them at beginning of 5A.
- 4/07 Reviewed definition of invariant maps and invariant subspaces from Wikipedia. Wiki also introduces the terminology of calling a subspace W ‘T-invariant’ because a space W is only invariant with respect to a specific map T.
- 4/08 Also looked at eigen calculations in Singh.
- 4/09 Reviewed G. Strang YouTube lecture MIT OpenCourseWare on eigenvalues and eigenvectors. Up to 14:50 out of 51:00
- 4/10 Worked out on paper characteristic equation etc based on Strang MIT lecture
- 4/11 finished lecture 21 of strang lecture, started lecture 22.
- 4/12 Calculated eigenvectors and eigenvalues for matrices A and B from Strang lecture 21.
- 4/14 More on Strang lecture 21.
- Sometimes, one can add together matrices in the Av⃗ = λv⃗ equation, mostly they cannot. And one can only do it when the two matrices happen to share the same eigenvectors.
- 4/15 What square matrices have no real eigenvalues? Rotation Matrices, which still have eigenvalues that are complex numbers.
- See this Quora answer: ‘So the answer is: it depends on which ring you’re working in. If you’re working in C, then every polynomial with coefficients in C has solutions in C, so every square matrix would have eigenvalues. If you’re working in some other ring R, then your matrix may not have eigenvalues whenever its characteristic polynomial happens not to have any solutions in R.’
- See also this answer which says that all complex square matrcies have eigenvalues. This is the result of the spectral theorem. Also extending concept of eigen values to eigen pairs, even/odd dimensions, ortations. In 4-D space, you can have 2 independent rotations in 2 non-intersecting planes.
- 4/16 For Example 5.6 (p. 134) for T : F3 → F3 where T(x,y,z) = ( 7x+3z, 3x+6y+9z, -6y ). Use notes on how to calculate the determinant of a 3x3 square matrix from JH math notebook #5, 10/25/2017 - PB+HL Chapter 1, section 2, Exercise 7.
- 4/21 eigenvector matrix S, eigenvalue matrix Λ
- 4/23. Note. Master YouTube playlist for Strang’s Spring 2005 lecture MIT 18.06 Linear Algebra
- Lecture 21 - Intro to Eigenvalues, Eigenvectors
- Lecture 22 - Diagonalizability, Fibonnaci’s Sequence
- 4/24 - Tufts PDF Theorems 5 and 6 about how an n x n matrix is diagonalizable iff it has n distinct eigenvalues. This is from their intro course on pure and applied linear algebra Tufts Math 70. Here is the 2020 syllabus
- 4/28 - Started Notebook 23 after finishing Fibonacci Sequence using AS = SΛ
- 4/30 - Thinking through the proof of Theorem 5.11 (p.136) in Axler 4e: Linearly independent eigenvectors.
- Statement of Theorem Let T be an operator on V. Then every list of eigenvectors of T corresponding to distinct eigenvalues of T is linearly independent.
- Proof
- Suppose the desired result is false. Then there must exist a smallest positive integer m such that there exists a linearly dependent list v⃗1…v⃗m of e-vectors of T corresponding to distinct e-values lambda1…lambdam of T.
- Note: m must be equal to or greater than 2 because an eigenvector is by definition, a non-zero vector.
- Thus there must exist a list of scalars a1…am that belong to F1, none of which are 0 (because of the minimality of m), such that the linear combinations of a1v⃗1 + … + amv⃗m = 0.
- Apply T - lambdamI to both sides of the equation above, getting…
- 5/18 - Restarted thinking about how eigenvalues and eigenvectors have no meaning without a specific map T or matrix A associated. And the most powerful (only?) usage is in operators aka square matrices. See also Theorem 5.11 is better expressed in Axler 3e vs. Axler 4e.
- 5/19 - In general, only square matrices have eigenvectors and eigenvalues. Although Singular Value Decomposition SVD allows one to find equivalents in rectangular non-square matrices.
- 6/01 - In trying to understand the proof for Theorem 5.12 (p. 136) I was led back to Theorem 2.22 (p.35): “In a finite-dimensional vector space, the length of every linearly independent list of vectors is less than or equal to the length of every spanning list of vectors.”
- As a follow up, interesting examples on the following p. 36, involving Example 2.23 (no list of length 4 is linearly independent in R3) and Ex. 2.24 (No list of length three, aka a list of 3 vectors, can possibly span R4.)
- 6/02 - Continued on to understanding polynomials, application to Differentiation Functions like linear map D, etc. And finally, application of raising operators to a power, e.g., Tm.
- 6/03 - Decided to review what I might have missed in Chapter 4, Polynomials. Interesting difference in Fundmantal Theorem of algebra as applied to Complex vs Real Numbers.
- 6/04 - completed Section 5A, including multiplication of polynomial function of operators. Properties: (a) homomorphism property and (b) associative property.
- 6/05 - Went back to review Chapter 4: Polynomials
- 6/10 - Completed all of Polynomials, (including careful study of fundamentals of division operations for R1). Back to Chapter 5, Section 5B, minimal polynomial.
- 6/12 - Interesting. Treatment of monic polynomials and minimal polynomials are moved in Chapter 8 (Axler 3e) to Chapter 5 (Axler 4e).
- 6/17 - In trying to understand Proof for 5.22, decided to watch this video on minimal polynomials.
- 6/18 - More youtube links:
- 6/20