Linear Algebra: Theory, Intuition, Code
$25.00
Minimum price
$30.00
Suggested price

Linear Algebra: Theory, Intuition, Code

Wholesome mathy goodness for everyone.

About the Book

Linear algebra is perhaps the most important branch of mathematics for computational sciences, including machine learning, AI, data science, statistics, simulations, computer graphics, multivariate analyses, matrix decompositions, signal processing, and so on.

The way linear algebra is presented in traditional textbooks is different from how professionals use linear algebra in computers to solve real-world applications in machine learning, data science, statistics, and signal processing. For example, the "determinant" of a matrix is important for linear algebra theory, but should you actually use the determinant in practical applications? The answer may surprise you!

If you are interested in learning the mathematical concepts linear algebra and matrix analysis, but also want to apply those concepts to data analyses on computers (e.g., statistics or signal processing), then this book is for you. You'll see all the math concepts implemented in MATLAB and in Python.

Unique aspects of this book:

- Clear and comprehensible explanations of concepts and theories in linear algebra.

- Several distinct explanations of the same ideas, which is a proven technique for learning.

- Visualization using graphs, which strengthens the geometric intuition of linear algebra.

- Implementations in MATLAB and Python. Com'on, in the real world, you never solve math problems by hand! You need to know how to implement math in software!

- Beginner to intermediate topics, including vectors, matrix multiplications, least-squares projections, eigendecomposition, and singular-value decomposition.

- Strong focus on modern applications-oriented aspects of linear algebra and matrix analysis.

- Intuitive visual explanations of diagonalization, eigenvalues and eigenvectors, and singular value decomposition.

- Codes (MATLAB and Python) are provided to help you understand and apply linear algebra concepts on computers.

- A combination of hand-solved exercises and more advanced code challenges. Math is not a spectator sport!

About the Author

Mike X Cohen
Mike X Cohen

Mike is an associate professor of neuroscience at the Donders Institute (Radboud University Medical Centre) in the Netherlands. He has over 20 years experience teaching scientific coding, data analysis, statistics, and related topics, and has authored several online courses and textbooks. He has a suspiciously dry sense of humor and apparently enjoys making pop-art in his spare time.

Table of Contents

  • 0.1 Front matter
  • 0.2 Dedication
  • 0.3 Forward
  • 1 Introduction
    • 1.1 What is linear algebra and why learn it?
    • 1.2 About this book
    • 1.3 Prerequisites
    • 1.4 Exercises and code challenges
    • 1.5 Online and other resources
  • 2 Vectors
    • 2.1 Scalars
    • 2.2 Vectors: geometry and algebra
    • 2.3 Transpose operation
    • 2.4 Vector addition and subtraction
    • 2.5 Vector-scalar multiplication
    • 2.6 Exercises
    • 2.7 Answers
    • 2.8 Code challenges
    • 2.9 Code solutions
  • 3 Vector multiplication
    • 3.1 Vector dot product: Algebra
    • 3.2 Dot product properties
    • 3.3 Vector dot product: Geometry
    • 3.4 Algebra and geometry
    • 3.5 Linear weighted combination
    • 3.6 The outer product
    • 3.7 Hadamard multiplication
    • 3.8 Cross product
    • 3.9 Unit vectors
    • 3.10 Exercises
    • 3.11 Answers
    • 3.12 Code challenges
    • 3.13 Code solutions
  • 4 Vector spaces
    • 4.1 Dimensions and fields
    • 4.2 Vector spaces
    • 4.3 Subspaces and ambient spaces
    • 4.4 Subsets
    • 4.5 Span
    • 4.6 Linear independence
    • 4.7 Basis
    • 4.8 Exercises
    • 4.9 Answers
  • 5 Matrices
    • 5.1 Interpretations and uses of matrices
    • 5.2 Matrix terms and notation
    • 5.3 Matrix dimensionalities
    • 5.4 The transpose operation
    • 5.5 Matrix zoology
    • 5.6 Matrix addition and subtraction
    • 5.7 Scalar-matrix mult.
    • 5.8 "Shifting" a matrix
    • 5.9 Diagonal and trace
    • 5.10 Exercises
    • 5.11 Answers
    • 5.12 Code challenges
    • 5.13 Code solutions
  • 6 Matrix multiplication
    • 6.1 "Standard" multiplication
    • 6.2 Multiplication and eqns.
    • 6.3 Multiplication with diagonals
    • 6.4 LIVE EVIL
    • 6.5 Matrix-vector multiplication
    • 6.6 Creating symmetric matrices
    • 6.7 Multiply symmetric matrices
    • 6.8 Hadamard multiplication
    • 6.9 Frobenius dot product
    • 6.10 Matrix norms
    • 6.11 What about matrix division?
    • 6.12 Exercises
    • 6.13 Answers
    • 6.14 Code challenges
    • 6.15 Code solutions
  • 7 Rank
    • 7.1 Six things about matrix rank
    • 7.2 Interpretations of matrix rank
    • 7.3 Computing matrix rank
    • 7.4 Rank and scalar multiplication
    • 7.5 Rank of added matrices
    • 7.6 Rank of multiplied matrices
    • 7.7 Rank of A, A', A'A, and AA'
    • 7.8 Rank of random matrices
    • 7.9 Boosting rank by "shifting"
    • 7.10 Rank difficulties
    • 7.11 Rank and span
    • 7.12 Exercises
    • 7.13 Answers
    • 7.14 Code challenges
    • 7.15 Code solutions
  • 8 Matrix spaces
    • 8.1 Column space of a matrix
    • 8.2 Column space: A AA'
    • 8.3 Determining whether v is in C(A)
    • 8.4 Row space of a matrix
    • 8.5 Row spaces of A'A and A
    • 8.6 Null space of a matrix
    • 8.7 Geometry of the null space
    • 8.8 Orthogonal subspaces
    • 8.9 Matrix space orthogonalities
    • 8.10 Dimensionalities of matrix spaces
    • 8.11 More on Ax=b and Ay=0
    • 8.12 Exercises
    • 8.13 Answers
    • 8.14 Code challenges
    • 8.15 Code solutions
  • 9 Complex numbers
    • 9.1 Complex numbers
    • 9.2 What are complex numbers?
    • 9.3 The complex conjugate
    • 9.4 Complex arithmetic
    • 9.5 Complex dot product
    • 9.6 Special complex matrices
    • 9.7 Exercises
    • 9.8 Answers
    • 9.9 Code challenges
    • 9.10 Code solutions
  • 10 Systems of equations
    • 10.1 Algebra and geometry of eqns.
    • 10.2 From systems to matrices
    • 10.3 Row reduction
    • 10.4 Gaussian elimination
    • 10.5 Row-reduced echelon form
    • 10.6 Gauss-Jordan elimination
    • 10.7 Possibilities for solutions
    • 10.8 Matrix spaces, row reduction
    • 10.9 Exercises
    • 10.10 Answers
    • 10.11 Coding challenges
    • 10.12 Code solutions
  • 11 Determinant
    • 11.1 Features of determinants
    • 11.2 Determinant of a 2x2 matrix
    • 11.3 The characteristic polynomial
    • 11.4 3x3 matrix determinant
    • 11.5 The full procedure
    • 11.6 Delta of triangles
    • 11.7 Determinant and row reduction
    • 11.8 Delta and scalar multiplication
    • 11.9 Theory vs practice
    • 11.10 Exercises
    • 11.11 Answers
    • 11.12 Code challenges
    • 11.13 Code solutions
  • 12 Matrix inverse
    • 12.1 Concepts and applications
    • 12.2 Inverse of a diagonal matrix
    • 12.3 Inverse of a 2x2 matrix
    • 12.4 The MCA algorithm
    • 12.5 Inverse via row reduction
    • 12.6 Left inverse
    • 12.7 Right inverse
    • 12.8 The pseudoinverse, part 1
    • 12.9 Exercises
    • 12.10 Answers
    • 12.11 Code challenges
    • 12.12 Code solutions
  • 13 Projections
    • 13.1 Projections in R2
    • 13.2 Projections in RN
    • 13.3 Orth and par vect comps
    • 13.4 Orthogonal matrices
    • 13.5 Orthogonalization via GS
    • 13.6 QR decomposition
    • 13.7 Inverse via QR
    • 13.8 Exercises
    • 13.9 Answers
    • 13.10 Code challenges
    • 13.11 Code solutions
  • 14 Least-squares
    • 14.1 Introduction
    • 14.2 5 steps of model-fitting
    • 14.3 Terminology
    • 14.4 Least-squares via left inverse
    • 14.5 Least-squares via projection
    • 14.6 Least-squares via row-reduction
    • 14.7 Predictions and residuals
    • 14.8 Least-squares example
    • 14.9 Code challenges
    • 14.10 Code solutions
  • 15 Eigendecomposition
    • 15.1 Eigenwhatnow?
    • 15.2 Finding eigenvalues
    • 15.3 Finding eigenvectors
    • 15.4 Diagonalization
    • 15.5 Conditions for diagonalization
    • 15.6 Distinct, repeated eigenvalues
    • 15.7 Complex solutions
    • 15.8 Symmetric matrices
    • 15.9 Eigenvalues singular matrices
    • 15.10 Eigenlayers of a matrix
    • 15.11 Matrix powers and inverse
    • 15.12 Generalized eigendecomposition
    • 15.13 Exercises
    • 15.14 Answers
    • 15.15 Code challenges
    • 15.16 Code solutions
  • 16 The SVD
    • 16.1 Singular value decomposition
    • 16.2 Computing the SVD
    • 16.3 Singular values and eigenvalues
    • 16.4 SVD of a symmetric matrix
    • 16.5 SVD and the four subspaces
    • 16.6 SVD and matrix rank
    • 16.7 SVD spectral theory
    • 16.8 Low-rank approximations
    • 16.9 Normalizing singular values
    • 16.10 Condition number of a matrix
    • 16.11 SVD and the matrix inverse
    • 16.12 MP Pseudoinverse, part 2
    • 16.13 Code challenges
    • 16.14 Code solutions
  • 17 Quadratic form
    • 17.1 Algebraic perspective
    • 17.2 Geometric perspective
    • 17.3 The normalized quadratic form
    • 17.4 Evecs and the qf surface
    • 17.5 Matrix definiteness
    • 17.6 The definiteness of A'A
    • 17.7 Eigenvalues and definiteness
    • 17.8 Code challenges
    • 17.9 Code solutions
  • 18 Covariance matrices
    • 18.1 Correlation
    • 18.2 Variance and standard deviation
    • 18.3 Covariance
    • 18.4 Correlation coefficient
    • 18.5 Covariance matrices
    • 18.6 Correlation to covariance
    • 18.7 Code challenges
    • 18.8 Code solutions
  • 19 PCA
    • 19.1 PCA: interps and apps
    • 19.2 How to perform a PCA
    • 19.3 The algebra of PCA
    • 19.4 Regularization
    • 19.5 Is PCA always the best?
    • 19.6 Code challenges
    • 19.7 Code solutions
  • 20 The end.
    • 20.1 The end... of the beginning!
    • 20.2 Thanks!

The Leanpub 45-day 100% Happiness Guarantee

Within 45 days of purchase you can get a 100% refund on any Leanpub purchase, in two clicks.

See full terms

Do Well. Do Good.

Authors have earned$10,462,200writing, publishing and selling on Leanpub, earning 80% royalties while saving up to 25 million pounds of CO2 and up to 46,000 trees.

Learn more about writing on Leanpub

Free Updates. DRM Free.

If you buy a Leanpub book, you get free updates for as long as the author updates the book! Many authors use Leanpub to publish their books in-progress, while they are writing them. All readers get free updates, regardless of when they bought the book or how much they paid (including free).

Most Leanpub books are available in PDF (for computers), EPUB (for phones and tablets) and MOBI (for Kindle). The formats that a book includes are shown at the top right corner of this page.

Finally, Leanpub books don't have any DRM copy-protection nonsense, so you can easily read them on any supported device.

Learn more about Leanpub's ebook formats and where to read them

Write and Publish on Leanpub

You can use Leanpub to easily write, publish and sell in-progress and completed ebooks and online courses!

Leanpub is a powerful platform for serious authors, combining a simple, elegant writing and publishing workflow with a store focused on selling in-progress ebooks.

Leanpub is a magical typewriter for authors: just write in plain text, and to publish your ebook, just click a button. (Or, if you are producing your ebook your own way, you can even upload your own PDF, EPUB and/or MOBI files and then publish with one click!) It really is that easy.

Learn more about writing on Leanpub