Libraries for Linear Algebra#
NumPy Linear Algebra#
The NumPy library has a lot of linear algebra functions that work directly on NumPy arrays. It also uses the @
operator to define matrix multiplication. Let’s look at some of these.
import numpy as np
We can treat any two-dimensional array as a matrix.
A = np.array([[1, 2, 3],
[5, 7, 11],
[13, 17, 19]])
A
array([[ 1, 2, 3],
[ 5, 7, 11],
[13, 17, 19]])
Let’s multiply this matrix and a vector
x = np.array([1, 2, 3])
b = A @ x
b
array([ 14, 52, 104])
We can solve the system \({\bf A}{\bf x} = {\bf b}\) easily too:
x = np.linalg.solve(A, b)
x
array([1., 2., 3.])
We can also find the inverse:
A_inv = np.linalg.inv(A)
A @ A_inv
array([[ 1.00000000e+00, -5.55111512e-17, -2.77555756e-17],
[ 2.77555756e-17, 1.00000000e+00, 8.32667268e-17],
[-2.41473508e-15, 3.88578059e-16, 1.00000000e+00]])
We see that we get back nearly the identity matrix when we do \({\bf A}{\bf A}^{-1}\)
In fact, we can easily create an identity matrix:
I = np.eye(5)
I
array([[1., 0., 0., 0., 0.],
[0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0.],
[0., 0., 0., 1., 0.],
[0., 0., 0., 0., 1.]])
BLAS, LINPACK, and LAPACK#
There are several standard linear algebra libraries that were originally implemented in Fortran and C.
BLAS#
The Basic Linear Algebra Subprograms (BLAS) is a low level library that provides routines for matrix-matrix and matrix-vector multiple, norms, etc.
Most other linear algebra libraries are built from BLAS.
Three levels of functionality:
Level 1: vector operations \((\alpha {\bf x} + {\bf y})\)
Level 2: matrix-vector operations \((\alpha {\bf A x} + \beta {\bf y})\) (usually called an [SD]AXPY operation)
Level 3: matrix-matrix operations \((\alpha {\bf A B} + \beta {\bf C})\)
Many chip vendors (Intel, NVIDIA, etc.) will provide custom BLAS libraries that are hand-optimized for their chips.
LINPACK#
LINPACK is an older library for linear algebra written in the 1970s in Fortran. It was designed for the computer architectures that dominated back then.
LAPACK#
LAPACK is a more modern linear algebra library, built on top of BLAS. It provides routines for solving linear systems, finding eigenvalues, … and has optimized versions of the methods for matrices with different symmetries and structures.
Even NumPy uses LAPACK under the hood—the numpy.linalg.solve
page shows that it uses gesv
, which is the LAPACK general matrix solver (that uses LU decomposition internally).
Some more modern C++ alternatives are listed here: https://scicomp.stackexchange.com/questions/26395/how-to-start-using-lapack-in-c
For example, the Armadillo library.