01.01.08 · foundations / linear-algebra

Eigenvalue, eigenvector, characteristic polynomial

shipped3 tiersLean: partial

Anchor (Master): Apostol Calculus Vol. 2 Ch. 4; Axler — Linear Algebra Done Right Ch. 5; Hoffman-Kunze — Linear Algebra Ch. 6; Hörmander — Linear Operators (spectral framing)

Intuition [Beginner]

An eigenvector of a linear map is an arrow the map does not rotate. Apply the map; the arrow comes back pointing along its original line, possibly longer, possibly shorter, possibly flipped to the opposite direction. The number recording how much the arrow has been stretched is the eigenvalue of that eigenvector. A stretch by a factor of 2 is an eigenvalue of 2; a flip and a stretch by 3 is an eigenvalue of ; a fold flat onto the origin is an eigenvalue of 0.

Most arrows are not eigenvectors. A generic arrow gets rotated by the map and lands pointing somewhere new. Eigenvectors are the special arrows along which the map acts by a single scalar — the rotation is invisible because the arrow has no perpendicular component to rotate into.

A linear map can have several eigenvalues, each one labelling its own family of eigenvectors. Together, the eigenvalues record the stretching directions of the map and the stretch factors along each. They are the simplest pieces of information needed to describe what a linear map does.

Visual [Beginner]

The left panel shows a generic vector in the plane and the result after a linear map is applied. The two arrows point in different directions: is not an eigenvector. The right panel shows a special vector along a specific line; the result lies on the same line, only longer. The ratio of lengths is the eigenvalue .

On the left, a generic vector v in the plane is rotated to a non-parallel vector T v by the linear map T; on the right, an eigenvector e is sent by T to T e equals lambda times e along the same line, with lambda equal to the stretch factor. Below each panel, the eigenvalue equation T v equals lambda v is shown, together with the characteristic polynomial chi(t) equals det(t I minus T) whose roots are the eigenvalues.

Two pieces of data are encoded here. The line along which the map acts as a pure stretch is the eigendirection. The number recording the stretch factor along that line is the eigenvalue. The whole map decomposes into these directions whenever there are enough of them.

Worked example [Beginner]

Take the two-by-two matrix

Apply to the vector : the result is , which is times the input. So is an eigenvector with eigenvalue . Apply to : the result is , which is times the input. So is an eigenvector with eigenvalue .

Read the geometry. The map stretches everything along the horizontal axis by a factor of and flips everything along the vertical axis (a stretch by ). Every arrow drawn in the plane breaks into a horizontal piece and a vertical piece; the map acts on each piece by its own scalar. A vector becomes .

Now an off-diagonal example. Take

Apply to : the result is . So is an eigenvector with eigenvalue . Try to find a second eigenvector along : , which is not a scalar multiple of . The matrix has only one eigenvalue () and one eigendirection (the horizontal line), even though it is a two-by-two matrix.

What this tells us: a linear map may have a full set of eigendirections (one per dimension), in which case the map looks like pure stretching in those directions, or it may have fewer, in which case some shearing is involved and the map cannot be reduced to pure stretching.

Check your understanding [Beginner]

Formal definition [Intermediate+]

Let be a vector space over a field and let be a linear operator on . A scalar is an eigenvalue of if there exists a non-zero vector with

The non-zero vector is then called an eigenvector of for the eigenvalue . The condition is part of the definition: without it every scalar would qualify as an eigenvalue because the equation holds at for any [quantum-well md/Mathematical foundations/Algebra/Linear Algebra and Matrix Theory/Eigenvalues and eigenvectors.md].

For each the eigenspace of at is the subspace

The eigenspace contains the zero vector and every eigenvector for . It is a subspace of because it is the kernel of a linear map 01.01.05. The scalar is an eigenvalue of exactly when .

The characteristic polynomial of , when is finite-dimensional, is

where is the identity operator on and the determinant 01.01.07 is computed in the polynomial ring . Concretely, fix a basis of to represent by a matrix ; then . Choice of basis is harmless: under change of basis , the multiplicativity of the determinant gives , recorded in Exercise 8 of 01.01.07. The polynomial is therefore an invariant of the operator class. It is monic of degree , with constant term and next-to-leading coefficient .

Eigenvalues are roots of the characteristic polynomial. A scalar is an eigenvalue of if and only if . Indeed, is an eigenvalue iff , which by the invertibility corollary in 01.01.07 is equivalent to , which is exactly .

The algebraic multiplicity of an eigenvalue is its multiplicity as a root of . The geometric multiplicity of is . Both are positive integers; geometric multiplicity is always at most algebraic multiplicity, with strict inequality possible.

Worked instances. For the matrix , the characteristic polynomial is with two distinct simple roots; algebraic multiplicity equals geometric multiplicity (both ) for each eigenvalue. For the Jordan block , the characteristic polynomial is ; the single eigenvalue has algebraic multiplicity but geometric multiplicity — the eigenspace is the kernel of , the span of alone. For the planar rotation by angle , , the characteristic polynomial is , with no real roots when ; over the roots are [textbooks-extra Calculus Vol.2 - Multi-Variable Calculus and Linear Algebra with Applications (Tom Apostol).pdf].

Diagonalisability. The operator is diagonalisable if admits a basis consisting of eigenvectors of . Equivalently, as a direct sum over eigenvalues. In a basis of eigenvectors, the matrix of is diagonal with the eigenvalues on the diagonal. Diagonalisability is the condition that the sum of geometric multiplicities equals .

Counterexamples to common slips

  • The condition in the eigenvalue definition is load-bearing. Drop it and every scalar becomes an eigenvalue of every operator.
  • An operator on a real vector space may have no eigenvalues in the ground field: planar rotation by an angle other than or has with no real roots. The same operator over has eigenvalues .
  • Algebraic multiplicity is a property of the characteristic polynomial; geometric multiplicity is a property of the eigenspace. They can differ. The Jordan block has as an eigenvalue of algebraic multiplicity and geometric multiplicity .
  • Diagonalisability is not implied by having eigenvalues; it is implied by having enough independent eigenvectors. The Jordan block above has an eigenvalue but is not diagonalisable.

Key theorem with proof [Intermediate+]

Theorem (linear independence of eigenvectors for distinct eigenvalues). Let be a linear operator on a vector space over . Let be distinct eigenvalues of and let be corresponding eigenvectors, with and for . Then is linearly independent.

[textbooks-extra Calculus Vol.2 - Multi-Variable Calculus and Linear Algebra with Applications (Tom Apostol).pdf]

Proof. Induction on .

Base case . A single non-zero vector with is linearly independent: a relation with forces , since the field has no zero divisors.

Inductive step. Assume the result for distinct eigenvalues. Suppose

for some scalars . The plan is to apply the operator to both sides, observe that the term drops out, and use the inductive hypothesis on the resulting relation among .

Apply to both sides of the relation, using linearity:

For each , . Substituting,

The term is and drops out, leaving

By the inductive hypothesis, is linearly independent, so each coefficient must vanish:

The eigenvalues are distinct, so for ; hence for . The original relation collapses to , and forces as well. Every coefficient vanishes, so is linearly independent.

Corollary (sufficient condition for diagonalisability). Let be a linear operator on a finite-dimensional vector space over with . If has distinct eigenvalues in , then is diagonalisable.

Proof. Let be the distinct eigenvalues and pick one eigenvector for each . By the theorem, is a linearly independent set of vectors in . Any linearly independent set of vectors is a basis 01.01.04, so is a basis of consisting of eigenvectors. The matrix of in this basis is diagonal with in the -th diagonal entry.

Bridge. The linear-independence-for-distinct-eigenvalues theorem is the load-bearing lemma from which most of finite-dimensional spectral theory descends.

First synthesis: combined with the fundamental theorem of algebra, it gives a generic diagonalisability statement — over , the characteristic polynomial of a generic matrix has distinct roots, so a generic operator on a complex finite-dimensional vector space is diagonalisable, with the non-diagonalisable locus a proper algebraic subvariety of the matrix space.

Second synthesis: combined with the determinant identity , it lets the entire spectral structure of be read off from the characteristic polynomial — the eigenvalues are the roots, the algebraic multiplicities are the root multiplicities, and the geometric multiplicities are the dimensions of the kernels of at each root.

Third synthesis: the same independence pattern, applied to generalised eigenspaces instead of eigenspaces, yields the primary decomposition where is the algebraic multiplicity; this is the path to the Jordan canonical form, in which every operator over becomes a direct sum of Jordan blocks with a single nilpotent shift.

Fourth synthesis: the spectral data carries the dynamics of iteration. The -th iterate has the same eigenvectors as and eigenvalues , so the growth rate of orbits is governed by the largest eigenvalue (the spectral radius), and the long-time behaviour of any linear dynamical system — stability, oscillation, decay — is determined entirely by the location of the eigenvalues in the complex plane.

Exercises [Intermediate+]

Lean formalization [Intermediate+]

Mathlib packages the operator-level eigenvalue machinery as Module.End.HasEigenvalue and Module.End.eigenspace, the characteristic polynomial of a square matrix as Matrix.charpoly, the linear independence of eigenvectors for distinct eigenvalues as Module.End.eigenvectors_linearIndependent, and the Cayley-Hamilton identity as Matrix.aeval_self_charpoly. The companion file Codex.Foundations.LinearAlgebra.Eigen records the statements used above and packages them in the Codex namespace.

[object Promise]

This unit is marked lean_status: partial because Mathlib supplies every named ingredient — Module.End.HasEigenvalue, Module.End.eigenspace, Matrix.charpoly, Module.End.eigenvectors_linearIndependent, and Matrix.aeval_self_charpoly — but the unification of the operator-level eigenvalue framework with the Codex axiomatic determinant module, together with a single named Jordan-canonical-form result, is not packaged in Mathlib; the corresponding statements in the companion module are left as sorry-gated aliases.

Advanced results [Master]

Cayley-Hamilton theorem. Every operator satisfies its own characteristic polynomial. Concretely, for on a finite-dimensional -vector space with characteristic polynomial , the operator is the zero operator. Equivalently, for ,

where . The standard proof routes through the adjugate identity from 01.01.07: the matrix over satisfies ; expanding the adjugate in powers of and matching coefficients gives a telescoping identity that collapses to .

Minimal polynomial. The set of polynomials with is an ideal of ; the unique monic generator is the minimal polynomial of . By Cayley-Hamilton, divides . The two polynomials have the same roots (the eigenvalues of ), but the multiplicities may differ: the multiplicity of in is the size of the largest Jordan block of at , while the multiplicity in is the algebraic multiplicity (the sum of the sizes of all Jordan blocks at ). The operator is diagonalisable iff has distinct roots — that is, iff is square-free.

Primary decomposition. For an operator on a finite-dimensional vector space over , with factoring over an algebraic closure, the generalised eigenspace at is

The space decomposes as , with each invariant under and . The restriction of to has the single eigenvalue , and on the operator is nilpotent of order .

Jordan canonical form. Over an algebraically closed field , every operator on a finite-dimensional -vector space is similar to a direct sum of Jordan blocks

with and ranging over the eigenvalues of . The decomposition refines the primary decomposition: on each generalised eigenspace , the restriction of is a nilpotent operator, and the Jordan block structure records its nilpotent type (the partition of by Jordan-block sizes). The multiset of block sizes at is the Segre characteristic of at , and the Jordan-form classification is the conjugacy classification of — two matrices in are similar iff they have the same Jordan form up to block reordering [Jordan, C. — Traité des substitutions et des équations algébriques].

Spectral theorem (finite-dimensional). Let be a finite-dimensional inner-product space over (Hermitian) or (Euclidean). A linear operator is self-adjoint if for all . The spectral theorem says that every self-adjoint operator has real eigenvalues and an orthonormal basis of consisting of eigenvectors of . Equivalently, the matrix of in an orthonormal basis is diagonalisable by a unitary (resp. orthogonal) change of basis, with real diagonal entries. The companion real spectral theorem says that real symmetric matrices are diagonalisable over — a statement about that, over the complex extension, follows from the Hermitian case.

Banach- and Hilbert-space generalisation. For a bounded linear operator on a Banach space , the spectrum is the set of for which fails to be invertible. The spectrum partitions as

  • the point spectrum , eigenvalues in the classical sense ( is not injective);
  • the continuous spectrum , is injective with dense but proper image;
  • the residual spectrum , is injective with non-dense image.

The resolvent map is an operator-valued analytic function on , and the spectrum is a non-empty compact subset of the closed disc of radius . For a self-adjoint operator on a Hilbert space, the spectrum is real, and the spectral theorem of Stone and von Neumann associates a projection-valued spectral measure on with [von Neumann, J. — Allgemeine Eigenwerttheorie Hermitescher Funktionaloperatoren]; the finite-dimensional spectral decomposition over eigenprojections is the discrete-measure case.

Spectral radius and dynamics. The spectral radius of is , the largest absolute value of an eigenvalue. Gelfand's formula gives in the Banach-space setting. The spectral radius controls the long-time behaviour of the discrete dynamical system : orbits decay to zero when , are bounded when with no eigenvalues on the unit circle producing Jordan blocks of size , and grow exponentially when . The same dichotomy in continuous time governs the stability of an equilibrium of via the location of eigenvalues relative to the imaginary axis.

Frobenius eigenvalues in arithmetic. The eigenvalues of geometric Frobenius acting on -adic étale cohomology of a smooth projective variety over a finite field are algebraic integers whose absolute values, under every complex embedding, are integer powers of (the Weil conjectures, proved by Deligne in 1974). This is the arithmetic incarnation of the eigenvalue concept: the spectral data of a single linear operator (Frobenius on cohomology) encodes the number of points of the variety over every finite extension of the base field, via the Lefschetz fixed-point formula. The same package — eigenvalues of a canonical operator on a cohomology space— recurs in representation theory (characters of Hecke operators), in number theory (Galois representations), and in mathematical physics (energy spectrum of a quantum Hamiltonian).

Synthesis. Several threads weave together. First synthesiseigenvalues as the principal invariants of a linear operator: the entire similarity class of is recorded by its Jordan form, hence by the multiset of eigenvalues with their Segre characteristics; every coordinate-free polynomial in the entries of — the trace, determinant, characteristic polynomial coefficients — is a symmetric function of the eigenvalues. Second synthesisthe characteristic polynomial as the bridge between linear algebra and algebra: is a degree- polynomial whose roots carry the spectrum, but the polynomial itself lives in and is amenable to factorisation, divisibility, and minimal-polynomial bookkeeping — so the spectral structure of is encoded entirely in the ideal and the module structure for (the elementary divisors). Third synthesisself-adjointness as a guarantee of diagonalisability: a self-adjoint operator on an inner-product space carries enough orthogonality data that the spectral theorem produces an orthonormal eigenbasis; this is the linear-algebra origin of the unitary representation theory of compact groups, the spectral decomposition of compact self-adjoint operators on Hilbert space, and the eigenfunction expansions of the Laplacian on a compact Riemannian manifold. Fourth synthesisspectrum as dynamical signature: the location of the spectrum of an evolution operator in the complex plane determines the qualitative dynamics — exponential decay, oscillation, exponential growth — and is the linear-algebra core of stability theory for differential equations and discrete-time recurrences alike.

Full proof set [Master]

Cayley-Hamilton via the adjugate identity. Work over a commutative ring and let . The adjugate identity from 01.01.07 gives

as a matrix identity in . The entries of are polynomials of degree at most in ; write

Write . Expanding the left side of the adjugate identity and matching coefficients of for each gives the system

Multiply the -th identity on the left by and sum from to (with formally). The left side telescopes:

interpreting from the boundary identity (which contributes to the right side). The right side is . Equating, .

Minimal polynomial divides characteristic. By Cayley-Hamilton, , so belongs to the ideal . The minimal polynomial is the monic generator of this ideal, hence divides . Equality holds iff every elementary divisor of the -module (the structure-theorem decomposition) is a single power — iff is a cyclic -module, iff has a cyclic vector.

Same roots. Every root of is a root of because . Conversely, every eigenvalue of satisfies for some non-zero , so ; since and , . The two polynomials have the same set of roots; the multiplicities differ as recorded.

Diagonalisability iff minimal polynomial has distinct roots. Suppose is square-free with distinct roots . The polynomials satisfy (Lagrange interpolation at the with constant value ) and is divisible by , so . Setting gives and . Each is a projection onto an eigenspace , and , so is diagonalisable. Conversely, if is diagonalisable with eigenvalues , then satisfies on each eigenspace, hence on , so ; since has the same roots as and is square-free with those roots, is square-free.

Primary decomposition. Let over an algebraic closure of (the eigenvalues are distinct). The polynomials are pairwise coprime in . By the structure theorem for finitely generated modules over a PID, applied to as a -module with acting as ,

for some exponents ; here and the inclusion of the summand identifies with . The direct-sum decomposition gives with each invariant under . Restricting to , the operator is nilpotent because annihilates . Hence on , with nilpotent.

Jordan canonical form. On a generalised eigenspace the operator is nilpotent. The structure theorem for nilpotent operators on a finite-dimensional vector space gives a basis in which takes the block-diagonal form with the standard nilpotent shift (1 on the superdiagonal, 0 elsewhere). Adding to each block gives on in the form where is the standard Jordan block. Assembling over gives the full Jordan canonical form . The multiset of block sizes at each — the Segre characteristic — is uniquely determined by ; two operators are similar iff they have the same Jordan form up to permutation of blocks. The construction of the basis of that exhibits in block form is the Jordan chain construction: pick a basis of , lift to a basis of via -preimages, and iterate.

Spectral theorem for self-adjoint operators (finite-dimensional). Let be a finite-dimensional inner-product space over and let be self-adjoint, . The proof has three parts.

Eigenvalues are real. Let be an eigenvalue with eigenvector . Compute . Since , , so .

Eigenvectors for distinct eigenvalues are orthogonal. Let and with (both real by the first part). Compute . Hence , and forces .

An orthonormal eigenbasis exists. Induct on . The base case is immediate. Inductive step: over , the fundamental theorem of algebra gives a root of , hence an eigenvalue with unit-norm eigenvector . Over , the complexification on is self-adjoint with respect to the Hermitian extension of the inner product, so has a real eigenvalue by the complex case, and a real eigenvector by the conjugation symmetry of . In either case, set , the orthogonal complement. The restriction is well-defined: for , , so . The restriction is self-adjoint on (the inner product on is the restriction), and . The inductive hypothesis gives an orthonormal eigenbasis of ; together with , this is an orthonormal eigenbasis of .

Linear independence of eigenvectors, revisited functorially. The Intermediate-tier inductive proof has a coordinate-free packaging via the primary decomposition: distinct eigenvalues give pairwise coprime polynomials , so the corresponding eigenspaces sit inside pairwise distinct primary components of the -module , hence their sum is direct. The two proofs reflect the same arithmetic of — pairwise coprimality of — at different levels of abstraction.

Connections [Master]

  • Determinant: axiomatic + expansion + properties 01.01.07 — supplies the determinant identity that defines the characteristic polynomial of an operator, and the multiplicativity identity that proves the characteristic polynomial is a similarity invariant. The adjugate identity from 01.01.07 is the engine of the standard Cayley-Hamilton proof in this unit; without the determinant unit's machinery, neither the characteristic polynomial nor Cayley-Hamilton has a workable definition or proof.

  • Linear transformation: kernel, image, rank-nullity 01.01.05 — supplies the eigenspace as a kernel, , and the dimension count that links algebraic and geometric multiplicity. The equivalence " is an eigenvalue iff is not injective" is the rank-nullity equivalence "injective iff kernel is zero" combined with the determinant-based root characterisation of 01.01.07; the present unit is where the two prerequisite threads converge into a single spectral invariant.

  • Inner-product space: orthogonality, Gram-Schmidt, spectral theorem 01.01.09 pending — refines the diagonalisability content of this unit for self-adjoint operators on a finite-dimensional inner-product space, producing an orthonormal eigenbasis with real eigenvalues. The spectral theorem is the inner-product-space sharpening of the diagonalisability corollary of this unit, and the load-bearing case of the finite-dim spectral theory that generalises to the unbounded self-adjoint operators of functional analysis.

  • Unbounded self-adjoint operators on Hilbert space 02.11.03 — the infinite-dimensional generalisation in which the spectrum of a self-adjoint operator becomes a closed subset of , decomposed into point, continuous, and (in the non-self-adjoint generalisation) residual spectrum, and the spectral measure replaces the discrete eigenprojection sum. The Stone-von Neumann spectral theorem in 02.11.03 is the functional-analytic incarnation of the finite-dim spectral theorem.

  • Linear ODE systems with constant coefficients 02.06.04 pending — the solutions of for are governed by the eigenvalues and Jordan structure of : distinct eigenvalues give modes, and a Jordan block of size at adds polynomial-in- modes for . Stability of the equilibrium is governed by the location of the eigenvalues relative to the imaginary axis. The eigenvalue concept of this unit is the linear-algebra core of constant-coefficient ODE theory.

  • Discrete and continuous dynamical systems 05.01.01 — for a discrete dynamical system or a continuous system near an equilibrium, the eigenvalues of classify the local dynamics: stable equilibria correspond to all eigenvalues having (discrete) or (continuous), unstable equilibria correspond to at least one eigenvalue outside that region, and bifurcations occur precisely when an eigenvalue crosses the boundary. The eigenvalue concept of this unit is the linear-algebra core of stability theory and bifurcation analysis.

Historical & philosophical context [Master]

The eigenvalue concept arose in mechanics. Joseph-Louis Lagrange, in Mécanique analytique (Paris, 1788), found that the small oscillations of a coupled mechanical system about an equilibrium decompose into independent harmonic modes governed by the roots of an auxiliary algebraic equation — what would now be the characteristic polynomial of the linearised motion. Pierre-Simon Laplace, in Mécanique céleste (Paris, 1799–1825, Vol. I, Book II), applied the same technique to the secular variations of planetary orbits, with the roots of the équation séculaire governing the long-term stability of the solar system. Augustin-Louis Cauchy's 1829 paper in his own Exercices de Mathématiques gave the first systematic treatment, proving in particular that the characteristic equation of a real symmetric matrix has only real roots — the prototype of the spectral theorem [Cauchy, A.-L. — Sur l'équation à l'aide de laquelle on détermine les inégalités séculaires des mouvements des planètes].

Arthur Cayley introduced matrix notation in A memoir on the theory of matrices (Philosophical Transactions of the Royal Society 148, 1858, 17–37), and stated what is now the Cayley-Hamilton theorem there, with a proof for and the remark that the general case would be left to the reader [Cayley, A. — A memoir on the theory of matrices]. Ferdinand Georg Frobenius proved the general case in Über lineare Substitutionen und bilineare Formen (J. reine angew. Math. 84, 1878, 1–63), and developed the systematic invariant theory of the characteristic polynomial together with the elementary divisor theorem [Frobenius, F. G. — Über lineare Substitutionen und bilineare Formen]. Camille Jordan's Traité des substitutions et des équations algébriques (Gauthier-Villars, Paris, 1870) gave the canonical-form classification that bears his name, in the algebraic-group-theory context of substitutions of variables in polynomial equations [Jordan, C. — Traité des substitutions et des équations algébriques].

The infinite-dimensional extension is the work of David Hilbert, who in Grundzüge einer allgemeinen Theorie der linearen Integralgleichungen (Nachrichten von der Gesellschaft der Wissenschaften zu Göttingen, six papers from 1904 to 1910) developed the spectral theory of self-adjoint integral operators with continuous kernel and introduced the term Spektrum. The modern Hilbert-space formulation, with the spectral measure of an unbounded self-adjoint operator, is John von Neumann's Allgemeine Eigenwerttheorie Hermitescher Funktionaloperatoren (Mathematische Annalen 102, 1929, 49–131), written in support of the mathematical foundations of quantum mechanics: physical observables correspond to self-adjoint operators on a Hilbert space, and their possible measured values are the points of the spectrum [von Neumann, J. — Allgemeine Eigenwerttheorie Hermitescher Funktionaloperatoren]. The English vocabulary eigenvalue and eigenvector are direct loans from von Neumann's German Eigenwert and Eigenvektor, themselves shortenings of Eigenschaftswert — characteristic value.

Bibliography [Master]

  • Lagrange, J.-L., Mécanique analytique, Veuve Desaint, Paris, 1788.
  • Laplace, P.-S., Traité de Mécanique céleste, Crapelet (later Courcier), Paris, 1799–1825, Vol. I, Book II.
  • Cauchy, A.-L., "Sur l'équation à l'aide de laquelle on détermine les inégalités séculaires des mouvements des planètes", Exercices de Mathématiques 4 (1829), 140–160.
  • Jordan, C., Traité des substitutions et des équations algébriques, Gauthier-Villars, Paris, 1870.
  • Cayley, A., "A memoir on the theory of matrices", Philosophical Transactions of the Royal Society of London 148 (1858), 17–37.
  • Frobenius, F. G., "Über lineare Substitutionen und bilineare Formen", Journal für die reine und angewandte Mathematik 84 (1878), 1–63.
  • Weierstrass, K., "Zur Theorie der bilinearen und quadratischen Formen", Monatsberichte der Königlich Preussischen Akademie der Wissenschaften zu Berlin, 1868, 311–338.
  • Hilbert, D., Grundzüge einer allgemeinen Theorie der linearen Integralgleichungen, Nachrichten von der Gesellschaft der Wissenschaften zu Göttingen, six papers, 1904–1910; collected as a single Teubner monograph, Leipzig, 1912.
  • von Neumann, J., "Allgemeine Eigenwerttheorie Hermitescher Funktionaloperatoren", Mathematische Annalen 102 (1929), 49–131.
  • von Neumann, J., Mathematische Grundlagen der Quantenmechanik, Springer, Berlin, 1932.
  • Apostol, T. M., Calculus, Vol. 2: Multi-Variable Calculus and Linear Algebra with Applications, 2nd ed., John Wiley & Sons, 1969, Ch. 4.
  • Hoffman, K. & Kunze, R., Linear Algebra, 2nd ed., Prentice-Hall, 1971, Ch. 6.
  • Axler, S., Linear Algebra Done Right, 3rd ed., Springer, 2015, Ch. 5.
  • Deligne, P., "La conjecture de Weil. I", Publications Mathématiques de l'IHÉS 43 (1974), 273–307.

Autonomous production unit. Successor to determinant; load-bearing for the inner-product spectral theorem, the Jordan canonical form, the Cayley-Hamilton identity, the linear-ODE-system theory of 02.06.04 pending, the unbounded self-adjoint operators of 02.11.03, and the stability theory of dynamical systems.