Polynomials and rational expressions
Anchor (Master): Lang Basic Mathematics Ch. 1.5–1.7; Lang Algebra Ch. IV; Gauss 1799 Demonstratio nova theorematis; Bourbaki Algèbre Ch. IV
Intuition [Beginner]
A polynomial is an expression built from a variable by taking whole-number powers of , multiplying each by a number, and adding the results together. For example, is a polynomial, and so is . Each piece is called a term. The biggest power of that appears with a non-zero coefficient is the degree of the polynomial.
A rational expression is a ratio of two polynomials, , where the bottom is not the zero polynomial. So is a rational expression. Polynomials are the analogue of integers in this picture, and rational expressions are the analogue of fractions: you build the second from the first by allowing division.
The pictures matter. Polynomial graphs are smooth curves that rise and fall a finite number of times. Rational-expression graphs can blow up at the points where the bottom vanishes, producing vertical asymptotes that split the graph into separate pieces.
Visual [Beginner]
Picture two side-by-side graphs. On the left, a smooth cubic curve that rises, dips, and rises again with no breaks. On the right, the rational expression , which has a vertical dashed line at where the bottom vanishes. The curve approaches the dashed line on one side, jumps to the other side, and continues without ever crossing it.
The picture records the key contrast. Polynomials are continuous everywhere because adding and multiplying by itself never creates a hole. Rational expressions can have holes and asymptotes at the points where the denominator equals zero, and reading off those points is the first thing to do when graphing a rational expression.
Worked example [Beginner]
Take and multiply it out into a single polynomial.
Apply the distributive rule term by term. The product equals . That gives . The middle terms combine: . The final result is .
The degree is , the leading coefficient is , and the constant term is . You can check the answer by plugging in : , and . Both sides agree.
A second example: divide by . The numerator factors as a difference of cubes: . So . The cube vanishes at , and the factor records exactly that root.
What this tells us: polynomial arithmetic is the same arithmetic you already do with numbers, with the rule that powers of are kept apart and matching powers combine.
Check your understanding [Beginner]
Formal definition [Intermediate+]
Fix a field (think at this tier). A polynomial over in one variable is a finite formal sum
with coefficients [Lang — Basic Mathematics Ch. 1.5–1.7]. If the polynomial has degree , written , and is the leading coefficient. The zero polynomial has no leading coefficient, and by convention . The set of all polynomials over is denoted .
Polynomials add term by term: . They multiply by distributing and collecting matching powers: . With these operations becomes a commutative ring with unit (the constant polynomial ). For a field, has no zero divisors and for nonzero .
A root of is an element with . The factor theorem states that is a root of iff divides in . A nonzero degree- polynomial has at most roots in , counted without multiplicity.
A rational expression over is a formal quotient with and , modulo the equivalence . The set of equivalence classes is the field of rational functions , the fraction field of .
Counterexamples to common slips
- The degree of a product of nonzero polynomials adds: . The degree of a sum is at most the maximum of the two degrees, with strict inequality possible. Example: , have , but because the leading terms cancel.
- The factor theorem requires the coefficient field to contain the candidate root. The polynomial has no root in but has the roots in . Whether a root exists depends on the field.
- A polynomial of degree has at most roots counted without multiplicity, not exactly . Over the polynomial has zero roots; over the polynomial has one root with multiplicity two. Counting with multiplicity over an algebraically closed field gives exactly .
Key theorem with proof [Intermediate+]
Theorem (polynomial division algorithm). Let be a field and let with . There exist unique polynomials such that
Proof. Existence by induction on . If or , take and ; then with (or ). This handles the base of the induction.
Now assume , write and , let be the leading coefficient of and the leading coefficient of . Form the polynomial
The two leading terms cancel by construction: the term in is matched by in . So , and the inductive hypothesis applies to . It produces with and or . Substituting,
Set . Then with the required degree bound on .
Uniqueness. Suppose with both remainders satisfying the degree bound. Subtracting, . If , then , while . The two sides cannot be equal, so , and consequently .
Bridge. The polynomial division algorithm builds toward unique factorisation in , in direct analogy with the integer case in 00.01.01: the algorithm makes a Euclidean domain, every Euclidean domain is a principal ideal domain, and every PID is a unique factorisation domain. So every nonzero polynomial in factors uniquely into irreducible polynomials up to constants, exactly as every nonzero integer factors uniquely into primes up to signs. The same algorithm supplies the Euclidean algorithm for greatest common divisors in : iterated remainders eventually hit zero, and the last nonzero remainder is a of the original pair. This is the foundation of partial-fraction decomposition, which sits at the centre of integrating rational functions in calculus. The algorithm is also the load-bearing step in the proof of the factor theorem: is a root of iff dividing by leaves remainder zero, which by the algorithm is equivalent to .
Putting these together, the polynomial division algorithm is the structural ancestor of unique factorisation in , partial-fraction decomposition over , the gcd-and-Bezout machinery, and the factor theorem itself. The bridge from this elementary algorithm to the fundamental theorem of algebra is that the algorithm reduces every polynomial-arithmetic question to a question about roots, and the fundamental theorem then guarantees the roots exist over . The foundational insight is that is to what is to : the ring whose fraction field is the rational-function field , and whose arithmetic is governed by an algorithm that runs in finite steps and produces unique output. This is the same content the rest of algebra reads off — Euclidean domain, PID, UFD, and Galois theory all stand on this base.
Theorem (factor theorem). Let be a field, , and . Then is a root of iff divides in .
Proof. Apply the division algorithm to and the linear polynomial . It produces and a remainder with and either or , that is, is a constant in . Evaluate at : . So . Then is a root iff iff iff .
Exercises [Intermediate+]
Lean formalization [Intermediate+]
[object Promise]The four named statements compile against Mathlib's Polynomial type. The proofs are deferred — Mathlib supplies the Euclidean-domain instance on Polynomial K for a field, the factor theorem via Polynomial.dvd_iff_isRoot, the degree bound via Polynomial.card_roots_le_degree, and the fundamental theorem of algebra via the algebraically-closed instance on . The human reviewer named in the frontmatter signs off on the coverage claim.
Advanced results [Master]
The structure of over a field sits inside a chain of progressively stronger ring-theoretic properties, and the place where the chain begins to break tells the rest of the story.
Euclidean domain, PID, UFD. A ring is a Euclidean domain if it carries a norm function supporting a division-with-remainder procedure. The polynomial division algorithm puts in this class with . Every Euclidean domain is a principal ideal domain: every ideal is generated by a single element, which is the gcd computed by the Euclidean algorithm. Every principal ideal domain is a unique factorisation domain: every nonzero non-unit factors into irreducibles uniquely up to associates [Lang — Algebra Ch. IV]. The integers and for a field are the two canonical examples of this hierarchy, and the analogy between them is exact: primes in correspond to monic irreducible polynomials in , divisibility transfers, and the gcd-and-Bezout machinery works identically.
Where the chain breaks: polynomials in several variables. The ring for is still a UFD by Gauss's lemma — if is a UFD then is a UFD, applied inductively — but it is not a PID. The ideal requires two generators because the only common divisors of and in are units. The Euclidean structure fails for the same reason: there is no canonical that supports division of by with smaller-degree remainder. Algebraic geometry takes over from here: ideals in correspond to algebraic subsets of via the Nullstellensatz, and the failure of the PID property is the algebraic shadow of the geometric fact that affine -space has subvarieties of every dimension from to .
Field of rational functions. The field of fractions of is the rational-function field . Elements are equivalence classes with , with arithmetic carried out by the cross-multiplication rules of ordinary fractions. The field is the prototype of a transcendental extension of of transcendence degree : the element satisfies no nonzero polynomial relation over , and adjoining a single transcendental element gives the same field up to isomorphism. The function-field viewpoint reframes as the field of rational functions on the affine line , and the same equivalence-class construction applied to produces , the function field of .
Algebraic closure. A field is algebraically closed if every non-constant has a root in . Every field embeds in an algebraically closed field , and is unique up to (non-canonical) -isomorphism. The construction is the Galois-theoretic analogue of the metric completion in 00.01.01: take the union of all finite algebraic extensions, or equivalently, freely adjoin roots of every monic polynomial. The algebraic closure of is the field of algebraic numbers, a countable subfield. The algebraic closure of is , by the fundamental theorem of algebra. For finite fields , the algebraic closure is the union .
Fundamental theorem of algebra. Over , every non-constant polynomial has a root. Equivalently, every of degree factors as for unique complex roots counted with multiplicity. Gauss's 1799 doctoral thesis gave three proofs of this theorem [Gauss — Demonstratio nova theorematis]; the modern complex-analytic proof uses Liouville's theorem: if had no roots, would be a bounded entire function on and therefore constant, contradicting the assumption that is non-constant. For real coefficients, complex roots come in conjugate pairs , so with each quadratic factor irreducible over and arising from a conjugate pair with , .
Galois theory and the unsolvability of the quintic. For the roots of a polynomial of degree are expressible in radicals — the quadratic, cubic (Cardano-Tartaglia), and quartic (Ferrari) formulas. For no such formula exists: Ruffini 1799 and Abel 1824 proved the absence of a general radical formula for the quintic, and Galois 1832 organised the explanation around the symmetry group of the splitting field of over . A polynomial is solvable in radicals iff its Galois group is solvable; the symmetric group is solvable iff , and the generic quintic has Galois group . Galois's framework is the symmetry theory of polynomial roots, and the rest of modern algebraic number theory unfolds from it.
Synthesis. The bridge between the elementary polynomial division algorithm and the full structure theory of is the recognition that the algorithm is the single load-bearing property — once division-with-remainder is in hand, the PID property, the UFD property, the gcd-and-Bezout pair, the factor theorem, and partial-fraction decomposition all follow as direct corollaries of the Euclidean machinery. The case-by-case proof of the division algorithm generalises into the abstract Euclidean-domain axiom, which generalises in turn into the ideal-theoretic definition of a PID, which generalises into the unique-factorisation axiom of a UFD. This is precisely the structural content the algebra strand reads off in 01.01.01: is the founding example of a one-variable polynomial ring, the analogue of inside , and the foundation of the polynomial-ring theory that drives commutative algebra.
The same machinery extends to at the cost of the PID property but with UFD intact, and that single break in the chain is the algebraic source of the geometry of affine -space. The fundamental theorem of algebra and Galois theory then read off from this base: the first says is the algebraic closure of , and the second says the symmetry of the roots controls whether a polynomial is solvable in radicals. Putting these together, the foundational insight is that is the simplest substantive ring carrying both a division algorithm and a notion of root, and every theorem about polynomial roots — from the factor theorem to Galois solvability — is read off from this single piece of structure. This unit identifies as the prototype Euclidean domain, identifies the division algorithm as the source of unique factorisation, identifies as the algebraic closure of , and identifies Galois symmetry as the obstruction to radical solvability beyond degree four.
Full proof set [Master]
Proposition (Euclidean algorithm in ). For any with , iterated division produces a sequence of remainders with the remainder of dividing by . The degrees of strictly decrease until some . The last nonzero remainder is a greatest common divisor of and , and there exist with (Bezout identity).
Proof. The strict decrease follows from the division algorithm. Since degrees are non-negative integers (or for the zero polynomial), the sequence must reach zero in finitely many steps. The standard inductive argument identifies as a common divisor: from the relation , and then by descending induction. Conversely, any common divisor of and divides every , so is a . The Bezout identity follows by back-substitution through the division steps.
Proposition ( is a PID). Every ideal is principal: there exists with .
Proof. If , take . Otherwise, let be a nonzero element of of minimal degree. For any , divide by : with or . The element lies in (since is closed under sums and multiplication by ring elements), so is either zero or a nonzero element of of degree strictly less than that of . The latter contradicts minimality of . So and . The reverse inclusion is automatic from .
Proposition ( is a UFD). Every nonzero factors as with a constant, distinct monic irreducible polynomials, and positive integers. The factorisation is unique up to the order of the factors.
Proof. This follows from PID UFD in general ring theory. The existence step uses that the degree provides a noetherian descent: any nonzero non-unit polynomial has a strict divisor of strictly smaller degree, so iterated factoring terminates. The uniqueness step uses that irreducibles in a PID are prime: if with irreducible, then or , which in turn follows from when , hence Bezout produces , so and . Standard induction on the number of irreducible factors then yields the uniqueness statement.
Proposition (fundamental theorem of algebra, Liouville-style proof). Every non-constant polynomial has a root in .
Proof sketch. Suppose for contradiction that has no root. Then is holomorphic on all of . As , because the leading term dominates, so . In particular is bounded on . Liouville's theorem states that every bounded entire function is constant, so is constant, hence is constant — contradiction. The argument relies on (i) the basic estimate for sufficiently large and (ii) Liouville's theorem, which is a direct application of Cauchy's integral formula on a circle of growing radius.
Proposition (real factorisation). Every non-constant factors uniquely (up to order) as
with , real roots , and irreducible real quadratic factors with .
Proof. Apply the fundamental theorem to viewed as an element of , giving a complete factorisation into linear factors over . Since has real coefficients, , so the non-real roots come in conjugate pairs . The product has real coefficients and , with discriminant since is not real. Grouping the conjugate pairs into real quadratic factors and the real roots into real linear factors gives the stated form. Uniqueness follows from the UFD property of .
Connections [Master]
The polynomial ring is the founding example of a one-variable polynomial ring, and the algebra strand uses it throughout. The Euclidean-domain structure proved here is the foundation for the polynomial-ring theory in
01.05.01pending (polynomial ring), and the unique-factorisation property is the foundation for the divisibility-theoretic results in commutative algebra. The contrast with for — still UFD by Gauss's lemma, but no longer a PID — is the algebraic shadow of affine-space geometry and the entry point to the Nullstellensatz in03.07.01pending (algebraic geometry foundations).The factor theorem and the at-most--roots bound are the founding cases of the theory of roots of polynomials over arbitrary fields, and the fundamental theorem of algebra over recurs in
06.01.04(complex analysis: Liouville and the fundamental theorem of algebra) as a corollary of Liouville's theorem on bounded entire functions. The same proof spine — bound from below, invoke a global rigidity statement about holomorphic functions — drives both the elementary statement here and the deeper holomorphic-function machinery downstream.The field of rational functions is the fraction field of , and the partial-fraction decomposition over is the load-bearing technique for integrating rational functions in calculus, recurring in
02.07.03pending (techniques of integration). The same equivalence-class construction applied to a polynomial ring in several variables produces the function field of affine -space, recurring in03.07.05(function fields of varieties), and the broader theory of fraction fields and localisation in01.06.02pending (commutative algebra: localisation).The unsolvability of the general quintic and the connection between root symmetry and Galois groups recurs in
03.05.01(Galois theory: solvability and the symmetric group), where the symmetry-theoretic content of polynomial-root extraction is fully developed. The factor theorem and the polynomial division algorithm proved here are the elementary input to the splitting-field construction and the Galois-correspondence theorem.
Historical & philosophical context [Master]
The algebraic content of polynomial arithmetic predates the modern notation by more than a millennium. Diophantus of Alexandria (~250 CE) in his Arithmetica worked with polynomial equations in one and two unknowns and developed systematic methods for finding rational solutions, with notation that is the ancestor of modern algebraic symbolism. al-Khwārizmī in his ~825 CE Kitāb al-Jabr wa-l-Muqābala (literally "the book of balancing and restoring") gave systematic solutions of linear and quadratic equations, named the operations of moving terms across the equals sign, and supplied the root of the word algebra. The general cubic was solved by Scipione del Ferro (c. 1515, unpublished) and rediscovered by Niccolò Tartaglia (1535), with the formula published by Gerolamo Cardano in his 1545 Ars Magna; the same volume contains Lodovico Ferrari's reduction of the quartic to a resolvent cubic [Lang — Basic Mathematics Ch. 1.5–1.7].
The unsolvability of the general quintic occupied two centuries. Paolo Ruffini in 1799 Teoria generale delle equazioni gave the first proof — initially incomplete, later strengthened — and Niels Henrik Abel in 1824 Mémoire sur les équations algébriques gave the now-standard proof that no general formula in radicals exists for polynomials of degree five or higher. Évariste Galois, in manuscripts of 1830–1832 collected and published posthumously by Liouville in 1846, organised the entire question around the symmetry group of the roots of a polynomial, giving the modern criterion that a polynomial is solvable in radicals iff its Galois group is solvable. Carl Friedrich Gauss in his 1799 doctoral thesis Demonstratio nova theorematis omnem functionem algebraicam rationalem integram unius variabilis in factores reales primi vel secundi gradus resolvi posse gave the first essentially correct proof of the fundamental theorem of algebra, supplied two further proofs in 1815 and 1816, and a fourth in 1849 [Gauss — Demonstratio nova theorematis]. The modern complex-analytic proof using Joseph Liouville's 1844 theorem on bounded entire functions came later but is the version that appears in every contemporary textbook.