Quadratic Irrational - Square Root of Non-square Is Irrational

Square Root of Non-square Is Irrational

The definition of quadratic irrationals requires them to satisfy two conditions: they must satisfy a quadratic equation and they must be irrational. The solutions to the quadratic equation ax2 + bx + c = 0 are

Thus quadratic irrationals are precisely those numbers in this form that are not rational. Since b and 2a are both integers, asking when the above quantity is irrational is the same as asking when the square root of an integer is irrational. The answer to this is that the square root of any natural number that is not a perfect square is irrational.

The square root of 2 was the first such number to be proved irrational. Theodorus of Cyrene proved the irrationality of the square roots of whole numbers up to 17, but stopped there, probably because the algebra he used couldn't be applied to the square root of numbers greater than 17. Euclid's Elements Book 10 is dedicated to classification of irrational magnitudes. The original proof of the irrationality of the non-square natural numbers depends on Euclid's lemma.

Many proofs of the irrationality of the square roots of non-square natural numbers implicitly assume the fundamental theorem of arithmetic, which was first proven by Carl Friedrich Gauss in his Disquisitiones Arithmeticae. This asserts that every integer has a unique factorization into primes. For any rational non-integer in lowest terms there must be a prime in the denominator which does not divide into the numerator. When the numerator is squared that prime will still not divide into it because of the unique factorization. Therefore the square of a rational non-integer is always a non-integer; by contrapositive, the square root of an integer is always either another integer, or irrational.

Euclid used a restricted version of the fundamental theorem and some careful argument to prove the theorem. His proof is in Euclid's Elements Book X Proposition 9.

The fundamental theorem of arithmetic is not actually required to prove the result though. There are self-contained proofs by Richard Dedekind, among others. The following proof was adapted by Colin Richard Hughes from a proof of the irrationality of the square root of two found by Theodor Estermann in 1975.

Assume D is a non-square natural number, then there is a number n such that:

n2 < D < (n + 1)2,

so in particular

0 < √Dn < 1.

Assume the square root of D is a rational number p/q, assume the q here is the smallest for which this is true, hence the smallest number for which qD is also an integer. Then:

(√Dn)qD = qDnqD

is also an integer. But 0 < (√Dn) < 1 so (√Dn)q < q. Hence (√Dn)q is an integer smaller than q such that (√Dn)qD is also an integer. This is a contradiction since q was defined to be the smallest number with this property; hence √D cannot be rational.

Read more about this topic:  Quadratic Irrational

Famous quotes containing the words square, root and/or irrational:

    This house was designed and constructed with the freedom of stroke of a forester’s axe, without other compass and square than Nature uses.
    Henry David Thoreau (1817–1862)

    Propaganda has a bad name, but its root meaning is simply to disseminate through a medium, and all writing therefore is propaganda for something. It’s a seeding of the self in the consciousness of others.
    Elizabeth Drew (1887–1965)

    How did reason enter the world? As is fitting, in an irrational way, accidentally. We will have to guess at it, like a riddle.
    Friedrich Nietzsche (1844–1900)