Real Number - History

History

Simple fractions have been used by the Egyptians around 1000 BC; the Vedic "Sulba Sutras" ("The rules of chords") in, ca. 600 BC, include what may be the first 'use' of irrational numbers. The concept of irrationality was implicitly accepted by early Indian mathematicians since Manava (c. 750–690 BC), who were aware that the square roots of certain numbers such as 2 and 61 could not be exactly determined. Around 500 BC, the Greek mathematicians led by Pythagoras realized the need for irrational numbers, in particular the irrationality of the square root of 2.

The Middle Ages brought the acceptance of zero, negative, integral, and fractional numbers, first by Indian and Chinese mathematicians, and then by Arabic mathematicians, who were also the first to treat irrational numbers as algebraic objects, which was made possible by the development of algebra. Arabic mathematicians merged the concepts of "number" and "magnitude" into a more general idea of real numbers. The Egyptian mathematician Abū Kāmil Shujā ibn Aslam (c. 850–930) was the first to accept irrational numbers as solutions to quadratic equations or as coefficients in an equation, often in the form of square roots, cube roots and fourth roots.

In the 16th century, Simon Stevin created the basis for modern decimal notation, and insisted that there is no difference between rational and irrational numbers in this regard.

In the 17th century, Descartes introduced the term "real" to describe roots of a polynomial, distinguishing them from "imaginary" ones.

In the 18th and 19th centuries there was much work on irrational and transcendental numbers. Johann Heinrich Lambert (1761) gave the first flawed proof that π cannot be rational; Adrien-Marie Legendre (1794) completed the proof, and showed that π is not the square root of a rational number. Paolo Ruffini (1799) and Niels Henrik Abel (1842) both constructed proofs of Abel–Ruffini theorem: that the general quintic or higher equations cannot be solved by a general formula involving only arithmetical operations and roots.

Évariste Galois (1832) developed techniques for determining whether a given equation could be solved by radicals, which gave rise to the field of Galois theory. Joseph Liouville (1840) showed that neither e nor e2 can be a root of an integer quadratic equation, and then established existence of transcendental numbers, the proof being subsequently displaced by Georg Cantor (1873). Charles Hermite (1873) first proved that e is transcendental, and Ferdinand von Lindemann (1882), showed that π is transcendental. Lindemann's proof was much simplified by Weierstrass (1885), still further by David Hilbert (1893), and has finally been made elementary by Adolf Hurwitz and Paul Gordan.

The development of calculus in the 18th century used the entire set of real numbers without having defined them cleanly. The first rigorous definition was given by Georg Cantor in 1871. In 1874 he showed that the set of all real numbers is uncountably infinite but the set of all algebraic numbers is countably infinite. Contrary to widely held beliefs, his first method was not his famous diagonal argument, which he published in 1891. See Cantor's first uncountability proof.

Read more about this topic:  Real Number

Famous quotes containing the word history:

    History ... is, indeed, little more than the register of the crimes, follies, and misfortunes of mankind.
    But what experience and history teach is this—that peoples and governments have never learned anything from history, or acted on principles deduced from it.
    Georg Wilhelm Friedrich Hegel (1770–1831)

    My good friends, this is the second time in our history that there has come back from Germany to Downing Street peace with honour. I believe it is peace for our time. We thank you from the bottom of our hearts. And now I recommend you to go home and sleep quietly in your beds.
    Neville Chamberlain (1869–1940)

    The history of all hitherto existing society is the history of class struggles.
    Karl Marx (1818–1883)