Natural Language Understanding - History

History

The program STUDENT, written in 1964 by Daniel Bobrow for his PhD dissertation at MIT is one of the earliest known attempts at natural language understanding by a computer. Eight years after John McCarthy coined the term artificial intelligence, Bobrow's dissertation (titled Natural Language Input for a Computer Problem Solving System) showed how a computer can understand simple natural language input to solve algebra word problems.

A year later, in 1965, Joseph Weizenbaum at MIT wrote ELIZA, an interactive program that carried on a dialogue in English on any topic, the most popular being psychotherapy. ELIZA worked by simple parsing and substitution of key words into canned phrases and Weizenbaum sidestepped the problem of giving the program a database of real-world knowledge or a rich lexicon. Yet ELIZA gained surprising popularity as a toy project and can be seen as a very early precursor to current commercial systems such as those used by Ask.com.

In 1969 Roger Schank at Stanford University introduced the conceptual dependency theory for natural language understanding. This model, partially influenced by the work of Sydney Lamb, was extensively used by Schank's students at Yale University, such as Robert Wilensky, Wendy Lehnert, and Janet Kolodner.

In 1970, William A. Woods introduced the augmented transition network (ATN) to represent natural language input. Instead of phrase structure rules ATNs used an equivalent set of finite state automata that were called recursively. ATNs and their more general format called "generalized ATNs" continued to be used for a number of years.

In 1971 Terry Winograd finished writing SHRDLU for his PhD thesis at MIT. SHRDLU could understand simple English sentences in a restricted world of children's blocks to direct a robotic arm to move items. The successful demonstration of SHRDLU provided significant momentum for continued research in the field. Winograd continued to be a major influence in the field with the publication of his book Language as a Cognitive Process. At Stanford, Winograd was later the adviser for Larry Page, who co-founded Google.

In the 1970s and 1980s the natural language processing group at SRI International continued research and development in the field. A number of commercial efforts based on the research were undertaken, e.g., in 1982 Gary Hendrix formed Symantec Corporation originally as a company for developing a natural language interface for database queries on personal computers. However, with the advent of mouse driven, graphic user interfaces Symantec changed direction. A number of other commercial efforts were started around the same time, e.g., Larry R. Harris at the Artificial Intelligence Corporation and Roger Schank and his students at Cognitive Systems corp. In 1983, Michael Dyer developed the BORIS system at Yale which bore similarities to the work of Roger Schank and W. G. Lehnart.

Read more about this topic:  Natural Language Understanding

Famous quotes containing the word history:

    We don’t know when our name came into being or how some distant ancestor acquired it. We don’t understand our name at all, we don’t know its history and yet we bear it with exalted fidelity, we merge with it, we like it, we are ridiculously proud of it as if we had thought it up ourselves in a moment of brilliant inspiration.
    Milan Kundera (b. 1929)

    It is my conviction that women are the natural orators of the race.
    Eliza Archard Connor, U.S. suffragist. As quoted in History of Woman Suffrage, vol. 4, ch. 9, by Susan B. Anthony and Ida Husted Harper (1902)

    What would we not give for some great poem to read now, which would be in harmony with the scenery,—for if men read aright, methinks they would never read anything but poems. No history nor philosophy can supply their place.
    Henry David Thoreau (1817–1862)