Dynamical System

A dynamical system is a concept in mathematics where a fixed rule describes the time dependence of a point in a geometrical space. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a pipe, and the number of fish each springtime in a lake.

At any given time a dynamical system has a state given by a set of real numbers (a vector) that can be represented by a point in an appropriate state space (a geometrical manifold). Small changes in the state of the system create small changes in the numbers. The evolution rule of the dynamical system is a fixed rule that describes what future states follow from the current state. The rule is deterministic; in other words, for a given time interval only one future state follows from the current state.

Read more about Dynamical System:  Overview, Basic Definitions, Linear Dynamical Systems, Local Dynamics, Bifurcation Theory, Ergodic Systems, Multidimensional Generalization

Famous quotes containing the word system:

    Short of a wholesale reform of college athletics—a complete breakdown of the whole system that is now focused on money and power—the women’s programs are just as doomed as the men’s are to move further and further away from the academic mission of their colleges.... We have to decide if that’s the kind of success for women’s sports that we want.
    Christine H. B. Grant, U.S. university athletic director. As quoted in the Chronicle of Higher Education, p. A42 (May 12, 1993)