In computer science, a problem is said to have optimal substructure if an optimal solution can be constructed efficiently from optimal solutions of its subproblems. This property is used to determine the usefulness of dynamic programming and greedy algorithms for a problem.
Typically, a greedy algorithm is used to solve a problem with optimal substructure if it can be proved by induction that this is optimal at each step (Cormen et al. pp. 381–2). Otherwise, providing the problem exhibits overlapping subproblems as well, dynamic programming is used. If there are no appropriate greedy algorithms and the problem fails to exhibit overlapping subproblems, often a lengthy but straightforward search of the solution space is the best alternative.
In the application of dynamic programming to mathematical optimization, Richard Bellman's Principle of Optimality is based on the idea that in order to solve a dynamic optimization problem from some starting period t to some ending period T, one implicitly has to solve subproblems starting from later dates s, where t. This is an example of optimal substructure. The Principle of Optimality is used to derive the Bellman equation, which shows how the value of the problem starting from t is related to the value of the problem starting from s.
Read more about Optimal Substructure: Example, Definition, Problems With Optimal Substructure, Problems without Optimal Substructure
Famous quotes containing the word optimal:
“In the most desirable conditions, the child learns to manage anxiety by being exposed to just the right amounts of it, not much more and not much less. This optimal amount of anxiety varies with the childs age and temperament. It may also vary with cultural values.... There is no mathematical formula for calculating exact amounts of optimal anxiety. This is why child rearing is an art and not a science.”
—Alicia F. Lieberman (20th century)