Amdahl's law, also known as Amdahl's argument, is named after computer architect Gene Amdahl, and is used to find the maximum expected improvement to an overall system when only part of the system is improved. It is often used in parallel computing to predict the theoretical maximum speedup using multiple processors. It was presented at the AFIPS Spring Joint Computer Conference in 1967.
The speedup of a program using multiple processors in parallel computing is limited by the time needed for the sequential fraction of the program. For example, if a program needs 20 hours using a single processor core, and a particular portion of 1 hour cannot be parallelized, while the remaining promising portion of 19 hours (95%) can be parallelized, then regardless of how many processors we devote to a parallelized execution of this program, the minimum execution time cannot be less than that critical 1 hour. Hence the speedup is limited up to 20×, as the diagram illustrates.
Read more about Amdahl's Law: Description, Parallelization, Relation To Law of Diminishing Returns, Speedup in A Sequential Program, Relation To Gustafson's Law
Famous quotes containing the word law:
“Teacher, which commandment in the law is the greatest? He said to him, You shall love the Lord your God with all your heart, and with all your soul, and with all your mind. This is the greatest and first commandment. And a second is like it: You shall love your neighbor as yourself. On these two commandments hang all the law and the prophets.”
—Bible: New Testament, Matthew 22:36-40.