WhatIs.com

Amdahl's law

By Pat Brans

What is Amdahl's law?

Amdahl's law -- also known as Amdahl's argument -- is an intuitive observation and an associated formula. Named after computer scientist Gene Amdahl, the observation is that the performance improvement that can be gained through parallel processing is limited by the part of a system that's inherently sequential -- that is, the set of operations that must be run in series.

The formula, which makes a prediction about the maximum performance improvement that can be expected from parallel computing, is shown in the following:

Smax = 1 / ((1-p) + p/s)

Where the following applies:

Amdahl's law is attributed to Amdahl, an American computer scientist and high-tech entrepreneur, who worked first on mainframe computers at IBM and then founded his own companies, including Amdahl Corp. Much of his work involved improving computer performance using a variety of techniques, including parallel processing.

In 1967, Amdahl gave a presentation at the American Federation of Information Processing Societies (AFIPS) Spring Joint Computer Conference describing ways of predicting the limits of parallel processing. The ideas outlined in his presentation later became known as Amdahl's law.

The formula can be applied in a broader sense to other techniques that improve latency, not just parallel computing. If the overall task consists of a part that can benefit from an improvement and another part that can't benefit from the same improvement, Amdahl's law can make a prediction about the maximum savings in execution time of the overall task.

What is an example of how Amdahl's law can be applied?

Amdahl's law can be illustrated with a simple example. If a program is composed of a set of instructions that take 10 hours to run in series, and a one-hour portion of that program can't be parallelized, the absolute best you can expect is to approach the limit of 10x improvement in execution time. That's because no matter how much you parallelize the nine hours that can run in parallel, you will never be able to do away with the one hour that must be run in sequence.

How is Amdahl's law different from Gustafson's law?

Gustafson's law -- also called Gustafson-Barsis' law -- doesn't overturn Amdahl's law. Instead, it adds predictions about the improvement in execution time as more cores are added to execute a task that benefits from parallel computing. The formula starts with the theoretical performance of the task on a single processor as the baseline and makes predictions about system performance as processors are added.

The formula and associate ideas were described by John L. Gustafson and his colleague Edwin H. Barsis, in their article "Reevaluating Amdahl's Law," which was published in May 1988 in Communications of the ACM.

Amdahl's law assumes the problem size is fixed. But in practice, as more resources become available, programmers solve more complex problems to fully exploit the improvements in computing power. So, in reality, the time spent in the part of the task that can benefit from parallel computing often grows much faster than the time spent in the sequential part.

Gustafson's law, on the other hand, assumes that even though the overall execution time is limited -- as predicted by Amdahl -- larger problems can be solved in the same amount of time by increasing the number of processors.

Learn how software testing can be used to push software to its breaking point, identifying points of failure that can be fixed before the software is rolled out to users.

11 Apr 2023

All Rights Reserved, Copyright 1999 - 2024, TechTarget | Read our Privacy Statement