Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance The need for speed.

Similar presentations


Presentation on theme: "Performance The need for speed."— Presentation transcript:

1 Performance The need for speed

2 Aren’t today’s computers fast enough?
Justification for Better Performance • complex applications text  graphics  video • real-time applications • computational science Computers are being applied to problems previously solved by humans (e.g., animated motion pictures, Genome sequencing, modeling software for engineering). Jim Lewis modeling software ... Science keeps discovering new fields. The surface area of the earth is 510,065,600 km**2 for the weather forecasting problem.

3 How do we create faster computation?
1) Faster Hardware 2) Faster Algorithms

4 How to achieve Moore’s Law?
Faster Hardware How to achieve Moore’s Law? Miniaturization limitations: manufacturing & speed of light 2) Multiple processors (supercomputers) Remind students of Moore’s Law. Miniaturization increases speed because of the decrease in the distance electricity must travel. Current manufacturing uses 32nm. (nano=one billionth) laser etching; this is about 1/3000 of the thickness of a human hair) Nanotechnology may lead to growing circuits from molecules. 3) Different technologies optical computers? biological computers? quantum computers?

5 know which algorithm is
How would you know which algorithm is faster? Faster Algorithms empirical approach Benchmarks are one way to analyze algorithms. A benchmark is a program execution used to measure execution time. (a single experimental result) Counting things like instruction executions, variable/memory assignments. Talk about the advantages and disadvantages of benchmarks vs. estimates. The public often trusts benchmarks, but estimates provide more insight. Algorithm speed can be estimated by counting analytic approach ...the number of instructions executed ...the number of variable/memory assignments ...the number of data comparisons

6 Counting the Cost Recall linear search & binary search
Number of items Worst Case Expected Case Number of probes (comparisons) Linear Binary 1 1 1 1 1 7 7 3 3.5 3 63 8 31 1000 10 500 Sometimes these are called “cost estimates” because they approximate cost in computing time. Explain linear search & binary search in the context of finding a student record in a collection of papers sorted by student ID number. This means that for a million items it only takes 20 probes (binary) vs. a half million (linear). N N log2N N/2 log2N Note that 1000 processors could search 1000 items with a single probe per processor.

7 Algorithm to find the maximum
Consider 8 numbers. a b c d e f g h comparisons max of a & b max of c & d max of e & f max of g & h max of a - d max of e - h max of a - h Total number of comparisons?

8 Algorithm to find the maximum
What about 7 numbers. a b c d e f g max of a & b c & d e & f a - d e - g a - g Total number of comparisons?

9 Algorithm to find the maximum
What about 6 numbers. a b c d e f max of a & b c & d e & f a - d a - f Total number of comparisons?

10 Algorithm to find the maximum
Number of items Number of comparisons 1 2 1 6 5 7 6 8 7 20 19 100 99 N N-1 The performance of this algorithm is similar to the linear search. However, using many processors doesn’t help as much for the maximum finder algorithm.

11 Sorting Algorithms Sorting algorithms rearrange items from smallest
to largest (or largest to smallest). One sorting algorithm: - repeatedly find the maximum and move it immediately ahead of all prior maximums. Example (sort 100 values) Step 1 - find the maximum of 100 values 99 comparisons Consider 1000 items and 1000 processors - only 500 have any immediate work, the others must wait until the first round of comparisons are complete. The performance with N processors is essentially the same as binary search. Step 2 - find the maximum of 99 values 98 comparisons Step 3 - find the maximum of 98 values 97 comparisons • • • Total comparisons for sorting 100: Total comparisons for sorting N: (N-1)+(N-2)+(N-3)+...+1 = N*(N-1) 2

12 Comparing Algorithm Performance
binary search linear search sorting algorithm Notice that the differences become greater as the number of items increases.

13 Are there slower algorithms?
Consider an algorithm to crack your password. One such algorithm attempts every possible keystroke. Analysis Password length Comparisons *94 = 8,836 *94*94 = 830,584 = 6 x 1011 N N

14 Comparing Algorithm Performance
password cracker other algorithms

15 Functional Growth not practical for any large number of items n log2n
binary search linear search password cracker sort n log2n n2 94n 1 94 4 2 16 8 3 64 6 x 1011 12 3.6 144 (note 1) 256 (note 2) 20 4.3 400 24 4.6 576 28 4.8 784 32 5 1024 not practical for any large number of items IBM just recently claimed to break the 1 petaflop barrier note 1 - roughly five years of computation for a 1 petaflop supercomputer note 2 - about 5 times the age of the universe

16 Important Results #1 -- There are algorithms that take too long to be
practically executed on a computer. #2 -- These slow algorithms tend of be of the type that attempt a brute force approach of attempting all possible combinations. #3 -- Moore’s Law cannot fix the problem.

17 So what do we know? Many algorithms can be processed by modern computers. Some algorithms will be processed with faster computers. Computers will never be fast enough for some algorithms.


Download ppt "Performance The need for speed."

Similar presentations


Ads by Google