Algorithm Complexity is concerned about how fast or slow particular algorithm performs
Not so novice definition Complexity is defined as a numerical function T(n) - time versus the input size n Define time taken by an algorithm without depending on the implementation details A given algorithm will take different amounts of time on the same inputs depending on such factors as: processor speed; instruction set, disk speed, brand of compiler among others. The way around is to estimate efficiency of each algorithm asymptotically.
Algorithm complexity is something designed to compare two algorithms at the idea level — ignoring low-level details such as the implementation programming language, the hardware the algorithm runs on, or the instruction set of the given CPU.
Measure running time by counting the number of “basic operations”. Layman term definition
Terminologies profilers which measure running time in milliseconds and can help us optimize our code by spotting bottlenecks. NetBeans profiler is a module to provide a full- featured profiling functionality for the NetBeans IDE. The profiling functions include CPU, memory and threads profiling as well as basic JVM monitoring, allowing developers to be more productive in solving memory or performance- related issues.
As algorithms are programs that perform just a computation, complexity analysis allows us to measure how fast a program is when it performs computations. Examples of operations that are purely computational include numerical floating- point operations such as addition and multiplication; searching within a database that fits in RAM for a given value; among others
COMPLEXITY ANALYSIS Complexity analysis is also a tool that allows us to explain how an algorithm behaves as the input grows larger. Feeding different input, how will the algorithm behave? If algorithm takes 1 second to run for an input of size 1000, how will it behave if I double the input size? Will it run just as fast, half as fast, or four times slower? In practical programming, this is important as it allows us to predict how algorithm will behave when the input data becomes larger.
COMPLEXITY ANALYSIS EXAMPLE An algorithm for a web application that works well with 1000 users and measure its running time, using algorithm complexity analysis a programmer can have a pretty good idea of what will happen once we get 2000 users instead. For algorithmic competitions, complexity analysis gives us insight about how long our code will run for the largest test cases that are used to test our program's correctness. So measuring the program's behavior for a small input, programmer can get a good idea of how it will behave for larger inputs. (analogy – cake)
COUNTING INSTRUCTIONS CASE 1: working on the exercises using C++ for practice. Given an input array A of size n: var M = A[ 0 ]; for ( var i = 0; i < n; ++i ) { if ( A[ i ] >= M ) { M = A[ i ]; }
FUNDAMENTAL INSTRUCTIONS The first thing to do is count how many fundamental instructions this piece of code executes. Analyzing this piece of code, break it up into simple instructions; things that can be executed by the CPU directly - or close to that. Assume the processor can execute the following operations as one instruction each: Assigning a value to a variable Looking up the value of a particular element in an array Comparing two values Incrementing a value Basic arithmetic operations such as addition and multiplication
Next topic Assignment Define & Cite Example ALGORITHMIC ANALYSIS Best Case Analysis Worst Case Analysis Average Case Analysis
Analysis of Algorithm The determination of the amount od resources (time and storage) necessary to execute them. It help programmers to understand and suggest improvements. Algorithm tend to become shorter, simpler, and more elegant during the analysis process. Usually, the efficiency or running time of an algorithm is stated as a function relating the input to the number of steps (time complexity) or storage locations(space complexity).
Analysis of Algorithm Provides theoretical estimates for the resources needed by any algorithm which solves a given computational problem. The estimates provide an insight into reasonable directions of search for efficient algorithms.
Complete running time of algorithm 1. Implement the algorithm completely. 2. Determine the time required for each basic operations. 3. Identify unknown quantities that can be used to describe the frequency of execution of the basic operations 4. Develop if it is possible, a realistic model to input to the program 5. Analyze the unknown quantities, assuming the modeled input.
Worst, Average and Best CASE ANALYSIS
Definition Worst-case complexity – is the function defined by the maximum number of steps taken on any instance of size n. Average-case complexity – is the function defined by the average number of steps taken on any instance of size n. Best-case complexity – is the function defined by the minimum number of steps taken on any instance of size n.
Definition Worst-case complexity – implies slowest time to complete with pessimal inputs chosen. Average-case complexity – is the amount of some computational resource used by the algorithm, averaged over all possible. Best-case complexity – implies fastest time to complete, with input chosen.
Examples – Sorting Algorithm Worst-case Is the data that is sorted in the reverse order and you have to find the last element or find an element in a middle of a non-sorted list. Average-case Search an element which is situated in a middle of a sorted list. Best-case Data is already sorted.
Examples - Linear Worst-case When the element to be searched is not present in the array. The search functions compares it with all the elements of array one by one. Average-case Assume that all cases are uniformly distributed (including the element is not present) Sum all the cases and divide the sum by (n+1) Best-case The desired element is the first element of the list
Notations of Algorithm Big O notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation. The letter O is used because the growth rate of a function is also referred to as order of the function. Associated with big O notation are several related notations, using the symbols o, Ω, ω, and Θ, to describe other kinds of bounds on asymptotic growth rates.
NotationDefinition Constant Time O(1) An algorithm is said to run in a constant time if it requires the same amount of time regardless of the input size. Ex. An algorithm written to do the addition of two numbers Linear Time O(n) An algorithm is said to run in linear time if its time execution is directly proportional to the input size. Time grows linear as input size increases. Ex. An algorithm which order a list of numbers from smaller number to the greater number
NotationDefinition Logarithmic Time O(log n) An algorithm is said to run in logarithmic time if its time execution is proportional to the logarithm of the input size. Polylogarithmic time O((log(n)) A polylogarithmic functions occur as the order of memory used by some algorithms. Example:
assignment Pigeonhole principle Permutations and combinations Discrete probability Recurrence relations Divide and conquer Inclusion and exclusion