Download presentation
Presentation is loading. Please wait.
Published byAllan Greene Modified over 9 years ago
2
FUNDAMENTALS OF ALGORITHMS. FUNDAMENTALS OF DATA STRUCTURES. TREES. GRAPHS AND THEIR APPLICATIONS. STORAGE MANAGEMENT.
3
DATA SRUCTURE Data: Raw facts about an objects. Structure: Way of storing, organising and manipulating data.
4
Logical (or) Mathematical model of a particular organisation of data. How the data should be organised. How the flow of data should be controlled. How a data structure should be designed.
5
THREE ACTIVITIES: Storing, Accessing, Manipulating, data in one form to another.
6
Two types of data structures: they are 1) Primitive data structure. int, char, float. 2) Non-Primitive data structure. arrays, structure and files.
7
TOPICS COVERED: 1)Algorithms. 2)Analysis of algorithm. 3) Analysis of algorithms using data structures. 4) Performance Analysis 4.1) Time Complexity. 4.2) Space Complexity. 4.3) Amortized time Complexity. 5) Asymptotic notation.
8
1.1 Definition: An algorithm consists of a set of explicit and unambiguous, finite steps when carrying out for a given set of initial condition,produce corresponding output and terminate in a finite time.
9
1.2) Characteristics: Each instruction should be unique and concise. Each instruction should be relative in nature and should not be repeated infinitely. Repetition of same tasks should be avoided. The results should be available to the user after the algorithm terminates.
10
2.1) Steps to be followed: Analysis is more reliable than experimentation or testing. Analysis help to select better algorithms. Analysis predicts performance. Analysis identifies scope of improvement of algorithms.
11
3.1) Problem solving. 3.2) Top-down design.
12
Part of thinking process. Describes steps to proceed from a given state to a desired goal state. 3.1.1) Problem solving phases. 3.1.2) Problem solving strategies.
13
An effective program should be developed by one (or) more users to obtain a solution to a problem is known as problem solving.
14
Program design Problem solving phases Problem specification Analysis& data structure Debugging & testing implementat ion Program correctness Maintenance
15
Divide and Conquer Method. Dynamic Programming. Back tracking. Brute-force algorithm.
16
DEFINITION: It is a method for designing the layout for anything that starts with the complete item then breaks it down into smaller and smaller subtasks.
17
Problem Decomposition. Collecting information required for each task. Writing function declaration. Termination of the algorithm. The result of top-down design is a tree- like structure.
18
Main task Sub task 1 Sub task 2 Sub task 1(a) Sub task 1(b) Sub task 2(a) Sub task 2(b) Sub task 2(c)
19
Measured by the amount of resources it uses the time and the spaces. ▪ Time – number of steps. ▪ Space – number of units memory storage.
20
Time complexity is the amount of time needed by a program to complete its task. It can be divided into two 1) Compile time. 2) Run time.
21
Three types of time complexity. 1) Worst-case. 2) Average-case. 3) Best-case. 1) Worst-case: The function defined by maximum number of steps taken on any problem of size(n).
22
FORMULA: T(n)=O (f(n)) 2)Average-case: The function defined by average number of steps taken on any problem of size(n). FORMULA: T(n)=theta(f(n))
23
3) Best-case: The function defined by minimum number of steps taken on any problem of size(n). FORMULA: T(n)=omega((fn))
24
Amount of memory consumed by the algorithm until it completes its execution. character = 1 unit. Integer = 2 units. Float = 4 units. Three spaces. Instruction space. Data space. Environment space.
25
NOTATION: A method used to estimate the efficiency of an algorithm. 1)Big-oh notation. 2)Big-omega notation. 3)Big-theta notation. 4)Little -oh notation. 5) Little-omega notation.
26
5.1) Big-oh notation: O-> Order of. Big-> very large values of “n”. Used to define the worst-case running time of an algorithm. Example: f(n)=O(g(n)) f(n) and g(n) -> positive integer.
27
5.2) Big-Omega Notation: Used to define the best-case running time of an algorithm. Example: f(n)=Omega(g(n)) f(n) and g(n) -> positive integer.
28
5.3) Big-Theta Notation: It is in between Big-oh and Big- omega notations. Example: f(n)=Theta(g(n)) f(n) and g(n) -> positive integer.
29
5.4) Little-Oh Notation: O-> Order of. Little-> very small values of “n”. The growth rate of f(n) is lesser than the growth rate of g(n). Example: f(n)=O(g(n)).
30
5.5) Little-Omega Notation: The growth rate of f(n) is greater than the growth rate of g(n). Example: f(n)=Omega(g(n)).
31
Many algorithms are hard to analyze mathematically. Big-Oh analysis only tells you how it grows with the size of the problem not how efficient of the algorithm is.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.