Download presentation
Presentation is loading. Please wait.
1
Notes on Labs and Assignments
2
Commenting Revisited You have probably been able to “get away” with poor inline documentation is labs Tasks are straightforward Often only one key method is involved On assignments this is not going to work Bigger variety of solutions Tougher to follow as a marker (or user)
3
The Header Comment It is good practice to include an initial comment in the source file for each class /*AlphaChars.java Author: Aaron Hunter Date: September 26, 2006 CMPT 126 - Lab 2 Provides methods to check if a string contains letters. */
4
Method Descriptions It is also good practice to include a comment preceding each method /*Checks if a single character is a letter.*/ public static boolean isAlpha( char c) { if (((c >= 'a') && (c = 'A') && (c <= 'Z'))) return true; else return false; }
5
Another Method /* Checks if a string contains all letters. */ public static boolean isAllAlpha(String s) { for(int i=0; i<s.length(); i++) { \\ Check each character. If not a letter, \\ return false. if(!isAlpha(s.charAt(i))) return false; } \\ If the loop completes without returning false, \\ then everything is a letter and we should \\ \\ \\ return true. return true; }
6
Another Method /* Checks if a string contains all letters. */ public static boolean isAllAlpha(String s) { for(int i=0; i<s.length(); i++) { if(!isAlpha(s.charAt(i))) return false; } return true; }
7
Abstracting The General Problem Given a method hasProperty(Type x) checks if x has some property, returns true if it does Given a variable xArray of type Type[] : Does every element of xArray have the property? Does some element of xArray have the property? Do no elements of xArray have the property?
8
Solution – All public static boolean allHaveProperty(Type[] xArray) { for(int i=0; i<xArray.length; i++) { if(!hasProperty(xArray[i])) return false; } return true; }
9
Solution – Some public static boolean someHaveProperty(Type[] xArray) { for(int i=0; i<xArray.length; i++) { if(hasProperty(xArray[i])) return true; } return false; }
10
Solution – None public static boolean noneHaveProperty(Type[] xArray) { for(int i=0; i<xArray.length; i++) { if(hasProperty(xArray[i])) return false; } return true; }
11
Lab 2 – Basic Comments Use comments comments before methods comments about important bits of code e.g. Why are you casting a char to an int ? Use sub-methods if you are copying and pasting lots of code, maybe you need another method Make the code testable
12
Running Time
13
Speed When writing programs, we often want them to be fast Several things affect this: The algorithm that is implemented The way the algorithm is implemented Programming language used Capabilities of the hardware We won’t worry about the 3 rd and 4 th points
14
Algorithm vs. Implementation The algorithm is the step by step procedure used to solve a problem One problem can be solved by different algorithms e.g. linear search vs. binary search One algorithm can be implemented in different ways in Java e.g. linear search with a for loop vs. linear search with a while loop
15
Algorithm vs. Implementation So… the algorithm is the procedure that is used to solve the problem … the implementation of the algorithm is the way the algorithm is described in Java (or some other programming language) Algorithms are the kind of thing we can describe with pseudo-code – implementations are described in Java.
16
Implementation For a given algorithm, there will be many ways it can be implemented e.g. loop forward or backwards, order of if conditions, how to split into methods, lazy/active boolean operators… some of these affect the speed of the program No rules here: programming experience helps so does knowledge of system architecture, compilers, interpreters, language features,…
17
Algorithm Analysis To analyze an algorithm is to determine the amount of “resources” needed for execution Typically there are two key resources: Time Space (memory) We will only be concerned with running time today
18
Algorithm Analysis Donald Knuth (1938--) pioneer in algorithm analysis wrote “The Art of Programming” (I-III) mails $2.56 for every error found developed the Tex typesetting system bible analysis through random sampling "Beware of bugs in the above code; I have only proved it correct, not tried it."
19
Algorithm Analysis The inherent running time of an algorithm almost always overshadows the implementation e.g. There is nothing we can do to make Power1 run as fast as Power2 (for large values of y) e.g. For sorted arrays, binary search is always faster (for large arrays, in the worst case)
20
Measuring Running Time To evaluate the efficiency of an algorithm: We can’t just time it Different architectures, hidden processes, etc. Need something that allows us to generalize Idea: count the number of “steps” required For an input of size n Will be measured in terms of “Big-O” notation
21
Measuring Running Time We define algorithms for any input (of appropriate type) What is the size of the input? For numerical inputs… it is just the input value n For string inputs… normally the length What is a step? Normally it is one command
22
Actual Running Time Consider the following pseudocode algorithm: Given input n int[] n_by_n = new int[n][n]; for i<n,j<n set n_by_n[i][j] = random(); print “The random array is created”;
23
Actual Running Time How many steps does it take for input n? 1 step: declare an n by n array n 2 steps: put random numbers in the array 1 step: print out the final statement Total steps: n 2 + 2 Note: the extra 2 steps don’t carry the same importance as the n 2 steps as n gets big… the extra two steps are negligible
24
Actual Running Time We also think of constants as negligible we want to say n 2 and c*n 2 have “essentially” the same running time as n increases more accurately: the same asymptotic running time Commercial programmers would argue here… constants can matter in practice
25
Actual Running Time But for large n, the constants don’t matter nearly as much Plug in n=1000 n 2 +500 = 1000500[+500] 2n 2 = 2000000[+100000] n 3 = 1000000000[+999000000] 2 n = 10 301 (approx)[+ 10 301 ]
26
Big O Notation Running time will be measured with Big-O notation Big-O is a way to indicate how fast a function grows e.g. “Linear search has running time O(n) for an array of length n” indicates that linear search takes about O(n) steps
27
Big-O Notation When we say an algorithm has running time O(n): we are saying it runs in the same time as other functions with time O(n) we are describing the running time ignoring constants we are concerned with large values of n
28
Big-O Rules Ignore constants: O(c * f(n)) = O(f(n)) Ignore smaller powers: O(n 3 + n) = O(n 3 ) Logarithms cost less than a power Think of log n as equvialent to n 0.000….001 O(n a+0.1 ) > O(n a log n) > O(n a ) e.g. O(n log n + n) = O(n log n) e.g. O(n log n +n 2 ) = O(n 2 )
29
Why Big-O? Look at what happens for large inputs small problems are easy to do quickly big problems are more interesting larger function makes a huge difference for big n Ignores irrelevant details Constants and lower order terms depend on implementation Don’t worry about that until we’ve got a good algorithm
30
Running Time Graphs
31
Determining Running Time Need to count the number of “steps” to complete need to consider the worst case for input of size n a “step” must take constant (O(1)) time Often: iterations of the inner loop * work per loop recursive calls * work per call
32
Why Does log Keep Coming Up? By default, we write log n for log 2 n High school math: log b c = e meansb e = c so: log 2 n is the inverse of 2 n log 2 n is the power of 2 that gives result n
33
Why Does log Keep Coming Up? Exponential algorithm – O(2 n ) Increasing input by 1 doubles running time Logarithmic algorithm – log n The inverse of doubling… Doubling input size increases running time by 1 Intuition: O(log n) means that every step in the algorithm divides the problem size in half
34
Example 1: Search Linear search: checks each element in array does some other stuff in java implementation… but just a constant number of steps O(n) - “order n” Binary search chops array in half with each step n n/2 n/4 n/8 …. 2 1 takes log n steps: O(log n) - “order log n”
35
Example 2 Power 1: x y = x * x y-1 makes y recursive calls: O(y) Power 2: x y = x y/2 * x y/2 Makes log y recursive calls: O(log y) Had to be careful not to compute x y/2 twice Would have created a O(y) algorithm Instead: calculated once and stored in a variable
36
Computational Complexity All of this falls in the larger field of computational complexity theory Historically: recursion theory: what can be computed? not everything it turns out complexity theory: given a computable function, how much time and space are needed?
37
Computational Complexity Polynomial good, exponential bad polynomial time = O(n k ) for any k exponential time = basically O(2 n ) This is a big jump in time Will not be bridged by “faster computers” polynomial algorithm on modern computer <1 second for large n exponential algorithm can be in the centuries 1000 times faster computers… still in the centuries
38
Non-Deterministic Algorithms Allow the algorithm to “guess” a value and assume it is right Det: check if a student is enrolled in CMPT 126 Non: find a student enrolled in CMPT 126 Intuitively: finding an example is harder than checking an example The big question: Can non-deterministic polynomial algorithms be captured with deterministic ones?
39
Non-Deterministic Algorithms Commonly called the “P=NP” problem Open for 30 years Currently there is a 1 million dollar prize for a solution (from the Clay Institute) All modern cryptography and e-commerce relies on the (unproved) assumption of a solution
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.