Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 6: 22/09/2004, 24/09/2004,

Slides:



Advertisements
Similar presentations
Part VI NP-Hardness. Lecture 23 Whats NP? Hard Problems.
Advertisements

NP-Completeness.
Complexity Classes: P and NP
CS344: Principles of Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 11, 12: Perceptron Training 30 th and 31 st Jan, 2012.
CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 15, 16: Perceptrons and their computing power 6 th and.
NP-complete and NP-hard problems Transitivity of polynomial-time many-one reductions Concept of Completeness and hardness for a complexity class Definition.
Theory of Computing Lecture 16 MAS 714 Hartmut Klauck.
Complexity 7-1 Complexity Andrei Bulatov Complexity of Problems.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
Computability and Complexity 13-1 Computability and Complexity Andrei Bulatov The Class NP.
1 Introduction to Computability Theory Lecture11: Variants of Turing Machines Prof. Amos Israeli.
1 Linear Bounded Automata LBAs. 2 Linear Bounded Automata are like Turing Machines with a restriction: The working space of the tape is the space of the.
Complexity 5-1 Complexity Andrei Bulatov Complexity of Problems.
NP-complete and NP-hard problems
Analysis of Algorithms CS 477/677
Turing Machines CS 105: Introduction to Computer Science.
The Theory of NP-Completeness 1. What is NP-completeness? Consider the circuit satisfiability problem Difficult to answer the decision problem in polynomial.
Computational Complexity Polynomial time O(n k ) input size n, k constant Tractable problems solvable in polynomial time(Opposite Intractable) Ex: sorting,
Machines with Memory Chapter 3 (Part B). Turing Machines  Introduced by Alan Turing in 1936 in his famous paper “On Computable Numbers with an Application.
CMPS 3223 Theory of Computation Automata, Computability, & Complexity by Elaine Rich ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Slides provided.
Complexity Classes (Ch. 34) The class P: class of problems that can be solved in time that is polynomial in the size of the input, n. if input size is.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 25: Perceptrons; # of regions;
Review Byron Gao. Overview Theory of computation: central areas: Automata, Computability, Complexity Computability: Is the problem solvable? –solvable.
Theory of Computing Lecture 17 MAS 714 Hartmut Klauck.
Prof. Pushpak Bhattacharyya, IIT Bombay 1 CS 621 Artificial Intelligence Lecture /10/05 Prof. Pushpak Bhattacharyya Linear Separability,
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 30: Perceptron training convergence;
CSCI 3160 Design and Analysis of Algorithms Tutorial 10 Chengyu Lin.
1 The Theory of NP-Completeness 2 Cook ’ s Theorem (1971) Prof. Cook Toronto U. Receiving Turing Award (1982) Discussing difficult problems: worst case.
Erdal Kose CC30.10 These slides are based of Prof. N. Yanofsky Lecture notes.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 29: Perceptron training and.
Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 4: 24/08/2004, 25/08/2004,
CS621 : Artificial Intelligence
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32: sigmoid neuron; Feedforward.
1 Linear Bounded Automata LBAs. 2 Linear Bounded Automata (LBAs) are the same as Turing Machines with one difference: The input string tape space is the.
CS621 : Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 21: Perceptron training and convergence.
Complexity & Computability. Limitations of computer science  Major reasons useful calculations cannot be done:  execution time of program is too long.
Automata & Formal Languages, Feodor F. Dragan, Kent State University 1 CHAPTER 7 Time complexity Contents Measuring Complexity Big-O and small-o notation.
Strings Basic data type in computational biology A string is an ordered succession of characters or symbols from a finite set called an alphabet Sequence.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
CS 461 – Nov. 18 Section 7.1 Overview of complexity issues –“Can quickly decide” vs. “Can quickly verify” Measuring complexity Dividing decidable languages.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 36, 37: Hardness of training.
NP-Completeness  For convenience, the theory of NP - Completeness is designed for decision problems (i.e. whose solution is either yes or no).  Abstractly,
Young CS 331 D&A of Algo. NP-Completeness1 NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and.
COMPLEXITY. Satisfiability(SAT) problem Conjunctive normal form(CNF): Let S be a Boolean expression in CNF. That is, S is the product(and) of several.
CS623: Introduction to Computing with Neural Nets (lecture-7) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
COSC 3101A - Design and Analysis of Algorithms 14 NP-Completeness.
CS623: Introduction to Computing with Neural Nets (lecture-9) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CS621 : Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 20: Neural Net Basics: Perceptron.
Complexity, Decidability, Computability and other impossible questions Stephen Dolan, IMSA 2010.
A Universal Turing Machine
CS344: Introduction to Artificial Intelligence (associated lab: CS386)
Time complexity Here we will consider elements of computational complexity theory – an investigation of the time (or other resources) required for solving.
Part VI NP-Hardness.
CS621: Artificial Intelligence
Jaya Krishna, M.Tech, Assistant Professor
Jaya Krishna, M.Tech, Assistant Professor
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Computational Complexity
CS623: Introduction to Computing with Neural Nets (lecture-9)
CSE 6408 Advanced Algorithms.
NP-Completeness Reference: Computers and Intractability: A Guide to the Theory of NP-Completeness by Garey and Johnson, W.H. Freeman and Company, 1979.
CS621: Artificial Intelligence
Our First NP-Complete Problem
NP-Completeness Lecture for CS 302.
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Prof. Pushpak Bhattacharyya, IIT Bombay
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Week 11 - Wednesday CS221.
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

Instructor: Prof. Pushpak Bhattacharyya 13/08/2004 CS-621/CS-449 Lecture Notes CS621/CS449 Artificial Intelligence Lecture Notes Set 6: 22/09/2004, 24/09/2004, 1/10/2004, 6/10/2004

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Contents Complexity of Feedforward Network (FFN) training Positive Linear Confinement (PLC) problem Computational complexity basics PLC is NP-Complete

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Complexity of Feedforward Network (FFN) Training Observation –Almost all problems that use FFN with BP require a very large amount of time. Rivest and Blum (1992) –Training of 3-layer FFN is NP-complete

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Decision Problem for FFN Training Given a set of labeled patterns, does there exist two hyperplanes such that patterns of only one kind are confined to a quadrant? In other words, Can a 3-node NN be trained on the given positive and negative examples? Does there exist two hyperplanes defined by h 0 and h 1 which confine ALL and ONLY points of one kind? (Linear Confinement Problem)

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Positive Linear Confinement(PLC) problem The PLC decision problem is computationally hard i.e. as the number of points grows, the time taken to find the planes grows exponentially. Equivalent statement: –Training of a 3-node NN will take time

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Computational Complexity Basics Time Complexity –The number of steps taken by a Turing Machine (TM) to complete the computation as a function of the length of the input Infinite tape Finite state head TURING MACHINE

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Computational Complexity Basics Space Complexity –The number of cells on the tape needed to complete the computation, as a function of the length of the input Kolmogorov Complexity –The length of the shortest program (as a function of input length) to express the computation

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Complexity Expressions for AI Problems Most problems in AI are pattern recognition problems AI problems involve enumerating large number of cases Numerical problems are described in terms of space and time complexity, whereas, AI problems are described using Kolmogorov complexity

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Complexity Classes Class P –It is the class of decision problems which can be computed in polynomial time by a deterministic TM Class NP –It is the class of decision problems which can be computed in polynomial time by a non- deterministic TM –For this class of problems, once the solution is given, it is easy to verify it

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Complexity Classes Since Deterministic TMs are a subset of Non- deterministic TMs, Complexity Class: –This is a class of problems which share the complexity measure –There exists “transformations” of lesser complexity among the problems

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Complexity Classes Completeness within a complexity class –If one of the problems within the class is solved in less time than indicated by the complexity, all of them can be solved in that amount of ‘less’ time. –This happens because of the existence of transformations of less complexity among the problems

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Transformation Procedure A known Complexity problem is chosen –The transformation procedure should take less time than the inherent complexity of the known complexity problem Arbitrary instance of “known complexity” problem Transformation procedure Constructed instance of problem whose complexity is desired

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Set Splitting Problem Here, we introduce Set Splitting problem which is later used in proving that PLC is NP-Complete Given a set S= {s 1,s 2,s 3,…s n } and subsets C i of S such that, can S be split into two subsets S 1 and S 2 such that and

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes PLC is NP-Complete 1.PLC belongs to NP Given two hyperplanes, one can check easily (in polynomial time) if they confine all and only the positive points 2.Transformation: Transformation of ‘Set Splitting’ problem to PLC in polynomial time Corresponding to n elements in S, set up a coordinate system with n axes S = {s 1,s 2,s 3,…s n } xnxn X n-1 X n-2 X n-3 X1X1

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes PLC is NP-Complete Transformation of SS problem instance to an instance of PLC: 1.Mark origin as positive 2.Mark unit-distance points on the axes as negative Ex: S={s 1,s 2,s 3 } C1={s 1, s 3 } C2={s 2 } + (0,0,0) - (1,0,0) - (0,0,1) - (0,1,0)

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes PLC is NP-Complete For each c i, mark as positive, where + (0,0,0) - (1,0,0) + (1,1,0) + (0,1,1) - (0,0,1) - (0,1,0)

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes PLC is NP-Complete The transformation implies –If part: If the set S can be split, two hyperplanes can be found which can confine all and only positive points –Then part: If two hyperplanes can confine ALL and only positive points, S can be split

Prof. Pushpak Bhattacharyya IIT Bombay 22/09/ /10/2004 CS-621/CS-449 Lecture Notes Example for SS to PLC transformation For the previous set-splitting example, one possible solution for the corresponding instance of PLC is P1: -x 1 +3x 2 -x 3 = -1/2 P2: 3x 1 -x 2 +3x 3 =-1/2 (see that all positive points are on the same side as origin)