Download presentation
Presentation is loading. Please wait.
Published byGabriel Atkins Modified over 9 years ago
1
Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 38: PAC Learning, VC dimension; Self Organization
2
VC-dimension Gives a necessary and sufficient condition for PAC learnability.
3
Def:- Let C be a concept class, i.e., it has members c1,c2,c3,…… as concepts in it. C1 C2 C3 C
4
Let S be a subset of U (universe). Now if all the subsets of S can be produced by intersecting with C i s, then we say C shatters S.
5
The highest cardinality set S that can be shattered gives the VC-dimension of C. VC-dim(C)= |S| VC-dim: Vapnik-Cherronenkis dimension.
6
IIT Bombay6 2 – Dim surface C = { half planes} x y
7
IIT Bombay7 a |s| = 1 can be shattered S 1 = { a } {a}, Ø y x
8
IIT Bombay8 a |s| = 2 can be shattered b S 2 = { a,b } {a,b}, {a}, {b}, Ø x y
9
IIT Bombay9 a |s| = 3 can be shattered b c x y S 3 = { a,b,c }
10
IIT Bombay10
11
IIT Bombay11 AB D C x y |s| = 4 cannot be shattered S 4 = { a,b,c,d }
12
A Concept Class C is learnable for all probability distributions and all concepts in C if and only if the VC dimension of C is finite If the VC dimension of C is d, then…(next page) IIT Bombay12
13
(a) for 0< ε <1 and the sample size at least max[(4/ ε )log(2/ δ ), (8d/ ε )log(13/ ε )] any consistent function A:S c C is a learning function for C (b) for 0< ε <1/2 and sample size less than max[((1- ε )/ ε )ln(1/ δ ), d(1-2( ε (1- δ )+ δ ))] No function A:S c H, for any hypothesis space is a learning function for C. IIT Bombay13
14
Book 1. Computational Learning Theory, M. H. G. Anthony, N. Biggs, Cambridge Tracts in Theoretical Computer Science, 1997. Paper’s 1. A theory of the learnable, Valiant, LG (1984), Communications of the ACM 27(11):1134 -1142. 2. Learnability and the VC-dimension, A Blumer, A Ehrenfeucht, D Haussler, M Warmuth - Journal of the ACM, 1989.
16
Biological Motivation Brain
17
Higher brain Brain Cerebellum Cerebrum 3- Layers: Cerebrum Cerebellum Higher brain
18
Search for Meaning Contributing to humanity Achievement,recognition Food,rest survival
19
Higher brain ( responsible for higher needs) Cerebrum (crucial for survival) 3- Layers: Cerebrum Cerebellum Higher brain
20
Side areas For auditory information processing Back of brain( vision) Lot of resilience: Visual and auditory areas can do each other’s job
21
Left Brain and Right Brain Dichotomy Left Brain Right Brain
22
Left Brain – Logic, Reasoning, Verbal ability Right Brain – Emotion, Creativity Music Words – left Brain Tune – Right Brain Maps in the brain. Limbs are mapped to brain
23
, O/p grid.. I/p neuron Character Reognition
24
Self Organization or Kohonen network fires a group of neurons instead of a single one. The group “some how” produces a “picture” of the cluster. Fundamentally SOM is competitive learning. But weight changes are incorporated on a neighborhood. Find the winner neuron, apply weight change for the winner and its “neighbors”.
25
Neurons on the contour are the “neighborhood” neurons. Winner
26
Weight change rule for SOM W (n+1) = W (n) + η (n) ( I (n) – W (n) ) P+ δ (n) P+ δ (n) P+ δ (n) Neighborhood: function of n Learning rate: function of n δ(n) is a decreasing function of n η(n) learning rate is also a decreasing function of n 0 < η(n) < η(n –1 ) <=1
27
Pictorially Winner δ(n).. Convergence for kohonen not proved except for uni- dimension
28
… … …. P neurons o/p layer n neurons Clusters: A A : A WpWp B : C : :
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.