Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 182 Sections 101 - 102 Leon Barrett (http://www2.hi.net/s4/strangebreed.htm)

Similar presentations


Presentation on theme: "CS 182 Sections 101 - 102 Leon Barrett (http://www2.hi.net/s4/strangebreed.htm)"— Presentation transcript:

1 CS 182 Sections 101 - 102 Leon Barrett (http://www2.hi.net/s4/strangebreed.htm)

2

3 Announcements a3 part 1 is due on Thursday (submit as a3-1)

4 Where we stand Last Week –Learning –backprop This Week –color –cognitive linguistics

5 Back-Propagation Algorithm We define the error term for a single node to be t i - y i xixi f yjyj w ij yiyi x i = ∑ j w ij y j y i = f(x i ) t i :target Sigmoid:

6 Gradient Descent i2i2 i1i1 global minimum: this is your goal it should be 4-D (3 weights) but you get the idea

7 Equations of Backprop Weight update shown on following slides; important equations highlighted in green Note momentum equation: –dW(t) = change in weight at time t –dW(t-1) = change in weight at time t-1 –so using momentum: –dW(t) = -learning_rate * -input * delta(i) + momentum * dW(t-1) –the first part of that comes from last slides below –the second part is the momentum term

8 kji w jk w ij E = Error = ½ ∑ i (t i – y i ) 2 yiyi t i : target The derivative of the sigmoid is just The output layer learning rate

9 kji w jk w ij E = Error = ½ ∑ i (t i – y i ) 2 yiyi t i : target The hidden layer

10 Let’s just do an example E = Error = ½ ∑ i (t i – y i ) 2 x0x0 f i1i1 w 01 y0y0 i2i2 b=1 w 02 w 0b E = ½ (t 0 – y 0 ) 2 111 101 110 000 y0y0 i2i2 i1i1 0.8 0.6 0.5 0 0 0.6224 0.5 1/(1+e^-0.5) E = ½ (0 – 0.6224) 2 = 0.1937 0 0 learning rate suppose  = 0.5 0.4268

11 Biological learning 1.What is Hebbian learning? 2.Where has it been observed? 3.What is wrong with Hebbian learning as a story of how animals learn? –hint – it's the opposite of what's wrong with backprop

12 Hebb’s Rule: neurons that fire together wire together Long Term Potentiation (LTP) is the biological basis of Hebb’s Rule Calcium channels are the key mechanism LTP and Hebb’s Rule strengthenweaken

13 Why is Hebb’s rule incomplete? here’s a contrived example: should you “punish” all the connections? tastebudtastes rotteneats foodgets sick drinks water

14 With high-frequency stimulation, Calcium comes in During normal low-frequency trans-mission, glutamate interacts with NMDA and non-NMDA (AMPA) and metabotropic receptors.

15 Recruitment learning What is recruitment learning? Why do we need it in our story? How does it relate to triangle nodes?

16 Models of Learning Hebbian ~ coincidence Recruitment ~ one trial Supervised ~ correction (backprop) Reinforcement ~ delayed reward Unsupervised ~ similarity


Download ppt "CS 182 Sections 101 - 102 Leon Barrett (http://www2.hi.net/s4/strangebreed.htm)"

Similar presentations


Ads by Google