Presentation is loading. Please wait.

Presentation is loading. Please wait.

Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar.

Similar presentations


Presentation on theme: "Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar."— Presentation transcript:

1

2 Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar May 25, 2011 Byoung-Tak Zhang Biointelligence Laboratory Computer Science and Engineering & Brain Science, Cognitive Science, and Bioinformatics Programs & Brain-Mind-Behavior Concentration Program Seoul National University http://bi.snu.ac.kr/

3 (c) 2009 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 2 Lecture Overview Part I: Computational Brain  How the brain encodes and processes information? Part II: Brain-Inspired Computation  How to build intelligent machines inspired by brain processes? Part III: Cognitive Brain Networks  How the brain networks perform cognitive processing?

4 Brain Networks

5 [Sporns et al., 2004] [Honey et al., 2010]

6

7

8

9 (C) 2010, SNU CSE Biointelligence Lab, http://bi.snu.ac.kr/ 8 Language Processing and ERP ERPs related to semantic and syntactic processing (from Gazzaniga et al., 2008)

10 Graphical Models

11 © 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 10 Machine Learning: Models Symbolic Learning  Version Space Learning  Case-Based Learning  Inductive Logic Programming Neural Learning  Multilayer Perceptrons  Self-Organizing Maps  Adaptive Resonance Theory  Independent Comp. Analysis Evolutionary Learning  Evolution Strategies  Evolutionary Programming  Genetic Algorithms  Genetic Programming  Molecular Programming Probabilistic Learning Probabilistic Learning  Bayesian Networks  Markov Random Fields  Hidden Markov Models  Boltzmann Machines  Gaussian Mixture Models  Latent Variable Models  Generative Topographic Mapping  Self-organizing Latent Lattice Models Other Methods  Decision Trees  Support Vector Machines (SVMs)  Kernel Machines  Boosting Algorithms  Mixture of Experts  Reinforcement Learning (POMDP)

12 © 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 11 Probabilistic Graphical Models (PGMs) Graphical Models - Boltzmann Machines - Markov Random Fields - Factor Graphs - Hypernetworks - Bayesian Networks - Hidden Markov Models - Latent Variable Models - Directed Hypernetworks UndirectedDirected

13 12 Recognition vs. Recall Memory “six” Recognition Recall

14  Generative  Can generate sample data from the model  cf. discriminative  Inference  Can predict the values of any unobserved variables  cf. classification  Comprehensible  Easy to understand the structure underlying the data  cf. perceptrons or kernel machines  Unsupervised (or self-supervised)  Can learn from unlabeled data (except for CRFs)  cf. supervised  Multiple use  Models joint probability distributions of variables (except for CRFs)  Inputs (knowns) and outputs (unknowns) can be switched  cf. classification Properties of PGMs © 2010 SNU CSE Biointelligence Lab 13

15 © 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 14 Compositionality: From Simple PGMs to Higher-Order PGMs G F J A S G F J A S G F J A S (1) Naïve Bayes (2) Bayesian Net (3) High-Order PGM

16 Hypernetworks Hypernetworks: A molecular evolutionary architecture for cognitive learning and memory, B.-T. Zhang, IEEE Computational Intelligence Magazine, 3 (3):49-63, 2008.

17 16 From Simple Graphs to Hypergraphs v5 v1 v3 v7 v2 v6 v4 G = (V, E) V = {v1, v2, v3, …, v7} E = {E1, E2, E3, E4, E5} E1 = {v1, v3, v4} E2 = {v1, v4} E3 = {v2, v3, v6} E4 = {v3, v4, v6, v7} E5 = {v4, v5, v7} E1 E4 E5 E2 E3

18 17 Hypernetworks A hypernetwork is a hypergraph of weighted edges. It is defined as a triple H = (V, E, W), where V = {v 1, v 2, …, v n }, E = {E 1, E 2, …, E n }, and W = {w 1, w 2, …, w n }. An m-hypernetwork consists of a set V of vertices and a subset E of V [m], i.e. H = (V, V [m], W) where V [m] is a set of subsets of V whose elements have precisely m members and W is the set of weights associated with the hyperedges. A hypernetwork H is said to be k-uniform if every edge E i in E has cardinality k. A hypernetwork H is k-regular if every vertex has degree k. Rem.: An ordinary graph is a 2-uniform hypergraph with w i =1. [Zhang, DNA12-2006]

19 © 2010 SNU Biointelligence Lab, http://bi.snu.ac.kr/ 18 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 Hypernetworks: Model Structure Vertex: Molecule Gene Protein Neurotransmitter Neuron Cell-assembly Words Edge: Interaction Genetic Signaling Metabolic Synaptic Cortical Cognitive [Zhang, DNA-2006] [Zhang, FOCI-2007]

20 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 19 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008] Hypernetwork as a Probabilistic Model of Distributed Parallel Associative Memory

21 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 20 Deriving the Learning Rule [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008]

22 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 21 Derivation of the Learning Rule

23  Higher-order terms  Explicit representation  fast learning  cf. Bayesian networks  Structural learning  Evolving complex networks  discovery of modules  cf. Markov random fields  Population coding  Collection of modules  incremental learning  cf. numerical CPT Features of Hypernetworks © 2010 SNU Biointelligence Lab 22  Compositionality  Creation of new modules  symbolic computation  cf. connectionist models  Self-supervised  Can learn from unlabeled data  no need for labeling  cf. supervised  Reconfigurable architecture  Run-time self-assembly  anytime inference  cf. fixed architecture

24 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 23 Difference from Bayesian Networks G F J A S G F J A S (1) Bayesian Net (2) Hypernetwork Hypernetworks explicitly represent higher-order terms Edges can be directed, undirected, or mixed. Graph structures (modules) automatically discovered

25 Difference from Markov Networks In Markov networks, the joint distribution is written as a product of potential functions over the maximal cliques of the graph Similarity  Hyperedges define potential functions (components) like cliques  Distribution is represented as a product of potential functions Difference  Novel hyperedges are constructed from data (cliques are given)  Both model structures and parameters are evolved (cliques are fixed)  Hyperedges can be ordered (cliques are not) © 2009 SNU CSE Biointelligence Lab 24 Partition function Potential function energy function Boltzmann Distribution

26 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 25 x1 =1 x2 =0 x3 =0 x4 =1 x5 =0 x6 =0 x7 =0 x8 =0 x9 =0 x10 =1 x11 =0 x12 =1 x13 =0 x14 =0 x15 =0 y = 1 x1 =0 x2 =1 x3 =1 x4 =0 x5 =0 x6 =0 x7 =0 x8 =0 x9 =1 x10 =0 x11 =0 x12 =0 x13 =0 x14 =1 x15 =0 y = 0 x1 =0 x2 =0 x3 =1 x4 =0 x5 =0 x6 =1 x7 =0 x8 =1 x9 =0 x10 =0 x11 =0 x12 =0 x13 =1 x14 =0 x15 =0 y =1 4 Data Items x4x4 x 10 y=1x1x1 x4x4 x 12 y=1x1x1 x 10 x 12 y=1x4x4 x3x3 x9x9 y=0x2x2 x3x3 x 14 y=0x2x2 x9x9 x 14 y=0x3x3 x6x6 x8x8 y=1x3x3 x6x6 x 13 y=1x3x3 x8x8 x 13 y=1x6x6 1 2 3 1 2 3 x1 =0 x2 =0 x3 =0 x4 =0 x5 =0 x6 =0 x7 =0 x8 =1 x9 =0 x10 =0 x11 =1 x12 =0 x13 =0 x14 =0 x15 =1 y =0 4 x 11 x 15 y=0x8x8 4 Round 1 Round 2 Round 3 Learning

27 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 26 Evolving DNA Hypernetworks

28 Cognitive Hypernetworks: Application Examples

29 Digital Videos for Teaching Machines (c) 2010 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 28 Multimodal Language Vision Audio “Situated” Contexts “Naturalistic” Dynamic “Quasireal” Continuous Educational

30 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 29 A Language Game ? still ? believe ? did this.  I still can't believe you did this. We ? ? a lot ? gifts.  We don't have a lot of gifts.

31 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 30 Text Corpus: TV Drama Series Friends, 24, House, Grey Anatomy, Gilmore Girls, Sex and the City 289,468 Sentences (Training Data) 700 Sentences with Blanks (Test Data) I don't know what happened. Take a look at this. … What ? ? ? here. ? have ? visit the ? room. …

32 (c) 2009 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 31 Hypernetwork Memory of Language

33 32 Evolutionary Hypernets for Linguistic Memory Why ? you ? come ? down ?  Why are you go come on down here ? appreciate it if ? call her by ? ?  I appreciate it if you call her by the way Would you ? to meet ? ? Tuesday ?  Would you nice to meet you in Tuesday and ? gonna ? upstairs ? ? a shower  I'm gonna go upstairs and take a shower ? have ? visit the ? room  I have to visit the ladies' room ? ? ? decision  to make a decision ? still ? believe ? did this  I still can't believe you did this Zhang and Park, Self-assembling hypernetworks for cognitive learning of linguistic memory, International Conf. on Cognitive Science (ICCS-2008), WASET, pp. 134-138, 2008. Zhang, Cognitive learning and the multimodal memory game: Toward human-level machine learning, IEEE World Congress on Computational Intelligence (WCCI-2008), 2008.

34 (c) 2009 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 33 Corpus: Friends Keyword: “mother” Corpus: Prison Break Keyword: “mother” you're mother killed herself it's my mother was shot by a woman at eight we're just gonna go to your mother that i love it feeling that something's wrong with my mother and father she's the single mother i put this on my friend's mother apparently phoebe's mother killed herself thanks for pleasing my mother killed herself i'm your mother told you this is an incredible mother that's not his mother or his hunger strike holy mother of god woman i like your mother and father on their honeymoon suite with her and never called your mother really did like us is my mother was shot by a drug dealer tells his mother and his family she's the mother of my eyes speak to your mother used to be tells his mother made it pretty clear on the floor has speak to your mother never had life insurance she's the mother of lincoln's child she's the mother of my own crap to deal with you just lost his mother is fine just lost his mother and his god tells his mother and his stepfather she's the mother of my time his mother made it clear you couldn't deliver fibonacci she's the mother of my brother is facing the electric chair same guy who was it your mother before you do it they gunned my mother down Memories for Friends and Prison Break

35 Learning Languages from Kids Video Goal: (1) Natural language generation at sentence level based on the probabilistic graphical model, and (2) Natural language processing without the explicit grammar rules. © 2009 SNU CSE Biointelligence Lab 34  Training data Kids video scripts  Sentence structure Converting sentences into graph structure  Application Sentence completion and generation Script sequenceGenerated sentence TimothyI like it too nora. Hello kittyI like it too mom. Looney toonsI like it too this time you're a diving act today. Dora I like it too this time you're a hug.

36 Generated Sentences and Evolved Grammar Generated sentences  (Good) On my first day of school  (Good) Yes timothy it is time to go to school  (Good) Thomas and Percy enjoy working in the spotlight  (Good) Well it is morning  (Bad) He couldn’t way to go outside and shoot  (Bad) the gas house gorillas are a lot of fun players Grammar rules analyzed from the generated sentences  G1: S = NP + VP,G2: NP = PRP  G3: S = VP, G4: PP = IN + NP  G5: NP = NN,G6: NP = DP + NN  G7: ADVP = RB,G8: NP = NP + PP  G9: SBAR = S © 2009 SNU CSE Biointelligence Lab 35

37 1.0 ♩♬ ♪ ♪ ♬ ♬ 가중치 기준 선택 제시된 단서를 기반으로 연속된 음악 생성 가중치를 둔 하이퍼에지 다양한 길이의 하이퍼에지로 구성된 하이퍼네트워크 라이브러리 36© 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 하이퍼에지 교체 ( 라이브러리의 진화 ) 1.0 1.3 0.7 1.1 1.2 1.4 원곡과 비교하여 가중치 변경 인지적 음악 작곡 1.4 1.5 1.3 1.5 1.6 [H.-W. Kim et al., FUZZ-IEEE 2009]

38 Experimental Results Scores generated by Evolutionary Hypernetworks that learned American (A), Scottish (B), Korean Singer Kim (C), and Korean Singer Shin (D) with the cue (left side of the bar in the middle) from “Swanee River”, the famous American folk song

39 © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Image Sound Text But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. Image-to-Text Generator (I2T) Image-to-Text Generator (I2T) Text-to-Image Generator (T2I) Text-to-Image Generator (T2I) Text Hint But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. Hint Image Machine Learner Toward Human-Level Machine Learning: Multimodal Memory Game (MMG) [Zhang, IEEE CIM, 2008] [Zhang, AAAI SSS, 2009]

40 LEARNING BY PLAYING Learning the image from the given text Click the Right Option Text Query Score : 01

41 40 Learning the text from the given image LEARNING BY PLAYING Image Query Click the Right Option Score : 02

42 Answer Query I don't know what happened There's a kitty in my guitar case Maybe there's something I can do to make sure I get pregnant Maybe there's something there's something I … I get pregnant There's a a kitty in … in my guitar case I don't know don't know what know what happened Matching & Completion Image-to-Text Recall Examples © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

43 Query Matching & Completion I don't know what happened Take a look at this There's a kitty in my guitar case Maybe there's something I can do to make sure I get pregnant Answer Text-to-Image Recall Examples © 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/


Download ppt "Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar."

Similar presentations


Ads by Google