Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar.

Slides:



Advertisements
Similar presentations
Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Advertisements

Rerun of machine learning Clustering and pattern recognition.
Feature Selection as Relevant Information Encoding Naftali Tishby School of Computer Science and Engineering The Hebrew University, Jerusalem, Israel NIPS.
Teaching an Agent by Playing a Multimodal Memory Game: Challenges for Machine Learners and Human Teachers AAAI 2009 Spring Symposium: Agents that Learn.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Prof. Carolina Ruiz Computer Science Department Bioinformatics and Computational Biology Program WPI WELCOME TO BCB4003/CS4803 BCB503/CS583 BIOLOGICAL.
An Overview of Machine Learning
Supervised Learning Recap
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Introduction to Machine Learning Algorithms. 2 What is Artificial Intelligence (AI)? Design and study of computer programs that behave intelligently.
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Bayesian Networks Alan Ritter.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
Information Retrieval in Practice
LLNL-PRES This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Crash Course on Machine Learning
CS-424 Gregory Dudek Today’s Lecture Neural networks –Backprop example Clustering & classification: case study –Sound classification: the tapper Recurrent.
Artificial Intelligence (AI) Addition to the lecture 11.
Utilising software to enhance your research Eamonn Hynes 5 th November, 2012.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Hypernetwork Models of Memory Hypernetwork Models of Memory Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Brain.
Cognitive Learning and the Multimodal Memory Game: Toward Human-Level Machine Learning 2008 IEEE World Congress on Computational Intelligence (WCCI 2008)
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
An Introduction to Artificial Intelligence and Knowledge Engineering N. Kasabov, Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering,
CSC2535: Computation in Neural Networks Lecture 11: Conditional Random Fields Geoffrey Hinton.
Artificial Intelligence Introductory Lecture Jennifer J. Burg Department of Mathematics and Computer Science.
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
NEURAL NETWORKS FOR DATA MINING
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
Understanding the Semantics of Media Lecture Notes on Video Search & Mining, Spring 2012 Presented by Jun Hee Yoo Biointelligence Laboratory School of.
Learning with Hypergraphs: Discovery of Higher-Order Interaction Patterns from High-Dimensional Data Moscow State University, Faculty of Computational.
Introduction to Artificial Intelligence and Soft Computing
Self-Assemblying Hypernetworks for Cognitive Learning of Linguistic Memory Int. Conf. on Cognitive Science, CESSE-2008, Feb. 6-8, 2008, Sheraton Hotel,
1 Machine Learning 1.Where does machine learning fit in computer science? 2.What is machine learning? 3.Where can machine learning be applied? 4.Should.
Learning from observations
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Lecture 2: Statistical learning primer for biologists
Course Overview  What is AI?  What are the Major Challenges?  What are the Main Techniques?  Where are we failing, and why?  Step back and look at.
Evolving Hypernetworks for Language Modeling AI Course Material Oct. 12, 2009 Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and.
Copyright Paula Matuszek Kinds of Machine Learning.
Modeling Situated Language Learning in Early Childhood via Hypernetworks Zhang, Byoung-Tak 1,2, Lee, Eun Seok 2, Heo, Min-Oh 1, and Kang, Myounggu 1 1.
Learning Kernel Classifiers 1. Introduction Summarized by In-Hee Lee.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 18, 2011.
Dan Roth University of Illinois, Urbana-Champaign 7 Sequential Models Tutorial on Machine Learning in Natural.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Artificial Intelligence DNA Hypernetworks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Sparse Coding: A Deep Learning using Unlabeled Data for High - Level Representation Dr.G.M.Nasira R. Vidya R. P. Jaia Priyankka.
Brief Intro to Machine Learning CS539
Today’s Lecture Neural networks Training
Done Done Course Overview What is AI? What are the Major Challenges?
COMP61011 : Machine Learning Ensemble Models
Data Mining Lecture 11.
Artificial Intelligence
Cognitive Learning and the Multimodal Memory Game: Toward Human-Level Machine Learning 2008 IEEE World Congress on Computational Intelligence (WCCI 2008)
Introduction to Artificial Intelligence and Soft Computing
3.1.1 Introduction to Machine Learning
Artificial Intelligence Lecture No. 28
Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,
Presentation transcript:

Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar May 25, 2011 Byoung-Tak Zhang Biointelligence Laboratory Computer Science and Engineering & Brain Science, Cognitive Science, and Bioinformatics Programs & Brain-Mind-Behavior Concentration Program Seoul National University

(c) 2009 SNU Biointelligence Laboratory, 2 Lecture Overview Part I: Computational Brain  How the brain encodes and processes information? Part II: Brain-Inspired Computation  How to build intelligent machines inspired by brain processes? Part III: Cognitive Brain Networks  How the brain networks perform cognitive processing?

Brain Networks

[Sporns et al., 2004] [Honey et al., 2010]

(C) 2010, SNU CSE Biointelligence Lab, 8 Language Processing and ERP ERPs related to semantic and syntactic processing (from Gazzaniga et al., 2008)

Graphical Models

© 2010, SNU Biointelligence Lab, 10 Machine Learning: Models Symbolic Learning  Version Space Learning  Case-Based Learning  Inductive Logic Programming Neural Learning  Multilayer Perceptrons  Self-Organizing Maps  Adaptive Resonance Theory  Independent Comp. Analysis Evolutionary Learning  Evolution Strategies  Evolutionary Programming  Genetic Algorithms  Genetic Programming  Molecular Programming Probabilistic Learning Probabilistic Learning  Bayesian Networks  Markov Random Fields  Hidden Markov Models  Boltzmann Machines  Gaussian Mixture Models  Latent Variable Models  Generative Topographic Mapping  Self-organizing Latent Lattice Models Other Methods  Decision Trees  Support Vector Machines (SVMs)  Kernel Machines  Boosting Algorithms  Mixture of Experts  Reinforcement Learning (POMDP)

© 2010, SNU Biointelligence Lab, 11 Probabilistic Graphical Models (PGMs) Graphical Models - Boltzmann Machines - Markov Random Fields - Factor Graphs - Hypernetworks - Bayesian Networks - Hidden Markov Models - Latent Variable Models - Directed Hypernetworks UndirectedDirected

12 Recognition vs. Recall Memory “six” Recognition Recall

 Generative  Can generate sample data from the model  cf. discriminative  Inference  Can predict the values of any unobserved variables  cf. classification  Comprehensible  Easy to understand the structure underlying the data  cf. perceptrons or kernel machines  Unsupervised (or self-supervised)  Can learn from unlabeled data (except for CRFs)  cf. supervised  Multiple use  Models joint probability distributions of variables (except for CRFs)  Inputs (knowns) and outputs (unknowns) can be switched  cf. classification Properties of PGMs © 2010 SNU CSE Biointelligence Lab 13

© 2010, SNU Biointelligence Lab, 14 Compositionality: From Simple PGMs to Higher-Order PGMs G F J A S G F J A S G F J A S (1) Naïve Bayes (2) Bayesian Net (3) High-Order PGM

Hypernetworks Hypernetworks: A molecular evolutionary architecture for cognitive learning and memory, B.-T. Zhang, IEEE Computational Intelligence Magazine, 3 (3):49-63, 2008.

16 From Simple Graphs to Hypergraphs v5 v1 v3 v7 v2 v6 v4 G = (V, E) V = {v1, v2, v3, …, v7} E = {E1, E2, E3, E4, E5} E1 = {v1, v3, v4} E2 = {v1, v4} E3 = {v2, v3, v6} E4 = {v3, v4, v6, v7} E5 = {v4, v5, v7} E1 E4 E5 E2 E3

17 Hypernetworks A hypernetwork is a hypergraph of weighted edges. It is defined as a triple H = (V, E, W), where V = {v 1, v 2, …, v n }, E = {E 1, E 2, …, E n }, and W = {w 1, w 2, …, w n }. An m-hypernetwork consists of a set V of vertices and a subset E of V [m], i.e. H = (V, V [m], W) where V [m] is a set of subsets of V whose elements have precisely m members and W is the set of weights associated with the hyperedges. A hypernetwork H is said to be k-uniform if every edge E i in E has cardinality k. A hypernetwork H is k-regular if every vertex has degree k. Rem.: An ordinary graph is a 2-uniform hypergraph with w i =1. [Zhang, DNA ]

© 2010 SNU Biointelligence Lab, 18 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 Hypernetworks: Model Structure Vertex: Molecule Gene Protein Neurotransmitter Neuron Cell-assembly Words Edge: Interaction Genetic Signaling Metabolic Synaptic Cortical Cognitive [Zhang, DNA-2006] [Zhang, FOCI-2007]

© 2009, SNU Biointelligence Lab, 19 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008] Hypernetwork as a Probabilistic Model of Distributed Parallel Associative Memory

© 2009, SNU Biointelligence Lab, 20 Deriving the Learning Rule [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008]

© 2009, SNU Biointelligence Lab, 21 Derivation of the Learning Rule

 Higher-order terms  Explicit representation  fast learning  cf. Bayesian networks  Structural learning  Evolving complex networks  discovery of modules  cf. Markov random fields  Population coding  Collection of modules  incremental learning  cf. numerical CPT Features of Hypernetworks © 2010 SNU Biointelligence Lab 22  Compositionality  Creation of new modules  symbolic computation  cf. connectionist models  Self-supervised  Can learn from unlabeled data  no need for labeling  cf. supervised  Reconfigurable architecture  Run-time self-assembly  anytime inference  cf. fixed architecture

© 2009, SNU Biointelligence Lab, 23 Difference from Bayesian Networks G F J A S G F J A S (1) Bayesian Net (2) Hypernetwork Hypernetworks explicitly represent higher-order terms Edges can be directed, undirected, or mixed. Graph structures (modules) automatically discovered

Difference from Markov Networks In Markov networks, the joint distribution is written as a product of potential functions over the maximal cliques of the graph Similarity  Hyperedges define potential functions (components) like cliques  Distribution is represented as a product of potential functions Difference  Novel hyperedges are constructed from data (cliques are given)  Both model structures and parameters are evolved (cliques are fixed)  Hyperedges can be ordered (cliques are not) © 2009 SNU CSE Biointelligence Lab 24 Partition function Potential function energy function Boltzmann Distribution

© 2009, SNU Biointelligence Lab, 25 x1 =1 x2 =0 x3 =0 x4 =1 x5 =0 x6 =0 x7 =0 x8 =0 x9 =0 x10 =1 x11 =0 x12 =1 x13 =0 x14 =0 x15 =0 y = 1 x1 =0 x2 =1 x3 =1 x4 =0 x5 =0 x6 =0 x7 =0 x8 =0 x9 =1 x10 =0 x11 =0 x12 =0 x13 =0 x14 =1 x15 =0 y = 0 x1 =0 x2 =0 x3 =1 x4 =0 x5 =0 x6 =1 x7 =0 x8 =1 x9 =0 x10 =0 x11 =0 x12 =0 x13 =1 x14 =0 x15 =0 y =1 4 Data Items x4x4 x 10 y=1x1x1 x4x4 x 12 y=1x1x1 x 10 x 12 y=1x4x4 x3x3 x9x9 y=0x2x2 x3x3 x 14 y=0x2x2 x9x9 x 14 y=0x3x3 x6x6 x8x8 y=1x3x3 x6x6 x 13 y=1x3x3 x8x8 x 13 y=1x6x x1 =0 x2 =0 x3 =0 x4 =0 x5 =0 x6 =0 x7 =0 x8 =1 x9 =0 x10 =0 x11 =1 x12 =0 x13 =0 x14 =0 x15 =1 y =0 4 x 11 x 15 y=0x8x8 4 Round 1 Round 2 Round 3 Learning

© 2009, SNU Biointelligence Lab, 26 Evolving DNA Hypernetworks

Cognitive Hypernetworks: Application Examples

Digital Videos for Teaching Machines (c) 2010 SNU Biointelligence Laboratory, 28 Multimodal Language Vision Audio “Situated” Contexts “Naturalistic” Dynamic “Quasireal” Continuous Educational

© 2009, SNU Biointelligence Lab, 29 A Language Game ? still ? believe ? did this.  I still can't believe you did this. We ? ? a lot ? gifts.  We don't have a lot of gifts.

© 2009, SNU Biointelligence Lab, 30 Text Corpus: TV Drama Series Friends, 24, House, Grey Anatomy, Gilmore Girls, Sex and the City 289,468 Sentences (Training Data) 700 Sentences with Blanks (Test Data) I don't know what happened. Take a look at this. … What ? ? ? here. ? have ? visit the ? room. …

(c) 2009 SNU Biointelligence Laboratory, 31 Hypernetwork Memory of Language

32 Evolutionary Hypernets for Linguistic Memory Why ? you ? come ? down ?  Why are you go come on down here ? appreciate it if ? call her by ? ?  I appreciate it if you call her by the way Would you ? to meet ? ? Tuesday ?  Would you nice to meet you in Tuesday and ? gonna ? upstairs ? ? a shower  I'm gonna go upstairs and take a shower ? have ? visit the ? room  I have to visit the ladies' room ? ? ? decision  to make a decision ? still ? believe ? did this  I still can't believe you did this Zhang and Park, Self-assembling hypernetworks for cognitive learning of linguistic memory, International Conf. on Cognitive Science (ICCS-2008), WASET, pp , Zhang, Cognitive learning and the multimodal memory game: Toward human-level machine learning, IEEE World Congress on Computational Intelligence (WCCI-2008), 2008.

(c) 2009 SNU Biointelligence Laboratory, 33 Corpus: Friends Keyword: “mother” Corpus: Prison Break Keyword: “mother” you're mother killed herself it's my mother was shot by a woman at eight we're just gonna go to your mother that i love it feeling that something's wrong with my mother and father she's the single mother i put this on my friend's mother apparently phoebe's mother killed herself thanks for pleasing my mother killed herself i'm your mother told you this is an incredible mother that's not his mother or his hunger strike holy mother of god woman i like your mother and father on their honeymoon suite with her and never called your mother really did like us is my mother was shot by a drug dealer tells his mother and his family she's the mother of my eyes speak to your mother used to be tells his mother made it pretty clear on the floor has speak to your mother never had life insurance she's the mother of lincoln's child she's the mother of my own crap to deal with you just lost his mother is fine just lost his mother and his god tells his mother and his stepfather she's the mother of my time his mother made it clear you couldn't deliver fibonacci she's the mother of my brother is facing the electric chair same guy who was it your mother before you do it they gunned my mother down Memories for Friends and Prison Break

Learning Languages from Kids Video Goal: (1) Natural language generation at sentence level based on the probabilistic graphical model, and (2) Natural language processing without the explicit grammar rules. © 2009 SNU CSE Biointelligence Lab 34  Training data Kids video scripts  Sentence structure Converting sentences into graph structure  Application Sentence completion and generation Script sequenceGenerated sentence TimothyI like it too nora. Hello kittyI like it too mom. Looney toonsI like it too this time you're a diving act today. Dora I like it too this time you're a hug.

Generated Sentences and Evolved Grammar Generated sentences  (Good) On my first day of school  (Good) Yes timothy it is time to go to school  (Good) Thomas and Percy enjoy working in the spotlight  (Good) Well it is morning  (Bad) He couldn’t way to go outside and shoot  (Bad) the gas house gorillas are a lot of fun players Grammar rules analyzed from the generated sentences  G1: S = NP + VP,G2: NP = PRP  G3: S = VP, G4: PP = IN + NP  G5: NP = NN,G6: NP = DP + NN  G7: ADVP = RB,G8: NP = NP + PP  G9: SBAR = S © 2009 SNU CSE Biointelligence Lab 35

1.0 ♩♬ ♪ ♪ ♬ ♬ 가중치 기준 선택 제시된 단서를 기반으로 연속된 음악 생성 가중치를 둔 하이퍼에지 다양한 길이의 하이퍼에지로 구성된 하이퍼네트워크 라이브러리 36© 2009, SNU Biointelligence Lab, 하이퍼에지 교체 ( 라이브러리의 진화 ) 원곡과 비교하여 가중치 변경 인지적 음악 작곡 [H.-W. Kim et al., FUZZ-IEEE 2009]

Experimental Results Scores generated by Evolutionary Hypernetworks that learned American (A), Scottish (B), Korean Singer Kim (C), and Korean Singer Shin (D) with the cue (left side of the bar in the middle) from “Swanee River”, the famous American folk song

© 2009, SNU Biointelligence Lab, Image Sound Text But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. Image-to-Text Generator (I2T) Image-to-Text Generator (I2T) Text-to-Image Generator (T2I) Text-to-Image Generator (T2I) Text Hint But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. Hint Image Machine Learner Toward Human-Level Machine Learning: Multimodal Memory Game (MMG) [Zhang, IEEE CIM, 2008] [Zhang, AAAI SSS, 2009]

LEARNING BY PLAYING Learning the image from the given text Click the Right Option Text Query Score : 01

40 Learning the text from the given image LEARNING BY PLAYING Image Query Click the Right Option Score : 02

Answer Query I don't know what happened There's a kitty in my guitar case Maybe there's something I can do to make sure I get pregnant Maybe there's something there's something I … I get pregnant There's a a kitty in … in my guitar case I don't know don't know what know what happened Matching & Completion Image-to-Text Recall Examples © 2009, SNU Biointelligence Lab,

Query Matching & Completion I don't know what happened Take a look at this There's a kitty in my guitar case Maybe there's something I can do to make sure I get pregnant Answer Text-to-Image Recall Examples © 2009, SNU Biointelligence Lab,