Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar.

Slides:



Advertisements
Similar presentations
Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Feature Selection as Relevant Information Encoding Naftali Tishby School of Computer Science and Engineering The Hebrew University, Jerusalem, Israel NIPS.
Teaching an Agent by Playing a Multimodal Memory Game: Challenges for Machine Learners and Human Teachers AAAI 2009 Spring Symposium: Agents that Learn.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
Prof. Carolina Ruiz Computer Science Department Bioinformatics and Computational Biology Program WPI WELCOME TO BCB4003/CS4803 BCB503/CS583 BIOLOGICAL.
Supervised Learning Recap
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Artificial Spiking Neural Networks
Introduction to Machine Learning Algorithms. 2 What is Artificial Intelligence (AI)? Design and study of computer programs that behave intelligently.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Grammar induction by Bayesian model averaging Guy Lebanon LARG meeting May 2001 Based on Andreas Stolcke’s thesis UC Berkeley 1994.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Bayesian Networks Alan Ritter.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
Information Retrieval in Practice
Artificial Intelligence (AI) Addition to the lecture 11.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Hypernetwork Models of Memory Hypernetwork Models of Memory Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Brain.
Cognitive Learning and the Multimodal Memory Game: Toward Human-Level Machine Learning 2008 IEEE World Congress on Computational Intelligence (WCCI 2008)
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
An Introduction to Artificial Intelligence and Knowledge Engineering N. Kasabov, Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering,
CSC2535: Computation in Neural Networks Lecture 11: Conditional Random Fields Geoffrey Hinton.
Artificial Intelligence Introductory Lecture Jennifer J. Burg Department of Mathematics and Computer Science.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
NEURAL NETWORKS FOR DATA MINING
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
Understanding the Semantics of Media Lecture Notes on Video Search & Mining, Spring 2012 Presented by Jun Hee Yoo Biointelligence Laboratory School of.
Learning with Hypergraphs: Discovery of Higher-Order Interaction Patterns from High-Dimensional Data Moscow State University, Faculty of Computational.
Introduction to Artificial Intelligence and Soft Computing
Self-Assemblying Hypernetworks for Cognitive Learning of Linguistic Memory Int. Conf. on Cognitive Science, CESSE-2008, Feb. 6-8, 2008, Sheraton Hotel,
1 Machine Learning 1.Where does machine learning fit in computer science? 2.What is machine learning? 3.Where can machine learning be applied? 4.Should.
Learning from observations
I Robot.
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Look Over Here: Attention-Directing Composition of Manga Elements Ying Cao Rynson W.H. Lau Antoni B. Chan SIGGRAPH
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Lecture 2: Statistical learning primer for biologists
Course Overview  What is AI?  What are the Major Challenges?  What are the Main Techniques?  Where are we failing, and why?  Step back and look at.
Evolving Hypernetworks for Language Modeling AI Course Material Oct. 12, 2009 Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and.
What is Artificial Intelligence?
Modeling Situated Language Learning in Early Childhood via Hypernetworks Zhang, Byoung-Tak 1,2, Lee, Eun Seok 2, Heo, Min-Oh 1, and Kang, Myounggu 1 1.
Learning Kernel Classifiers 1. Introduction Summarized by In-Hee Lee.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 18, 2011.
Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Artificial Intelligence DNA Hypernetworks Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Sparse Coding: A Deep Learning using Unlabeled Data for High - Level Representation Dr.G.M.Nasira R. Vidya R. P. Jaia Priyankka.
Brain, Mind, and Computation Part I: Computational Brain Brain, Mind, and Computation Part I: Computational Brain Brain-Mind-Behavior Seminar May 14, 2012.
Brief Intro to Machine Learning CS539
Today’s Lecture Neural networks Training
Deep Learning Amin Sobhani.
Done Done Course Overview What is AI? What are the Major Challenges?
Multimodal Learning with Deep Boltzmann Machines
Data Mining Lecture 11.
Artificial Intelligence
Cognitive Learning and the Multimodal Memory Game: Toward Human-Level Machine Learning 2008 IEEE World Congress on Computational Intelligence (WCCI 2008)
Introduction to Artificial Intelligence and Soft Computing
3.1.1 Introduction to Machine Learning
Artificial Intelligence Lecture No. 28
Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain, Mind, and Computation Part III: Cognitive Brain Networks Brain-Mind-Behavior Seminar May 16, 2012 Byoung-Tak Zhang Biointelligence Laboratory Computer Science and Engineering & Brain Science, Cognitive Science, and Bioinformatics Programs & Brain-Mind-Behavior Concentration Program Seoul National University

(c) 2009 SNU Biointelligence Laboratory, 2 Lecture Overview Part I: Computational Brain  How the brain encodes and processes information? Part II: Brain-Inspired Computation  How to build intelligent machines inspired by brain processes? Part III: Cognitive Brain Networks  How the brain networks perform cognitive processing?

Brain Networks

[Sporns et al., 2004] [Honey et al., 2010]

(C) 2010, SNU CSE Biointelligence Lab, 11 Language Processing and ERP ERPs related to semantic and syntactic processing (from Gazzaniga et al., 2008)

Graphical Models

© 2010, SNU Biointelligence Lab, 13 Machine Learning: Models Symbolic Learning  Version Space Learning  Case-Based Learning  Inductive Logic Programming Neural Learning  Multilayer Perceptrons  Self-Organizing Maps  Adaptive Resonance Theory  Independent Comp. Analysis Evolutionary Learning  Evolution Strategies  Evolutionary Programming  Genetic Algorithms  Genetic Programming  Molecular Programming Probabilistic Learning Probabilistic Learning  Bayesian Networks  Markov Random Fields  Hidden Markov Models  Boltzmann Machines  Gaussian Mixture Models  Latent Variable Models  Generative Topographic Mapping  Self-organizing Latent Lattice Models Other Methods  Decision Trees  Support Vector Machines (SVMs)  Kernel Machines  Boosting Algorithms  Mixture of Experts  Reinforcement Learning (POMDP)

© 2010, SNU Biointelligence Lab, 14 Probabilistic Graphical Models (PGMs) Graphical Models - Boltzmann Machines - Markov Random Fields - Factor Graphs - Hypernetworks - Bayesian Networks - Hidden Markov Models - Latent Variable Models - Directed Hypernetworks UndirectedDirected

15 Recognition vs. Recall Memory “six” Recognition Recall

 Generative  Can generate sample data from the model  cf. discriminative  Inference  Can predict the values of any unobserved variables  cf. classification  Comprehensible  Easy to understand the structure underlying the data  cf. perceptrons or kernel machines  Unsupervised (or self-supervised)  Can learn from unlabeled data (except for CRFs)  cf. supervised  Multiple use  Models joint probability distributions of variables (except for CRFs)  Inputs (knowns) and outputs (unknowns) can be switched  cf. classification Properties of PGMs © 2010 SNU CSE Biointelligence Lab 16

Motivation: Cognitive Learning and Memory © 2010, SNU Biointelligence Lab, 17 Continuity. Learning is a continuous, lifelong process. “The experiences of each immediately past moment are memories that merge with current momentary experiences to create the impression of seamless continuity in our lives” [McGaugh, 2003] Glocality. “Perception is dependent on context” and it is important to maintain both global and local, i.e. glocal, representations [Peterson and Rhodes, 2003] Compositionality. “The brain activates existing metaphorical structures to form a conceptual blend, consisting of all the metaphors linked together” [Feldman, 2006]

© 2010, SNU Biointelligence Lab, 18 Compositionality: From Simple PGMs to Higher-Order PGMs G F J A S G F J A S G F J A S (1) Naïve Bayes (2) Bayesian Net (3) High-Order PGM

Hypernetworks Hypernetworks: A molecular evolutionary architecture for cognitive learning and memory, B.-T. Zhang, IEEE Computational Intelligence Magazine, 3 (3):49-63, 2008.

20 From Simple Graphs to Hypergraphs v5 v1 v3 v7 v2 v6 v4 G = (V, E) V = {v1, v2, v3, …, v7} E = {E1, E2, E3, E4, E5} E1 = {v1, v3, v4} E2 = {v1, v4} E3 = {v2, v3, v6} E4 = {v3, v4, v6, v7} E5 = {v4, v5, v7} E1 E4 E5 E2 E3

21 Hypernetworks A hypernetwork is a hypergraph of weighted edges. It is defined as a triple H = (V, E, W), where V = {v 1, v 2, …, v n }, E = {E 1, E 2, …, E n }, and W = {w 1, w 2, …, w n }. An m-hypernetwork consists of a set V of vertices and a subset E of V [m], i.e. H = (V, V [m], W) where V [m] is a set of subsets of V whose elements have precisely m members and W is the set of weights associated with the hyperedges. A hypernetwork H is said to be k-uniform if every edge E i in E has cardinality k. A hypernetwork H is k-regular if every vertex has degree k. Rem.: An ordinary graph is a 2-uniform hypergraph with w i =1. [Zhang, DNA ]

© 2010 SNU Biointelligence Lab, 22 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 Hypernetworks: Model Structure Vertex: Molecule Gene Protein Neurotransmitter Neuron Cell-assembly Words Edge: Interaction Genetic Signaling Metabolic Synaptic Cortical Cognitive [Zhang, DNA-2006] [Zhang, FOCI-2007]

© 2010 SNU Biointelligence Lab, 23 Biomolecular Networks and Hypernets

Cortical Networks (Brain Networks) © 2009 SNU Biointelligence Lab, 24

© 2010, SNU Biointelligence Lab, 25 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008] Hypernetwork as a Probabilistic Distributed Associative Memory Hyperedges

Population Coding The average population activity A(t) of neurons 26

© 2009, SNU Biointelligence Lab, 27 x1 x2 x3 x4 x5 x6 x7 x8x9 x10 x11 x12 x13 x14 x15 [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008] Hypernetwork as a Probabilistic Model of Distributed Parallel Associative Memory

© 2009, SNU Biointelligence Lab, 28 Deriving the Learning Rule [Zhang, DNA-2006] [Zhang, IEEE CIM, 2008]

© 2009, SNU Biointelligence Lab, 29 Derivation of the Learning Rule

 Higher-order terms  Explicit representation  fast learning  cf. Bayesian networks  Structural learning  Evolving complex networks  discovery of modules  cf. Markov random fields  Population coding  Collection of modules  incremental learning  cf. numerical CPT Features of Hypernetworks © 2010 SNU Biointelligence Lab 30  Compositionality  Creation of new modules  symbolic computation  cf. connectionist models  Self-supervised  Can learn from unlabeled data  no need for labeling  cf. supervised  Reconfigurable architecture  Run-time self-assembly  anytime inference  cf. fixed architecture

© 2009, SNU Biointelligence Lab, 31 Difference from Bayesian Networks G F J A S G F J A S (1) Bayesian Net (2) Hypernetwork Hypernetworks explicitly represent higher-order terms Edges can be directed, undirected, or mixed. Graph structures (modules) automatically discovered

Difference from Markov Networks In Markov networks, the joint distribution is written as a product of potential functions over the maximal cliques of the graph Similarity  Hyperedges define potential functions (components) like cliques  Distribution is represented as a product of potential functions Difference  Novel hyperedges are constructed from data (cliques are given)  Both model structures and parameters are evolved (cliques are fixed)  Hyperedges can be ordered (cliques are not) © 2009 SNU CSE Biointelligence Lab 32 Partition function Potential function energy function Boltzmann Distribution

© 2009, SNU Biointelligence Lab, 33 x1 =1 x2 =0 x3 =0 x4 =1 x5 =0 x6 =0 x7 =0 x8 =0 x9 =0 x10 =1 x11 =0 x12 =1 x13 =0 x14 =0 x15 =0 y = 1 x1 =0 x2 =1 x3 =1 x4 =0 x5 =0 x6 =0 x7 =0 x8 =0 x9 =1 x10 =0 x11 =0 x12 =0 x13 =0 x14 =1 x15 =0 y = 0 x1 =0 x2 =0 x3 =1 x4 =0 x5 =0 x6 =1 x7 =0 x8 =1 x9 =0 x10 =0 x11 =0 x12 =0 x13 =1 x14 =0 x15 =0 y =1 4 Data Items x4x4 x 10 y=1x1x1 x4x4 x 12 y=1x1x1 x 10 x 12 y=1x4x4 x3x3 x9x9 y=0x2x2 x3x3 x 14 y=0x2x2 x9x9 x 14 y=0x3x3 x6x6 x8x8 y=1x3x3 x6x6 x 13 y=1x3x3 x8x8 x 13 y=1x6x x1 =0 x2 =0 x3 =0 x4 =0 x5 =0 x6 =0 x7 =0 x8 =1 x9 =0 x10 =0 x11 =1 x12 =0 x13 =0 x14 =0 x15 =1 y =0 4 x 11 x 15 y=0x8x8 4 Round 1 Round 2 Round 3 Learning

© 2009, SNU Biointelligence Lab, 34 Evolving DNA Hypernetworks

Connection to AI and Cog Science © 2009, SNU Biointelligence Lab, 35 [Zhang, IEEE Computational Intelligence Magazine, August 2008]

Cognitive Hypernetworks: Application Examples

Digital Videos for Teaching Machines (c) 2010 SNU Biointelligence Laboratory, 37 Multimodal Language Vision Audio “Situated” Contexts “Naturalistic” Dynamic “Quasireal” Continuous Educational

© 2009, SNU Biointelligence Lab, 38 Language

© 2009, SNU Biointelligence Lab, 39 A Language Game ? still ? believe ? did this.  I still can't believe you did this. We ? ? a lot ? gifts.  We don't have a lot of gifts.

© 2009, SNU Biointelligence Lab, 40 Text Corpus: TV Drama Series Friends, 24, House, Grey Anatomy, Gilmore Girls, Sex and the City 289,468 Sentences (Training Data) 700 Sentences with Blanks (Test Data) I don't know what happened. Take a look at this. … What ? ? ? here. ? have ? visit the ? room. …

(c) 2009 SNU Biointelligence Laboratory, 41 Hypernetwork Memory of Language

42 Evolutionary Hypernets for Linguistic Memory Why ? you ? come ? down ?  Why are you go come on down here ? appreciate it if ? call her by ? ?  I appreciate it if you call her by the way Would you ? to meet ? ? Tuesday ?  Would you nice to meet you in Tuesday and ? gonna ? upstairs ? ? a shower  I'm gonna go upstairs and take a shower ? have ? visit the ? room  I have to visit the ladies' room ? ? ? decision  to make a decision ? still ? believe ? did this  I still can't believe you did this Zhang and Park, Self-assembling hypernetworks for cognitive learning of linguistic memory, International Conf. on Cognitive Science (ICCS-2008), WASET, pp , Zhang, Cognitive learning and the multimodal memory game: Toward human-level machine learning, IEEE World Congress on Computational Intelligence (WCCI-2008), 2008.

(c) 2009 SNU Biointelligence Laboratory, 43 Corpus: Friends Keyword: “mother” Corpus: Prison Break Keyword: “mother” you're mother killed herself it's my mother was shot by a woman at eight we're just gonna go to your mother that i love it feeling that something's wrong with my mother and father she's the single mother i put this on my friend's mother apparently phoebe's mother killed herself thanks for pleasing my mother killed herself i'm your mother told you this is an incredible mother that's not his mother or his hunger strike holy mother of god woman i like your mother and father on their honeymoon suite with her and never called your mother really did like us is my mother was shot by a drug dealer tells his mother and his family she's the mother of my eyes speak to your mother used to be tells his mother made it pretty clear on the floor has speak to your mother never had life insurance she's the mother of lincoln's child she's the mother of my own crap to deal with you just lost his mother is fine just lost his mother and his god tells his mother and his stepfather she's the mother of my time his mother made it clear you couldn't deliver fibonacci she's the mother of my brother is facing the electric chair same guy who was it your mother before you do it they gunned my mother down Memories for Friends and Prison Break

Learning Languages from Kids Video Goal: (1) Natural language generation at sentence level based on the probabilistic graphical model, and (2) Natural language processing without the explicit grammar rules. © 2009 SNU CSE Biointelligence Lab 44  Training data Kids video scripts  Sentence structure Converting sentences into graph structure  Application Sentence completion and generation Script sequenceGenerated sentence TimothyI like it too nora. Hello kittyI like it too mom. Looney toonsI like it too this time you're a diving act today. Dora I like it too this time you're a hug.

Generated Sentences and Evolved Grammar Generated sentences  (Good) On my first day of school  (Good) Yes timothy it is time to go to school  (Good) Thomas and Percy enjoy working in the spotlight  (Good) Well it is morning  (Bad) He couldn’t way to go outside and shoot  (Bad) the gas house gorillas are a lot of fun players Grammar rules analyzed from the generated sentences  G1: S = NP + VP,G2: NP = PRP  G3: S = VP, G4: PP = IN + NP  G5: NP = NN,G6: NP = DP + NN  G7: ADVP = RB,G8: NP = NP + PP  G9: SBAR = S © 2009 SNU CSE Biointelligence Lab 45

Sentence Generation Accuracy Corpus: scripts from kids video (Miffy, Looney, Caillou, Dora Dora, Macdonald, Thoams & Friends, Timothy, Pooh) Corpus: Video scripts (kids video + sitcom Friends, 120K sentences) In each phase, corpus size is incremented by addition of a video script. Learning: building a language model based on a hypernetwork. Task: Sentence completion from a partial sentence. © 2009 SNU CSE Biointelligence Lab 46 D 1 D 2 D 3 D 4 D 5 D 6 D 7 D 8 D 9 D 1 = Miffy, D 2 = D 1 + Looney, D 3 = D 2 + caillou, D 4 = D 3 + Dora Dora D 5 = D 4 + Macdoland, D 6 = D 5 + Thomas, D 7 = D 6 + Timothy, D 8 = D 7 + Pooh, D 9 = D 8 + Friends

Evolution of Grammar Rules © 2009 SNU CSE Biointelligence Lab 47 Grammar learning curve KL divergence between the distribution of training corpus (P) and the generated sentences (Q). The right curve shows occurrence number of grammar rules are increasing as training progresses. D 1 D 2 D 3 D 4 D 5 D 6 D 7 D 8 D 9 Grammar rules learning curve G* = grammar rule *

© 2010, SNU Biointelligence Lab, 48 Music

1.0 ♩♬ ♪ ♪ ♬ ♬ 가중치 기준 선택 제시된 단서를 기반으로 연속된 음악 생성 가중치를 둔 하이퍼에지 다양한 길이의 하이퍼에지로 구성된 하이퍼네트워크 라이브러리 49© 2009, SNU Biointelligence Lab, 하이퍼에지 교체 ( 라이브러리의 진화 ) 원곡과 비교하여 가중치 변경 인지적 음악 작곡 [H.-W. Kim et al., FUZZ-IEEE 2009]

Experimental Results Scores generated by Evolutionary Hypernetworks that learned American (A), Scottish (B), Korean Singer Kim (C), and Korean Singer Shin (D) with the cue (left side of the bar in the middle) from “Swanee River”, the famous American folk song

© 2009, SNU Biointelligence Lab, 51 Vision

© 2009, SNU Biointelligence Lab, Image Sound Text But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. Image-to-Text Generator (I2T) Image-to-Text Generator (I2T) Text-to-Image Generator (T2I) Text-to-Image Generator (T2I) Text Hint But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. But, I'm getting married tomorrow Well, maybe I am... I keep thinking about you. And I'm wondering if we made a mistake giving up so fast. Are you thinking about me? But if you are, call me tonight. Hint Image Machine Learner Toward Human-Level Machine Learning: Multimodal Memory Game (MMG) [Zhang, IEEE CIM, 2008] [Zhang, AAAI SSS, 2009]

LEARNING BY PLAYING Learning the image from the given text Click the Right Option Text Query Score : 01

54 Learning the text from the given image LEARNING BY PLAYING Image Query Click the Right Option Score : 02

© 2010, SNU Biointelligence Lab, No. of Sessions Accuracy Result 1: Humans for T2I Learning

Result 2: Humans for I2T Learning © 2010, SNU Biointelligence Lab, No. of Sessions Accuracy

© 2010, SNU Biointelligence Lab, No. of Epochs Accuracy Result 3: Machines for I2T Learning [Fareed et al., 2009]

Answer Query I don't know what happened There's a kitty in my guitar case Maybe there's something I can do to make sure I get pregnant Maybe there's something there's something I … I get pregnant There's a a kitty in … in my guitar case I don't know don't know what know what happened Matching & Completion Image-to-Text Recall Examples © 2009, SNU Biointelligence Lab,

Query Matching & Completion I don't know what happened Take a look at this There's a kitty in my guitar case Maybe there's something I can do to make sure I get pregnant Answer Text-to-Image Recall Examples © 2009, SNU Biointelligence Lab,

60 Text-to-Image Associative Retrieval [S.-Y. Lee, 2009]

© 2009, SNU Biointelligence Lab, 61 Image-to-Text Associative Retrieval [S.-Y. Lee, 2009]