Lecture 7: Cognitive Science A Necker Cube

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
Summer 2011 Tuesday, 8/ No supposition seems to me more natural than that there is no process in the brain correlated with associating or with.
Computation and representation Joe Lau. Overview of lecture What is computation? Brief history Computational explanations in cognitive science Levels.
An Introduction to Artificial Intelligence Presented by : M. Eftekhari.
Organizational Notes no study guide no review session not sufficient to just read book and glance at lecture material midterm/final is considered hard.
COGNITIVE NEUROSCIENCE
Symbolic Encoding of Neural Networks using Communicating Automata with Applications to Verification of Neural Network Based Controllers* Li Su, Howard.
Chapter Seven The Network Approach: Mind as a Web.
Neural Networks Chapter Feed-Forward Neural Networks.
Overview and History of Cognitive Science. How do minds work? What would an answer to this question look like? What is a mind? What is intelligence? How.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
02 -1 Lecture 02 Agent Technology Topics –Introduction –Agent Reasoning –Agent Learning –Ontology Engineering –User Modeling –Mobile Agents –Multi-Agent.
Cognitive Modeling & Information Processing Metaphor.
AS Cognitive exam techniques. Outline one assumption of the cognitive approach in psychology (2) Group 1 work in threes Group 1 work in threes Group 2.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
THEORIES OF MIND: AN INTRODUCTION TO COGNITIVE SCIENCE Jay Friedenberg and Gordon Silverman.
Radial Basis Function Networks
MIND: The Cognitive Side of Mind and Brain  “… the mind is not the brain, but what the brain does…” (Pinker, 1997)
Chapter 4: Global responses to the integration challenge.
Explorations in Neural Networks Tianhui Cai Period 3.
Appendix B: An Example of Back-propagation algorithm
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
An informal description of artificial neural networks John MacCormick.
Connectionist Models of Language Development: Grammar and the Lexicon Steve R. Howell McMaster University, 1999.
Brandon Herndon, Wes Wynmor, and Tyler Tuminski.  Connectionism is a theory that seeks to explain the human thought process.  It states that the mind.
Methodology of Simulations n CS/PY 399 Lecture Presentation # 19 n February 21, 2001 n Mount Union College.
Neural Organization and Connectionist Models of Spatial Cognition: A Review Jigar Patel and S. Bapi Raju* University of Hyderabad Spatial Cognition The.
Application of Machine Learning for Sequential Data Peter Uherek.
Types of Minds J. Scott Jordan Department of Psychology Illinois State University.
An Unsupervised Connectionist Model of Rule Emergence in Category Learning Rosemary Cowell & Robert French LEAD-CNRS, Dijon, France EC FP6 NEST Grant.
The Language of Thought : Part II Joe Lau Philosophy HKU.
Emergent Semantics: Meaning and Metaphor Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University.
Biological and cognitive plausibility in connectionist networks for language modelling Maja Anđel Department for German Studies University of Zagreb.
Chapter 1. Introduction in Creating Brain-like intelligence, Sendhoff et al. Course: Robots Learning from Humans Bae, Eun-bit Otology Laboratory Seoul.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
Information Processing
Fundamental ARTIFICIAL NEURAL NETWORK Session 1st
Outline Of Today’s Discussion
Journal of Vision. 2017;17(4):9. doi: / Figure Legend:
CSCE 2017 ICAI 2017 Las Vegas July. 17.
Demonstration in Emergent Lucia Hraskova March 2016
Journal of Vision. 2015;15(9):7. doi: / Figure Legend:
Emergence of Semantics from Experience
Prof. Carolina Ruiz Department of Computer Science
-Shikha Gaur (Towards course project CS539 Fall 2017)
Convolutional Neural Networks
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
شبکه عصبی تنظیم: بهروز نصرالهی-فریده امدادی استاد محترم: سرکار خانم کریمی دانشگاه آزاد اسلامی واحد شهرری.
XOR problem Input 2 Input 1
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Understanding LSTM Networks
Prepared by: Mahmoud Rafeek Al-Farra
Artificial Intelligence 12. Two Layer ANNs
A connectionist model in action
Artificial Intelligence 10. Neural Networks
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
The Network Approach: Mind as a Web
Deep Neural Networks as Scientific Models
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Example of a simple deep network architecture.
Bidirectional LSTM-CRF Models for Sequence Tagging
Areas under the receiver operating characteristic (ROC) curves for both the training and testing data sets based on a number of hidden-layer perceptrons.
Prof. Carolina Ruiz Department of Computer Science
An artificial neural network: a multilayer perceptron.
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

Lecture 7: Cognitive Science A Necker Cube

 Many phenomena look as though mental representations are what’s important, not the stimuli themselves Your mental representation determines what you ‘see’

25 X

 the nature of concepts How do these things figure in our mental life? What’s the nature of the representation? [

 Vision – not much to work with (from

(Figure 2 from Marr & Nishihara (1978: 274)

(from

 Background: we’ve been assuming the ‘classical’ model of cognitive architecture  The mind works directly with mental symbolic representations of various sorts (phrase-structure trees for syntax, family-resemblance clusters for concepts, etc etc.)  Beginning of the 80’s:  Move cognitive science closer to neuroscience  Mind works with artificial neural networks, not representations

A connectionist model in action (from Stillings et al. (1995))

A connectionist model in action (from Stillings et al. (1995)) Input layer Hidden Layer Output layer

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output

The XOR Function Input 1Input 2Output