Lecture 10 – Semantic Networks 1 Two questions about knowledge of the world 1.How are concepts stored? We have already considered prototype and other models.

Slides:



Advertisements
Similar presentations
Social Cognition September 16th, 2009 : Lecture 2.
Advertisements

Intelligent systems Lection 7 Frames, selection of knowledge representation, its combinations.
Unit 3 Review 1.Two processes that cause forgetting are decay and displacement. (a) Define each. (b) Both processes are examples of _____ failure.
Intelligent systems Lecture 6 Rules, Semantic nets.
Cognitive - knowledge.ppt © 2001 Laura Snodgrass, Ph.D.1 Knowledge Structure of semantic memory –relationships among concepts –organization of memory –memory.
Chapter 9 Knowledge.
Chapter 7 Knowledge Terms: concept, categorization, prototype, typicality effect, object concepts, rule-governed, exemplars, hierarchical organization,
Representation/organization in LTM Typical empirical testing paradigm: propositional verification task – rt to rose is flower, vs. rose is plant. Set Theoretical.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio LECTURE 4: Semantic Networks and Description Logics.
Knowledge Engineering
 Contrary to the beliefs of early workers in AI, experience has shown that Intelligent Systems cannot achieve anything useful unless they contain a large.
Organization of Semantic Memory The study of Collins & Quillian (1969):Collins & Quillian (1969): The authors were interested in the organization of semantic.
Knowledge ß How do we organize our knowledge? ß How do we access our knowledge? ß Do we really use categories?
PSY 369: Psycholinguistics Mental representations II.
Knowing Semantic memory.
Cognitive Processes PSY 334 Chapter 5 – Meaning-Based Knowledge Representation July 24, 2003.
Natural Categories Hierarchical organization of categories –Superordinate (e.g., furniture) –Basic-level (e.g., chair) –Subordinate (e.g., armchair) Rosch.
Control processes The kinds of mental processes carried out on a memory 3 main types –Encoding processes –Retention processes –Retrieval processes.
Chapter Seven The Network Approach: Mind as a Web.
Knowledge information that is gained and retained what someone has acquired and learned organized in some way into our memory.
Categorization  How do we organize our knowledge?  How do we retrieve knowledge when we need it?
How do we store information for the long-term in memory?
Objects Objects are at the heart of the Object Oriented Paradigm What is an object?
Cognitive Psychology, 2 nd Ed. Chapter 8 Semantic Memory.
Memory for General Knowledge
General Knowledge Dr. Claudia J. Stanny EXP 4507 Memory & Cognition Spring 2009.
Knowledge Representation and Retrieval PSY 421 – Fall, 2004.
Semantic Memory Memory for meaning
Semantic Memory Psychology Introduction This is our memory for facts about the world This is our memory for facts about the world How do we know.
Semantic Memory Knowledge memory Main questions How do we gain knowledge? How is our knowledge represented and organised in the mind-brain? What happens.
Conceptual Hierarchies Arise from the Dynamics of Learning and Processing: Insights from a Flat Attractor Network Christopher M. O’ConnorKen McRaeGeorge.
CONNECTIONIST MODELLING
Memory for General Knowledge and Concepts and Categorization
Semantic Memory Knowledge memory Main questions How do we gain knowledge? How is our knowledge represented and organised in the mind-brain? What happens.
Emergence of Semantic Knowledge from Experience Jay McClelland Stanford University.
PSY 323 – COGNITION Chapter 9: Knowledge.  Categorization ◦ Process by which things are placed into groups  Concept ◦ Mental groupings of similar objects,
Psyc 1002 Dr Caleb Owens Cognitive Processes Lecture 4 : The structure of long term memory.
Category Structure Psychology 355: Cognitive Psychology Instructor: John Miyamoto 05/20 /2015: Lecture 08-2 This Powerpoint presentation may contain macros.
Memory--what’s in there?. The problem You have a great deal of stuff in memory: How do you get the right thing out when you need it?
Similarity and Attribution Contrasting Approaches To Semantic Knowledge Representation and Inference Jay McClelland Stanford University.
Neural Networks in Computer Science n CS/PY 231 Lab Presentation # 1 n January 14, 2005 n Mount Union College.
Long Term Memory: Semantic Kimberley Clow
PSY 369: Psycholinguistics Language Comprehension: Semantic networks.
Copyright © 2010, Pearson Education Inc., All rights reserved.  Prepared by Katherine E. L. Norris, Ed.D.  West Chester University of Pennsylvania This.
Semantic Cognition: A Parallel Distributed Processing Approach James L. McClelland Center for the Neural Basis of Cognition and Departments of Psychology.
Cognitive Processes PSY 334 Chapter 5 – Meaning-Based Knowledge Representation.
1 Lectures on Artificial Intelligence (CS 364) 1 Khurshid Ahmad Professor of Artificial Intelligence Centre for Knowledge Management September 2001.
1 How is knowledge stored? Human knowledge comes in 2 varieties: Concepts Concepts Relations among concepts Relations among concepts So any theory of how.
Memory How do we retain information? How do we recall information?
CSE (c) S. Tanimoto, 2008 ISA Hierarchies 1 ISA Hierarchies: A Basis for Knowledge Representation with Semantic Networks Outline: Introduction The.
The Emergent Structure of Semantic Knowledge
Organization of Semantic Memory Typical empirical testing paradigm: propositional verification task – rt to car has four wheels vs. car is a status symbol.
Associative Theories of Long- Term Memory. Network Theory The basic notion that we need to explore is that memory consists of a large number of associations.
Neuropsychological Evidence for Category Structure Then: The Functional Role of Mental Imagery Psychology 355: Cognitive Psychology Instructor: John Miyamoto.
Semantic Memory Psychology Introduction This is our memory for facts about the world How do we know that the capital of Viet Nam is Hanoi How is.
How to investigate Perception & Cognition n Ask your subjects (Introspectionism) n Look at S-R patterns (Behaviorism) n Infer mental processes (Cognitive.
Verbal Representation of Knowledge
Lecture 8 – Categories 1 great detail is needed for planning & executing current responses some detail about context of new learning can be emcoded with.
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
Module 5 Other Knowledge Representation Formalisms
Artificial Intelligence
What is cognitive psychology?
PSY 323 – Cognition Chapter 9: Knowledge.
Artificial Intelligence (CS 370D)
Class Schedule In-text Citations Long-term Memory: Organization
[Human Memory] 10.Knowledge
Cognitive Psychology Chapter 7 Knowledge, Part I.
How is knowledge stored?
The Network Approach: Mind as a Web
Presentation transcript:

Lecture 10 – Semantic Networks 1 Two questions about knowledge of the world 1.How are concepts stored? We have already considered prototype and other models of concept representation 2. How are relations among concepts stored? Our knowledge is not just knowledge of things – it is knowledge of relations among things.

Lecture 10 – Semantic Networks 2 Issues to consider A.Two things we want our knowledge store to do: Help us make decisions Help us make inferences B. Two ways we could store knowledge As a list of facts In some sort of structure that encodes relations among concepts

Lecture 10 – Semantic Networks 3 A. Two things we want our knowledge store to do Any model of knowledge representation must explain our ability to make decisions. Example of decision: Is a mouse a mammal? Yes. But how do I know? What is the cognitive operation that lets me get the right answer?

Lecture 10 – Semantic Networks 4 A. Two things we want our knowledge store to do Any model of knowledge representation must explain our ability to make inferences. Example of inference: Does a mouse bear live young? A mouse is a mammal. Mammals bear live young. Therefore, a mouse bears live young.

Lecture 10 – Semantic Networks 5 B. Two ways we could store knowledge: 1.A list A cuckoo is a bird A tractor is a farm vehicle Molasses is a food Decisions could take a long time. No attempt to relate one fact to another. No explanation of how we make inferences.

Lecture 10 – Semantic Networks 6 Problems with representing knowledge as a list: i.As the list gets longer, it gets harder to retrieve any given piece of information. More to search through; more searching takes more time. ii. Our knowledge would never be more than the sum of the items in the list.

Lecture 10 – Semantic Networks 7 B. Two ways we could store knowledge 2. In a structure E.g., put all the vehicle knowledge in one spot. Do the same with all the food knowledge, and so on. Within the vehicle knowledge, have ‘regions’ for family vehicles, farm vehicles, commercial vehicles, and so on.

Lecture 10 – Semantic Networks 8 Making decisions Structure in our knowledge store should make it easier (faster) to make a decision: Is a tractor a vehicle? It’s a farm vehicle. Farm vehicles are vehicles.

Lecture 10 – Semantic Networks 9 Making inferences Suppose we learn that: Tractors have large tires Combines have large tires We can now generalize: farm vehicles have large tires. Do hay-balers have large tires? Yes.

Lecture 10 – Semantic Networks 10 Structure in knowledge We then know that hay balers have large tires not because we experienced it, but because we deduced it. But what does the structure that can permit this look like? The most widely-accepted answer is, a network. A semantic network.

Lecture 10 – Semantic Networks 11 Network models of semantic memory 1.Quillian (1968), Collins & Quillian (1969) First network model of semantic memory 2.Collins & Loftus (1975) Revised network model of semantic memory 3.Neural network models (later in the term)

Lecture 10 – Semantic Networks 12 Quillian’s model Quillian was a computer scientist. He wanted to build a program that would read and ‘understand’ English text. To do this, he had to give the program the knowledge a reader has. Constraint: computers were slow, and memory was very expensive, in those days.

Lecture 10 – Semantic Networks 13 Basic elements of Quillian’s model 1.Nodes Nodes represent concepts. They are ‘placeholders’. They contain nothing. 2.Links Connections between nodes.

Lecture 10 – Semantic Networks 14 Animal Mammal Bird Fly FeathersWings Air Live young breathes has bears isa

Lecture 10 – Semantic Networks 15 Things to notice about Quillian’s model All links were equivalent. They are all the same length. Structure was rigidly hierarchical. Time to retrieve information based on number of links Cognitive economy – properties stored only at highest possible level (e.g., birds have wings)

Lecture 10 – Semantic Networks 16 Problems with Quillian’s model 1.How to explain typicality effect? Is a robin a bird? Is a chicken? Easier to say ‘yes’ to robin. Why? 2.How to explain that it is easier to report that a bear is an animal than that a bear is a mammal? 3.Cognitive economy – do we learn by erasing links?

Lecture 10 – Semantic Networks 17 What’s new in Collins & Loftus (1975) A. Structure responded to data accumulated since original Collins & Quillian (1969) paper got rid of hierarchy got rid of cognitive economy allowed links to vary in length (not all equal)

Lecture 10 – Semantic Networks 18 animal mammal bird robin ostrich feathers wings fly bat fly skin cow

Lecture 10 – Semantic Networks 19 What’s new in Collins & Loftus (1975)? B. Process – Spreading Activation Activation – arousal level of a node Spreading – down links Mechanism used to extract information from network Allowed neat explanation of a very important empirical effect: Priming

Lecture 10 – Semantic Networks 20 Priming An effect on response to one stimulus (target) produced by processing another stimulus (prime) immediately before. If prime is related to target (e.g., bread-butter), reading prime improves response to target. RT (unrelated) – RT (unrelated) = priming effect. Sometimes measured on accuracy.

Lecture 10 – Semantic Networks 21 Priming RelatedUnrelatedTask breadnurseread only BUTTERBUTTERread, respond Difference in RT to two types of trials = priming effect. Responses in related condition are faster.

Lecture 10 – Semantic Networks 22 Why is the Priming effect important? The priming effect is an important observation that models of semantic memory must account for. Any model of semantic memory must be the kind of thing that could produce a priming effect. A network through which activation spreads is such a model. (Score one point for networks.)

Lecture 10 – Semantic Networks 23 Review Knowledge has structure Our representation of that structure makes new knowledge available (things not experienced) The most popular models are network models, containing links and nodes. Nodes are empty. They are just placeholders.

Lecture 10 – Semantic Networks 24 Review Knowledge is stored in the structure – the pattern of links, and the lengths of the links. The pattern of links and the lengths of links reflect experience (learning). Network models provide a handy explanation of priming effects. Note: modern neural network models are different in some respects.