Chapter 6: Learning.

Slides:



Advertisements
Similar presentations
A.P. Psychology Modules 20-22
Advertisements

Chapter 6: Learning. Classical Conditioning Ivan Pavlov A type of learning in which a neutral stimulus acquires the ability to elicit a response. How.
Introduction to Psychology, 7th Edition, Rod Plotnik Module 9: Classical Conditioning Module 9 Classical Conditioning.
Learning How do we learn through our environment? Classical Conditioning – Neutral stimulus acquires ability to produce a response Operant Conditioning.
Chapter 8 Learning.  Learning  relatively permanent change in an organism’s behavior due to experience.
OPERANT CONDITIONING DEF: a form of learning in which responses come to be controlled by their consequences.
Chapter 6: Learning. Classical Conditioning Ivan Pavlov Terminology –Unconditioned Stimulus (UCS): evokes an unconditioned response without previous conditioning.
Learning Prof. Tom Alloway. Definition of Learning l Change in behavior l Due to experience relevant to what is being learned l Relatively durable n Conditioning.
Chapter 6: Learning. Classical Conditioning Ivan Pavlov Terminology –Unconditioned Stimulus (UCS) –Conditioned Stimulus (CS) –Unconditioned Response (UCR)
Learning.
Chapter 6 Learning. Table of Contents Classical conditioning Ivan Pavlov Terminology –Unconditioned Stimulus (UCS) –Conditioned Stimulus (CS) –Unconditioned.
Chapter 6 Learning.
Chapter 6 Learning. Table of Contents Learning Learning defined on page –Classical conditioning –Operant/Instrumental conditioning –Observational learning.
© 2013 by McGraw-Hill Education. This is proprietary material solely for authorized instructor use. Not authorized for sale or distribution in any manner.
Chapter 6: Learning.
Chapter 6: Learning 1Ch. 6. – Relatively permanent change in behavior due to experience 1. Classical Conditioning : Pairing 2. Operant Conditioning :
Learning Theories Learning To gain knowledge, understanding, or skill, by study, instruction, or experience.
Classical Conditioning
Chapter 6 Learning Through Conditioning. 1 (two words) explains how a neutral stimulus can acquire the capacity to evoke a response originally evoked.
Learning. LEARNING  Learning  relatively permanent change in an organism’s behavior due to experience.
Learning. A. Introduction to learning 1. Why do psychologists care about learning? 2. What is and isn’t learning? IS: A relatively permanent change in.
Chapter 5 Learning. chapter 5 What is Learning? Occurs whenever experience or practice results in a relatively permanent change in behavior.
Chapter 6 Learning.
Table of Contents CHAPTER 6 Learning. Table of ContentsLEARNING  Learning  Classical conditioning  Operant/Instrumental conditioning  Observational.
Copyright McGraw-Hill, Inc Chapter 5 Learning.
LEARNING  a relatively permanent change in behavior as the result of an experience.  essential process enabling animals and humans to adapt to their.
Learning Experiments and Concepts.  What is learning?
Learning Chapter 5.
Learning  relatively permanent change in an organism’s behavior due to experience  Helps us …
Chapter 6: Learning.
Myers’ PSYCHOLOGY (7th Ed) Chapter 8 Learning James A. McCubbin, PhD Clemson University Worth Publishers.
Unit 6: Learning. How Do We Learn? Learning = a relatively permanent change in an organism’s behavior due to experience. 3 Types:  Classical  Operant.
Module 9 Classical Conditioning. THREE KINDS OF LEARNING Learning –A relatively enduring or permanent change in behavior that results from previous experience.
Learning 7-9% of the AP Psychology exam. Thursday, December 3 Sit with your group from yesterday’s test review!
Table of Contents Chapter 6 Learning. Table of Contents Learning –Classical conditioning –Operant/Instrumental conditioning –Observational learning Ivan.
Psychology in Action (8e) PowerPoint  Lecture Notes Presentation Chapter 6: Learning 1.
Principles of Learning
Chapter 6 LEARNING. Learning Learning – A process through which experience produces lasting change in behavior or mental processes. Behavioral Learning.
Chapter 6 Learning & Conditioning. Discussion Question: What is learning?
Module 9 Classical Conditioning. Objectives Students will be able to… Students will be able to… Discuss the stages of Classical Conditioning Discuss the.
Chapter 6: Learning.
Learning Chapter 6.
Myers’ PSYCHOLOGY Unit VI Learning Worth Publishers.
Chapter 5 Learning © 2013 by McGraw-Hill Education. This is proprietary material solely for authorized instructor use. Not authorized for sale or distribution.
© 2008 The McGraw-Hill Companies, Inc.
Classical Conditioning
Learning.
Chapter 6 Learning – 8th edition.
Learning.
Chapter 6 Learning – 8th edition.
Chapter 6: Learning.
Module 20 Operant Conditioning.
Chapter 6: Learning Ch. 6.
Chapter 6 Learning.
Case Study: The Little Albert Experiment
Learning and Conditioning
ESSENTIALS OF UNDERSTANDING
Chapter 6.
Psychology in Action (8e) by Karen Huffman
Chapter 5 Learning.
Learning Any relatively permanent change in behavior (or behavior potential) produced by experience.
Chapter 6: Learning.
Classical Conditioning
Classical Conditioning
Learning A.P. Psychology.
Chapter 6 Learning.
Myers’ EXPLORING PSYCHOLOGY (6th Ed)
LEARNING DEF: a relatively durable change in behavior or knowledge that is due to experience.
Chapter 6: Learning/Conditioning
Warm-up Write a paragraph describing something you learned to do and how you learned it. Give specifics in your description; stay away from generalizations.
Presentation transcript:

Chapter 6: Learning

Classical Conditioning Ivan Pavlov Terminology Unconditioned Stimulus (UCS) Conditioned Stimulus (CS) Unconditioned Response (UCR) Conditioned Response (CR) Classical conditioning explains how a neutral stimulus can acquire the capacity to elicit (or draw forth) a response originally elicited by another stimulus. Ivan Pavlov, a prominent Russian physiologist in the early 1900’s, who did Nobel prize winning research on digestion, discovered (partly by accident) that dogs will salivate in response to the sound of a tone. In doing so, he discovered classical, sometimes called Pavlovian, conditioning. In classical conditioning, the UCS is a stimulus that elicits an unconditioned response without previous conditioning…Pavlov’s meat powder. The UCR is an unlearned reaction to a UCS that occurs without previous conditioning…salivating. The CS is a previously neutral stimulus that has acquired the capacity to elicit a conditioned response...the sound of a tone. The CR is a learned reaction to a conditioned stimulus…salivating to the tone.

Figure 6.1 Classical conditioning apparatus

Figure 6.2 The sequence of events in classical conditioning

Classical Conditioning: Additional Terminology Trial = pairing of UCS and CS Acquisition = initial stage in learning Stimulus contiguity = occurring together in time and space In classical conditioning research, a trial is a pairing of the UCS and the CS. (How many times have the tone and the meat powder been paired?) Some behaviors are learned after only one trial or pairing, while others take many trials. Acquisition refers to the initial stage of learning a response…acquiring the response. Conditioning has been shown to depend on stimulus contiguity; that is, the occurring of stimuli together in time and space.

Classical Conditioning: Additional Terminology 3 types of Classical Conditioning Simultaneous conditioning: CS and UCS begin and end together Short-delayed conditioning: CS begins just before the UCS, end together Trace conditioning: CS begins and ends before UCS is presented So when do you sound the tone in a classical conditioning task? What works best? Of the three types of conditioning (simultaneous, short-delayed, and trace), short-delayed conditioning appears to most promote acquisition of a classically conditioned response…ideally the delay should be very brief, about ½ a second.

Classical Conditioning in Everyday Life Conditioned fears Other conditioned emotional responses Conditioning and physiological responses Conditioning and drug effects

Figure 6.3 Classical conditioning of a fear response

Processes in Classical Conditioning Extinction Spontaneous Recovery Renewal effect Stimulus Generalization Discrimination Higher-order conditioning Extinction occurs when the CS and UCS are no longer paired and the response to the CS is weakened. We know that the response is still there, just not active, because of spontaneous recovery – when an extinguished response reappears after a period of non-pairing. Renewal effect—if a response is extinguished in a different environment than it was acquired, the extinguished response will reappear if the animal is returned to the original environment where acquisition took place. One of the reasons why conditioned fears and phobias are difficult to extinguish permanently (Hermans et al., 2006). Generalization occurs when conditioning generalizes to additional stimuli that are similar to the CS; for example, Watson and Rayner’s study with Little Albert, who was conditioned to fear a white rat but later came to be afraid of many white, furry objects. Discrimination is the opposite of generalization; that is, the response is to a specific stimulus… similar stimuli don’t work. Higher order conditioning occurs when a CS functions as if it were a UCS to establish new conditioning…condition to respond to a tone with saliva, pair the tone with a light.

Figure 6.7 Acquisition, extinction, and spontaneous recovery

Figure 6.10 Higher-order conditioning

Operant Conditioning or Instrumental Learning Edward L. Thorndike (1913) – the law of effect B.F. Skinner (1953) – principle of reinforcement Operant chamber Emission of response Reinforcement contingencies Cumulative recorder Thorndike’s law of effect stated that if a response in the presence of a stimulus leads to satisfying effects, the association between the stimulus and the response is strengthened. This law became the cornerstone of Skinner’s theory. Skinner’s principle of reinforcement holds that organisms tend to repeat those responses that are followed by favorable consequences, or reinforcement. Skinner defined reinforcement as when an event following a response increases an organism’s tendency to make that response. Skinner created a prototype experimental procedure, using animals and an operant chamber or “Skinner box.” This is a small enclosure in which an animal can make a specific response that is recorded, while the consequences of the response are systematically controlled. Rats, for example, press a lever. Because operant responses tend to be voluntary, they are said to be emitted rather than elicited. Reinforcement contingencies are the circumstances, or rules, that determine whether responses lead to the presentation of reinforcers. The cumulative recorder creates a graphic record of responding and reinforcement in a Skinner box as a function of time.

Figure 6.12 Reinforcement in operant conditioning

Figure 6.13 Skinner box and cumulative recorder

Basic Processes in Operant Conditioning Acquisition Shaping Extinction Stimulus Control Generalization Discrimination As in classical conditioning, acquisition refers to the initial stage of learning. Learning operant responses usually occurs through a gradual process called shaping, which consists of the reinforcement of closer and closer approximations of a desired response…key in pet tricks. Extinction in operant conditioning refers to the gradual weakening and disappearance of a response tendency, because the response is no longer followed by a reinforcer…stop giving food when the rat presses the lever…results in a brief surge of responding followed by a gradual decline until it approaches zero. Stimuli that precede a response can exert considerable influence over operant behavior, basically becoming “signals” that a reinforcer is coming. Discriminative stimuli are cues that influence operant behavior by indicating the probable consequences of a response (ex. slow down when the highway is wet, ask Mom when she’s in a good mood, etc.). Discrimination occurs when an organism responds to one stimulus, but not another one similar to it, while generalization occurs when a new stimulus is responded to as if it were the original. (ex. cat runs to the sound of a can-opener which signals food, but not to the sound of the mixer…discrimination…get a new blender, cat runs to it...generalization).

Figure 6.14 A graphic portrayal of operant responding

Table 6.1 Comparison of Basic Processes in Classical and Operant Conditioning

Reinforcement: Consequences that Strengthen Responses Primary Reinforcers Satisfy biological needs Secondary Reinforcers Conditioned reinforcement Operant theorists distinguish between primary reinforcers, which are events that are inherently reinforcing because they satisfy biological needs, and secondary reinforcers, which are events that acquire reinforcing qualities by being associated with primary reinforcers. Primary reinforcers in humans include food, water, warmth, sex, and maybe affection expressed through hugging and close bodily contact. Secondary reinforcers in humans include things like money, good grades, attention, flattery, praise, and applause.

Schedules of Reinforcement Continuous reinforcement Intermittent (partial) reinforcement Ratio schedules Fixed Variable Interval schedules A schedule of reinforcement determines which occurrences of a specific response result in the presentation of a reinforcer. Continuous reinforcement occurs when every instance of a designated response is reinforced (faster acquisition, faster extinction). Intermittent reinforcement occurs when a designated response is reinforced only some of the time (greater resistance to extinction). Ratio schedules require the organism to make the designated response a certain number of times to gain each reinforcer. A fixed-ratio schedule entails giving a reinforcer after a fixed number of non-reinforced responses. A variable ratio schedule entails giving a reinforcer after a variable number of non-reinforced responses. Interval schedules require a time period to pass between the presentation of reinforcers. A fixed-interval schedule entails reinforcing the first response that occurs after a fixed time interval has elapsed. A variable-interval schedule entails giving the reinforcer for the first response after a variable time interval has elapsed. More than 50 years of research on these schedules has yielded an enormous amount of information about how organisms respond to different schedules.

Figure 6.17 Schedules of reinforcement and patterns of response

Consequences: Reinforcement and Punishment Increasing a response: Positive reinforcement = response followed by rewarding stimulus Negative reinforcement = response followed by removal of an aversive stimulus Escape learning Avoidance learning Decreasing a response: Punishment Problems with punishment Responses can be strengthened either by presenting positive reinforcers or by removing negative reinforcers. Some theorists have recently questioned the value of the distinction between positive and negative reinforcement (Baron & Galizio, 2005; Iwata, 2006). They argue that the distinction is ambiguous and unnecessary. For example, the behavior of rushing home to get out of the cold (negative reinforcement) could also be viewed as rushing home to enjoy the warmth (positive reinforcement). Negative reinforcement regulates escape and avoidance learning. In escape learning, an organism learns to perform a behavior that decreases or ends aversive stimulation (turning on the air conditioner). In avoidance learning, an organism learns to prevent or avoid some aversive stimulation (turn on the a/c before it gets too hot). Punishment occurs when an event following a response weakens the tendency to make that response. Punishment is much more than disciplinary procedures…wear a new outfit and friends laugh…punishing. Punishment may involve presentation of an aversive stimulus (spanking) or removal of a rewarding stimulus (taking away TV). Some of the problems associated with punishment are that it can trigger strong emotional responses (anxiety, anger, resentment, hostility); physical punishment can lead to an increase in aggressive behavior.

Figure 6.18 Positive reinforcement versus negative reinforcement

Figure 6.19 Escape and avoidance learning

Figure 6.20 Comparison of negative reinforcement and punishment

Changes in Our Understanding of Conditioning Biological Constraints on Conditioning Instinctive Drift Conditioned Taste Aversion Preparedness and Phobias Cognitive Influences on Conditioning Signal relations Response-outcome relations Evolutionary Perspectives on learning New research has greatly changed the way we think about conditioning, with both biological and cognitive influences having been discovered. Instinctive drift occurs when an animal’s innate response tendencies interfere with conditioning (the raccoon who would rather rub the coins together than obtain the reinforcer). Conditioned taste aversions can be readily acquired, after only one trial and when the stimuli are not contiguous (i.e., becoming ill occurs hours after eating a food), suggesting that there is a biological mechanism at work. Martin Seligman has outlined the fact that some phobias are more easily conditioned than others, suggesting the concept of preparedness…that we are biologically prepared to learn to fear objects or events that have inherent danger. Signal relations theory (Rescorla) illustrates that the predictive value of a CS is an influential factor governing classical conditioning. Response-outcome relations - when a response is followed by a desired outcome, it is more easily strengthened if it seems that it caused the outcome (predicts)…you study for an exam and listen to Smash Mouth…you make an A. What is strengthened, studying or listening to Smash Mouth? Signal relations and response-outcome research suggest that cognitive processes play a larger role in conditioning than once believed. The evolutionary perspective on learning assumes that an organism’s biological heritage places certain constraints on the learning process (which some theorists see as merely specialized mechanisms designed to solve particular types of adaptive problems for particular species).

Figure 6.22 Conditioned taste aversion

Preparedness and Phobias Evolution has programmed organisms to acquire certain fears Species-specific predispositions Evolved module for fear learning Öhman and Susan Mineka (2001) have elaborated on the theory of preparedness, outlining the key elements of what they call an evolved module for fear learning. They assert that this evolved module is (1) preferentially activated by stimuli related to survival threats in evolutionary history, (2) automatically activated by these stimuli, (3) relatively resistant to conscious efforts to suppress the resulting fears, and (4) dependent on neural circuitry running through the amygdala.

Observational Learning: Basic Processes Albert Bandura (1977, 1986) Observational learning Vicarious conditioning 4 key processes Attention Retention Reproduction Motivation Acquisition vs. performance Albert Bandura outlined the theory of observational learning. In observational learning, vicarious conditioning occurs by an organism watching another organism (a model) be conditioned. Observational learning can occur for both classical and operant conditioning. In order for observational learning to take place, 4 key processes are at work. First the organism must pay attention to the model, retain the information observed, and be able to reproduce the behavior. Finally, an observed response is unlikely to be reproduced unless the organism is motivated to do so, i.e., they believe there will be a pay off. Bandura distinguishes between acquisition (having the response in your repertoire) and performance (actually engaging in the behavior). Bandura asserts that reinforcement usually influences already acquired responses, more than the acquisition of new responses.

Figure 6.25 Observational learning

Observational Learning and the Media Violence Controversy Studies demonstrate that exposure to TV and movie violence increases the likelihood of physical aggression, verbal aggression, aggressive thoughts, and aggressive emotions The association between media violence and aggression is nearly as great as the correlation between smoking and cancer

Figure 6.27. Comparison of the relationship between media violence and aggression to other correlations.