Download presentation
Presentation is loading. Please wait.
Published byAmbrose Murphy Modified over 9 years ago
1
Learning and Conditioning
2
I. The Assumptions of Behaviorism A. Behaviorists are deterministic. B. Behaviorists believe that mental explanations are ineffective. C. Behaviorists believe that the environment plays a powerful role in molding behavior. Learning: any relatively permanent change in behavior that is based upon experience. Behaviorists: psychologists who insist that psychologists should study only observable, measurable behaviors, not mental processes.
3
B. Methodological Behaviorists: study only events that they can measure and observe, BUT they sometimes use those observations to make inferences about internal events. 1) Intervening Variable: something that cannot be directly observed yet links a variety of procedures to a variety of possible responses. A. Radical Behaviorists: believe that internal states are caused by events in the environment, or by genetics. II. Two Key Types of Behaviorists
4
III. Pavlov and Classical Conditioning 1) Classical Conditioning Terminology Unconditioned Stimulus (UCS) An event that consistently and automatically elicits an unconditioned response. (Food) Unconditioned Response (UCR) An action that the unconditioned stimulus automatically elicits. (Salivation) Conditioned Stimulus (CS) Formerly the neutral stimulus, having been paired with the unconditioned stimulus, elicits the same response. (Bell) That response depends upon its consistent pairing with the UCS. Conditioned Response (CR) The response elicited by the conditioned stimulus due to the training. (Salivation) Usually it closely resembles the UCR. A. Classical Conditioning: learning based on association of a stimulus that does not ordinarily elicit a particular response with another stimulus that does elicit the response. Applies to involuntary responses.
6
B. An unfamiliar neutral stimulus enhances conditioning. C. Acquisition: the process that establishes or strengthens a conditioned response. D. Extinction: the conditioned stimulus is repeatedly presented without the unconditioned stimulus leading to a decrease and elimination of the response. E. Spontaneous Recovery: the temporary return of an extinguished response.
8
H. Stimulus generalization: the extension of a conditioned response from the training stimulus to similar stimuli. I. Stimulus discrimination: the process of learning to respond differently to two stimuli because they produce two different outcomes. J. Drug Tolerance and Classical Conditioning Pavlov believed that after enough training, the CS essentially became the UCS, rather than simply a signal that the UCS was coming. Later research determined that this is NOT the case, rather the CS does indeed becomes a signal that the UCS is coming. F. Simultaneous conditioning: the conditioned stimulus and the unconditioned stimulus are presented at the same time. G. Compound conditioning: two or more conditioned stimuli are presented together with the unconditioned stimulus.
9
K. Blocking Effects: it is difficult to condition the same response in an animal to more than one stimulus once an association has been made to a previously presented stimulus.
10
Behaviorism: Classical Conditioning John Watson: Conditioning of Fear Orphan boy ‘Little Albert’ –3. Albert cried because of noise –4. Eventually, site of rat made Albert cry –1. Albert liked the furry rat –2. Rat presented with loud CRASH!
11
IV. Thorndike and Operant Conditioning A. Thorndike’s Law of Effect: an animal is more likely to repeat a behavior if it led to favorable consequences even if it doesn’t understand why.
12
C. Forms of Operant Conditioning 1) Reinforcer: an event that follows a response and increases the future probability of that recent response. Primary Reinforcers: are reinforcing because of their own properties. (Food) Secondary Reinforcers: are reinforcing because of previous experiences. (Money) “I’ve learned through experience that I can exchange money for food.” 2) Punishment is an event that decreases the probability of a response. (Pain) B. Operant Conditioning: learning based on association of behavior with its consequences. The individual learns from the consequences of “operating” in the environment. Applies to voluntary responses.
13
Operant Conditioning: Reinforcement Increases likelihood of behavior reoccurring –Positive: Giving a reward Candy for finishing a task –Negative: Removing something aversive No chores for getting an A+ on homework
14
Operant Conditioning: Punishment Decreases likelihood of behavior reoccurring –Positive: Adding something aversive Extra Chores –Negative: Removing something pleasant Taking away car keys
15
D. Extinction: occurs if responses stop producing reinforcements. E. Stimulus Generalization: occurs when a new stimulus is similar to the original reinforced stimulus. The more similar the new stimulus is to the old, the more strongly the subject is likely to respond. F. Discriminative Stimulus: a stimulus that indicates which response is appropriate or inappropriate.
16
G. Belongingness: the concept that certain stimuli are classified together or more readily associated with certain outcomes more so than with others.
17
V. Skinner and the Shaping of Behavior A. Shaping: establishes new responses by reinforcing successive approximations to it. B. Schedule of Reinforcement: is a set of rules of procedures for delivery of reinforcement. C. Continuous Reinforcement: provides reinforcement every time a response occurs. D. Intermittent Reinforcement: sometimes a particular response is reinforced and other times it is not. 1) Ratio: when the delivery of reinforcement depends on the number of responses given by the individual. 2) Interval: when delivery of reinforcement depends on the amount of time that has passed since the last reinforcement.
18
E. Four Subcategories of Intermittent Reinforcement 1) Fixed-Ratio Schedule: provides reinforcement only after a certain “fixed” number of correct responses have been made. 2) Variable-Ratio Schedule: provides reinforcement after a variable number of correct responses. 3) Fixed-Interval Schedule: provides reinforcement for the first response made after a specific time interval. 4) Variable-Interval Schedule: provides reinforcement after a variable amount of time has elapsed. Extinction of responses tends to take longer when an individual has been on an intermittent schedule, especially one that is variable, rather than a continuous schedule. One explanation for this difference is that on an intermittent schedule, the lack of reinforcement does not signify the complete ending of all reinforcements. It’s harder to tell when your experiences with reinforcement are truly over.
19
VI. Applications of Operant Conditioning A. Animal Training B. Breaking Bad Habits 1) Behavior Modification: the clinician determines which reinforcers sustain an undesirable behavior and then tries to change the behavior by reducing the opportunities for reinforcement of the unwanted behavior and providing reinforcers for a more acceptable behavior.
20
VII. Other Kinds of Learning A. Conditioned Taste Aversions
21
B. Social Learning: we learn about many behaviors before we attempt them for the first time by observing the behaviors of others and from imagining the consequences of our own. D. Self-Efficacy in Social Learning: we tend to imitate people we admire and who are perceived as similar to us in some fashion. C. Bandura’s “Bobo” Doll Study
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.