Principles of Behavior Sixth Edition Richard W. Malott Western Michigan University Power Point by Nikki Hoffmeister.

Slides:



Advertisements
Similar presentations
Instrumental/Operant Conditioning
Advertisements

Chapter 10 Maintaining Behavior Changes. Relapses in Behavior behavior can regress after goals have been attained a relapse is an extended return to original.
Mean = = 83%
The Matching Law Richard J. Herrnstein. Reinforcement schedule Fixed-Ratio (FR) : the first response made after a given number of responses is reinforced.
Schedules of Reinforcement There are several alternate ways to arrange the delivery of reinforcement A. Continuous reinforcement (CRF), in which every.
Developing Behavioral Persistence Through the Use of Intermittent Reinforcement Chapter 6.
Principles of Behavior Sixth Edition Richard W. Malott Western Michigan University Power Point by Nikki Hoffmeister.
Quiz #3 Last class, we talked about 6 techniques for self- control. Name and briefly describe 2 of those techniques. 1.
Sniffy the Virtual Rat Psych 210 Winter Lab Assignment.
Copyright © 2011 Pearson Education, Inc. All rights reserved. Getting a Behavior to Occur More Often with Positive Reinforcement Chapter 3.
Copyright © 2011 Pearson Education, Inc. All rights reserved. Developing Behavioral Persistence Through the Use of Intermittent Reinforcement Chapter 6.
Instrumental Learning A general class of behaviors inferring that learning has taken place.
Thinking About Psychology: The Science of
Operant Conditioning. Shaping shaping = successive approximations toward a goal a process whereby reinforcements are given for behavior directed toward.
Myers EXPLORING PSYCHOLOGY (6th Edition in Modules) Module 19 Operant Conditioning James A. McCubbin, PhD Clemson University Worth Publishers.
Chapter 8 Operant Conditioning.  Operant Conditioning  type of learning in which behavior is strengthened if followed by reinforcement or diminished.
Psychology 001 Introduction to Psychology Christopher Gade, PhD Office: 621 Heafey Office hours: F 3-6 and by apt. Class WF 7:00-8:30.
Introduction to Operant Conditioning. Operant & Classical Conditioning 1. Classical conditioning forms associations between stimuli (CS and US). Operant.
More Instrumental (Operant) Conditioning. B.F. Skinner Coined the term ‘Operant conditioning’ Coined the term ‘Operant conditioning’ The animal operates.
PSY402 Theories of Learning Chapter 4 (Cont.) Schedules of Reinforcement.
Schedules of Reinforcement Lecture 14. Schedules of RFT n Frequency of RFT after response is important n Continuous RFT l RFT after each response l Fast.
Copyright © 2005 Pearson Education Canada Inc. Learning Chapter 5.
PSY 402 Theories of Learning Chapter 7 – Behavior & Its Consequences Instrumental & Operant Learning.
PSY 402 Theories of Learning Chapter 7 – Behavior & Its Consequences Instrumental & Operant Learning.
OPERANT CONDITIONING DEF: a form of learning in which responses come to be controlled by their consequences.
Learning the Consequences of Behavior
Chapter 7 Operant Conditioning:
Ratio Schedules Focus on the number of responses required before reinforcement is given.
Chapter 9 Adjusting to Schedules of Partial Reinforcement.
Operant Conditioning: Schedules and Theories Of Reinforcement.
Chapter 6 Operant Conditioning Schedules. Schedule of Reinforcement Appetitive outcome --> reinforcement –As a “shorthand” we call the appetitive outcome.
Ninth Edition 5 Burrhus Frederic Skinner.
Operant Conditioning: Schedules and Theories of Reinforcement
Organizational Behavior Types of Intermittently Reinforcing Behavior.
Chapter 6: Learning 1Ch. 6. – Relatively permanent change in behavior due to experience 1. Classical Conditioning : Pairing 2. Operant Conditioning :
Chapter 3 Learning (II) Operant (Instrumental) Conditioning.
4 th Edition Copyright 2004 Prentice Hall5-1 Learning Chapter 5.
Chapter 13: Schedules of Reinforcement
Principles of Behavior Sixth Edition
Chapter 6 Developing Behavioral Persistence Through the Use of Intermittent Reinforcement.
PSY402 Theories of Learning Chapter 6 – Appetitive Conditioning.
Schedules of Reinforcement 11/11/11. The consequence provides something ($, a spanking…) The consequence takes something away (removes headache, timeout)
Operant Conditioning I
Operant Conditioning. Operant Conditioning – A form of learning in which voluntary responses come to be controlled by their consequences. What does this.
Operant conditioning (Skinner – 1938, 1956)
Schedules of Reinforcement CH 17,18,19. Divers of Nassau Diving for coins Success does not follow every attempt Success means reinforcement.
Schedules of Reinforcement Thomas G. Bowers, Ph.D.
Schedules of reinforcement
Maintaining Behavior Change Dr. Alan H. Teich Chapter 10.
Principles of Behavior Sixth Edition Richard W. Malott Western Michigan University Power Point by Nikki Hoffmeister.
Schedules of Reinforcement
Working Hypothesis: If created by conditioning, could be eliminated using principles of conditioning Behavior Therapy: treatment based on environmental.
Operant Conditioning I. Volunteer? Priscilla the Fastidious Pig
Module 27 Operant Conditioning
CLASSICAL VS. OPERANT CONDITIONING  With classical conditioning you can teach a dog to salivate, but you cannot teach it to roll over. Why?  Classical.
Copyright © Allyn and Bacon Chapter 6 Learning This multimedia product and its contents are protected under copyright law. The following are prohibited.
4 th Edition Copyright 2004 Prentice Hall5-1 Psychology Stephen F. Davis Emporia State University Joseph J. Palladino University of Southern Indiana PowerPoint.
How is behavior “shaped” through operant conditioning? Operant shaping: demonstration, analysis and terminology Chaining operant behavior (again) The cumulative.
Schedules of Reinforcement
Chapter 6 LEARNING. Learning Learning – A process through which experience produces lasting change in behavior or mental processes. Behavioral Learning.
Schedules and more Schedules
Factors Affecting Performance on Reinforcement Schedules
Operant Conditioning A form of learning in which behavior becomes more or less probable depending on its consequences Associated with B.F. Skinner.
Schedules of Reinforcement
Module 20 Operant Conditioning.
Maintaining Behavior Change Chapter 10
Operant conditioning.
Operant Conditioning, Continued
Schedules of Reinforcement
Presentation transcript:

Principles of Behavior Sixth Edition Richard W. Malott Western Michigan University Power Point by Nikki Hoffmeister

Chapter 18 Interval Schedules

What is a Fixed-Interval Schedule? Fixed-Interval (FI) Schedule of Reinforcement: A reinforcer is contingent on the first response, after a fixed interval of time, since the last opportunity for reinforcement.

What type of responding results from an FI schedule? Fixed-Interval Scallop: A fixed-interval schedule often produces a scallop – a gradual increase in the rate of responding, with responding occurring at a high rate just before reinforcement is available. No responding occurs for some time after reinforcement.

Fixed-Interval Scallop

Joe’s Term Paper Sid assigned a term paper the first day of class Joe has 15 weeks to complete the project The following figure is a cumulative record of Joe’s work under this schedule Weeks are plotted on the abscissa Cumulative number of hours he worked are on the ordinate

FI Scallop of Joe’s Paper Writing

Joe’s Term Paper Joe spent no time preparing the paper in the first 7 weeks Finally, in the 8 th week, he spent 5 hours preparing He did more the next week And even more the next week He spent the final week in a frenzy of long hours in the library This is an FI scallop, right? Wrong.

Contrasting Fixed-Interval and Term- Paper Schedules FeatureFixed- Interval Term- Paper Does early responding affect anything? NoYes Do you get more if you work harder? NoYes Is the relevant response class clear? YesNo Are there calendars and clocks?NoYes Is there a deadline?NoYes Is the reinforcer too delayed?NoYes

Congress Example What’s the cumulative record of passing laws by the US Congress? A scallop; just like the pigeon pecking a key on an FI schedule. Members of Congress return from their annual recess (reinforcer). They pause for a few months, and then they pass the first law, and gradually increase passing laws until right before the next time for recess.

Is law-passing on an FI schedule? No In this analysis, we ask the same questions we did in the term paper example

FeatureFixed- Interval Congress Does early responding affect anything? NoYes Do you get more if you work harder? NoYes Is the reinforcer contingent on the final response? YesNo Immediate reinforcer?YesNo Is there a deadline?NoYes Clocks or calendars?NoYes

Other Non-Examples The TV Schedule S Δ : Calendar and clock say 9:30 AM Monday S D : Calendar and clock say 11:30 PM Saturday Behavior: You turn TV to channel 8 After: You have no opportunity to see your favorite TV show After: You have opportunity to see your favorite TV show Before: You have no opportunity to see your favorite TV show

Analysis Problem 1: –You have a calendar and clock; Rudolph doesn’t –If you didn’t, you might respond like Rudolph: responding more and more quickly as time passed Problem 2: –You have a deadline; Rudolph doesn’t

Other Non-Examples The Paycheck S Δ : It has been 1 week since last paycheck S D : It has been 2 weeks since last paycheck Behavior: You pick up your paycheck After: You have no paycheck After: You have a paycheck Before: You have no paycheck

A Good Example You’re watching Seinfeld Commercials come on You switch to Jerry Springer But you keep switching back and forth with increasing frequency as the commercial interval wears on One of your flips is reinforced by Seinfeld

Superstition in the Pigeon Skinner put a pigeon in a Skinner box. He placed a feeder in the box every 15 seconds, regardless of what the bird was doing. The first time, just prior to the feeder being presented, the pigeon had made an abrupt counter clockwise turn. He did the same thing the next time, right before the feeder came up.

Results The bird performed a stereotyped pattern of behavior: –rapid and persistent counter clockwise turns.

What is a Fixed-Time Schedule? Fixed-Time Schedule of Reinforcer Delivery: A reinforcer is delivered after the passage of a fixed period of time, independently of the response.

What is Superstitious Behavior? Superstitious Behavior: Behaving as if the response causes some specific outcome, when it really does not.

What is a Variable-Interval Schedule? Variable-Interval (VI) Schedule of Reinforcement: A reinforcer is contingent on the first response after a variable interval of time since the last opportunity for reinforcement.

VI Schedules The opportunity for reinforcement comes as a direct function of the passage of time. Thus, it is a time-dependent schedule. The lengths of the intervals between opportunities are varied.

VI Schedules Although the opportunity for reinforcement occurs as a function of time alone, the subject must make the response after the interval is over for reinforcement to occur. Time alone will not bring about the reinforcer.

What type of responding does a VI schedule produce? Variable-Interval Responding: Variable-interval schedules produce a moderate rate of responding, with almost no post-reinforcement pausing.

VI Responding

Responses can produce reinforcers in 2 ways: 1.Continuous Reinforcement: Every response produces a reinforcer. 2.Intermittent Reinforcement: Only some responses produce a reinforcer.

4 Classic Intermittent Schedules FixedVariable RatioFixed-RatioVariable-Ratio IntervalFixed-IntervalVariable-Interval

Intermittent Reinforcement & Extinction Resistance to Extinction and Intermittent Reinforcement: Intermittent reinforcement makes the response more resistant to extinction than does continuous reinforcement.

What is Resistance to Extinction? Resistance to Extinction: The number of responses or amount of time before a response extinguishes. The more an intermittent schedule differs from continuous reinforcement, the more the behavior resists extinction.

Interval Schedules vs. Time Schedules Fixed IntervalFixed Time Involves TimeYes Requires a Response YesNo

Comparing & Contrasting Ratio & Variable Schedules of Reinforcement ScheduleReinforcer Follows Behavior RATIOA number of responses Fixed RatioFixed number of responses After reinforcement, no responding occurs for some time. Then it occurs at a high, steady rate until next reinforcer. VariableVariable number of responses Responding occurs at a high steady rate, with almost no post- reinforcement pausing.

ScheduleReinforcer Follows Behavior INTERVALThe first response after a time interval Fixed IntervalFirst response after a fixed time interval No response occurs immediately after reinforcement. Then the rate increases slowly as the interval advances (scallop). Variable Interval First response after a variable time interval A consistent and steady rate of responding occurs, with almost no post- reinforcement pausing.

ScheduleReinforcer Follows Behavior TIMEA time period whether or not there is a response Fixed TimeA fixed-time period, whether or not there is a response No behavior, unless superstitious behavior resulting from accidental reinforcement of the response of interest.

Cumulative Records of 4 Schedules

Intermittent Reinforcement & Extinction Why does intermittent reinforcement increase resistance to extinction? It’s easy for the rats to “tell the difference” between CRF and extinction. –During CRF, all responses produce reinforcers –During extinction, none of them do

Also… It’s hard for the rats to “tell the difference” between intermittent reinforcement and extinction –During intermittent reinforcement, only an occasional response produces a reinforcer –During extinction, none of them do The rats quickly discriminate between CRF and extinction They greatly generalize between intermittent reinforcement and extinction

Intermediate Enrichment Response Strength: –Response frequency –Resistance to extinction –Behavioral momentum Skinner rejected “response strength,” claiming it is a reification When you have 2 different ways of measuring the same thing, that thing is probably a reification –Those different measures may not agree

Example A pigeon has 2 keys in the box –One is on a CRF schedule –One is a VI 1 According to Resistance to Extinction-Response Strength Measure: –The response on a VI 1 schedule is stronger because it is more resistant to extinction But…the pigeon will most likely continue pecking the CRF key more often than the VI 1 key

On DickMalott.com Chapter 18 Advanced Enrichment Section –Why Do Limited Holds Work the Way They Do? –Why Does Intermittent Reinforcement Increase Resistance to Extinction?

Join us for Chapter 19: Concurrent Contingencies