Operant Conditioning: Schedules and Theories Of Reinforcement
Now that we have discussed punishment and reinforcement... It is time to discuss just HOW reinforcements can and should be delivered In other words, there are other things to consider than just WHAT the reinforcer should be!
Think about this! If you were going to reinforce your puppy for going to the bathroom outside, how would you do it? Would you give him a Milk Bone every time? Some of the time? Would you keep doing it the same way or would you change your method as you go along?
What is a schedule of reinforcement? A schedule of reinforcement is the response requirement that must be met in order to obtain reinforcement. In other words, it is what you have to do to get the goodies!
Continuous vs. Partial Reinforcement Continuous A continuous reinforcement schedule is one in which each specified response is reinforced Partial A partial reinforcement schedule is one in which only some responses are reinforced
Partial Reinforcement Schedules When you want to reinforce based on a certain number of responses occurring (for example, doing a certain number of math problems correctly), you can use a ratio schedule When you want to reinforce the first response after a certain amount of time has passed (for example when a teacher gives a midterm test), you can use an interval schedule
Four Types of Partial Schedules Ratio Schedules Fixed Ratio Variable Ratio Interval Schedules Fixed Interval Variable Interval
Fixed Ratio Schedule On a fixed ratio schedule, reinforcement is contingent upon a fixed, predictable number of responses Characteristic pattern: High rate of response Short pause following each reinforcer Reading a chapter then taking a break is an example A good strategy for “getting started” is to start with an easy task
Fixed Ratio: FR Fixed Ratio is abbreviated “FR” and a number showing how many responses must be made to get the reinforcer is added: Ex. FR 5 (5 responses needed to get a reinforcer)
Variable Ratio Schedule On a variable ratio schedule, reinforcement is contingent upon a varying, unpredictable number of responses Characteristic pattern: High and steady rate of response Little or no post-reinforcer pausing Hunting, fishing, golfing, shooting hoops, and telemarketing are examples of behaviors on this type of schedule
Variable Ratio: VR Variable Ratio is abbreviated “VR” and a number showing an average of how many responses between 1 and 100 must be made to get the reinforcer is added: Ex. VR 50 (an average of 50 responses needed to get a reinforcer – could the the next try, or it could take 72! Gambling is the classic example!
Fixed Interval Schedules On a fixed interval schedule, reinforcement is contingent upon the first response after a fixed, predictable period of time Glancing at your watch during class provides an example! The mailman delivering your mail at a specific time every day.
Fixed Interval: FI Fixed Interval is abbreviated “FI” and a number showing how much time must pass before the reinforcer is available: FI 30-min (reinforcement is available for the first response after 30 minutes have passed) Ex. Looking down the tracks for the train if it comes every 30 minutes
Variable Interval Schedule On a variable interval schedule, reinforcement is contingent upon the first response after a varying, unpredictable period of time Characteristic pattern: A moderate, steady rate of response with little or no post- reinforcement pause. Looking down the street for the bus if you are waiting and have no idea how often it comes provides an example!
Variable Interval: VI Variable Interval is abbreviated “VI” and a number showing the average time interval that must pass before the reinforcer is available: VI 30-min (reinforcement is available for the first response after an average of 30 minutes has passed) Ex. Hilary’s boyfriend, Michael, gets out of school and turns on his phone some time between 3:00 and 3:30 – the “reward” of his answering his phone puts her calling behavior on a VI schedule, so she calls every few minutes until he answers