The Probability of Reinforcer Delay as a Determinant of Preference for Variability Michelle Ennis Soreth, Concetta Mineo, Jeffrey Walsh, Thomas Budroe,

Slides:



Advertisements
Similar presentations
Schedules of reinforcement
Advertisements

Schedules of Reinforcement: Continuous reinforcement: – Reinforce every single time the animal performs the response – Use for teaching the animal the.
The Matching Law Richard J. Herrnstein. Reinforcement schedule Fixed-Ratio (FR) : the first response made after a given number of responses is reinforced.
PSY402 Theories of Learning Chapter 4 (Cont.) Schedules of Reinforcement.
Lectures 15 & 16: Instrumental Conditioning (Schedules of Reinforcement) Learning, Psychology 5310 Spring, 2015 Professor Delamater.
Ratio Schedules Focus on the number of responses required before reinforcement is given.
A Direct Test of Contrast and Delay Reduction Hypotheses: Why Do Pigeons Prefer Stimuli Following Aversive Events? Rebecca Singer & Thomas Zentall University.
Chapter 9 Adjusting to Schedules of Partial Reinforcement.
Organizational Behavior Types of Intermittently Reinforcing Behavior.
Chapter 13: Schedules of Reinforcement
Increasing & Decreasing Behaviors 1. Increasing Behaviors 2.
Antecedent Based Interventions
Determinants of Olfactory Memory Span in Rats MacQueen, D. A., Bullard, L. A., & Galizio, M. University of North Carolina Wilmington Introduction Dudchenko,
Investigating the Step Size in a Progressive-Ratio Schedule of Reinforcement for Young Children Diagnosed with Autism Kathryn R. Haugle, Stephany Reetz,
Schedules of Reinforcement CH 17,18,19. Divers of Nassau Diving for coins Success does not follow every attempt Success means reinforcement.
Schedules of Reinforcement Thomas G. Bowers, Ph.D.
Schedules of reinforcement
Schedules of Reinforcement
Operant Conditioning. Learning when an animal or human performs a behavior, and the following consequence increases or decreases the chance that the behavior.
MONTE CARLO ANALYSIS When a system contains elements that exhibit chance in their behavior, the Monte Carlo method of simulation may be applied.
1 Quiz Question: In what way are fixed-ratio (FR) and variable-ratio (VR) reinforcement schedules: (a) similar? (b) different?
Reinforcements. Clinician’s Basic Task Create communication behaviors Increase communication behaviors Both.
Copyright © 2009 Pearson Education, Inc. Chapter 13 Experiments and Observational Studies.
Basic Practice of Statistics - 3rd Edition Introducing Probability
Unit 4 Projects will be returned by Sunday if they were submitted by the deadline Rubrics are at the end of your paper Questions? Antecedent Control Procedures-ch16.
AP PSYCHOLOGY LEARNING
Brady Et Al., "sequential compression device compliance in postoperative obstetrics and gynecology patients", obstetrics and gynecology, vol. 125, no.
Unit 8 Probability.
Chapter 5: Probability: What are the Chances?
STATISTICAL INFERENCE
Forecasting Methods Dr. T. T. Kachwala.
Single Subject Research
Review Measure testosterone level in rats; test whether it predicts aggressive behavior. What would make this an experiment? Randomly choose which rats.
Choice Behavior One.
Differential Reinforcement
Comparing Three or More Means
Identifying & Enhancing the Effectiveness of Positive Reinforcement
5.2 Probability
Warmup The “experts” think the Braves are still rebuilding and will only win 46% of their games this season. Using a standard deck of cards (52 cards,
Choice Behavior Two.
Schedules of Reinforcement
Chapter 32 Behavioral Principles, Assessment, and Therapy
Learning.
Heather Frye Shawnee Mission Behavior Specialist
Maintaining Behavior Change Chapter 10
PSY402 Theories of Learning
Introduction Results Discussion Method References
Chapter 5: Probability: What are the Chances?
Operant Conditioning Unit 4 - AoS 2 - Learning.
Operant Principles (aka: Behavior Management)
Use the Randomness to select 3 digits at a time.
Schedules of Reinforcement
Evaluating research Is this valid research?.
Schedules of Reinforcement
Basic Practice of Statistics - 3rd Edition Introducing Probability
Weighted Interval Scheduling
Chapter 5: Probability: What are the Chances?
Basic Practice of Statistics - 3rd Edition Introducing Probability
WARM – UP A two sample t-test analyzing if there was a significant difference between the cholesterol level of men on a NEW medication vs. the traditional.
Chapter 5: Probability: What are the Chances?
Motivation Through Consequences
Thinking critically with psychological science
Weighted Interval Scheduling
Probability using Simulations
Chapter 5: Probability: What are the Chances?
Inferential statistics Study a sample Conclude about the population Two processes: Estimation (Point or Interval) Hypothesis testing.
Chapter 5: Probability: What are the Chances?
Basic Practice of Statistics - 5th Edition Introducing Probability
Social Validity and Treatment Integrity
STATISTICS INFORMED DECISIONS USING DATA
Presentation transcript:

The Probability of Reinforcer Delay as a Determinant of Preference for Variability Michelle Ennis Soreth, Concetta Mineo, Jeffrey Walsh, Thomas Budroe, & Alec Ward

Preference for Variability Organisms generally prefer working in situations with variable outcomes over situations with fixed outcomes. This phenomenon is largely determined by a occasional quick or large payoff imbedded in the variable situation.

Past Research Pigeons prefer working on VI schedules over FI schedules that have the same arithmetic mean (Herrnstein, 1964). – Led to conclude that the value of the VI reinforcers are weighted differently than the value of the FI reinforcers – Mean may not be the best way to characterize the value of the VI schedule/reinforcers However, when the occasional short intervals were removed from the VI schedules, the preference for the FI did NOT become exclusive (Andrzejewski et al, 2005; Soreth & Hineline, 2009). – Suggest that the occasional quick payoff is not the sole determinant of the preference for variability

Method A concurrent-chains arrangement with fixed interval (FI) and random interval (RI) terminal link alternatives. RI schedule maintained a rate of reinforcement half that of the FI alternative. – RI 30 (Reinforcer produced on average once every 30 s) – FI 15 (Reinforcer always produced by the first response after 15 s) RI schedule never produced a component interval value less than that of the FI schedule. – Shortest interval available on RI = 15 seconds, often longer – Interval ALWAYS available on FI = 15 seconds

Four pigeons were exposed to the procedure in daily experimental sessions – 40 choice trials per day – Preference was assessed as the % of RI terminal link trials per session Probability of obtaining the smallest programmed delay to reinforcement Pr[minRI] on the RI schedule is to be manipulated across conditions The probability of producing the shortest RI variable was.50 Future testing will include.03 and.97 probabilities.

FI 15 RI 30 Future work in this experiment will have the pigeons additionally exposed to Pr[minRI] variables for: RI 60 s vs. FI 30 s RI 90 s vs. FI 45 s 100% chance S R+ available 15 s 50% chance S R+ available 50% chance reinforcer delay longer than 15 s Terminal Link Begins 0 s

References Andrzejewski, M.E., Cardinal, C.D., Field, D.P., Flannery, B.A., Johnson, M., Bailey, K., & Hineline, P.N. (2005). Pigeons’ choices between fixed and variable interval schedules: Utility of variability? Journal of the Experimental Analysis of Behavior, 83, Herrnstein, R.J. (1964). Aperiodicity as a factor in choice. Journal of the Experimental Analysis of Behavior, 7, Soreth, M.E., & Hineline, P.N. (2009). The probability of small schedule values and preference for random-interval (RI) schedules.