Do artificial surveillance cues increase generosity? Two meta-analyses

Slides:



Advertisements
Similar presentations
What is meta-analysis? ESRC Research Methods Festival Oxford 8 th July, 2010 Professor Steven Higgins Durham University
Advertisements

Dissertation Writing.
James A. Russell, Ph.D. 1, Karen Kirtland, Ph.D. 2, Dawn K. Wilson, Ph.D. 2, Nadia Craig 1, Veronica Addison 1, Wally Peters, Ph.D. 1 1 University of South.
How Science Works Glossary AS Level. Accuracy An accurate measurement is one which is close to the true value.
INFERENTIAL STATISTICS – Samples are only estimates of the population – Sample statistics will be slightly off from the true values of its population’s.
Are the results valid? Was the validity of the included studies appraised?
Overview of Meta-Analytic Data Analysis
13th International Conference on Social Dilemmas Kyoto, JAPAN, August 20-24, Your peers are watching you: Reputation sensitivity and in-group favoritism.
Eye images increase generosity, but not for long: the limited effect of a false cue Adam Sparks, Pat Barclay Shefali Garg(11678) Smith Gupta(11720)
1 Things That May Affect Estimates from the American Community Survey.
Statistical Significance of Data
1 CH450 CHEMICAL WRITING AND PRESENTATION Alan Buglass.
Establishment And Evolution of Engineering Sciences Journal Dr. Mohammed A. Alhaider Electrical Engineering Dept.- College of Engineering King Saud University.
Funded through the ESRC’s Researcher Development Initiative Department of Education, University of Oxford Session 2.1 – Revision of Day 1.
Altruistic Advice Giving Shoham Choshen-Hillel and Ilan Yaniv The Hebrew University of Jerusalem Introduction General General Method References Schotter,
Chapter 9: Introduction to the t statistic. The t Statistic The t statistic allows researchers to use sample data to test hypotheses about an unknown.
Psych IA What to write and where. Introduction Identify your area of study State why your area of study is important and interesting to our world Identify.
Wim Van den Noortgate Katholieke Universiteit Leuven, Belgium Belgian Campbell Group Workshop systematic reviews.
Research methods revision The next couple of lessons will be focused on recapping and practicing exam questions on the following parts of the specification:
Quality Assessment Fifteen studies met all seven quality criteria, eighteen studies failed to meet one, eighteen studies failed to meet two, eight studies.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
Institute of Professional Studies School of Research and Graduate Studies Selecting Samples and Negotiating Access Lecture Eight.
Educational Psychology
An Overview of Statistical Inference – Learning from Data
DATA COLLECTION METHODS IN NURSING RESEARCH
Jacqueline A. Clauss, B.A., Jennifer Urbano Blackford, Ph.D. 
Reasoning in Psychology Using Statistics
Selecting the Best Measure for Your Study
Statistics in Clinical Trials: Key Concepts
The Effects of Imagery on Name Recall
How to Critically Appraise Literature
Searching the Literature
Literature Review: Conception to Completion
Family Paradigm Assessment Scale (F-PAS) Test-Retest Reliability
Primer on Adjusted Indirect Comparison Meta-Analyses
Question 1: What is the baseline of high power?
Health during transition to adulthood in the population with learning disabilities: analysis of Scotland’s Census (2011) Ms Genevieve Young-Southward1,
DISCUSSION AND CONCLUSIONS
Shudong Wang NWEA Liru Zhang Delaware Department of Education
Meta-Analytic Thinking
Chapter 10: Analysis of Variance: Comparing More Than Two Means
Auditing & Investigations I
Gender wage inequalities in Serbia
Implications and Future Research Research Subjects/Questions
Week 10 Chapter 16. Confidence Intervals for Proportions
An Overview of Statistical Inference – Learning from Data
Program Evaluation Essentials-- Part 2
PSYCH 610 Competitive Success/snaptutorial.com
PSYCH 610 Education for Service/snaptutorial.com.
Writing up your results
Three Studies Testing a Model of Self-Reflexion
Effect size measures for single-case designs: General considerations
Pilot Studies: What we need to know
Stat 217 – Day 28 Review Stat 217.
William P. Wattles, Ph.D. Psychology 302
Sign critical appraisal course: exercise 2
Dylan Antoniazzi, Sacha Dubois, Rupert Klein, Michel Bédard
Categorical Data Analysis Review for Final
Joint Statistical Meetings, Vancouver, August 1, 2018
Sabine Wollscheid, Senior Researcher, Dr. phil.
Angry eyes decrease charitable giving Conclusions and Implications
An Introduction to the Scientific Method and Investigation
Examining Environmental Injustice in Florida
SAGE Research Methods Online
Nonverbal Communication
Statistics is… Mathematics: The tools used to analyze data and quantify uncertainty are mathematical in nature (e.g., probability, counting methods). English:
Lack of Confidence Interval Reporting in Dermatology: A Call to Action
STEPS Site Report.
Investigating the potential for unanticipated consequences of teaching the tentative nature of science William W. Cobern Betty AJ Adams Brandy A-S Pleasants.
Presentation transcript:

Do artificial surveillance cues increase generosity? Two meta-analyses Stefanie B. Northover1,2, William C. Pedersen3, Adam B. Cohen2, and Paul W. Andrews1 1McMaster University, 2Arizona State University, 3California State University, Long Beach Introduction When people know they are being watched, they are more generous (Kurzban, 2001). Several papers seemingly show that even artificial cues of being watched impact behavior (e.g., Haley & Fessler, 2005). However, some studies have failed to replicate this surveillance cue effect (e.g., Matsugasaki, Tsukamoto, & Ohtsubo, 2015). In light of these mixed results, we conducted two meta-analyses investigating the effect of artificial surveillance cues on generosity. Results: Proportion who gave Conclusion No evidence that artificial cues of being watched increase generosity Logged odds-ratio: 0.16 (SE = 0.10) Participants in the surveillance cue conditions were 1.17 times more likely to give than participants in the control conditions. 95% CI: −0.04 to 0.35 The effect size is not significantly different from zero. Two meta-analyses found no evidence that artificial surveillance cues increase generosity, either by increasing how generous individuals are or by increasing the probability that individuals will show any generosity at all. No effect Acknowledgements The authors thank the researchers who provided statistics, data, and/or other information necessary to conduct the meta-analyses. The authors also thank three anonymous reviewers for their constructive comments and advice. Positive effect sizes indicate that proportion of Ss who gave was greater in the surveillance cue condition. Images of eyes, such as this one (Pedersen, 2016), have been used as surveillance cues by several studies. References Haley, K. J., & Fessler, D. M. T. (2005). Nobody's watching? Subtle cues affect generosity in an anonymous economic game. Evolution and Human Behavior, 26, 245–256. Kurzban, R. (2001). The social psychophysics of cooperation: Nonverbal communication in a public goods game. Journal of Nonverbal Behavior, 25(4), 241-259. Lipsey, M. W., & Wilson, D. B. (2001). Practical meta- analysis. Thousand Oaks, California: Sage Publications. Matsugasaki, K., Tsukamoto, W., & Ohtsubo, Y. (2015). Two failed replications of the watching eyes effect. Letters on Evolutionary Behavioral Science, 6, 17-20. Pedersen, R. (2016). No effects of artificial surveillance cues or social proofs on survey participation rates. Working paper, University of Copenhagen. Results: Amount given Method Meta-analysis procedures outlined by Lipsey and Wilson (2001) Effect size calculated for each experiment generosity for the surveillance cue condition compared to the control condition Effect sizes weighted using random effects model Calculated overall effect size, SE, and 95% CI Proportion who gave meta-analysis Measure: proportion of participants who gave something rather than nothing 19,512 participants 27 experiments Amount given meta-analysis Measure: mean amount of resources (usually money) given by participants to others 2,732 participants 26 experiments Mean difference: 0.03 (SE = 0.05) Participants in the surveillance cue conditions were slightly more generous than participants in the control conditions. 95% CI: −0.08 to 0.13 The effect size is not significantly different from zero. Further information Northover, S. B., Pedersen, W. C., Cohen, A. B., & Andrews, P. W. (2016). Artificial surveillance cues do not increase generosity: Two meta-analyses. Evolution and Human Behavior. doi:10.1016/j.evolhumbehav.2016.07.001 Contact: stefanie.northover@asu.edu Website: stefanienorthover.com No effect Positive effect sizes indicate that amount given by Ss was greater in the surveillance cue condition. *Studies cited in manuscript