Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Search for Significance: A Practical Guide to

Similar presentations


Presentation on theme: "The Search for Significance: A Practical Guide to"— Presentation transcript:

1 The Search for Significance: A Practical Guide to P-Hacking @Neuro_Skeptic neuroskeptic@googlemail.com http://blogs.discovermagazine.com/neuroskeptic

2

3

4

5 The Nine Circles of Dante’s Inferno First Circle: Limbo Second Circle: Lust Third Circle: Gluttony Fourth Circle: Greed Fifth Circle: Anger Sixth Circle: Heresy Seventh Circle: Violence Eighth Circle: Fraud Ninth Circle: Treachery

6

7 The Nine Circles of Scientific Hell First Circle: Limbo Second Circle: Overselling Third Circle: Post-Hoc Storytelling Fourth Circle: P-Value Fishing Fifth Circle: Creative Use of Outliers Sixth Circle: Plagiarism Seventh Circle: Non-Publication of Data Eighth Circle: Partial Publication of Data Ninth Circle: Inventing Data

8 P-Fishing Fourth Circle: P-Value Fishing “Those who tried every statistical test in the book until they got a p value less than 0.05 find themselves here, an enormous lake of murky water. Sinners sit on boats and must fish for their food. Fortunately, they have a huge selection of different fishing rods and nets. Unfortunately, only one in 20 fish are edible, so they are constantly hungry.”

9 P-Fishing Also known as… ▫P-Hacking ▫Questionable Research Practices (QRPs)) ▫Torturing the data ▫Outcome reporting bias ▫Undisclosed flexibility ▫Researcher Degrees of Freedom ▫…and more.

10 Related to… Publication bias P-hacking is about using multiple methods or attempts to find a significant result. Publication bias is the tendency to only publish significant results. Both go hand in hand in practice. But each could, in theory, occur without the other.

11 P-Hacking Works! Collect some data

12 P-Hacking Works! Collect some data Try many statistical tests on the same data

13 P-Hacking Works! Collect some data Try many statistical tests on the same data Or try many variants of the same data (e.g. removing ‘outliers’.)

14 P-Hacking Works! Collect some data Try many statistical tests on the same data Or try many variants of the same data (e.g. removing ‘outliers’.) Or try looking at different variables within the dataset

15 P-Hacking Works! Collect some data Try many statistical tests on the same data Or try many variants of the same data (e.g. removing ‘outliers’.) Or try looking at different variables within the dataset Report the analyses that give the most favourable results (usually the lowest p-values).

16

17

18

19

20

21 “P-Hack the numbers, HARK the text” Hypothesizing After the Results Are Known

22 “P-Hack the numbers, HARK the text” Hypothesizing After the Results Are Known Allows any significant result to become an interesting, hypothesis-confirming finding

23 “P-Hack the numbers, HARK the text” Hypothesizing After the Results Are Known Allows any significant result to become an interesting, hypothesis-confirming finding HARKing is not to be confused with revising or rejecting hypotheses in the light of new data – which is essential (!)

24 “P-Hack the numbers, HARK the text” Hypothesizing After the Results Are Known Allows any significant result to become an interesting, hypothesis-confirming finding HARKing is not to be confused with revising or rejecting hypotheses in the light of new data – which is essential (!) Rather, HARKing means that hypotheses are never tested. The hypotheses are always “one step ahead” of the data.

25 And now a demonstration…

26 fMRI Simulator

27 Why P-Hacking Is So Effective There are many choices (‘researcher degrees of freedom’) in data analysis. For example, in a simple task-based fMRI data analysis, Joshua Carp found 7000 combinations of parameters (very conservative). Carp, J. (2012).On the plurality of (methodological) worlds: estimating the analytic flexibility of fMRI experiments Frontiers in Neuroscience Carp, J. (2012).On the plurality of (methodological) worlds: estimating the analytic flexibility of fMRI experiments Frontiers in Neuroscience

28 How To Spot It The p-curve… Simonsohn, U. Nelson, L. D. Simmons, J. P. (2013). P-curve: a key to the file-drawer. Journal of Exp. Psychol General Simonsohn, U. Nelson, L. D. Simmons, J. P. (2013). P-curve: a key to the file-drawer. Journal of Exp. Psychol General Try it now! http://www.p-curve.com/app2/ Try it now! http://www.p-curve.com/app2/

29

30 Although it’s complicated “Publication bias and underpowered studies might be a bigger problem for science than inflated Type 1 error rates…”

31 The Root of the Problem

32 The Root of the Problem (and Fixes) Smulders YM (2013). A two-step manuscript submission process can reduce publication bias. Journal of Clinical Epidemiology Smulders YM (2013). A two-step manuscript submission process can reduce publication bias. Journal of Clinical Epidemiology

33 The Root of the Problem (and Fixes) Smulders YM (2013). A two-step manuscript submission process can reduce publication bias. Journal of Clinical Epidemiology Smulders YM (2013). A two-step manuscript submission process can reduce publication bias. Journal of Clinical Epidemiology Chambers CD (2013). Registered Reports: a new publishing initiative at Cortex Cortex Chambers CD (2013). Registered Reports: a new publishing initiative at Cortex Cortex

34

35

36

37

38 Happy hacking! @Neuro_Skeptic neuroskeptic@googlemail.com http://blogs.discovermagazine.com/neuroskeptic


Download ppt "The Search for Significance: A Practical Guide to"

Similar presentations


Ads by Google