Psychological Research and Scientific Method pt3 Designing psychological investigations.

Slides:



Advertisements
Similar presentations
The meaning of Reliability and Validity in psychological research
Advertisements

The Research Consumer Evaluates Measurement Reliability and Validity
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Week 5.  A psychologist at the local university agrees to carry out a study to investigate the claim that eating a healthy breakfast improves reading.
Validity, Sampling & Experimental Control Psych 231: Research Methods in Psychology.
The Methods of Social Psychology
Psych 231: Research Methods in Psychology
Research Methods.
Validity, Reliability, & Sampling
Methodology: How Social Psychologists Do Research
Proposal Writing.
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
Descriptive and Causal Research Designs
Contents Research Methods Planning Research The Experimental Method Advantages and Disadvantages Questioning Advantages and Disadvantages The Observational.
The Research Process Interpretivist Positivist
Chapter 2: The Research Enterprise in Psychology
Chapter 2: The Research Enterprise in Psychology
McGraw-Hill © 2006 The McGraw-Hill Companies, Inc. All rights reserved. The Nature of Research Chapter One.
Ch 6 Validity of Instrument
RESEARCH A systematic quest for undiscovered truth A way of thinking
Research Methods Key Points What is empirical research? What is the scientific method? How do psychologists conduct research? What are some important.
AQA Questions and answers
Final Study Guide Research Design. Experimental Research.
Chapter 1: Research Methods
Basic and Applied Research. Notes:  The question asked is either “basic” or “applied”  “Try again…” NEVER with the same data set  *data mining*  Literature.
Research Methods Resource: Text Chapter 2. What is the scientific method?  a set of principles and procedures that are used by researchers to develop.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Chapter 1: Psychology, Research, and You Pages 2 – 21.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
The Research Enterprise in Psychology
Validity and Reliability Edgar Degas: Portraits in a New Orleans Cotton Office, 1873.
Assumes that events are governed by some lawful order
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Experiment Basics: Variables Psych 231: Research Methods in Psychology.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
QUANTITATIVE RESEARCH Presented by SANIA IQBAL M.Ed Course Instructor SIR RASOOL BUKSH RAISANI.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
Methods of Research. 1. Laboratory Experiments Research in lab setting Research in lab setting Researcher can be objective and usually provides accurate.
JS Mrunalini Lecturer RAKMHSU Data Collection Considerations: Validity, Reliability, Generalizability, and Ethics.
Chapter 2 The Research Enterprise in Psychology. Table of Contents The Scientific Approach: A Search for Laws Basic assumption: events are governed by.
Psychology As Science Psychologists use the “scientific method” Steps to the scientific method: - make observations - ask question - develop hypothesis.
Reliability and Validity Themes in Psychology. Reliability Reliability of measurement instrument: the extent to which it gives consistent measurements.
Validity and Reliability in Instrumentation : Research I: Basics Dr. Leonard February 24, 2010.
RESEARCH METHODS IN INDUSTRIAL PSYCHOLOGY & ORGANIZATION Pertemuan Matakuliah: D Sosiologi dan Psikologi Industri Tahun: Sep-2009.
AS Research Methods - REVISION. Methods and Techniques Pilot Studies – used why? Experimental Method –THREE types of experiment? –S&W of each? Correlational.
Module 2 Research Strategies. Scientific Method A method of learning about the world through the application of critical thinking and tools such as observation,
Methodology: How Social Psychologists Do Research
How Psychologists Do Research Chapter 2. How Psychologists Do Research What makes psychological research scientific? Research Methods Descriptive studies.
Dependant + Independent variables Independent = directly manipulated by the experimenter Dependant = the variable affected by the independent variable.
Sampling techniques validity & reliability Lesson 8.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
Research in Psychology. Quantitative Methods  Quantitative: experiments and studies gathering data with questionnaires and analyzing results with correlations.
Consistency and Meaningfulness Ensuring all efforts have been made to establish the internal validity of an experiment is an important task, but it is.
VALIDITY What is validity? What are the types of validity? How do you assess validity? How do you improve validity?
ESTABLISHING RELIABILITY AND VALIDITY OF RESEARCH TOOLS Prof. HCL Rawat Principal UCON,BFUHS Faridkot.
CHAPTER 2 Research Methods in Industrial/Organizational Psychology
پرسشنامه کارگاه.
LAQ: Evaluating a study
Module 02 Research Strategies.
© 2012 The McGraw-Hill Companies, Inc.
Reliability.
Experimental Design.
Experimental Design.
Research Methods & Statistics
AS Psychology Research Methods
Presentation transcript:

Psychological Research and Scientific Method pt3 Designing psychological investigations.

Selection & application of appropriate research methods. Should I use quantative or qualitative. What is my research aim? What is my hypothesis (directional/non-directional? What is my IV and DV, is my variable(s) operationlised?

Key considerations when designing psychological investigations. 1.Choosing an appropriate research method. 2.Deciding upon the amount of Ps. 3.Using appropriate sampling method 4.How to debrief Ps. 5.How should I record the data & the techniques to be used. 1.Is the data qualitative or quantative, if the former = transcripts, if the latter = experiment. 2.Consider finances & practicalities. 3.Target population should be identified & a representative sample should be used. 4.Need to always consider ethics...is deception necessary? Should they know you are there (observations!) 5.Number analysis, Written record, video of interviews, a combination? How should it be coded? Should results be omitted?

Pilot study….why is this necessary? This is an important step in designing a good research study. It is characterised as ‘A small scale trial run of a specific research investigation in order to test out the planned procedures & identify any flaws & areas for improvement’. It provides feedback to clear up any issues.

The relationship between researcher & participants. Studying complex behaviour can create several ‘social’ type issues that can effect the outcome of the investigation such as 1.Demand characteristics: Behaving in a way that they perceive will help or distort the investigation. 2.Participant reactivity: Faithful/faithless, evaluation apprehension or social desirability bias. Change in behaviour because you think you are being evaluated in a positive/negative way (try harder/not hard enough!) 3.Investigator effects: Undesired effect of the researcher’s expectations or behaviour on participants or on the interpretation of data.

So I don’t have to retype it!!!!! AS Research methodsSee AS Research methods for information on experimental design Extraneous variables Methodology Ethics Sampling techniques. You should be able to identify, explain and give at least two advantages & two disadvantages of all the above!!!!!!

Issues of reliability & validity. Types of reliability 1.Internal: Consistency of measures, no lone ranger gets in and messes with the investigation! 2.External: Consistency of measures from one occasion to another- Can a test be relied upon to generate same or similar results. 3.Research Reliability: Extent to which researcher acts entirely consistently when gathering data in an investigation. Aka experimenter reliability in experimental conditions & inter-rater/inter-observer reliability.

Assessing researcher reliability. (Intra) Intra researcher reliability is achieved of the investigator has performed consistently. This is achieved when scoring/measuring on more than one occasion and receiving the same or similar results. If using observations or other non experimental methods this is achieved by comparing two sets of data obtained on separate occasions and observing the degree of similarities in the scores.

Assessing researcher reliability. (Inter) Researchers need to act in similar ways. All observers need to agree, therefore they record their data independently then correlate to establish the degree of similarity! Inter observer reliability is achieved if there is a statistically significant positive correlation between the scores of the different observers.

Improving researcher reliability Variability brings about extraneous variables so it is important to ensure high intra-inter research reliability by: Careful design of a study- E.g. Use of piloting, as this can improve the precision and make the investigation less open to interpretation. Careful training of researchers- In procedures & materials so variability can be reduced among researchers. Operational definitions should be used and understood by all and researchers should know how to carry out procedures and record data.

Assessing & improving internal & external reliability. Split-half method: Splitting the test into two parts after data has been collected. The two sets of data are then correlated, if the result is statistically significant it indicates reliability. If not significant each item is removed & retesting occurs. The overall aim is to reach +.8 & reach internal reliability. Test-retest method: Presenting same test on different occasions with no feedback after first presentation. Time between is important too, cant be too short/long! If statistically significant between scores this is deemed as stable, if not items are checked for consistency & reliability retested if correlation can be obtained.

Techniques to assess & improve internal validity. Face validity: On the surface of things, does the investigation test what is claims to be testing! Content Validity: Does the content of the test cover the whole topic area? Detailed systematic examination of all components are measured & until it is agreed that the content is appropriate. Concurrent validity: When the measures are obtained at the same time as the test scores. This indicates the extent to which the test scores accurately estimate an individual’s current state with regards to the criterion. E.g. On a test that measures levels of depression, the test would be said to have concurrent validity if it measured the current levels of depression experienced by the participant. Predictive validity: occurs when the criterion measures are obtained at a time after the test. Examples of test with predictive validity are career or aptitude tests, which are helpful in determining who is likely to succeed or fail in certain subjects or occupations.

Just a little extra on ethics! There are several ethic committees: 1.Departmental ethics:-The DEC carries out approval on undergraduate & postgraduate proposals. At least 3 members should be involved who do not have an invested interest in the research but have the appropriate expertise to look over the proposal. They yeah/nay the proposal ask for modifications if they deem it to be necessary. They may refer the proposal to the IEC if researchers are needed from other disciplines.

Institutional ethics (IEC) Formed by Psychologists and researchers from other disciplines. They have a wider expertise & practical knowledge of the possible issues (law or insurance.) The chair of the IEC will yeah or nay or ask for resubmission upon modification 

External ethics committee (EEC). 1.Some research cannot be approved by either the DEC or IEC and will need the EEC. 2.This is likely to be the case for proposals involving Ps from NHS etc.. 3.The ‘National research ethics service’ (NREC) is responsible the approval process. 4.The EEC consists of experts with no vested interest in the research. Monitoring the guidelines serves as a final means of protecting Ps. If Psychologists are found to be contravening guidelines they can be suspended or have their license to practice removed or expelled from the society (BPS).