Download presentation
Presentation is loading. Please wait.
Published byBranden Harrington Modified over 8 years ago
1
Elspeth Slayter, Associate Professor School of Social Work, Salem State University
2
Administrative matters & check-in Themes in qualitative research Measurement in quantitative research Online activities Do readings Listen to lecture Critique one measurement in a quantitative article that was part of your annotated bibliography
3
Choose intervention thoughtfully – with or without research ImplementEvaluate Research QualitativeQuantitative Program evaluation Process/formativeOutcome/summative
4
All are types of Research Looking up references, compiling existing information Practice evaluation (a.k.a. program evaluation) Social research that informs social work practice in some way 4
5
Cross-sectional Cannot determine causation Longitudinal Pre-test to Post-test Post-test only Match the question to the method!
6
REVIEW: Research design 101: Sampling Exposure/treatment group (i.e. sample) Control/comparison group No control or comparison group
7
Population (a.k.a. Universe) complete listing of a set of elements having a given characteristic(s) of interest Sample Infers population characteristics from a subset of the population Saves money Saves time Can be more accurate – don’t need whole pop
8
A list of population members May get a complete listing - but often population and sample frame are different Example: DCF Lowell Caseworkers Terminology heads-up: Differences between the sample frame and population are referred to as: “sample frame error” or “sampling frame error”
9
Mathematical likelihood that an event can occur Ranges from 0 to 1 (chance of occurring) Application to sampling: The larger the sample, the more chance there is that you will have a group that is representative of the whole population
10
QL: Purposive QN: Probability & Non- Probability 10
11
Importance of distinguishing between qualitative and quantitative Theme identification vs. concept measurement Beyond counting… Letting the themes “rise up” from the data Letting the collective voice of study respondents define what the themes are in answer to questions Can be confusing – demographic data sometimes included as qualitative 11
12
Horwitz AND Padgett, Hawkins, Abrams &Davis Are we thinking about measurement – or themes? Describe themes found in this artice How did the authors “get to” these themes? Horwitz: Do you agree with themes – is there anything missing? 12
13
Moving from concepts to observations It’s about operationalization Moving from ideas to reality It’s about objectivity Working to objectively measure things the same way Does everyone agree on “know it when you see it?” As objective as possible vs. subjective personal judgment 13
14
Process of making distinctions Where variables come from (process of concept operationalization in quantitative research) Putting value labels on variables Importance of consistency with the literature (i.e. what other people do) 14
15
15 Standardized Existing research has come to a conclusion about how to measure a concept Accepted screening or assessment tools Accepted measurement tools or approaches Unstandardized A measurement approach that has not been accepted in the literature Often a new concept – or a new-to-research concept that is in process towards a standardized measurement approach Only used as a last resort
16
16 Uniform administration Uniform scoring Measured/observed the same way Defined the same way Tests measure in a given population
17
1) "Have you ever felt you should Cut down on drinking (drug use)?" 2) "Have people Annoyed you by criticizing your drinking (drug use)?" 3) "Have you ever felt Guilty about your drinking (drug use)?" 4) "Have you ever taken a drink (used drugs) in the morning to steady your nerves or get rid of a hangover (“Eye opener”)? Scoring : Answering YES to 2 questions provides strong indication for substance abuse or dependency Answering YES to 3 questions confirms the likelihood of substance abuse or dependency 17
18
18 Clinical definition Functional Cognitive ICD-9-CM definition DSM-IV definition Observation Standardized testing Use the best option you can – consistently Caveat: Some research tests/explores how people define something (e.g. resiliency, success)
19
Are you satisfied with our services? How it is asked matters Who asks it matters Asking the big question might not get the real picture What do we intuitively know about what causes someone to be satisfied? Measures usually are an estimate of or “proxy” for satisfaction 19
20
20 To what degree does this measurement measure what it is supposed to measure? Does it measure it accurately? Content – enough measurement of sub- dimensions Face - on it’s face, valid? Criterion – score on this measure predicts a known result Construct – measures what it says it does?
21
To what degree does this measure consistently function to measure the concept? Internal consistency: Enjoy riding bikes Like riding bikes – yes Have enjoyed riding bikes in past – yes Hate riding bikes - no 21
22
Chronbach’s alpha: α Rule of thumb: Alpha score of.6-.7 is acceptable Alpha score of.8 or higher is good 22
23
Why will measurement occur? What will be measured? How will the measure be operationalized? What are the pros and cons of any given measurement? Justification for measurement choice 23
24
Thinking about measurement in quantitative research Rodriguez & Murphy Parenting stress – standardized/un-standardized? Validity? Reliability? Original sample on which measure was “normed” 24
25
25
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.