Data Collection Data is your friend. Agenda Action research check-up Measures (aka, ways to collect data) Midterms.

Slides:



Advertisements
Similar presentations
Developing a Questionnaire
Advertisements

The Research Consumer Evaluates Measurement Reliability and Validity
Standardized Tests What They Measure How They Measure.
Experimental Research Designs
Chapter Fifteen Understanding and Using Standardized Tests.
Ch. 5: Methods for Looking Within Ourselves
Performance Appraisal in the Public Sector
Business research methods: data sources
Survey Research Definition Importance of
Sampling and Data Collection
Chapter 7 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
Understanding Validity for Teachers
Quantitative Research
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 5 Informal Assessments.
Chapter 6 Descriptive Research Gay, Mills, and Airasian
© 2001 Dr. Laura Snodgrass, Ph.D.1 Non-experimental Methods Observation and Survey Research.
Chapter 1 Assessment in Elementary and Secondary Classrooms
DR. DAWNE MARTIN MKTG 241 FEB. 15, 2011 Marketing Research.
Assessment Procedures: Observational Techniques, Peer Appraisal, and Self-Report Chapter 13 June 25, 2007.
Collecting Quantitative Data Creswell Chapter 6. Who Will You Study? Identify unit of analysis Specify population Describe sampling approach  Class =
Evaluation Test Justin K. Reeve EDTECH Dr. Ross Perkins.
Research and Statistics AP Psychology. Questions: ► Why do scientists conduct research?  answer answer.
Instrument Validity & Reliability. Why do we use instruments? Reliance upon our senses for empirical evidence Senses are unreliable Senses are imprecise.
Unit III: Observing Children: A Tool for Assessment.
Foundations of Educational Measurement
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
 Cynthia J. Miller, Ph.D. Assistant Professor Dept. of Physiology & Biophysics University of Louisville.
Classroom Assessments Checklists, Rating Scales, and Rubrics
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
 Collecting Quantitative  Data  By: Zainab Aidroos.
Data Collection Methods
Instrumentation (cont.) February 28 Note: Measurement Plan Due Next Week.
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
The Indiana Youth Survey Insert Your Name, Title and Organization.
EDU 8603 Day 6. What do the following numbers mean?
Chapter 12 Survey Research.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 13 Assessing Affective Characteristics.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 2 Fundamentals of Assessment.
Qualitative Research Designs Day 4 The Curious Skeptics at work.
Data Collection and Reliability All this data, but can I really count on it??
Research Methods in Psychology (Pp 32-44)
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
Classroom Assessment, Grading, and Standardized Testing
CHAPTER OVERVIEW Deciding on a Method Tests and Their Development Types of Tests Observational Techniques Questionnaires.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 13 Data Collection in Quantitative Research.
Measurement, Data Collection, Validity & Reliability Data is your friend.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Chapter 9 Correlation, Validity and Reliability. Nature of Correlation Association – an attempt to describe or understand Not causal –However, many people.
PLC Team Leader Meeting
OBSERVING CHILDREN: A Tool for Assessment
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
Quantitative Data Collection Techniques In contrast to qualitative data quantitative data are numerical. Counted, calculated tallied and rated. Also ratings.
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
Standards-Based Tests A measure of student achievement in which a student’s score is compared to a standard of performance.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Evidence Based Practice & Research in Nursing Level 8, Academic Year (AY) 1434—1435 H Vanessa B. Varona, RN, MAN.
Assessment Design How do you know that they know what you taught them?
Chapter 1 Assessment in Elementary and Secondary Classrooms
Classroom Assessments Checklists, Rating Scales, and Rubrics
Survey Design Workshop:
Chapter 6: Checklists, Rating Scales & Rubrics
Performance Management
Classroom Assessments Checklists, Rating Scales, and Rubrics
Data Collection Methods
Performance Management
Understanding and Using Standardized Tests
Presentation transcript:

Data Collection Data is your friend

Agenda Action research check-up Measures (aka, ways to collect data) Midterms

Public Service Announcement APA style guide websites

Action Research Projects Where you should be Finding research related to, but not necessarily identical to, your topic Research questions may shift

Action Research Projects literature reviewOrganizing Sources for literature review Example Topic A ( Source A, C, G, H, K ) Subtopic 1 (Source G, K) Subtopic 2 (Source A, G, H) Subtopic 3 (Source A, C, H, K) Topic B ( Source B, D, E, F, G, H ) Subtopic 1 (Source B, F, G, H) Subtopic 2 (Source D, E, F, G)

Action Research Projects Collecting Data What types of data should I collect to answer my research question? What types of data should I collect to help answer the why questions in my study?

Measures of Data Collection Interviews Questionnaires Observations Tests

Measures (Means of Data Collection) You must match the instrument to the research question!

Examples to critique Measures Questionnaire – Psychological School Membership Survey used with middle school students Interview protocol – for teachers & counselors regarding professional development issues Observation instrument – PDE 430 for student teachers What are 2 benefits and 2 limitations of this measure?

Interviews What are some important things happening in this video related to interviewing?

Interviews Advantages Establish rapport & enhance motivation Clarify responses through additional questioning Capture the depth and richness of responses Allow for flexibility Reduce “no response” and/or “neutral” responses Disadvantages Time consuming Expensive Small samples Subjective – interviewer characteristics, contamination, bias

Questionnaires Used to obtain a subject’s perceptions, attitudes, beliefs, values, opinions, or other non-cognitive traits Example of psychology questionnaires measuring aspects of happiness, “signature strengths,” well-being

Questionnaires Scales - a continuum that describes subject’s responses to a statement Likert Checklists Ranked items

Questionnaires Likert scales Response options require the subject to determine the extent to which they agree with a statement Debate over odd v. even number responses Statements must reflect extreme positive or extreme negative positions

Questionnaires Checklists Choose options Ranked items Sequential order Avoids marking everything high or low

Questionnaires Problems with measuring non-cognitive traits Difficulty clearly defining what is being measured Self-concept or self-esteem Response set Responding same way (Ex - all 4’s on CATS) Social desirability /faking “PC filter” Agreeing with statements because of the negative consequences associated with disagreeing

Questionnaires Controlling problems Equal numbers of positively and negatively worded statements Alternating positive and negative statements Providing confidentiality or anonymity to respondents

Designing Questionnaires Online resources E536http:// 004E536

Observations Observations - direct observations of behaviors Provide first hand account (ameliorates issues of self-reporting in questionnaires) Natural or controlled settings Ex – classroom vs. lab (child attachment studies) Structured or unstructured observations Ex – frequency counts vs. narrative record Detached or involved observers

Observations Inference Low inference - involves little if any inference on the observers’ part Student participation High inference - involves high levels of inference on the observers’ part Teacher effectiveness – PDE form 430

Observations Controlling observer effects Observer bias Training Inter-rater reliability (Cronbach’s alpha) Multiple observers Contamination - knowledge of the study influences the observation Training Targeting specific behaviors Observers do not know of the expected outcomes Observers are “blind” to which group is which

Observations Observer effects Halo effect - initial ratings influence subsequent ratings Hawthorne effect - increased performance results from awareness of being part of study Leniency - wanting everyone to do well Central Tendency - measuring in the middle Observer Drift - failing to record pertinent information

Tests PSSA Writing Assessment Purpose Domains Scoring Questions to Consider

Score Interpretation of Tests Know the norming group on a norm-referenced test Self-report information is not very reliable Inferences must be limited to what is being tested Grade equivalency scores should not be interpreted to indicate grade assignment Comparisons on a norm-referenced test can not be made to populations outside the given norming group.

Tests Standard scores - transformations of raw scores into easily interpreted standard metrics Z-score - how far away the data value is from the mean All standard scores are interpreted relative to the scores of others in the norming group Barbara’s SAT score of 700 is very, very good relative to the scores of the norm group because it is two (2) standard deviations above the mean (i.e., in the 99 th percentile)

Tests Standardized tests Uniform procedures for administration, scoring, and interpreting test scores Benefits and limitations? Validity and reliability?

Benefits Forces teachers to get students on track with specific standards; quality control; assures students getting skills Red flag for a poor teacher who is negligent Consistency across school districts; use scores as consistent measure Consistency of subject matter Accountability of teachers Find areas where students are lacking and teach to their deficits Force students to be accountable

Limitations “If test scores improve (as I believe they will) and parent and policy makers are pleased with the results, are children receiving a better education?” (Cuban, 1983, p. 696). Once we know the expectation we limit what we teach to just what is covered on the test; restricts content taught Teaching to the test Alignment between test and standard; is test valid? Which came first—the standard or the assessment? How to provide opportunities for remediation for those not meeting the standard without cheating the “smart” kids. Doesn’t measure creativity Leads to labels of students Bad test takers penalized

Validity & Reliability What are issues of validity and reliability that must be addressed in relation to standardized testing? PSSA issues

Standardized Testing Resources AERA statement about standardized tests AERA “Research Points” rch_Points/RP_Spring03.pdfhttp:// rch_Points/RP_Spring03.pdf Article by James Popham philosophy/38778.htmlhttp://school.familyeducation.com/educational-testing/educational- philosophy/38778.html Dept of Education guide for policy makers and educators