Download presentation
Presentation is loading. Please wait.
Published byCamilla Moody Modified over 8 years ago
1
Field Guide to Living a Data-Rich Life Webinar 7: Designing or Choosing Instruments: Standardized Assessments Hosted by Durham’s Partnership for Children, with support from the North Carolina Partnership for Children Presented by Compass Evaluation and Research
2
Welcome Webinar 7 in our year long data quality series Last time (webinar 6), we discussed Use of observations for collecting data Examples of standardized protocols for collecting observation data Tips for collecting data if you don’t have a standardized protocol Importance of training Today, we will: Recap types of data Discuss the use of standardized data in program evaluations Discuss the difference between norm-referenced and criterion- referenced data Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
3
Recap: Start with the End in Mind Think about what data you will need at the end of your program To what extent has the program addressed “big picture” needs in the community? Issues such as student achievement, poverty, child welfare To what extent has the program addressed “little picture” needs in systems development and service delivery? Issues such as access to high quality, evidence-based programs or the affordability of care Helpful tools: Logic Model Theory of Change Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
4
Recap: Start with the End in Mind Use a logic model to identify outcomes Examples: improve affordability of high quality child care or enhance child development Identify specific measures for outcomes Improved affordability—change over time in the number and percent of lower income children in higher quality care Enhance development—change over time in the number and percent of children whose development is on or ahead of expectations Identify specific tools for specific measures Improved affordability—subsidy enrollment figures Enhanced development—child assessment instrument such as the Ages and Stages Questionnaire or the Brigance (there are many child assessment tools) Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
5
Recap: Instruments We’ve Covered So Far Surveys (interviews and focus groups) Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
6
Recap: Instruments We’ve Covered So Far Observations Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
7
Standardized Assessments An assessment (or test) completed by an individual Or, with young children, completed by an adult, based on the abilities and development a child demonstrates Provide 1 or 2 types of information: Norm-referenced scores Criterion-referenced scores Are not created equal! Read carefully for: Specific developmental domains and skills being assessed or tested How the instrument was developed and tested Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
8
Examples Language Assessment can include: General language Phonological awareness Vocabulary Receptive language Expressive language Verbal IQ Letter naming And others… Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
9
Examples Social-Emotional Assessment can include: Attachment Behavior problems Emotional regulation Social skills Self-regulation Internalizing or externalizing symptoms And others… Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
10
Pros and Cons of Standardized Assessments Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research ProsCons Establishes expectations or standards for performance Many lead to “teaching to the test“ in order to meet standards Standardized approach allows meaningful comparison of scores over time Process for setting the standards must be meaningful and informed for all groups potentially taking the test Objective assessment—it is difficult to influence the results May not capture everything that is important about an individual’s abilities or growth Relatively efficient means of assessing abilities and skills Test conditions the day of the test may influence performance
11
Norm-Referenced Scores Examine an individual’s abilities relative to comparable individuals Same age Same demographic groups Requires a norming population Data collected for the purposes of establishing “typical” development or abilities Produces scores along a Bell Curve Can compare an individual to the skills and abilities of the norming population Example: percentile ranking—does an individual perform at or above 75% of “typical” individuals? 80% 90% Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Huitt, W. (1996). Measurement and evaluation: Criterion- versus norm-referenced testing. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved [date], from http://www.edpsycinteractive.org/topics/measeval/crnmref.html http://www.edpsycinteractive.org/topics/measeval/crnmref.html
12
Bell Curve Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
13
Criterion-Referenced Scores Examine individual development or abilities according to pre- established benchmarks or lists of specific skills Bell Curve for performance may or may not exist Example: proficiency testing—can an individual perform a specific set of tasks? Determine proficiency regardless of the performance of other individuals Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Huitt, W. (1996). Measurement and evaluation: Criterion- versus norm-referenced testing. Educational Psychology Interactive. Valdosta, GA: Valdosta State University. Retrieved [date], from http://www.edpsycinteractive.org/topics/measeval/crnmref.html http://www.edpsycinteractive.org/topics/measeval/crnmref.html
14
Example: Provider Training Series Outcome: improved teacher knowledge Measure: percent of providers who score at or above 85% on a knowledge assessment Tool: teacher knowledge assessment Criterion-referenced approach: 85% was the score identified as proficient Only one assessment necessary to establish an outcome finding Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
15
Example: Provider Training Series Outcome: improved teacher knowledge Measure: percent of providers who score at or above the 90 th percentile of a random sample of providers who take the same assessment Tool: teacher knowledge assessment Norm-referenced approach: Comparing program participants to a random group of providers Do program participants exhibit “average” or better performance? One assessment necessary to establish an outcome finding Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
16
Example: Provider Training Series Outcome: improved teacher knowledge Measure: percent of providers who improve their knowledge between pre- and post-assessment Tool: teacher knowledge assessment Tends to be a criterion-referenced approach But: There may not be an indication of “proficiency” or acceptable score There may not be comparisons to other providers Two assessments necessary to establish an outcome finding Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
17
Recap Criterion Measure: percent of providers who score at or above a pre-defined standard on an assessment Publisher has used a random sample of individuals to establish age- or skill- proficiency Ensure the published assessment is identified as “criterion referenced” Norm Measure: percent of providers who score at or above a pre- specified rating of a random sample of providers who take the same assessment Publisher has used a random sample of individuals to identify “average” or typical performance Ensure the published assessment is identified as “norm referenced” Gains Measure: percent of providers who improve their knowledge between pre- and post-assessment Can be conducted without reference to a reference population Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
18
Interpreting Criterion-Referenced Scores Focus is on content and proficiency Assessment developed with a focus on a threshold level of skills and abilities (as compared to a typical range of skills and abilities) Common categories for interpretation are Failing, In Progress, Proficient, and Advanced Technical information should describe the approach for setting standards Standards may not be sensitive to the needs of specific sub-groups Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
19
Interpreting Norm-Referenced Scores Focus is on performance relative to other test-takers Assessment developed to identify a typical range of skills and abilities (as compared to a threshold level of skills and abilities that are necessary to be considered proficient) Can generate information such as percentile ranking, T-score, standard scores, scale scores, stanines Technical information should describe the approach for identifying the norming population Norming population may or may not contain individuals that are similar to your specific population Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
20
Mean and Standard Deviation Mean: average score Sum up all scores and divide by the number of scores to get the average Includes extremely high and extremely low scores (i.e., the outliers) Standard Deviation: how varied are the scores around the mean? +/- 3 to 4 standard deviations account for all of the test-takers 68% of test takers fall within +/- 1 standard deviation 96% of test takers fall within +/- 2 standard deviations Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
21
There can be wider and narrower distributions of scores around a mean Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Narrower Distribution Wider Distribution Mean Score (range of 1-10) 77 Total number of scores 55 Scores810 64 77 8 64 Area under the curve is the same
22
Interpreting Norm-Referenced Scores Raw score: number or percent correct of all scorable items Common to use a percentile ranking: percent of scores that are at or below your score 90 th percentile: you have scored above 90% of the comparison group of test- takers Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
23
Training is Critical in Using Published Assessments Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Look or ask for the publisher’s Technical Manual Read, re-read, and get training on: Scoring procedures How to generate a raw score for each individual What types of score conversions are possible Example: Percentile ranking Advanced: T-score, Z-score (or Standard score), Scaled score, Stanine, Normal Curve Equivalent Interpreting the scores: what can you say once you have the raw data and score conversions?
24
Recap Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Standardized assessments are supported with data from a reference, or norming, population Assess what an individual knows or can do Can be criterion-referenced or norm-referenced Criterion-referenced: designed to test for specific skills Level of proficiency determined by setting standards for knowledge and skills Norm-referenced: designed to compare skills and abilities to those of a typical population Position relative to peers Performance at, above, or below average skills expressed by peers Training is critical for scoring and interpreting scores Technical Manual Publisher-provided training
25
Questions? Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research
26
Next Steps Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research The webinar series is designed to help you create a plan to capture, manage, analyze, and use high quality data: July 27, 2016: Choosing and Using Sampling in Your Evaluation. August 30, 2016: Best Practices in Data Collection and Management. September 28, 2016: Finding the Value in Evaluation: Cultural Relativity and Bias. October 25, 2016: Using Data: Effective Reporting and Grant Writing. November 30, 2016: Thinking Beyond Your Program: Evaluating Systems and Collaborations.
27
Additional questions? Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Taylor Webber-Fields Durham’s Partnership for Children Phone: 919-403-6960 (extension 219) Email: taylor@dpfc.nettaylor@dpfc.net
28
Additional questions? Standardized Assessments Field Guide to Living a Data-Rich Life Compass Evaluation and Research Feel free to contact me, too! Sarah Heinemeier Compass Evaluation and Research Phone: 919-308-5019 Email: sarahhei@compasseval.com
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.