Download presentation
Presentation is loading. Please wait.
Published byAmberlynn Spencer Modified over 9 years ago
1
Welcome! A-Team Session 4 Fall 2007
2
Overview for Today 1 Minute Message review Outcomes Feedback Methodology Review Assessment Steps Sampling Validity/Reliability Analysis Ethics/Reporting Take Home
3
Session II Overview of Outcome Design Creating Outcomes Service, Learning, & Development Session III Critiquing Outcomes 3 M’s Intro: Assessment Steps Session I Outcomes & Overview What is Assessment Assessment Language Comp Model & Foundation SALDOs Session IV Assessment Steps (cont) Sampling, Ethics, Reporting Session V Assessment Plans Session VI Assessment Plan Presentation Session Evaluation Foundation Session Outcomes Demonstrate an understanding of “A” language & literature Define and plan an assessment project for your department Increase Technical Skills— Perseus, Report Writing, Qual/Quan Perform 1 + assessment projects Assessment Resource for department and team
4
One Minute Message Any further questions or concerns about methods for your assessment projects?
5
Outcomes Feedback Remember the measurable piece—need to specify how you will measure in your outcome(s) Manageable: think about timelines Meaningful: great job on this one! Ideal order of assessment versus reality
6
Methodology Review Activity Is the word or phrase in reference to qualitative or quantitative methods?
7
Assessment Steps cont: So now you have a method, what’s next? Let’s talk about sampling… What’s a sample? Why is determining your sample important? What determines your strategy?
8
Quantitative Sampling Random sampling: Why is this important to quantitative assessments? Convenience sampling Stratified sampling Sample size Response rates
9
Qualitative Sampling Purposeful sampling: Why is this important to quantitative assessments? Snowball sampling Stratified sampling Sample size-contextual; about data saturation
10
Quantitative Analysis Frequency distributions (percentages, counts) Descriptive statistics (means) Useful in tracking reports, perceived responses- on Likert measurements) Inferential statistics- t Tests examines whether or not relationships are statistically significant (Ex: looking at differences in self concept scores between different types of disabilities)
11
Qualitative Analysis Data is rich and complex (Interviews, focus groups, documents, observations) Coding-open, categorical, thematic Looking for themes to emerge
12
Validity and Reliability Reliability- refers to the extent to which an experiment, test, or any measuring procedure yields the same result on repeated trials Validity-refers to the degree which a study accurately reflects or assess the specific concepts that the researcher is attempting to measure
13
What does R & V look like in Qualitative and Quantitative processes? For Quantitative: R-concerned with consistency of measurements: Pilot tests V-support for generalization of results to general population: Use random sample, correct sample size, and strong response rate. For Qualitative: R-does it make sense; themes consistent with data collected: Member checking, peer examination, triangulation of data V-how applicable to other situations for reader (can’t generalize): Use rich thick data, use of multiple sites
14
Ethics Accuracy—are you measuring what you want to measure? Quality—projects should be conducted with rigorous methods and high level of skill Confidentiality/Anonymity Informed Consent Role Conflicts Ownership/Dissemination of data
15
Reporting and Using Results Consider your audience Language Familiarity with subject matter Time to read Ability to act Politics
16
Reporting and Using Results Format and Content Executive Summary Short Report (3-5 pages) Full Report Supplemental Reports Meaningful Title Recommendations Visual Aids Action/Implementation Plan
17
Final Thoughts Take Home Assignment One Minute Message
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.