Welcome! A-Team Session 4 Fall 2007
Overview for Today 1 Minute Message review Outcomes Feedback Methodology Review Assessment Steps Sampling Validity/Reliability Analysis Ethics/Reporting Take Home
Session II Overview of Outcome Design Creating Outcomes Service, Learning, & Development Session III Critiquing Outcomes 3 M’s Intro: Assessment Steps Session I Outcomes & Overview What is Assessment Assessment Language Comp Model & Foundation SALDOs Session IV Assessment Steps (cont) Sampling, Ethics, Reporting Session V Assessment Plans Session VI Assessment Plan Presentation Session Evaluation Foundation Session Outcomes Demonstrate an understanding of “A” language & literature Define and plan an assessment project for your department Increase Technical Skills— Perseus, Report Writing, Qual/Quan Perform 1 + assessment projects Assessment Resource for department and team
One Minute Message Any further questions or concerns about methods for your assessment projects?
Outcomes Feedback Remember the measurable piece—need to specify how you will measure in your outcome(s) Manageable: think about timelines Meaningful: great job on this one! Ideal order of assessment versus reality
Methodology Review Activity Is the word or phrase in reference to qualitative or quantitative methods?
Assessment Steps cont: So now you have a method, what’s next? Let’s talk about sampling… What’s a sample? Why is determining your sample important? What determines your strategy?
Quantitative Sampling Random sampling: Why is this important to quantitative assessments? Convenience sampling Stratified sampling Sample size Response rates
Qualitative Sampling Purposeful sampling: Why is this important to quantitative assessments? Snowball sampling Stratified sampling Sample size-contextual; about data saturation
Quantitative Analysis Frequency distributions (percentages, counts) Descriptive statistics (means) Useful in tracking reports, perceived responses- on Likert measurements) Inferential statistics- t Tests examines whether or not relationships are statistically significant (Ex: looking at differences in self concept scores between different types of disabilities)
Qualitative Analysis Data is rich and complex (Interviews, focus groups, documents, observations) Coding-open, categorical, thematic Looking for themes to emerge
Validity and Reliability Reliability- refers to the extent to which an experiment, test, or any measuring procedure yields the same result on repeated trials Validity-refers to the degree which a study accurately reflects or assess the specific concepts that the researcher is attempting to measure
What does R & V look like in Qualitative and Quantitative processes? For Quantitative: R-concerned with consistency of measurements: Pilot tests V-support for generalization of results to general population: Use random sample, correct sample size, and strong response rate. For Qualitative: R-does it make sense; themes consistent with data collected: Member checking, peer examination, triangulation of data V-how applicable to other situations for reader (can’t generalize): Use rich thick data, use of multiple sites
Ethics Accuracy—are you measuring what you want to measure? Quality—projects should be conducted with rigorous methods and high level of skill Confidentiality/Anonymity Informed Consent Role Conflicts Ownership/Dissemination of data
Reporting and Using Results Consider your audience Language Familiarity with subject matter Time to read Ability to act Politics
Reporting and Using Results Format and Content Executive Summary Short Report (3-5 pages) Full Report Supplemental Reports Meaningful Title Recommendations Visual Aids Action/Implementation Plan
Final Thoughts Take Home Assignment One Minute Message