Sociology Outcomes Assessment

Slides:



Advertisements
Similar presentations
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Advertisements

What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
The Research Consumer Evaluates Measurement Reliability and Validity
Reliability for Teachers Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Reliability = Consistency.
Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
A Multi-method Approach: Assessment of Basic Communication Cheryl E Drout, Ph.D. SUNY-Fredonia.
The Academic Assessment Process
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
FLCC knows a lot about assessment – J will send examples
Assessment Report Department of Psychology School of Science & Mathematics D. Abwender, Chair J. Witnauer, Assessment Coordinator Spring, 2013.
Assessed: 5 Cycles 2006, 2009, 2010, 2011, 2013.
Classroom Assessments Checklists, Rating Scales, and Rubrics
September 11, Assessment Primer A. Defining the Assessment Loop B. Establishing Departmental-level learning goals and objectives C. Assessment Terminology.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Preparing Your ELCC Assessments for NCATE Accreditation Missouri Professors of Educational Administration Conference October 10, 2008.
KKC’s Assessment Academy Project Update August 2012.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
An Institutional Writing Assessment Project Dr. Loraine Phillips Texas A&M University Dr. Yan Zhang University of Maryland University College October 2010.
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
OBJECTIVE INTRODUCTION Emergency Medicine Milestones: Longitudinal Interrater Agreement EM milestones were developed by EM experts for the Accreditation.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Designing Quality Assessment and Rubrics
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
4/16/07 Assessment of the Core – Humanities with Writing Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Copyright © 2009 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 47 Critiquing Assessments.
4/16/07 Assessment of the Core – Quantitative Reasoning Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Learning Goals, Objectives, & Curriculum Mapping Linking curriculum & pedagogy to demonstrations of knowledge, skills, and abilities Dr. Marjorie Dorimé-Williams.
EVALUATING EPP-CREATED ASSESSMENTS
Making assessment in PhD programs more useful for faculty and students
Robert P. King Department of Applied Economics April 14, 2017
Classroom Assessments Checklists, Rating Scales, and Rubrics
MBA CLASS CONTENT AND TEACHING EFFECTIVENESS
Presented by Deborah Eldridge, CAEP Consultant
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
DATA COLLECTION METHODS IN NURSING RESEARCH
Assessment Planning and Learning Outcome Design Dr
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! S. Berlin.
Classroom Assessment A Practical Guide for Educators by Craig A
Consider Your Audience
Closing the Assessment Loop
KKC’s Assessment Academy Project Update
Alternative Assessment (Portfolio)
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment Validity And Bias in Assessment.
General Education Assessment Subcommittee Report
General Education Assessment
Institutional Learning Outcomes Assessment
Institutional Effectiveness USF System Office of Decision Support
Putting Knowledge into Practice
Rubrics.
Advanced Program Learning Assessment
Effective Use of Rubrics to Assess Student Learning
ACCJC Standards Adopted june 2014.
Program Assessment Plans Step by Step
Randy Beach, South Representative Marie Boyd, Chaffey College
Mapping the ACRL Framework and Nursing Professional
Assessing Academic Programs at IPFW
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Assessment Literacy: Test Purpose and Use
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Tri-College Teagle Foundation Systematic Improvement Grant Retreat: Sustainable Department-Level Assessment of Student Learning September 11, 2009.
Student Learning Outcomes at CSUDH
Curriculum Coordinator: Patrick LaPierre February 1, 2016
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Welcome to ‘Assessment of the Capabilities’
Presentation transcript:

Sociology Outcomes Assessment Wade Cole & Claudia Geist Former & Current Director of Graduate Studies Department of Sociology

Outline What did we do? The review process & procedures What did we find? A summary of our analysis and results What’s next? Recommendations for the future

Sociology ELOs ELO 1: Understand what sociology is, as a social science discipline. ELO 2: Utilize sociological theories to guide research and improve understanding of social phenomena and human behavior. ELO 3: Learn to use a variety of research methods as a means of understanding the social world and human interaction. ELO 4: Apply sociological and social-science perspectives to the understanding of real-world problems or topics (e.g., issues of diversity, health, globalization, crime & law, sustainability). ELO 5: Communicate effectively about sociological issues, making well-organized arguments supported by relevant evidence. We considered three expected learning outcomes in our Spring 2018 report.

Process First, we identified courses from which to sample student artifacts. ELO 2 Upper-division required Upper-division elective ELO 4 Lower-division required ELO 5 For each ELO, we solicited student artifacts from one required course and one elective course. Most sociology courses are upper-division.

Process The DUGS contacted instructors for these courses, asking them to submit six student artifacts from assignments they used to evaluate the stated ELO. Instructors submitted two artifacts they deemed high quality, two intermediate quality, and two low quality. Artifacts consisted of written assignments, final papers, and essays. After anonymizing the artifacts, they were distributed to Committee members for their evaluations. From each of these six courses, we asked instructors to submit six student artifacts: 2 they deemed high quality, 2 intermediate quality, and 2 low quality. Artifacts included written assignments, final papers, and essays. I anonymized the artifacts and submitted them to committee members for review. Three people independently rated each artifact.

Process Evaluators were provided with a scoring sheet for rating the artifacts. Scoring sheets asked evaluators to consider how well the assignment assesses the specified ELO. For each of the six artifacts, evaluators rated ELO mastery using the following four-point scale: 0 = Poor (There is no evidence that the ELO was addressed) 1 = Emerging/Low (Initial but substandard effort to address ELO) 2 = Competent/Mid (ELO achieved with reasonable proficiency) 3 = Exemplary/High (Artifact demonstrates mastery of the ELO) I developed a scoring sheet for rating the artifacts. Raters were first asked to evaluate how well the assignment itself assesses the specified learning outcome. Then, for each artifact, evaluators rated master of the corresponding ELO using a 4-point scale.

Raters plus instructor Raters plus instructor† Analysis Interrater agreement analyses Kappa interrater agreement scores   Raters only Raters plus instructor Raters plus instructor† ELO 2 .120* .246*** .249** ELO 4 .497** .498*** n/a ELO 5 .130 .243*** .327*** After compiling these scores, we conducted a series of interrater agreement analyses to determine if instructors and raters evaluated artifacts similarly. Here’s one example. Scores vary from 0 (the level of agreement is one would expect to observe by chance) and 1 (perfect agreement). Intermediate values can be interpreted as follows: .00 – .20 Slight .21 – .40 Fair .41 – .60 Moderate .61 – .80 Substantial .81 – 1.0 Almost perfect Ratings for ELOs 2 and 5 showed slight to fair levels of agreement Scores for ELO 4 are somewhat better, in the moderate range None, however, reach substantial levels of agreement Of course, these analyses are based on a small sample of three raters per ELO, plus the instructor’s own assessment * p<.05, ** p<.01, *** p<.001 (two tailed). †For this analysis, ratings of 0 (poor) were recoded as 1 (emerging/low) to render them comparable with instructors’ assessment of high, intermediate, and low artifacts. No coefficient is estimated for ELO 4 because none of the artifacts received a rating of 0.

Raters plus instructor† Analysis Disaggregated kappa interrater agreement scores   Rating Raters only Raters plus instructor† ELO 2 Poor .768*** n/a Low .518** .484*** Mid -.178 .063 High .000 .188 ELO 4 .822** .880*** .250 .395 .345* ELO 5  -.091 .308* -.037 .121 .182 .348** One benefit of kappa scores is that they can be disaggregated to determine where there is more or less agreement. This table reports disaggregated kappa scores. Consider ELO 2. Raters (and instructors) showed substantial agreement on artifacts deemed poor or low, but they did not agree as to what constitutes an intermediate or high degree of mastery. Likewise, raters for ELO 4 showed very high levels of agreement regarding low-level work, but they differed in their assessment of mid- and high-quality artifacts. Agreement was weakest for ELO 5, where raters showed a fair amount of agreement for artifacts deemed “low” but disagreed extensively across all other ratings.

Summary of Findings Interrater agreement for ELO 4 was acceptable, but raters tended to agree that students were not very successful in achieving this outcome. ELO 4: Apply sociological and social-science perspectives to the understanding of real-world problems or topics. Agreement was much weaker for ELOs 2 and 5. ELO 2: Utilize sociological theories to guide research and improve understanding. ELO 5: Communicate effectively about sociological issues.

Recommendations Faculty should discuss how ELOs are interpreted and assessed, with the goal of generating more agreement across instructors, courses, and raters. There seems to be tacit agreement regarding what constitutes low-quality work, but it is more difficult to identify work that is proficient or exemplary. We may want to develop rubrics for evaluating student mastery of ELOs, to improve interrater reliability and to guide instructors as they design student assessments.

Recommendations We should revisit the ELOs with an eye toward distinguishing them more clearly from one another. What does it mean to “utilize sociological theories” (ELO 2) and “apply sociological perspectives” (ELO 4)? How are theories and perspectives different? In qualitative feedback, reviewers noted that assignments designed to assess ELO 2 were not always well-suited to the task, precisely because they asked students to consider perspectives and concepts rather than theories per se.

Recommendations We plan to standardize the department’s assessment procedures and methodologies, which will permit longitudinal analyses in the future. It is essential to compile and analyze comparable data over time to establish trends and track progress (or the lack thereof). The Committee will reanalyze ELOs 1 and 3 this spring using the framework established in this report.

Recommendations The Committee might also consider abandoning the practice of soliciting “stratified samples” of student artifacts from instructors. This strategy primes reviewers to expect patterned variation in the artifacts they evaluate. Raters know that they will be given two examples each of high-, intermediate-, and low-quality work for review, and this knowledge may bias their own independent assessments of the artifacts. If anything, this approach may overstate levels of interrater agreement.