Presentation is loading. Please wait.

Presentation is loading. Please wait.

Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction.

Similar presentations


Presentation on theme: "Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction."— Presentation transcript:

1 Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction and Assessment

2 Actionable Assessment of Academic Programs: Principles and Practices for Usable Results Jo Beld Professor of Political Science Director of Evaluation & Assessment Assessment Showcase April 17, 2012

3 Agenda Principles: Conceptual frameworksPrinciples: Conceptual frameworks Practices: Making assessment usefulPractices: Making assessment useful Practices: Engaging facultyPractices: Engaging faculty Politics and policy: The bigger picturePolitics and policy: The bigger picture

4 Conceptual frameworks Utilization-focused assessment (Patton, 2008): Focus on intended uses by intended users

5 Conceptual frameworks Backward design (Wiggins & McTighe, 2005): “Beginning with the end in mind”

6 Conceptual frameworks Traditional assessment design: Choose an assessment instrument Gather and summarize evidence Gather and summarize evidence Send a report to someone

7 Conceptual frameworks Backward assessment design: Identify intended users and uses Define and locate the learning Define and locate the learning Choose assessment approach Choose assessment approach

8 Conceptual frameworks Once you’ve defined your outcomes, start planning your assessment project here

9 Making assessment useful Studio art major Developed evaluation form for senior exhibit that doubles as assessment instrumentDeveloped evaluation form for senior exhibit that doubles as assessment instrument Addressed disconnect between student and faculty criteria for artistic excellenceAddressed disconnect between student and faculty criteria for artistic excellence Revised requirements for the majorRevised requirements for the major Refocused common foundation-level coursesRefocused common foundation-level courses

10 Making assessment useful Chemistry major Using ACS exam as final in Chem 371: Physical ChemUsing ACS exam as final in Chem 371: Physical Chem Students outperform national average and do well in kinetics despite limited coverage in courseStudents outperform national average and do well in kinetics despite limited coverage in course Chem 371 being retooled to focus on thermodynamics and quantum mechanicsChem 371 being retooled to focus on thermodynamics and quantum mechanics

11 Making assessment useful History major % “exemplary” ability to… % “exemplary” ability to… Gathering evidence in 2011-12 (voluntarily!) to examine sequencing in the majorGathering evidence in 2011-12 (voluntarily!) to examine sequencing in the major Examining ability to understand and work with historiography in new intermediate seminars for majorExamining ability to understand and work with historiography in new intermediate seminars for major First-Year seminars Senior seminars Differentiate primary from secondary source 45%91% Use primary source to develop argument 19%60%

12 Making assessment useful Statistics concentration Collaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essaysCollaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essays Instructor adjusted teaching in response to findingsInstructor adjusted teaching in response to findings

13 Making assessment useful Management Studies concentration Quiz scores: Teams outperform best individual studentsQuiz scores: Teams outperform best individual students Course evaluations: Students believe they learned “much” or “exceptional amount” by working together in teams (73%)Course evaluations: Students believe they learned “much” or “exceptional amount” by working together in teams (73%) Team-based learning being extended to other coursesTeam-based learning being extended to other courses Mean Results – Management Studies 251 Course Quizzes Highest individual scoreTeam quiz score Section A4.364.79 Section B4.394.74

14 Making assessment useful Interdisciplinary programs Collaboratively developed assessment questionnaireCollaboratively developed assessment questionnaire Considering direct assessment of interdisciplinary proficiency using common rubric with program-level portfolioConsidering direct assessment of interdisciplinary proficiency using common rubric with program-level portfolio Will consider whether all programs should have capstone course or experienceWill consider whether all programs should have capstone course or experience

15 Making assessment useful Benefits for individual courses: Setting priorities for content/instructionSetting priorities for content/instruction Revising/expanding assignmentsRevising/expanding assignments Clarifying expectations for studentsClarifying expectations for students Enhancing “scaffolding”Enhancing “scaffolding” Piloting or testing innovationsPiloting or testing innovations Affirming current practicesAffirming current practices

16 Making assessment useful Benefits for the program as a whole: Strengthening program coherenceStrengthening program coherence Sending consistent messages to studentsSending consistent messages to students Revising program requirementsRevising program requirements Extending productive pedagogiesExtending productive pedagogies Affirming current practicesAffirming current practices

17 Making assessment useful More program benefits: Telling the program’s story to graduate schools and employersTelling the program’s story to graduate schools and employers Enhancing visibility to disciplinary and inter-disciplinary associationsEnhancing visibility to disciplinary and inter-disciplinary associations Supporting grant applicationsSupporting grant applications Meeting requirements for specialized accreditationMeeting requirements for specialized accreditation

18 Making assessment useful Benefits for faculty members: Efficiencies in curriculum and instructionEfficiencies in curriculum and instruction Confidence that what you’re doing is workingConfidence that what you’re doing is working Collaboration and collegiality within and across departmentsCollaboration and collegiality within and across departments Professional development for early facultyProfessional development for early faculty Better integration of adjunct facultyBetter integration of adjunct faculty

19 Making assessment useful How might assessment be useful for an individual course, your program as a whole, or your faculty colleagues?

20 Engaging faculty Consider your colleaguesConsider your colleagues De-mystify assessmentDe-mystify assessment Reduce costs and enhance rewardsReduce costs and enhance rewards

21 Engaging faculty Consider your colleagues: Faculty roles, commitments, and disciplinary identities offer both incentives and disincentives to engage assessment

22 Engaging faculty Your colleagues as practitioners of their disciplines: Studio Art IncentivesDisincentives Art is publicly displayed and evaluated Studio artists work alone rather than collaboratively Discipline values creativity, risk-taking and inventiveness “Assessment” connotes “counting”

23 Engaging faculty Your colleagues as practitioners of their disciplines: Chemistry IncentivesDisincentives Habit of collecting and interpreting observable evidence More than two variables is scary Collaborative inquiry is practiced extensively Less experience with qualitative inquiry

24 Engaging faculty Your colleagues as practitioners of their disciplines: Political Science IncentivesDisincentives Embracing multiple types of scholarship Awareness of limits of social scientific research Embracing multiple methods of inquiry View of assessment as a policy innovation that will eventually “go away”

25 Engaging faculty Demystifying assessment: Not scholarship of teaching and learningNot scholarship of teaching and learning Not individual teaching evaluationNot individual teaching evaluation Not student satisfaction dataNot student satisfaction data Not necessarily quantitativeNot necessarily quantitative Not rocket science (unless that’s what you teach!)Not rocket science (unless that’s what you teach!)

26 Engaging faculty “Direct” Assessment” “Direct” Assessment Evidence of what students actually know, can do, or care about “Indirect” Assessment Evidence of learning-related experiences or perceptions

27 Engaging faculty Common direct assessment “artifacts”  Theses, papers, essays, abstracts  Presentations and posters  Oral or written examination items  Responses to survey or interview questions that ask for examples of knowledge, practice, or value

28 Engaging faculty Common indirect assessment “artifacts”  Course mapping, course-taking patterns or transcript analysis  Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences  Reflective journals

29 Engaging faculty But wait!! Aren’t we observing student work all the time anyway? What’s the difference between grading and assessment?

30 Engaging faculty Gradingsummarizes many outcomes for one student Assessmentsummarizes one outcome for many students

31 Engaging faculty The purpose of assessment is to provide systematic, summarized information about the extent to which a group of students has realized one or more intended learning outcomes

32 Engaging faculty Reducing the costs of assessment Use what you’ve already got Use what you’ve already got Borrow freely Borrow freely Integrate assessment into work you are already doing Integrate assessment into work you are already doing Share the work broadly Share the work broadly Limit your agenda Limit your agenda

33 Engaging faculty Reaping the rewards of assessment Address questions that matter to faculty Address questions that matter to faculty Build in collaboration Build in collaboration Pair direct with indirect methods Pair direct with indirect methods Choose approaches that “multi-task” Choose approaches that “multi-task” Dedicate time for discussion and application Dedicate time for discussion and application

34 Engaging faculty Plan intentionally for use of results Borrow strategies from past successes in collective departmental actionBorrow strategies from past successes in collective departmental action Focus reporting on planned actions, not on the evidence itselfFocus reporting on planned actions, not on the evidence itself Weight Watchers, not the Biggest LoserWeight Watchers, not the Biggest Loser Dedicate time and resources for actionDedicate time and resources for action

35 Engaging faculty What can you do in your program to:  Link assessment to faculty identities and incentives  De-mystify assessment  Reduce costs OR OR  Enhance benefits?

36 The bigger picture

37 Accreditation by a federally-recognized accreditation agency is required for access to federal student aid Recognition requires accreditors to evaluate whether an institution maintains clearly- specified educational objectives and is successful at meeting them.

38 The bigger picture Guiding values of new HLC criteria: “A commitment to assessment would mean assessment at the program level that proceeds from clear goals, involves faculty at all points in the process, and analyzes the assessment results; it would also mean that the institution improves its programs…on the basis of those analyses.”

39 The bigger picture

40 Table Talk – Questions for Dr. Beld Table Facilitators (a.k.a. FLC members) Paul Blowers Chemical & Environmental Engineering Eliud Chuffe Spanish & Portuguese Faiz Currim Management Information Systems Wendy Davis Animal Science Ryan Foor Agricultural Education Herman Gordon Cellular & Molecular Medicine Christopher Johnson Educational Technology Amy Kimme-Hea English Carl Maes College of Optical Sciences Katrina Miranda Chemistry & Biochemistry John Murphy Pharmacy Practice & Science Teresa Polowy Russian & Slavic Studies Claudia Stanescu Physiology Hal Tharp Electrical & Computer Engineering Deb Tomanek Office of Instruction & Assessment

41 Assessment & Research in Student Affairs Angela Baldasare, Ph.D. Director, Divisional Assessment & Research baldasar@email.arizona.edu

42 Starting from Scratch: From Outcomes to Assessment Activities Aurelie Sheehan, Ph.D. Director, Creative Writing asheehan@email.arizona.edu

43 Unlocking Assessment Linking Findings to Outcomes David Cuillier, Ph.D. Director, University of Arizona School of Journalism cuillier@email.arizona.edu

44 Our Process 1. Define learning outcomes 2. Measure at overall program level 3. Link findings specifically to outcomes 4. Make adjustments (report & faculty retreat) 5. Feedback loop – see if it worked

45 EXAMPLES…

46 Outcome #10: Technology MEASURE: 2009 survey of multimedia knowledge On a scale of 0-9 Photoshop6.24 Final Cut2.59 Dreamweaver.76 Soundslides.47 Audacity.35 CSS.35 Flash.24 FINDING: Need more Soundslides/Audacity training ADJUSTMENT: Created multimedia class in 2010 FEEDBACK LOOP: To survey students again 2012

47 Outcome #9: Writing MEASURE: Survey of intern supervisors FINDING: Positive trajectory on student writing ADJUSTMENT: Keep doing what we’re doing

48 Lessons learned 1. Faculty buy-in through program-level view 2. One person responsible 3. Model assessment plans (e.g., Elon) 4. Explicit reporting – state it clearly 5. Focus on findings, not methods

49 Program Assessment: Raw Data to Findings Ingrid Novodvorsky, Ph.D. Director, College of Science Teacher Preparation Program novod@email.arizona.edu

50 TEACHER PREPARATION PROGRAM Student Learning Outcomes—Core Understandings (These describe attributes of a well-prepared science teacher.) Core UnderstandingCourse Where Assessed Science content understandingSubject Methods Course, STCH 410, STCH 496a Understanding of adolescent development STCH 310, STCH 496a Coherent curriculum decisionsSTCH 420, STCH 496a Productive learning environmentSTCH 410, STCH 496a Clear communicationSTCH 410, STCH 420, STCH 496a Acknowledgement of teaching complexity STCH 410, STCH 420, STCH 496a Reflective practiceSTCH 410, STCH 420, STCH 496a

51 TEACHER PREPARATION PROGRAM Student Learning Outcomes—Core Understandings (These describe attributes of a well-prepared science teacher.) Core UnderstandingCourse Where Assessed Science content understandingSubject Methods Course, STCH 410, STCH 496a Understanding of adolescent development STCH 310, STCH 496a Coherent curriculum decisionsSTCH 420, STCH 496a Productive learning environmentSTCH 410, STCH 496a Clear communicationSTCH 410, STCH 420, STCH 496a Acknowledgement of teaching complexity STCH 410, STCH 420, STCH 496a Reflective practiceSTCH 410, STCH 420, STCH 496a

52 TEACHER PREPARATION PROGRAM STCH 410 Mentor Teacher Evaluation Interns are assessed on a three-point scale on five Core Understandings.

53 TEACHER PREPARATION PROGRAM STCH 410 Mentor Teacher Evaluation

54 TEACHER PREPARATION PROGRAM Student Learning Outcomes—Core Understandings (These describe attributes of a well-prepared science teacher.) Core UnderstandingCourse Where Assessed Science content understandingSubject Methods Course, STCH 410, STCH 496a Understanding of adolescent development STCH 310, STCH 496a Coherent curriculum decisionsSTCH 420, STCH 496a Productive learning environmentSTCH 410, STCH 496a Clear communicationSTCH 410, STCH 420, STCH 496a Acknowledgement of teaching complexity STCH 410, STCH 420, STCH 496a Reflective practiceSTCH 410, STCH 420, STCH 496a

55 TEACHER PREPARATION PROGRAM Student Teaching Exit Evaluation University Supervisor/Mentor Teacher Student Teachers are assessed on a four-point scale on each Core Understanding.

56 2003-04 (N=12) 2004-05 (N= 7) 2005-06 (N=13) 2006-07 (N=10) 2007-08 (N=15) 2008-09 (N=15) 2009-10 (N=11)2010-11 (N=13) Content understanding3.403.453.50 3.573.773.593.75 Understanding of adolescent development 3.473.583.623.433.383.363.323.58 Coherent curriculum decisions3.673.783.713.483.403.513.393.43 Productive learning environment3.633.833.693.653.33 3.453.46 Clear communications3.583.893.743.793.783.833.863.83 Acknowledgement of teaching complexity 3.583.893.793.803.533.473.363.58 Reflective practice3.663.813.923.723.813.763.733.74

57 2003-04 (N=12) 2004-05 (N= 7) 2005-06 (N=13) 2006-07 (N=10) 2007-08 (N=15) 2008-09 (N=15) 2009-10 (N=11)2010-11 (N=13) Content understanding3.403.453.50 3.573.773.593.75 Understanding of adolescent development 3.473.583.623.433.383.363.323.58 Coherent curriculum decisions3.673.783.713.483.403.513.393.43 Productive learning environment3.633.833.693.653.33 3.453.46 Clear communications3.583.893.743.793.783.833.863.83 Acknowledgement of teaching complexity 3.583.893.793.803.533.473.363.58 Reflective practice3.663.813.923.723.813.763.733.74

58 Program Assessment Raw Data to Findings Questions? TEACHER PREPARATION PROGRAM

59 Live Long & Assess! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction and Assessment Us es


Download ppt "Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction."

Similar presentations


Ads by Google