Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.

Slides:



Advertisements
Similar presentations
What “Counts” as Evidence of Student Learning in Program Assessment?
Advertisements

Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Evidence of Student Learning: What it is, Where to Find it, and How to Use It Dr. Renay M. Scott Interim Executive Vice President & Provost Owens Community.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
The Academic Assessment Process
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
1. What is it we want our students to learn?
FLCC knows a lot about assessment – J will send examples
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Presented by Jennifer Fager Xavier University for University of Wisconsin-Superior Enhancement Day 1/19/2011.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
The Third Year Review A Mini-Accreditation Florida Catholic Conference National Standards and Benchmarks.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Mapping Student Learning Outcomes
Let’s Get S.T.A.R.T.ed Standards Transformation and Realignment in Thompson.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Academic Assessment Accountability: Are we what we say we are? Program Improvement: How can we be even better? External audiences: SACS.
Learning Outcomes Made Easy Using the Best Tools Jeffrey D. Keith, Ph.D. J. Kelly Flanagan, Ph.D. Russell T. Osguthorpe, Ph.D. Danny R. Olsen, Ph.D. Tom.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie Johnson Assessment Specialist.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Improving the Institutional Effectiveness Process 1.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
LSAC Academic Assistance Training Workshop June 13 – 16, 2012 OUTCOMES ASSESSMENT – THE BASICS Janet W. Fisher Suffolk University Law School.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
WRITING LEARNING OUTCOMES AND MAPPING CURRICULUM UK Office of Assessment.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Program Assessment: Choosing Assessments Specify intended outcomes Measure whether students are meeting those outcomes Improve your program based on results.
The Lesson PlanningProcess The BLaST IU17 Liberty Fellowship September 20, 2011 Dr. Fran Macko
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Chapter 15: Getting Started on the Assessment Path Essential Issues to Consider.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
Assessing Learning in Programs: A Crash Course for Faculty Office of University Assessment Tara Rose, Director
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
+ Montgomery College Program Assessment Orientation Spring 2013.
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Assessment Basics PNAIRP Conference Thursday October 6, 2011
The assessment process For Administrative units
Consider Your Audience
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Teaching and Learning Commons West Virginia University
Assessing Academic Programs at IPFW
What to do with your data?
Student Learning Outcomes at CSUDH
Presentation transcript:

Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist

Completion Dates Not actively engaged in program level assessment Actively engaged in program level assessment Sept 2009 Program level student learning outcomes revised and/or updated Dec 2009Assessment strategy in place Jan-Mar 2010Assessment strategy implemented April 2010 Assessment results available for faculty reflection and action May 2010 First cycle completed and improvement plans submitted At least one cycle completed and improvement plans submitted September 2010First annual LEARNING Improvement awards announced May 2011Two cycles completedAt least two cycles completed August 2011SACS Compliance Audit begins September 2011Second annual LEARNING Improvement awards announced

 How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students?  What evidence do you have that students achieve your stated learning outcomes?  In what ways do you analyze and use evidence of student learning?  How do you ensure shared responsibility for student learning and for assessment of student learning?  How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning?  In what ways do you inform the public and other stakeholders about what and how well your students are learning?

 University Assessment ◦ Campus-wide assessment of student learning at the program level (e.g., General Education) ◦ University assessment is the primary charge of the Office of Assessment  University assessment is separate and distinct from evaluation of teaching effectiveness ◦ Evaluation of teaching effectiveness is the responsibility of departments/colleges  Assessment data are analyzed and reported only in the aggregate  You can’t assess everything all the time! ◦ Plan for assessment that is practical, given current time and resource constraints ◦ Assess 1 or 2 outcomes per year

 Assessment vs Evaluation ◦ Assessment requires us to “take a step back” from the interaction between student and teacher ◦ Grades are evaluations, generally not used for assessment  Team approach to evaluation ◦ Essentially a juried assessment in that more than one individual is scoring/evaluating ◦ A periodic, objective validation process of some kind required to ensure validity and reliability

 Three levels of assessment ◦ Course ◦ Program  Undergraduate majors/programs  General education program  Graduate majors/programs ◦ Institutional  Course, Program, and Institutional outcomes should be aligned, but are not identical

 Focused on curricular, environmental improvement  Formative and Summative, Direct and Indirect methods  Curriculum mapping, program improvement

 Focus on broad skills developed over time ◦ Not restricted to a single course or learning experience  Demonstrate acquisition of specific disciplinary/professional knowledge and skills necessary after graduation ◦ Ask: “What makes a graduate of the program able to function and learn in a specific discipline/profession after the degree?”  Measurable ◦ Confirmable through evidence

 Measures must be appropriate to outcomes ◦ Avoid cumbersome data-gathering ◦ Use both direct and indirect methods  Indirect methods measure a proxy for student learning  Direct methods measure actual student learning ◦ “Learning” = what students know (content knowledge) + what they can do with what they know

 Information that tells you something directly or indirectly about the topic of interest  Evidence is neutral -- neither “good” nor “bad” ◦ Requires context to be meaningful  Two types of assessment evidence ◦ Direct (“authentic”) and Indirect  Best practice calls for multiple methods

 Students show achievement of learning goals through performance of knowledge, skills: ◦ Scores and pass rates of licensure/certificate exams ◦ Capstone experiences  Individual research projects, presentations, performances  Collaborative (group) projects/papers which tackle complex problems ◦ Score gains between entry and exit ◦ Ratings of skills provided by internship/clinical supervisors ◦ Substantial course assignments that require performance of learning ◦ Portfolios

 Indirect methods measure proxies for learning ◦ Data from which you can make inferences about learning but do not demonstrate actual learning, such as perception or comparison data ◦ Surveys  Student opinion/engagement surveys  Student ratings of knowledge and skills  Employers and alumni, national and local ◦ Focus groups/Exit interviews ◦ Course grades ◦ Institutional performance indicators  Enrollment data  Retention rates, placement data  Graduate/professional school acceptance rates

 Create a visual map: ◦ Lay out program courses and learning outcomes (competencies) on a grid  Refer to examples (Handouts) ◦ Identify the courses at which each competency is:  Introduced  Reinforced  Emphasized

Basic Program Map Template OutcomesCourse #1; Baseline Assessment Course #2Course #3; Mid-Program Assessment Course #4Course #5; Capstone Assessment Outcome 1IRRER Outcome 2RRE Outcome 3IERE Outcome 4ERR I= Outcome is introduced; baseline, formative assessment R= Outcome is reinforced; formative assessment E = Outcome is emphasized; summative assessment

 Lets you discover the evidence you already have, such as: ◦ Institutional Research data ◦ Student Life data ◦ Exit Surveys (seniors) ◦ Alumni Surveys  Start with the obvious … but don’t stop there

 Institutional history ◦ “We’ve already done that, and it didn’t tell us anything!”  Territory; Politics ◦ Fighting for scant resources  Institutional policy/culture about sharing information ◦ “I don’t want somebody ‘policing’ my classrooms!”

 Does the evidence address student learning issues appropriate to the institution?  Does the evidence tell you something about how well the institution is accomplishing its mission and goals? ◦ The questions you have about student learning should guide your choice of appropriate existing evidence and identify gaps where a new type of evidence might be needed

 “This is a lot of work!” ◦ Use some sort of evidence inventory to help faculty understand how existing academic practices yield evidence ◦ Keep expectations reasonable, given limited time and resources  Remember: it is not necessary to gather all the evidence all of the time

 “How do I know you won’t use this against me?” ◦ Be consistent and firm in the message that assessment is not faculty evaluation, that results will only be reported in the aggregate ◦ Remember: Assessment results will link to allocation of resources, ideally through the strategic planning process

 Assessment is only a means to an end ◦ The purpose of assessment is continuous improvement of student learning  The assessment cycle is complete when assessment results have been used successfully for evidence-based decision making

 Articulate expectations in the form of student learning outcomes  Measure achievement of expectations  Collect and analyze data  Use evidence to improve learning  Assess the effectiveness of improvement

 Unit Assessment Plan Template (Handout) ◦ Use this template as a foundation for your unit assessment plan, revising and reshaping as necessary