UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Capstone Assessment An Introduction Office of Assessment and Accreditation Indiana State University.
Degree SLO Workshop 17 Sept Workshop Learning Outcomes Faculty who participate in the Workshop will: – Know the accreditation requirements and timeline.
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
Best Practices in Assessment, Workshop 2 December 1, 2011.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
Making Your Assessments More Meaningful Flex Day 2015.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes Workshop Strand 3.
Learning Outcomes at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Program-level Outcomes Assessment RSC FORENSIC SCIENCE.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
David Gibbs and Teresa Morris College of San Mateo.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Mapping Student Learning Outcomes
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Learning Outcomes Made Easy Using the Best Tools Jeffrey D. Keith, Ph.D. J. Kelly Flanagan, Ph.D. Russell T. Osguthorpe, Ph.D. Danny R. Olsen, Ph.D. Tom.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
September 11, Assessment Primer A. Defining the Assessment Loop B. Establishing Departmental-level learning goals and objectives C. Assessment Terminology.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
WRITING LEARNING OUTCOMES AND MAPPING CURRICULUM UK Office of Assessment.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Texas Tech University PROGRAM-LEVEL ASSESSMENT WORKSHOP WRAP-UP SESSION FRIDAY, NOVEMBER 11,
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Stetson University welcomes: NCATE Board of Examiners.
Unit Meeting Feb. 15, ◦ Appreciative Inquiry Process-BOT Steering Committee and Committee Structure. ◦ Four strategies identified from AIP: Each.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
MAY 1, 2014 THE LOOK AND FEEL OF TASKSTREAM. PERCEPTION OF ASSESSMENT: COMPLIANCE Accreditation Write Outcomes Identify Assessments Gather Results PackageResults.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
Assessing Learning in Programs: A Crash Course for Faculty Office of University Assessment Tara Rose, Director
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
CONVOCATION AUGUST 14, 2014 THE LOOK AND FEEL OF TASKSTREAM.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Consider Your Audience
The assessment process
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Assessing Academic Programs at IPFW
What to do with your data?
Student Learning Outcomes at CSUDH
planning THE ASSESSMENT CYCLE
Presentation transcript:

UK Office of Assessment

The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment Actively engaged in program level assessment Sept 2009 Program level student learning outcomes revised and/or updated Dec 2009Assessment strategy in place Jan-Mar 2010Assessment strategy implemented April 2010 Assessment results available for faculty reflection and action May 2010 First cycle completed and improvement plans submitted At least one cycle completed and improvement plans submitted September 2010First annual LEARNING Improvement awards announced May 2011Two cycles completedAt least two cycles completed August 2011SACS Compliance Audit begins September 2011Second annual LEARNING Improvement awards announced

 Evidence that supports compliance must be: ◦ Reliable ◦ Current ◦ Verifiable ◦ Coherent ◦ Objective ◦ Relevant ◦ Representative  Entail interpretation and reflection  Represent a combination of trend and snapshot data  Draw from multiple indicators

 Measures must be appropriate to outcomes ◦ Avoid cumbersome data-gathering ◦ Use both indirect and direct methods ◦ Learning =  what students know (content knowledge) +  what they can do with what they know (performance)

Assessment Inventory: Assessment Inventory: Fill out the degree program title and skip to Page 2 University of Kentucky Assessment Inventory for General Education and Degree Programs College: ___________________________________________________________________________________ Department: _______________________________________________________________________________ General Education/Degree Program: ____________________________________________________________ Undergraduate/Graduate/Professional: __________________________________________________________ Part I: Inventory of Statements and Plans 1. Is there a written mission statement or statement of purpose for this program and/or the department or unit within which the program is located?_______Yes_______No If Yes, please copy and paste, attach a copy or send a link 2. Have you articulated student learning outcomes which describe what a student should know or be able to do when they have completed this program?_______Yes_______No If Yes, please copy and paste, attach a copy or send a link 3. Have you chosen a method(s) of assessment for measuring student learning outcomes?_______Yes_______No If Yes, please copy and paste, attach a copy or send a link 4. Do you have a document (such as a curriculum map) that links student learning outcomes to the program curriculum?_______Yes_______No If Yes, please copy and paste, attach a copy or send a link 5. Have you determined an assessment cycle and fully articulated an assessment plan?_______Yes_______No If Yes, please copy and paste, attach a copy or send a link 6. Does this program have an accreditation process(es) separate from SACS?_______Yes_______No

 Students show achievement of learning goals through performance of knowledge, skills: ◦ Scores and pass rates of licensure/certificate exams ◦ Capstone experiences  Research projects, theses, dissertations, presentations, performances  Collaborative (group) projects/papers which tackle complex problems ◦ Score gains between entry and exit ◦ Ratings of skills provided by internship/practicum supervisors ◦ Substantial course assignments that require performance of learning ◦ Portfolios

 Indirect methods measure proxies for learning ◦ Data from which you can make inferences about learning but do not demonstrate actual learning, such as perception or comparison data ◦ Surveys  Student opinion/engagement surveys  Student ratings of their knowledge and skills  Employer and alumni perceptions, national and local ◦ Focus groups/Exit interviews ◦ Aggregate analyses of course grade distributions ◦ Institutional performance indicators  Enrollment data  Retention rates, placement data  Graduate/professional school acceptance rates

 Curriculum-embedded, performance-based methods ◦ Normal student work products already in place  Individual and collaborative papers, projects, performances, etc ◦ Internships, Practicums, Clinicals  Supervisor/employer judgments  Rubrics, crosswalks/matrices, panels of experts ◦ Instruments/methods that translate qualitative assessments into quantitative data results  Locally developed tests

 Rubrics are essential for curriculum- embedded, performance-based assessment because they provide: ◦ A clear articulation of how expectations are linked to specific course & program outcomes ◦ A means of increasing consistency in grading across sections, courses, programs, colleges ◦ An objective method of gathering quantitative data on complex performances of learning

 Analytic rubric: each performance indicator is assigned a numerical value; final score is the sum of indicator values ◦ Extremely low inter-rater reliability ◦ Labor and time intensive ◦ Best used for ‘drilling down’ into data  Holistic rubric: a single score is assigned for the whole performance ◦ Very high inter-rater reliability ◦ Quick scoring (labor and time efficient) ◦ Single score masks exact performance on each indicator ◦ Best used for large-scale assessment

 Take stock of the assessments the program is already conducting ◦ The Assessment Inventory  Analyze your program curriculum map to: ◦ Determine the best assessment points ◦ Map the major student work-products (artifacts) already being generated in the program  Then create new assignments/tasks as needed, and at strategic program assessment points

 Please fill out the Workshop Evaluation Form Thanks!