Using WIDS to Document How Colleges Assess Learning Outcomes.

Slides:



Advertisements
Similar presentations
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
Advertisements

Elizabeth Deane WIL symposium July One definition: The broad skills that a University expects that graduates will have acquired and be able to demonstrate.
School of Nursing Reaccreditation November 9-11, 2009 Commission on Collegiate Nursing Education (CCNE) ensures the quality and integrity of baccalaureate,
Criteria for High Quality Career and Technical Education Programs National Career Pathways Network Orlando, FL November 14, 2014.
As presented to the Global Colloquium on Engineering Education Deborah Wolfe, P.Eng. October 2008 The Canadian Process for Incorporating Outcomes Assessment.
Selecting and Identifying Programs of Study Division of School and Community Academic Programs Camden County College Camden Pathways Professional Development.
How Can I Spend Perkins Funds? CESA #4 Network Night 11/17/2010 Sherri K. Torkelson.
Career & Technical Education: Carl D. Perkins (VTEA) Funding April 2009 Division of Teaching & Learning.
Career and Technical Education in Minnesota Presentation to the Governor’s Workforce Development Council March 13, 2008 Minnesota Perkins State Career.
IL State Board of Education - 9/18/2007 Perkins IV - Secondary Indicators Carol Brooks Illinois State Board of Education.
A Systemic Approach February, Two important changes in the Perkins Act of 2006 A requirement for the establishment of Programs of Study A new approach.
CCTC Background Process coordinated by NASDCTEc 42 states, DC, and one territory involved in development Modeled the process and outcomes of Common Core.
DQI State Plan Accountability Requirements, Guidelines, Timeline, Student Definitions and Indicators John Haigh, U.S. Department of Education Savannah,
Minnesota WIDS Users’ Group Connecting: Learning Teaching and Assessment Maximizing the Potential of WIDS Judy Neill Worldwide Instructional Design System.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
School of Business University of Bridgeport Admissions Presentation Robert Gilmore, Ph.D. Associate Dean School of Business.
ACADEMIC INFRASTRUCTURE Framework for Higher Education Qualifications Subject Benchmark Statements Programme Specifications Code of Practice (for the assurance.
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
performance INDICATORs performance APPRAISAL RUBRIC
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
The Role of Assessment in the EdD – The USC Approach.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
ACADEMIC PERFORMANCE AUDIT
TECH PREP PERFORMANCE MEASURES & PROGRAMS OF STUDY NACTEI Annual Conference May 2012.
The Minnesota CTE Assessment Project Building a CTE Assessment System for Student Results and Program Improvement.
Federal Emphasis on Accountability in Higher Education and Regional Accreditation Processes Carla D. Sanderson Commissioner, Southern Association of Colleges.
ENGAGING LEADERS FOR CHANGE AND INNOVATION ADEA CCI 2011 Summer Liaison Meeting San Diego, CA June 27-29, 2011 Janet M. Guthmiller, DDS, PhD University.
Learner-Ready Teachers  More specifically, learner-ready teachers have deep knowledge of their content and how to teach it;  they understand the differing.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Assessment Tools for Today and Tomorrow.  What do they need to achieve? WHO WHAT WHEN HOW  Who are the learners?  How will I know when they’ve achieved.
Quality Performance Dr. J. August 12, 2011 In-Service.
Documenting How WTCS Colleges Assess Learning Outcomes.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
IT State Conference Update Eau Claire, March 1, 2012 Hal Zenisek WIDS Learning Design System WTCS Foundation, Waunakee, WI.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Computer Literacy Placement Exam Design and Assessment Amber M. Epps The Art Institute of P ittsburgh.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
Assessment Validation. MORE THAN YOU IMAGINE ASQA (Australian Skills Quality Authority) New National Regulator ASQA as of 1 July, 2011.
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
Institutional Effectiveness at CPCC DENISE H WELLS.
The Proof is in the Assessment Brenda Clark Director of Professional Development.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Programs of Study. Program of Study A Program of Study is a sequence of instruction (based on recommended standards and knowledge and skills) consisting.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Denise Kirkpatrick Pro Vice-Chancellor The Open University, UK Quality Assurance in Distance Education.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Using VE-135 Data and Other CCCS Data to Prepare for Accreditation
Is what you say you teach, what your students learn?
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Program Quality Assurance Process Validation
Welcome to Your New Position As An Instructor
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
Presentation transcript:

Using WIDS to Document How Colleges Assess Learning Outcomes

Overview  What is assessment?  What are the drivers?  How does curriculum fit in?  How does WIDS help?

“Assessment of student learning is a participatory, iterative process that:  Provides data/information you need on your students’ learning  Engages you and others in analyzing and using this data/information to confirm and improve teaching and learning  Produces evidence that students are learning the outcomes you intended  Guides you in making educational and institutional improvements  Evaluates whether changes made improve/impact student learning, and documents the learning and your efforts.” NCA Higher Learning Commission From “Student Learning, Assessment and Accreditation: Criteria and Contexts”, presented at Making a Difference in Student Learning: Assessment as a Core Strategy, a workshop from the Higher Learning Commission, July 26-28, Assessment of Learning

 Informed judgments about achievement of intended learning outcomes Based on evidence of what students can do and their capacity to apply what they know (data) at the completion of the learning experience (course, certificate, program, or other credential) Valid, reliable, and fair Built into the plan for teaching  Data-driven evaluation that is actively used for continual improvement of teaching and learning Assessment of Learning

Sound Assessment Fair Reliable ValidValid Outcomes based on standards (industry) Measures intended outcomes Measures application and critical thinking Reliable Performance assessment based on consistent rubrics, scoring guides, and rating scales Consistent process - each learner is assessed in same way as other learners Fair Valid Reliable Learners informed of expectations up front! Feedback to learners

Summative/Formative Assessment Program Level or End of Course Program Quality Student Credentials Course Level Feedback to student Improvement of Learning/Teaching Continuous Improvement

What are the drivers? 1.Learning & continuous improvement of teaching and learning 2.Accreditation (NCA, AQIP, Industry) 3.Carl Perkins IV

HLC/AQIP asks 5 fundamental questions:  How are your stated learning outcomes appropriate to your mission, programs, students and degrees?  How do you ensure shared responsibility for student learning & assessment of student learning?  What evidence do you have that students achieve your stated learning outcomes?  In what ways do you analyze and use evidence of student learning?  How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? NCA Higher Learning Commission From “Student Learning, Assessment and Accreditation: Criteria and Contexts”, presented at Making a Difference in Student Learning: Assessment as a Core Strategy, a workshop from the Higher Learning Commission, July 26-28, 2006.

The What: Both secondary and postsecondary must develop a process for measuring technical skills that lead to an industry-recognized credential, or state-recognized credential, or locally-developed credential The Why: Both secondary and postsecondary education systems must provide, as a separate measure of accountability, the extent of skill development in Career and Technical Education (CTE) programs The How: Both secondary and postsecondary education must develop valid and reliable assessments Carl Perkins IV – The New Law

Postsecondary Indicators  Technical Skill Attainment  Credential, Certificate, or Diploma  Student Retention or Transfer  Student Placement  Nontraditional Participation  Nontraditional Completion Carl Perkins IV – The New Law

B RONZE S ILVER G OLD WTCS will move toward a balance of silver and gold assessment of learning outcomes.

 Gold Standard (ex. NCLEX, ASE, Barber/Cosmetology) External, third-party assessment that objectively measures student attainment of industry recognized skills upon graduation.  Silver Standard WTCS approved assessment that objectively measures student attainment of industry recognized skills upon graduation. May be developed and/or implemented by the state system. May be developed and/or implemented locally according to system guidelines and with industry (advisory committee) approval. Must be valid and reliable.  Bronze Non-assessment related indicators such as indirect measures. GPA and/or Course Completion Program Completion – Graduation Teacher-developed exams that do not meet silver standard (not externally approved ). Carl Perkins IV – The New Law Summary of Standards for Assessment of Technical Skill Attainment

Plan Establish outcomes Build assessments Design learning DO Deliver Assess Document CHECK Analyze assessment data Propose improvements ACT Adjust design Adjust delivery Adjust assessment Assessment Process and Curriculum

The Learning Assessment Design Technology Tool Kit TracDat Blackboard iWebfolio eFolio Questionmark People Soft Banner Datatel

Assessment Mgt Software (ex. TracDat) Describe Outcomes (Learning/Org) Build Assessment Plans Track Continual Improvement Activities Document/Report Results Assessment Mgt Software (ex. TracDat) Describe Outcomes (Learning/Org) Build Assessment Plans Track Continual Improvement Activities Document/Report Results WIDS Establish Learning Outcomes Build Assessments Design and Plan LearningWIDS Establish Learning Outcomes Build Assessments Design and Plan Learning Enterprise System (ex. PeopleSoft DataTel or Banner) Manage Student Records Manage Enrollment/Scheduling Manage Budget Etc. Enterprise System (ex. PeopleSoft DataTel or Banner) Manage Student Records Manage Enrollment/Scheduling Manage Budget Etc. Assessment Software (ex. Questionmark) Author Q & A Assessments and Surveys Deliver Assessments/Surveys Record Learners Performance (In terms of outcomes) Assessment Software (ex. Questionmark) Author Q & A Assessments and Surveys Deliver Assessments/Surveys Record Learners Performance (In terms of outcomes) Electronic Portfolio (ex. iWebfolio/MSNCU eFolio) Present evidence of individual learning/professional experience Document connection between artifacts and outcomes Owned by the student Electronic Portfolio (ex. iWebfolio/MSNCU eFolio) Present evidence of individual learning/professional experience Document connection between artifacts and outcomes Owned by the student WIDS Learning Design System populates assessment, electronic portfolio, enterprise, and assessment systems with learning outcomes and performance standards. Assessment Management Software provides feedback loop for continual improvement of learning design using WIDS. Web-based Electronic Portfolio System presents evidence of individual’s learning and professional experience. Learning Outcomes Curriculum Documents Learning Outcomes Rubrics Student Learning Documentation Continuous Improvement Feedback Student Data

 What do they need to be able to achieve?  Who are the learners?  How will we know and show when they’ve achieved it?  How will they get there? 3 of 4 components answer assessment questions WHO WHATWHENHOW WIDS Model

Linked Competencies (performance assessments) Exit Learning Outcomes Program Outcomes (summative assessment) Core Abilities (summative assessment) Gen Ed Outcomes (summative assessment) Job Task Analysis (DACUM) (DACUM) External Standards (Research-based) (Research-based) Drives Learning Design Assesses Learning Results Integrating the Outcomes Linked Competencies (performance assessments) Linked Competencies (performance assessments) Formative assessment of program outcome Summative assessment of learning outcomes External Program-Level Course-Level

Drives Learning Design Assesses Learning Results External Program-Level Course-Level Dental Hygiene Example Graduates are competent in applying ethical, legal and regulatory concepts to the provision and/or support of oral health care services. Incorporate into dental hygiene practice professional laws, regulations and policies established by the licensing state and regulatory agencies Respond to a request to perform a task that is not legally permitted to be delegated to a dental hygienist. ADA Dental Hygiene Education Standard Dental Hygiene Program Outcome Dental Hygiene Competency Formative assessment of program outcome Summative assessment of Learning outcomes

Challenges  How do we communicate WTCS long-standing commitment to industry input and standards?  How do we continue WTCS commitment to performance assessment whenever possible (versus testing?)  How do we ensure that we use quality assessments? (That we don’t regress to testing regurgitation of information?)  How do we ensure that we are doing what our model and assessment plans imply?

1.Create Program Design files for all programs 2.Develop Exit Learning Outcomes  Core Abilities (mission level)  Program Outcomes (discipline level)  General Education Outcomes (Optional) 3.Determine summative assessments 4.Define criteria for summative assessments 5.Link courses to exit learning outcomes Using WIDS Program Design Design. Document. Align

6.Create Performance Assessment Tasks (with scoring guides and rubrics) to support summative assessment  Embed in program courses, using the program file as the master file  Create a separate Summative Assessment course file for program Using WIDS Learning Design Design. Document. Align

7.Generate Analyzer reports  Learning Outcomes Matrix  Program Course List Matrix  Assessment Task Outcomes Matrix Using WIDS Analyzer Design. Document. Align

Learning Outcomes Matrix Outcomes Courses Documents where outcomes are addressed Summative Assessments

Assessment Task Outcomes Matrix Performance Assessment Tasks By Course Shows the assessment and the link/target

Maximize the potential of WIDS to support assessment planning: Create/maintain Program Outcome Summaries for all new and existing programs. Develop/maintain Course Outcome Summaries for all courses. Design Performance Assessment Tasks for course level (formative) and beyond-the-course (summative) assessments. Use WIDS Analyzer to build Learning Outcome Matrices that document how outcomes are linked with programs and courses. Present WIDS documentation as a central feature in accreditation.