 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.

Slides:



Advertisements
Similar presentations
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
Advertisements

A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
MED  Problem of the Day: SEND +MORE MONEY.
Apples to Oranges to Elephants: Comparing the Incomparable.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
Evaluation. Practical Evaluation Michael Quinn Patton.
Learning and Development Developing leaders and managers
Strategic Planning with Appreciative Inquiry
Classroom Action Research Overview What is Action Research? What do Teacher Researchers Do? Guidelines and Ideas for Research.
Department of Humanities College of Sciences and Liberal Arts Writing Program Assessment at New Jersey Institute of Technology Carol Siri Johnson Associate.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Connecting Work and Academics: How Students and Employers Benefit.
Assessment of GLOs: Background 12 years of working on Assessment of GLOs Faculty/Staff generated: in-service, workshop, teams, surveys, etc. Since 2006.
Using a logic model to help you use the ePortfolio Implementation Framework Katherine Lithgow.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
26 TH ACADEMIC COUNCIL ST APRIL 2015 Breakout session Group C “EXPERIENTIAL LEARNING”
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Southern Regional Education Board Welcome What Does Academic Integration Really Mean in the Career-Technical Classroom? Nancy Headrick, Director State.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
The Evaluation Plan.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Focus on Learning: Student Outcomes Assessment and the Learning College.
Module 3 Developing Improvements and Building Institutional Capacity.
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Developing a Library Marketing Plan, Part 2 Implementing the Plan Mark E. Ibach Marketing & PR Coordinator South Central Library System.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
How Do We Know What’s Working? A Guided Dialogue on SLOs and Assessment Lesley Kawaguchi Santa Monica College Chair, Committee on Student Learning Outcomes.
A User-Friendly Approach to Streamlining the Collection and Analysis of SLO Evidence Dave Karp & Tom Vitzelio.
Foundations of Assessment I Understanding the Assessment Process.
District Learning Day August 5, 2015
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Step 1: Build a Planning Team
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
TOP TEN LIST OF COACHING BELIEFS CURRICULUM 511 DR. PECK BY: HALI PLUMMER.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Measures for assessing the impact of ICT use on attainment Ian Stevenson University of Leeds.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Transforming Patient Experience: The essential guide
Student Affairs Assessment Council Wednesday, October 28, 2015.
Assessing Teacher Effectiveness Charlotte Danielson
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
The Learning Cycle as a Model for Science Teaching Reading Assignment Chapter 5 in Teaching Science to Every Child: Using Culture as a Starting Point.
GSS as a Professional Learning Community. What do we already know about PLC’s?
+ Scaffolding Teacher Implementation of PBLs Adams 12 Five Star Schools.
Assessing Student Learning Workshop 2: Making on-balance judgements and building consistency.
What Is Action Research? Action Research is : Action Research is : - A research methodology - Participative - Responsive - Cyclic “A cycle of posing questions,
Getting Ready for the Higher Learning Commission (NCA) July 2011 Dr. Linda Johnson HLC/Assessment Coordinator/AQIP Liaison Southeast Technical Institute.
How to Help Your Evaluation Team Help You: Peer Review and your ACCJC External Visiting team Stephanie Curry—Reedley College Dolores Davison—Area B Representative.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
TELL Survey 2015 Trigg County Public Schools Board Report December 10, 2015.
1 Vanderbilt University Name: Vanderbilt TAR Fellows Program Persons responsible: Thomas R. Harris, Derek Bruff, Jean Alley Time Commitment: Introductory.
From Nuts and Bolts to Drywall and Paint: Building an Assessment Culture Office of Outcomes Assessment.
Nuts and Bolts: Functional Variations of Assessment and Evaluation Barbara Hornum, PhD Director, Drexel Center for Academic Excellence Associate Professor,
Learning Goals, Objectives, & Curriculum Mapping Linking curriculum & pedagogy to demonstrations of knowledge, skills, and abilities Dr. Marjorie Dorimé-Williams.
Consider Your Audience
Asking the Right Question - the Key to Good Assessment
Effective Outcomes Assessment
Student Learning Outcomes at CSUDH
Presentation transcript:

 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development.”  Palomba & Banta, 1999, pg. 4

 Start with the culture because faculty must be comfortable with the assessment culture before they feel comfortable with assessment!  Determine the role that assessment will play in college processes – and what it WON’T be used for!  Improving Student Learning  Program Evaluation  Budgeting  NOT Faculty Evaluation

 Value Campus Culture & History (Assessment is not a one size fits all)  Respect and Empower People – especially Faculty!  Value Assessment by providing appropriate resources and infrastructure  Value innovation & risk taking to improve teaching (even if it fails)

“I have not failed. I've just found 10,000 ways that won't work.” ― Thomas A. Edison

Assessment I think we are about to turn the corner on this whole “Assessment” thing…

 Is linked to decision making about the curriculum – Paloma & Banta  Measures real-life gaps in desired skills & performance – Swing, et. al.  Leads to reflection and action by faculty – Paloma & Banta

 Institutions and faculty are good at collecting data – just not as good at using data to drive curricular changes  This part of the assessment cycle is often called:  “Closing the loop”  What loop – and who left it open?

 Start with asking the right questions  Questions to help faculty focus their assessment activities  What should students learn?  How well are they learning it?  What evidence do you have?  What are you doing with the evidence? Question Plan Collect & Score Analyze/Reflect Report/Act Assessment Cycle

 The right question is:  Meaningful – it is a question that faculty want to know the answer to, and that knowing the answer will help them impact student learning  Measurable – work at asking a question that faculty can answer – usually that means narrowing down the question  Manageable – it is important to keep the question and the process of collecting data manageable – this isn’t the only or primary job that faculty have

 Once faculty has determined the actual question that is going to be answered this will drive the methodology of assessment  Pre/Post Test  Embedded test questions  Project Based Assessment  Portfolios  Surveys  Performance, etc., etc.  How often will they collect the data, in what classes – What makes sense? (Remember keep it manageable)

 First encourage faculty to take time to organize the data  Excel is an easy accessible tool available to 99% of all faculty  Offer Data management workshops What does on a row, column, what are some analyses you can run in Excel

 Qualitative Data is an appropriate tool for many disciplines  Map out the requirements of the assignment  Search for themes in student responses  Track how often key curricular themes appear

 If you are a IR person – or big on Stats – close your ears for a minute…  Rather than worry so much about reliability, validity, statistical significance, think about these assessment projects as Action Research  Action Research focuses on getting information that will enable faculty to change conditions in a particular situation in which they personally involved  Seeks to solve a problem, or  Inform local practice – specifically – classroom/course practice!

 Identify the Research Question  Gather the necessary information  Analyze and interpret the information  Develop an action plan  Sound familiar?? – Think Cycle of Assessment

 Move beyond “Averages”  Look at the Spread of the Data  What does the spread indicate?  Is the data evenly distributed?  Are there large gaps? Where do they exist?  Has the faculty member or department decided what is an acceptable level of performance?

Curricular Change  Follow the data trail – and then talk to invested participants  Data leads to students not understanding a concept What is the benchmark of performance? Curriculum mapping – where does the concept occur? How is the concept taught (pedagogy) Where is the concept reinforced? (Scaffolding) What changes can we (faculty/department) make to the curriculum to help students understand and apply concept How will we measure this curricular change to see if it is successful

 Case Study – Visual Communications  General Education Curriculum Assessment Cycle Visual Communications Mass Exodus of Visual Communications Classes Faculty couldn’t come up with acceptable assignments to show how they were assessing visual communications Focused discussion with department chairs/faculty groups on the issue Faculty felt unqualified to “teach” visual communications Professional Development, in-service workshops, teaching circles, etc.

 If the focus throughout the process has been on student learning (versus report writing), then faculty will be more open to making curricular changes  Next Assessment Cycle – what difference did the change make? Was there a difference in performance?  Make sure appropriate time has elapsed for changes to be in effect  Make sure the measurement is parallel to the previous assessment

 “When are we done with a learning outcome?”  Did you see improvement?  Did you meet your benchmark performance?  Are you satisfied?  Do you see a greater need/question that needs to be asked?

 Reports and data analysis that don’t focus on student learning are a waste of paper  Faculty must be engaged in making sense of and interpreting assessment results – administration can’t do it for them  Share Successes  With permission – share assessment results from other departments/disciplines  Get the faculty to tell their success stories – it carries more weight with their peers

 Effective Assessment Takes time to Plan, Implement and Sustain  Don’t expect instant results – reliable data takes time to gather  Make sure the assessment is asking a question that can be answered  This means being narrow in scope – rather than throwing as much stuff as possible at a wall and seeing what sticks!

 While it is important for faculty to “own” assessment, support needs to come from administration:  Recognizing faculty efforts  Attending faculty functions  Providing appropriate resources  Considering policy implications

Questions?? Sheri H. Barrett, EdD Director of Outcomes Assessment Johnson County Community College