Planning For, Interpreting & Using Assessment Data Gary Williams, Ed.D. Instructional Assessment Specialist, Crafton Hills College

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
School Based Assessment and Reporting Unit Curriculum Directorate
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Student Assessment Standards
1 Aligning Assessment Methods with Learning Outcome Statements and Curricular Design Presented at CCRI April 8, 2005 Material from Maki,
1 Module 2: Tasks that Prompt Students’ to Represent Your Outcome Statements “ Every assessment is also based on a set of beliefs about the kinds of tasks.
Teaching, Learning, and Assessing Peggy Maki Senior Scholar, Assessing for Learning AAHE
An Assessment Primer Fall 2007 Click here to begin.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
The Academic Assessment Process
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Virginia Teacher Performance Evaluation System
Standards and Guidelines for Quality Assurance in the European
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
1. What is it we want our students to learn?
Professional Growth= Teacher Growth
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Janet Fulks, ASCCC Bakersfield College Bob Pacheco, RP, Barstow College.
Becoming a Teacher Ninth Edition
Assessing Students Learning Outcomes Senia Terzieva
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Student Learning Outcomes Los Angeles Valley College Training, Spring 2008 – part II SLO Coordinator – Rebecca Stein
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
Rubrics for Complex Papers/Projects Academic Assessment Workshop May 13-14, 2010 Bea Babbitt, Ph.D.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
1 Selecting Appropriate Assessment Methods Presented at the Teaching & Learning Innovations 17 th Annual Conference At the University of Guelph May 12,
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
STUDENT LEARNING OUTCOMES STATE CENTER COMMUNITY COLLEGE DISTRICT BOARD REPORT, DECEMBER 7, 2010 SLO Coordinators: Maggie Taylor (FCC) and Eileen Apperson(RC)
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Building Your Assessment Plan Esther Isabelle Wilder Lehman College.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
On SLOA (SLO Assessment). (after Munch) Will Lecture die? No…
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
WRITING LEARNING OUTCOMES AND MAPPING CURRICULUM UK Office of Assessment.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Program Review A systematic method of analyzing components of an instructional program, including instructional practices, aligned and enacted curriculum,
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Understand the purpose and benefits of guiding instructional design through the review of student work. Practice a protocol for.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
An Integral Part of Professional Learning Communities.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
Assessment Literacy and the Common Core Shifts. Why Be Assessment Literate? ED Test and Measurement 101 was a long time ago Assure students are being.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Lecturer - Dr Justin Rami Assessment & Feedback Lecturer – Dr Justin Rami.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
School – Based Assessment – Framework
Effective Outcomes Assessment
Advanced Program Learning Assessment
Presented by: Skyline College SLOAC Committee Fall 2007
What to do with your data?
Student Learning Outcomes at CSUDH
NON-ACADEMIC ASSESSMENT REPORTING FY’17
Presentation transcript:

Planning For, Interpreting & Using Assessment Data Gary Williams, Ed.D. Instructional Assessment Specialist, Crafton Hills College Fred Trapp, Ph.D. Administrative Dean, Institutional Research/Academic Services, Long Beach City College October, 2007

Goals of the Presentation “De-mystify” the assessment process Provide practical approaches & examples for assessing student learning Answer questions posed by attendees that pertain to their assessment challenges

What’s It All About? An ongoing process aimed at understanding and improving student learning. Faculty making learning expectations explicit and public. Faculty setting appropriate standards for learning quality.

What’s It All About? Systematically gathering, analyzing and interpreting evidence to determine how well student performance matches agreed upon faculty expectations & standards. Using results to document, explain and improve teaching & learning performance. Tom Angelo AAHE Bulletin, November 1995

Roles of Assessment “We assess to assist, assess to advance, assess to adjust”: –Assist: provide formative feedback to guide student performance –Advance: summative assessment of student readiness for what’s next –Adjust: continuous improvement of curriculum, pedagogy. - Ruth Stiehl, The Assessment Primer: Creating a Flow of Learning Evidence (2007)

Formulating Questions for Assessment Curriculum designed backwards; Students’ journey forward: –What do students need to DO “out there” that we’re responsible for “in here?” (Stiehl) Subsequent roles in life (work or future study, etc.) –How do students demonstrate the intended learning now? –What kinds of evidence must we collect and how do we collect it?

Assessment Questions & Strategies– Factors to consider: Meeting Standards –Does the program meet or exceed certain standards? –Criterion reference, commonly state or national standards Comparing to Others –How does the student or program compare to others? –Norm reference, other students, programs or institutions

Assessment Questions & Strategies- Factors to Consider: Measuring Goal Attainment –Does the student or program do a good job at what it sets out to accomplish? –Internal reference to goals and educational objectives compared to actual performance. –Formative student-center. –Professional judgment about evidence common.

Assessment Questions & Strategies- Factors to Consider: Developing Talent and Improving Programs –Has the student or program improved? –How can the student’s program and learning experience be improved even further? –Formative and developmental. –Variety of assessment tools and sources of evidence.

Choosing Assessment Tools Depends upon the “unit of analysis” –Course –Program –Degree/general education –Co-curricular Also depends upon overall learning expectations

Formulating Assessment Strategies:

Direct vs. Indirect Evidence Direct –What can the student actually do or demonstrate they know –Can witness with own eyes –Setting is structured/ contained Indirect –What students say they can do –Focus on the learning process or environment –Things from which learning is inferred –Setting is not easily contained/structured

Qualitative vs. Quantitative Qualitative –Words –Categorization of performance into groups –Broad emergent themes –Holistic judgments Quantitative –Numbers –Individual components and scores –Easier calculations and comparisons plus presentation to a public audience

Formative vs. Summative Assessment for learning “In-progress” Provide corrective feedback Establish foundational learning for next step. Assessment for evaluative purpose “After the fact” Determine progress/ achievement/proficiency Readiness for next step/ role/learning experience

Means of Assessment- (Quantitative Judgments) Cognitive –Standardized exams –Locally developed exams Attitudes/beliefs –Opinion surveys of students, graduates, employers

Means of Assessment- (Qualitative Judgments) Cognitive –Embedded classroom assignments Behavior/performances (skills applications) –Portfolios –Public performances –Juried competitions –Internships –Simulations –Practical demonstrations Attitudes/beliefs –Focus groups

Interpreting Results- How Good Is Good Enough? Norm Referencing –Comparing student achievement against other students doing the same task Criterion Referencing –Criteria and standards of judgment developed within the institution

Are Results Valid and Reliable? Validity Reliability Authentic assessment Important questions or easy questions Inform teaching and learning?

How Does Assessment Data Inform Decision-Making? Goal: Making sound curricular and pedagogical decisions, based on evidence Assessment questions are tied to instructional goals. Assessment methods yield data that is valid & reliable. A variety of measures are considered. Assessment is an ongoing cycle.

Assessment Process

Collaboration Among Faculty, Administration & Researchers Assessment, the auto, and a road trip: an analogy –Who should drive the car? –Who provides the car, gas, insurance and maintenance? –Who brings the maps, directions, repair manual, tool kit, first aid kit, and stimulates the conversation along the journey?

Why Faculty are the Drivers Faculty have the primary responsibility for facilitating learning (delivery of instruction) Faculty are already heavily involved in assessment (classroom, matriculation) Faculty are the content experts Who knows better what students should learn than faculty?

Who Provides the Car and Keeps Gas in It? Administrators!

Role of Administrators Establish that an assessment program is important at the institution Ensure college’s mission and goals reflect a focus on student learning Institutionalize the practice of data-driven decision making (curriculum change, pedagogy, planning, budget, program review) Create a neutral, safe environment for dialogue

Where Does IR Sit in the Car?

Roles of Researchers Serve as a resource on assessment methods Assist in the selection/design and validation of assessment instruments Provide expertise on data collection, analysis, interpretation, reporting, and use of results Facilitate dialogue - train and explain Help faculty improve their assessment efforts

Faculty DON’Ts… Avoid the SLO process or rely on others to do it for you. Rely on outdated evaluation/grading models to tell you how your students are learning. Use only one measure to assess learning Don’t criticize or inhibit the assessment efforts of others.

Faculty DOs... Participate in SLO assessment cycle Make your learning expectations explicit Use assessment opportunities to teach as well as to evaluate. Dialogue with colleagues about assessment methods and data. Focus on assessment as a continuous improvement cycle.

Questions From the Field…