Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.

Slides:



Advertisements
Similar presentations
Global Learning Outcomes at Pensacola State College (GLOs)
Advertisements

Performance Assessment
Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
The Teacher Work Sample
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
An Overview of Service Learning: Building Bridges, Making Connections
PORTFOLIO.
CONNECT WITH CAEP | | Teachers Know Their Content And Teach Effectively: CAEP Standard 1 Stevie Chepko,
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Core Competencies Student Focus Group, Nov. 20, 2008.
Cambridge International Examinations
[Insert faculty Banner] Consistency of Teacher Judgement
General Education as a Learning-Outcome-Based Program: Course Review and Revision Workshop Tuesday, September 21 Fitting the pieces together and making.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Consistency of Assessment
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Curriculum, Instruction, & Assessment
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Consistent Teacher judgement
Rationale for CI 2300 Teaching and Learning in the Digital Age.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Matt Moxham EDUC 290. The Idaho Core Teacher Standards are ten standards set by the State of Idaho that teachers are expected to uphold. This is because.
Diane Holtzman Evonne Kruger.  Required for management concentration juniors and seniors and is an elective for all Business majors  Contextualizes.
ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons.
MATHEMATICS KLA Years 1 to 10 Understanding the syllabus MATHEMATICS.
Principles of Assessment
Meeting SB 290 District Evaluation Requirements
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
Curriculum Mapping.
Interstate New Teacher Assessment and Support Consortium (INTASC)
The Learning-centered Institution: Pedagogy and Assessment for Student Success Amy Driscoll WASC EDUCATIONAL SEMINAR February 2008.
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
Classroom Assessments Checklists, Rating Scales, and Rubrics
TNE Program Assessment Forum April 19 th, Glad you’re here!  Who’s Who… Design Team Representatives Program Assessment Team.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
1. Principles Equity Curriculum Teaching 3 Assessment Technology Principles The principles describe particular features of high-quality mathematics programs.
Setting High Academic Expectations that Ensure Academic Achievement TEAM PLANNING STANDARDS & OBJECTIVES TEACHER CONTENT KNOWLEDGE.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Teaching Today: An Introduction to Education 8th edition
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
K-12 Technology Literacy Curriculum and Assessment.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Agenda Introductions Objectives and Agenda Review Research Review Taking Stock Collect evidence Principal Practices & the Rubric End-of-the-Year Looking.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Information Literacy Module for Majors Available to support any department Tony Penny, Research Librarian – Goddard Library Supporting the Architecture.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
SUMMATIVE ASSESSMENT A process – not an event. Summative assessment “Information is used by the teacher to summarize learning at a given point in time.
Why ePortfolios? Christine Jones
Instructional Leadership: Planning Rigorous Curriculum (What is Rigorous Curriculum?)
Wisconsin Administrative Code PI 34 1 Wisconsin Department of Public Instruction - Elizabeth Burmaster, State Superintendent Support from a Professional.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Defining 21st Century Skills: A Frameworks for Norfolk Public Schools NORFOLK BOARD OF EDUCATION Fall 2009.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Designing Quality Assessment and Rubrics
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
CRITICAL CORE: Straight Talk.
Project Category Grade Level
We VALUE HIPs Utilizing VALUE Rubrics and HIP QA Tools in Course Revitalization Presented by Melynda Conner, TBR OSS HIP Specialist 2019.
Student Learning Outcomes at CSUDH
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008

Assessment Key Concepts Program Outcomes/ Topic Areas Student Learning Outcomes Means of Assessment / Evidence Criteria for Assessment Standards of Success/Quality of Performance: Exemplary Satisfactory Unsatisfactory

GE Program Outcomes/Topics Broad descriptions Categories of learning outcomes End toward which efforts are directed

Examples of GE Program Outcomes Critical Thinking Effective Communication Global Awareness Personal Responsibility Academic Senate, College of the Redwoods

Student Learning Outcomes Results in terms of specific student learning, development, and performance (Braskamp and Braskamp, 1997) Answer the question – “What do we expect of our students?” (CSU Report 1989) Describe actual skills, understandings, behaviors, attitudes, values expected of students

Examples of GE Student Learning Outcomes Math: Use arithmetical, algebraic, geometric and statistical methods to solve problems. Personal Responsibility: Demonstrate facility in making value judgments & ethical decisions Global/cultural context: Analyze issues from multiple perspectives Team work : Listens to, acknowledges, and builds on the ideas of others.

Make Student Learning Outcomes Public and visible –The syllabus  Relevant and meaningful – If my class was the only one a student took in the discipline …  Motivating and supportive of learning

Course Rating Scheme For each GE learning outcome rate your course 0- Course does not include instruction and assessment of this outcome. 1- Course includes instruction or practice of the outcome, and performance/knowledge of this outcome is assessed. 2- Course includes instruction or practice in the outcomes of this outcome, performance/ knowledge is assessed, and 20% or more of the course focuses on it. 3- Course includes instruction or practice in the outcome, performance/knowledge is assessed, and 1/3 or more of the course focuses on it. See the example matrix

Means of Assessment/Evidence Student work that demonstrates achievement of outcomes (assignments, projects, presentations, papers, responses to questions, etc.) Designed for appropriate level of learning expectations (outcomes) Opportunity for different ways of demonstrating learning

Examples of Means of Assessment/Evidence Critical Thinking Role play or case study Project or problem solving assignment Math Mathematical and statistical projects and papers Personal Responsibility A written account A multi-media presentation or display board An audio tape

Criteria for Assessment Basis on which student work is evaluated Support faculty in making objective evaluations Represent powerful professional judgments of faculty Guide student learning efforts –(if you share the criteria in advance)

Examples of Criteria for Assessment Math –Accuracy –Complexity –Clarity and Coherence Personal responsibility –Complexity (broad, multifaceted, interconnected) –Conscious Awareness Global/cultural context –Range of Cultures –Reflectivity and Integration Teamwork  Respect  Flexibility

Standards of Success/ Quality of Performance Describe different levels of performance in the criteria Support faculty in making objective evaluations Describe specific indications of criteria Promote understanding of criteria

Examples of Standards of Success/ Quality of Performance Math (Accuracy) – Satisfactory : Contains few errors and those errors do not significantly undermine the quality of the work. –Considers and uses data, models, tools or processes that reasonably and effectively address issues or problems. – Unsatisfactory : One or more errors that significantly undermine the quality of the work. –Uses data, models, tools or processes in inappropriate or ineffective ways. Global/cultural context (Complexity) – Standard for Excellent : Consistently views sophisticated and significant dilemmas and issues with a broad focus and from multiple perspectives. – Standard for Satisfactory : Usually views sophisticated and significant dilemmas and issues with a broad focus, but may sometimes use a more narrow focus and may use fewer perspectives. – Standard for Unsatisfactory : Mainly views issues and dilemmas in simple terms and usually does so with a limited focus and minimal perspectives.

Assessment Key Concepts Example GE program outcome/topic area -Personal responsibility Learning outcomes -Students demonstrate a facility in making value judgments & ethical decisions -Student describe and assume personal responsibility in collaborative endeavors, and respect and support the responsibilities of others

Personal Responsibility Means of assessment/evidence -Written code with discussion of two different life decisions based on the code -Multimedia presentation -Letter of application for professional position -Dramatization of ethical issues Criteria for assessment -Reflection -Multiple perspectives

Personal Responsibility Standards of success/quality of performance - Excellence in Reflection: Consistently raises questions, checks assumptions, connects with previous experiences, acknowledges biases and values and engages in self- assessment -Excellence in Multiple Perspectives: Examines thinking and experiences of others, considers those affected by decisions, and considers diverse courses of action

Assessment Process

Aligning Curriculum and Pedagogy with Learning Outcomes Outcomes and criteria as planning focus Course Alignment Grids Teachers talking teaching See sample course alignment grid

Collect Evidence of Student Performances Individual faculty collect representative samples of student work (3 Exemplary, 3 Satisfactory, 3 Unsatisfactory) CSU Monterey Bay approach

Review and Analyze Evidence Faculty as a group (faculty learning community) Read holistically to determine whether outcomes are achieved (reliability). Several readings to identify examples of criteria (validity). Final reading for insights about pedagogy, class structure and environment, and learning supports. CSU Monterey Bay approach

Collect Evidence of Student Performances Research office/assessment committee selects classes for assessment work Individual faculty forward student work to the research office Research office makes copies & returns originals for grading Next term research office draws a sample of the work Johnson County CC approach

Review and Analyze Evidence A cross disciplinary team of faculty apply holistic rubrics in reviewing samples of work Assemble and discuss findings reflecting on their implications for future action Johnson County CC approach

Collect Evidence of Student Performances Individual faculty decide on an assignment fitting the GE theme Faculty (individually & collaboratively) build a grading rubric Faculty teach their course, grade work but keep a course portfolio of their experiences Raymond Walters CC approach

Review and Analyze Evidence Faculty meet in departments at the end of the year to exchange experiences and notes from their course portfolios Department writes a summary report of findings and recommendations for future action Raymond Walters CC approach

Process Results: Improving Learning Documentation of student achievement of outcomes –Someone has to keep the data Faculty as a group Identification of curricular gaps/foci and pedagogical weaknesses/strengths Clarification of outcomes, criteria & standards Redesign of means of assessment/evidence