General Education Assessment

Slides:



Advertisements
Similar presentations
Documentation of the BVCTC General Education Student Learning Outcomes
Advertisements

Assessment Consultant to THECB’s Undergraduate Education Advisory Committee (UEAC): Danita McAnally, Chief of Planning and Advancement.
IN SUPPORT OF STUDENT INVOLVEMENT IN THE COURSE TRANSFORMATION PROGRAM Senate Resolution 1012.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
KPIs: Definition and Real Examples
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
Topic #3 - CRITICAL THINKING Key Evidence 1 Provided by Amarillo College Offices of Institutional Research and Outcomes Assessments.
Assessment Report School of The Arts, Humanities and Social Sciences________________ Department: Political Science and International Studies.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
Richard Beinecke, Professor and Chair Suffolk University Institute for Public Service.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
4/16/07 Assessment of the Core – Humanities with Writing Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
4/16/07 Assessment of the Core – Quantitative Reasoning Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Critical Information Literacy
BA Art Extension Examination Preparation
CRITICAL CORE: Straight Talk.
Consider Your Audience
Taking the TEAM Approach: Writing with a Purpose
Developing a Student Learning Outcomes Assessment Plan and Report
Introduction to Assessment in PBL
Writing in Science Argument
MUHC Innovation Model.
Assessment of Student Learning
The General Education Core in CLAS
General Education Assessment
Faculty Forum on Critical Thinking
Recertification and Assessment of Core Curriculum Courses
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Creating Analytic Rubrics April 27, 2017
General Education Assessment
The IB Diploma Programme visual arts course encourages students to: A
The IBCC Reflective Project
First-Stage Draft Plans for Gen Ed Revision
Measuring Project Performance: Tips and Tools to Showcase Your Results
Institutional Learning Outcomes Assessment
Literature Response Papers
Institutional Effectiveness USF System Office of Decision Support
Revision of the General Education Curriculum DGE and Department Heads
Student Learning Outcomes Assessment
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Using VALUE Rubrics to Assess Almost Any Program Outcome
Assessment of Course and Program Outcomes
2018 OSEP Project Directors’ Conference
Student Course Evaluation Revision Task Force
Gary Carlin, CFN 603 September, 2012
Topic Principles and Theories in Curriculum Development
Grantee Guide to Project Performance Measurement
Introduction to Assessment in PBL
Assessment of Learning in Student-Centered Courses
Introduction to Engineering Design II (IE 202)
Shazna Buksh, School of Social Sciences
The Heart of Student Success
Introduction to Engineering Design II (IE 202)
Introduction to Assessment
Assessment Day 2017 New Student Experience Presented by Jenny Lee
Presentation and project
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Student Learning Outcomes at CSUDH
Presentation and project
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Curriculum Coordinator: Pamela Quinn Date of Presentation: 1/19/18
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Student Learning Outcomes Assessment
Quality Matters Overview
Presentation transcript:

General Education Assessment Critical Thinking Skills Competency

Assessment at AC Assessment Methods: Planning and Evaluation Tracking (PET) General Education Assessment Assessment Overview Resource: General Education Methodology Outcomes Assessment Information and Training At Amarillo College, we assess student learning through the PET forms and through General Education Assessment. We will only discuss General Education assessment today, which uses existing, embedded assignments that are submitted by instructors for assessment purposes. If you would like to know the entire general education assessment background from course selection to how the report findings are used, we have that information detailed in our General Education Methodology. If you would like to read more about the overall assessment process, you can visit the AC Outcomes Assessment Information and Training page.

2011-2012 Competencies and Rubrics (Based on UEAC Core Objectives) New Areas Communication Skills Critical Thinking Skills Empirical and Quantitative Skills Teamwork Future Areas Personal Responsibility Social Responsibility

Committee Purpose Evaluate Student Attainment of Competencies Pinpoint Student Strengths, Weaknesses, and Areas for Improvement Pinpoint Points of Interest Found in Artifacts Evaluate the Evaluation Process Competency Information and Rubrics General Process Evaluation Each competency has a built-in benchmark indicating what academic-course taking students who have earned 30 or more credit hours at Amarillo College should be able to accomplish We need to pinpoint student strengths and weaknesses and analyze artifact findings in order to achieve constant improvement The competency information and rubric was primarily created by the Instructional Sub-Committee based on the new THECB Undergraduate Education Advisory Committee (UEAC) rubrics and other institutional rubrics and information provided by the Office of Assessment and Development and AC has set a methodology in place for how we collect and assess rubrics. However, in addition to evaluating the students, we appreciate your participation in the evaluation process.

Critical Thinking Skills Committee Members (From 2011 General Education Competency Member List) Carol Summers Co-chair and Instructional Assessment Committee Michael Kopenits Kim Boyd Mike Bellah John Chaka Each committee is made up of 5 members. On each committee, two people serve as the co-chairs. The co-chairs can answer any questions related to assessment and the co-chairs are the only members who have the authoritative power to “throw out” artifacts that do not have clear directions and/or cannot be assessed (the reason why any committee member cannot opt to not assess artifacts is that even weaker artifacts typically need to be assessed in order for areas of improvement to be identified). The co-chairs must agree to discard an artifact set and must decide which artifacts will replace the discarded artifacts. The co-chairs will also be responsible for collecting and compiling the committee results and must be in agreement on the final product before submitting the results and findings to the Assessments Coordinator. At least one member on each committee will also serve on the Instructional Assessment Committee; this is important because it is the Instructional Assessment committee that creates and revises rubrics and acts as a voice to the institution.

Committee Expectations Make a game plan for artifact assessment Check to assure 100 chosen artifacts can be assessed Develop internal deadlines for when the artifacts will be assessed and/or set up meeting times to assess the artifacts Complete entire artifact evaluation by May 1st Will you assess artifacts individually or as a team? When teams assess artifacts, there is typically only one resulting group score for each artifact. When individuals assess artifacts, each team member rates each artifact and a co-chair or co-chairs compile and average the results. Will you use the excel spreadsheet you were provided? Verify that the artifacts on the J drive can be assessed (read the assignment and glance at the artifacts to make sure they can be assessed using the definitions and rubric tool that you are provided).

Competency Overview 2 in 1 Rubric: Involves Critical and/or Creative Thinking Skills Broad statements Diverse group of assignments submitted for assessment We collect artifacts one year in advance so the work that was submitted last year was from instructors who intended for the work to fulfill the old critical thinking competency. When we changed competencies, we went through the work and attempted to find work that would work for the new competency statement. To have one rubric that works for both critical and innovative problem solving methods, the statements were written in a broad manner and “OR” statements were used on the rubric.

Concept Definitions Inquiry A close examination or interpretation of a matter. Critical inquiry may involve the analytical interpretation of evidence and arguments. Interpretive inquiry may include an investigation into alternative points of view. Brainstorming methods or novel and untested solutions to a problem can be a part of the inquiry process.

Concept Definitions Analysis A critical examination of explanations and problem-solving methods. Analysis involves the ability to dissect, fully understand, and explain individual ideas. Analysis can also be used innovatively by pinpointing problem-solving methods found through the examination of a problem, task, etc.

Concept Definitions Synthesis Interlacing individual argument components so that a meaningful, coherent whole can be formed. Synthesis can use logical deductions to form scientific/mathematical arguments. Synthesis can also be used to effectively present a new or existing concept.

Concept Definitions Product The result produced by using evidence to form a coherent conclusion or the result produced by taking an innovative approach to a given task. The product is the end result and as such should either supply a coherent conclusion, solution, and/or product based on evidence or should use innovation to form a new and well-structured conclusion, solution, and/or product.

Points of Consideration Evaluation of the Students and not Instructors Please do not grade students based on the perceived level of difficulty for the assignment (e.g. if an assignment is only required to be 1 page long, the examination of a matter may not be as thorough as a 17 page essay, but the argument could be just as strong). When you make your comments, you are welcome to give insight on ways that student skills could be better assessed by the faculty (e.g. we need to require a higher level of content analysis in our literature-based classrooms), but if you recognize work as being from a particular class, please do not make comments like “Kristin 1301 needs to get her act together and ask higher-level questions”).

Submission Checklist All artifacts scored Student strengths identified Student weaknesses identified Process approval suggestions provided Ideas for improvement/interesting findings identified

Questions Co-chairs Assessments Coordinator Carol Summers (cbsummers@actx.edu; 371-5416) Michael Kopenits (mskopenits@actx.edu; 371-5491) Assessments Coordinator Kristin McDonald-Willey (kmw@actx.edu; 371-5420) If you have questions, you can contact one of your co-chairs. If you are unable to reach your co-chair or if your co-chairs have questions, please contact me.