Gen Ed Assessment Critical Thinking Outcome Multiple Choice Question (MCQ) Development Project in the Social Sciences BASED ON SLIDES FROM DEC. LAURA BLASI,

Slides:



Advertisements
Similar presentations
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Advertisements

Course Design: The Basics Monica A. Devanas, Ph.D. Director, Faculty Development and Assessment Programs Center for Teaching Advancement and Assessment.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
Understanding Depth 0f knowledge
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Southern Regional Education Board The will to succeed is only overshadowed by the will to prepare to succeed!
Exploring the 21st Century Imperative I of Intercultural Competence
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Apples to Oranges to Elephants: Comparing the Incomparable.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
SUNITA RAI PRINCIPAL KV AJNI
POLITICAL SCIENCE 204A Thad Kousser, Spring 2011.
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
The Current Refocusing of General Education. Objectives for the Workshop Proposing and/or Renewing a Course Assessing the general education aspect of.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Jeanne M. Clerc, Ed.D. Western Illinois University (WIU) October 14, 2011.
Assessing Critical Thinking Skills Dr. Barry Stein - Professor of Psychology, Director of Planning, Coordinator of TTU Critical Thinking Initiative Dr.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
{ Senate Hearing Project Kathryn Gustafson Farmington High School.
Janet Fulks, ASCCC Bakersfield College Bob Pacheco, RP, Barstow College.
Student Learning Outcomes: Interpretations, Validity, and Factor Development Krista Soria and Laura Gorny This project was funded by the Undergraduate.
PEGGY MAKI, PH.D. EDUCATION CONSULTANT IN ASSESSING STUDENT LEARNING PRESENTED AT CCRI SEPTEMBER 21, 2012 Assessment at Three Levels: Institution, Program,
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
Report on the Evaluation Function Evaluation Office.
Responding Critically to Texts
February 21, JAS Consultation between the Government of Tanzania and Development Partners February 21, 2006 Courtyard Hotel, Dar es Salaam.
Teaching Today: An Introduction to Education 8th edition
Workshop on VHL and HEN, Sao Paulo, April 2006 Workshop on VHL and HEN Sao Paulo, April 2006 Anca Dumitrescu, M.D. WHO Regional Office for.
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Oregon’s Core Standards and Assessment Standards & Assessment Task Force March 20, 2008.
1 Roles and Responsibilities of The Learning Evidence Team at CCRI Presented at CCRI Peggy Maki
CREATING ESSENTIAL QUESTIONS Essential Questions have no one answer.
SUBMITTED TO THE HIGHER LEARNING COMMISSION OF THE NORTH CENTRAL ASSOCIATION OF COLLEGES AND SCHOOLS MAY 2010 Progress Report on Outcomes Assessment.
Preparing for North Central Association / Higher Learning Commission (HLC) Accreditation Reviewing Areas of Specialization and Assessing Learning Outcomes.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
The Power of Information Literacy Developing and Testing a Campus-Wide IL Rubric Mary C. MacDonald & Jim Kinnie Project funded by: Davis Educational Foundation,
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Select Slides… Spring 2013 Training Strengthening Teaching and Learning through the Results of Your Student Assessment of Instruction (SAI) For Faculty.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Connect2Complete Theory of Change Development for Colleges and State Offices November 10, 2011 OMG Center for Collaborative Learning.
Bloom’s Taxonomy The Concept of “Levels of Thinking”
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Assessing Critical Thinking Via Selected and Open Response Items BRADLEY COVERDALE, ED.D DANIELLE D. BROWN, PH.D. DATE: 06/06/2016 LEARNING OUTCOMES ASSESSMENT.
PGES Professional Growth and Effectiveness System.
Development Account: 6th Tranche Strengthening the capacity of National Statistical Offices (NSOs) in the Caribbean Small Island Developing States to fulfill.
Dr. Marciano B. Melchor University of Ha’il, KINGDOM OF SAUDI ARABIA May 2013.
Academic Program Review Workshop 2017
Writing and Revising SLOs with Best Practices in Mind
What Is This Intentional Learning Thing?
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
CRITICAL CORE: Straight Talk.
SLOs: What Are They? Information in this presentation comes from the Fundamentals of Assessment conference led by Dr. Amy Driscoll and sponsored by.
Director, Institutional Research
Questioning Activities
Teaching Spotlight Workshop: Effective Exams
SACSCOC Reaffirmation 2016 Quality Enhancement Plan
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Dr. Drini Imami Agricultural University of Tirana
The QEP and Me QEP Update August 16, 2007.
Opportunities to Enhance Quality at EKU
Assessment David Taylor.
Synthesis Evaluation Analysis Application Comprehension Knowledge
Institutional Self Evaluation Report Team Training
Presentation transcript:

Gen Ed Assessment Critical Thinking Outcome Multiple Choice Question (MCQ) Development Project in the Social Sciences BASED ON SLIDES FROM DEC. LAURA BLASI, PH.D., DIRECTOR, INSTITUTIONAL ASSESSMENT

Critical Thinking – The Gen Ed Outcome  When testing Critical Thinking in the General Education program at Valencia we are focused on the three indicators addressing – (1) bias, (2) use of evidence, and (3) context. A pattern in faculty work focused on Critical Thinking since 2002.

Accomplishment 1 – Materials and Website – Valencia College Gen Ed Faculty Resources Specific to Critical Thinking  planning/institutional-assessment/loa/ResourcesCriticalThinking.cfm planning/institutional-assessment/loa/ResourcesCriticalThinking.cfm

Purpose of the Multiple Choice Question (MCQ) Project The current MCQ project : (1) invests the money in our faculty and a homegrown test item bank that is emerging from our experience with the students in Gen Ed; (2) increases the college’s capacity for reliability testing (in our IR office) moving away from reliance on consultants; (3) assures that faculty concerns about external reliability using pilot data are addressed by recognized experts in the field; (4) provides an option after Social Science faculty discovered that nationally normed example can be $6 per student or higher.

Standards for reviewing the questions specific to the outcome

Different forms of questions are possible…. Examples can be taken from the standardized tests used across the country – for example –  excerpt of a study

excerpt dialogue, speech, or a current event

 Provide a premise and test student assumptions about the idea

Applying the Standards Imagine you are a 2 nd year student… Notice the question begins with reference to Political Science but it is broad enough to be accessible to students who have not taken Political Science.

Bias (analyze others and one’s own)

Context (beyond knowing it is important – examine its relevance when presenting ideas)

Bias (beyond recognizing it, analyzing it)

Evidence

Next steps needed  Questions by Jan 31  Pilot February – March  Expert Analysis April  Discussion of Results Assessment Day  50 questions to develop (we can include review of those we have)  Work is distributed and faculty-led  Questions stand up to peer review applying external standards  So our “Self Test” questions hold up when applied to the items (internal standards.)  The MCQ creation strategies – not discipline specific - from the Steve Downing workshop are followed and adhered to (external standards.)  Timeline for the pilot takes into account student administration and validation study by USF

Dr. Steve Downing Tips for Developing Multiple Choice Questions Across Disciplines (examples)  Write a clear “testing point” or objective for item [context, bias, evidence]  Pose a clear question - review, edit, rewrite  Focus on important/essential information  Assure that question can be answered without reading options  Write clear, concise items; avoid superfluous information  Include most information in stem, avoiding lengthy options  Don’t use trick questions  Test higher-order cognitive knowledge (he refers to Bloom’s Taxonomy) Application, problem solving, judgment, synthesis

 Questions?