Assessment and Course Redesign in Community College Geosciences

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Course Design: The Basics Monica A. Devanas, Ph.D. Director, Faculty Development and Assessment Programs Center for Teaching Advancement and Assessment.
Regional Weather Tracking Unit Portfolio Presentation Courtney Nielsen.
Assessment OF Learning (Summative) vs
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments Oct 2013.
Scientific Inquiry: Learning Science by Doing Science
Joanne Chen Irvine Valley College.  SLOs are statements that specify what students will know, be able to perform and to demonstrate.  SLOs specify an.
© Copyright 2014 Milady, a part of Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Presented by the SRHS Literacy Team.  Recap Last Meeting  Focus For Today - Steps 1 And 2  Activity  Homework.
Competency Structure in D2L MEASURING STUDENT LEARNING OUTCOMES Pima Community College Center for Learning Technology.
Selected Teaching-Learning Terms: Working Definitions...
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Learning Assessment Loop Closing Activity in the Biology Department Vivian Navas Biology Department Assessment Coordinator May 2006.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Presented to GETSI by Ellen Iverson, SERC, Carleton College Developed as InTeGrate talk by David Steer Department of Geosciences The University of Akron.
Stuart Birnbaum Department of Geological Sciences The University of Texas at San Antonio Learning objectives and assessments June 15, 2015.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
General Education Assessment Report Assessment Cycle.
Goals To understand assessment of student science learning. To learn about RIPTS Standard 9.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
Chapter 1 Assessment in Elementary and Secondary Classrooms
Department of Physics and Goal 2 Committee Chair
Assessing - How do we get beyond the final exam and effectively assess what our students are learning? David Steer Department of Geology & Environmental.
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
ACT SCIENCE.
“Twas the Night Before Testing”
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Georgia Milestones Assessment
Fullerton College SLOA Workshop:
Preliminary Data Analyses
Assessing Students' Understanding of the Scientific Process Amy Marion, Department of Biology, New Mexico State University Abstract The primary goal of.
Assessment of Student Learning
Regents Prep Living Environment.
ASSESSMENT OF STUDENT LEARNING
Understanding Your Child’s Report Card
Contemporary Issues November 8, 2010.
Classroom Assessment Validity And Bias in Assessment.
What’s Happening With Millennials In Community College Geosciences
Teaching and Educational Psychology
Writing to Learn vs. Writing in the Disciplines
EarthComm Inquiry: Preparing Students to be Critical Thinkers and Science Literate Citizens Key Points: Today’s talk is about a high school Earth science.
AP English Language and Composition
Prepared by: Toni Joy Thurs Atayoc, RMT
Using Action Research to Guide Student Learning Experiences & Assessments Exploring the synergistic activities of curriculum design/delivery and assessment.
Introduction to Physical Science & Scientific Method
Research Question Can reading guides help students in introductory statistics make better sense of their textbooks and achieve greater success in the.
Outcome Based Education
Assessment OF Learning (Summative) vs
Assessing Student Learning Heather Holshouser CUR 528 September 19, 2015 University of Phoenix Siddeeqah Johnson.
Adrianne Leinbach & Gretchen Miller
Critically Evaluating an Assessment Task
Assignment Design Workshop
Exploring Assessment Options NC Teaching Standard 4
EPAS Educational Planning and Assessment System By: Cindy Beals
Criminal Investigation Program School of Science, Health, and CJ Fall 2015 Assessment Report
Assessment of Classroom Learning
Assessment Literacy: Test Purpose and Use
Curriculum Coordinator: Marela Fiacco Date : February 29, 2015
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Miss Tower Chemistry Grading Scale 10th and 11th grade
The Impact of Project Based Learning on High School Biology SOL Scores
Cornell Notes with GBQs
AP Biology.
SLOs, Curriculum, and Other Things that Shape Your Classroom
Curriculum Coordinator: Janet Parcell Mitchell January 2016
Presentation transcript:

Assessment and Course Redesign in Community College Geosciences Course Design, Improving Diversity, and Transfer Opportunities in Geoscience Wake Technical Community College, Geology Department November 18, 2017 Dr. Kenneth L. Howard, LG

Course Design and Evaluation Design – select textbook, write syllabus, and develop presentation methods. Assessment of student outcomes – measure outcomes with numerous means and assign a grade. Course evaluation – we evaluate student success, course content, methodology, and our success based on results. Redesign – we use data gathered to modify and improve our teaching and assessment methods.

Assessment of Student Outcomes Homework Assignments Classroom and Laboratory Exercises Term papers Testing Quizzes Unit tests Final Exam

How Do We Use Test Results The primary purpose of testing is to assign a grade. Students are evaluated against a grade scale with 70% correct response level selected, but We must also use our testing to assess our own course content, our testing procedures, and our success as instructors.

How Do We Ask a Question Does it matter how we ask? We commonly ask a variety of questions reflecting different levels of student understanding and difficulty of content: Multiple choice questions, Short answer questions, Matching, and Essay questions. Generally, we evaluate questions for content and the level of difficulty using Bloom’s Taxonomy levels as our basic guide. Recall Understand Apply Evaluate

Can you predict student performance on these comparable questions? On the accompanying figure, which letter is likely over oceanic crust that is similar in age to that beneath letter E? ______ a. A and B b. B c. C d. D   On the accompanying figure, which letter is over the oldest oceanic crust? ______ a. A e. E (Bloom Level 2.7 Inferring)

Did you get the right question? 85% 52%

Short answer versus multiple choice Clay minerals formed from destruction of feldspars illustrates which weathering process? ___________ (Bloom Level 1.2 Recalling)  Conversion of feldspar to clay illustrates which weathering process? a. oxidation b. mechanical c. hydrolysis d. syntropical (Bloom Level 1.1 Recognition) Which question has a higher student success rate?

Did you pick the easiest? Short Answer Result 43% Multiple Choice Result 57%

What one little word may do? Which of the following features is younger than Fault 1? _______ a. the lava flow b. the granite c. tilted layers d. lava flow and granite e. tilted layers and granite Student success rate: 11% What do you think the modification was? What do you think the change in student success was?

The word is ONLY Student success rate after modification: 45% Which of the following features is younger than Fault 1? _______ a. only the lava flow b. only the granite c. tilted layers d. lava flow and granite e. tilted layers and granite Student success rate after modification: 45%

How Do We Use Test Results to Assess Our Course Content? We assess a number of issues with tests: course content, course learning objectives, student understanding, our own performance, and student performance. What is the difficulty of the question? Is the question asked in such a way that the student should understand the information requested? Do we collect and track sufficient data to make sure we can measure success and change?

Testing and Assessment Common Objectives and Exam Questions at Course Level Since 2010, the WTCC Geology Group has tested students on 12 course objectives with multiple choice questions on final exams. Data are compiled and reviewed at the end of each semester to establish long-term trends. Common Questions Aimed at Specific Content Knowledge and Comprehension Levels More content or concept targeted learning outcomes are evaluated for content specific and Bloom level specific . Each assessment targets general knowledge, critical thinking, and scientific literacy for each content area.

Common Exam Question Results Since Fall 2010, test scores on 9 objectives have fallen (1 – 15%), one objective has not changed, and two objectives have increased (1 – 3%). Students currently achieve 70% success on only 5 of the twelve objectives.

Common Questions Aimed at Specific Content or Understanding Targeting a specific area of course content with 1) low level questions to measure general knowledge and 2) questions to specifically assess critical thinking skills and scientific literacy. Presented to students as quizzes with a total of ten questions in an on-line format. Results are analyzed at the end of each semester and one particular area of low performance is selected for follow on intervention.

Scientific Literacy Intervention Initial Finding (from SLO#2, PLO#6 quiz results): Students do not understand sorting of sediments and answer a scientific literacy question at a 35.6% level of proficiency. First Level Intervention: modify the question and learn that the problem is not with the question (student success 36.8%). Second Level Intervention (on-going): increase teacher effort to explain the concept in either lecture or laboratory format. First semester results encouraging (student success 41.6%) but not decisive.

Assessment of Our Teaching Prowess Tracking pre- and post-course student knowledge is essential to understanding the effectiveness of our methodology. Scientific Literacy – a test of general science knowledge (general science, physics, chemistry, biology, and earth sciences) given on the first and last day. Repeating questions from semester tests verbatim on the final exam and looking at the difference in success.

Changes in Scientific Literacy

Repeat Questions on Final Exam Topic Semester Final Difference Regression 43.1 59.1 16.0 Crystallization 47.5 62.2 14.7 Hydrolysis 57.8 61.8 4.0 Perched H2O Table 26.8 26.7 -0.1 Continental Rift 68.4 85.0 16.6 Metamorphism 52.4 58.2 5.8 Age Dating 78.8 71.7 -7.1 Rock Forming Minerals 87.1 91.0 3.9 Angle of Repose 50.2 67.3 13.1 Earthquake Distance 68.8 92.4 23.6

Redesign Based on Results How do we use the information that we gather? Interventions directed at specific topics designed to give students better understanding of material. Change lecture emphasis and presentation Create a classroom or lab exercise Assign homework on the topic Adopt a new textbook. When all else fails, change the question.

Summary of Observations Assessment is not always a straightforward process. Interventions do not always achieve the desired results. Most students will not be successful with some course content despite intervention. After long periods of stability in outcomes, it’s time to rethink the entire process.

The End