Assessing Student Understanding of Physical Hydrology (#0691) Adam J. Castillo a,c ; Jill Marshall a ; Meinhard B. Cardenas b a Department of Curriculum.

Slides:



Advertisements
Similar presentations
Initiative on K-12 Teacher Preparation Natasha Speer, Univ. of Maine Tim Scott and Omah Williams, Texas A & M Noah Finkelstein, Univ. Colorado-Boulder.
Advertisements

Program Goals Just Arent Enough: Strategies for Putting Learning Outcomes into Words Dr. Jill L. Lane Research Associate/Program Manager Schreyer Institute.
Outcomes Assessment. Many academic institutions measure the success of service- learning based on participation, number of hours, or estimated monies.
QAA-HEA Education for Sustainable Development Guidance Document Consultation 5 November 2013, Birmingham Professor James Longhurst Assistant Vice Chancellor.
Source Code: Assessing Cited References to Measure Student Information Literacy Skills Dale Vidmar Information Literacy and Instruction Librarian Southern.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
KTMT Research Action Cluster Knowledge for Teaching Mathematics Tasks Update for the CSU-MTEP Convening October 10-11, 2014.
A Systemic Approach February, Two important changes in the Perkins Act of 2006 A requirement for the establishment of Programs of Study A new approach.
CSU-MTEP New Mathematics Standards for K-12 and College Learning Margaret L. Kidd CSU Fullerton October 2014.
Greenville Technical College Assessing and Developing Student Computing Technology Skills September 19, 2012 Dr. Lenna Young, Mark Krawczyk, and Mary Locke.
Evaluation EnAct Campuses Sonoma, Humboldt, Chico, San Francisco, San Jose, Fresno, Bakersfield, Pomona Presentation July 11, 2007 Public Works, Inc. Mikala.
Group Seminar Field Instruction Model.  1. Delivery of consistent competency based field instruction and augmented case supervision.  2. Provision of.
Assessing General Education Joe Safdie San Diego Mesa College Joe Safdie San Diego Mesa College.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
This work is supported by a National Science Foundation (NSF) collaboration between the Directorates for Education and Human Resources (EHR) and Geosciences.
Assessment Report Department of Environmental Science and Biology School of Sciences and Mathematics Chair: Christopher Norment Assessment Coordinator:
Group Field Instruction Model.  1. Delivery of consistent competency based field instruction and augmented case supervision.  2. Provision of consistent.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Striving Readers Comprehensive Literacy Program (SRCL) SRCL is a comprehensive literacy development education program to advance literacy skills for students.
Assessment Report Department of Psychology School of Science & Mathematics D. Abwender, Chair J. Witnauer, Assessment Coordinator Spring, 2013.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Session Materials  Wiki
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
NEW TEACHER EVALUATION PROCESS CONNECTING TEACHER PERFORMANCE to ACADEMIC PROGRESS.
Course Design Adam Berman Nydia MacGregor. Today’s goals and agenda Identify best practices of designing a course Understand how students learn Understand.
Arunee Wiriyachitra, Chiang Mai University
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration – Initiated in 2003 – Low-residency—one weekend per month (over.
1 Orientation to Teacher Evaluation /15/2015.
Universal Design for Learning in the College Classroom Abstract This Faculty Learning Community (FLC) integrated components of Universal Design for Learning.
PROFESSIONAL DEVELOPMENT PLAN WORKSHOP. What is the Professional Development Plan? The Professional Development Plan is a directed planning and evaluation.
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration –Initiated in 2003 –Low-residency—one weekend per month (over.
Research Problem UF sciences faculty complain that students lack adequate computer, math/statistics, and writing skills Biology faculty have embraced undergraduate.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Corinne H. Lardy Cheryl L. Mason San Diego State University The Association for Science Teacher Education (ASTE) January 14-16, 2010 Sacramento, California.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
ISECON 2006 The Work System Model as a Tool for Understanding the Problem in an Introductory IS Project Doncho Petkov Eastern Connecticut State University.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Career Academic Technical Institute (CATI) Division of Career-Technical Education TN State Department of Education 25th NACTEI New Orleans, 2005.
QEP Update Primary Outcomes improvement of students’ critical thinking skills 1. Measurable improvement of students’ critical thinking skills at the course,
WELCOME TO BUS 302 The Gateway Experience For more information visit:
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Comparison of Student Learning in Challenge-based and Traditional Instruction in Biomedical Engineering Others: Taylor Martin, Stephanie D. Rivale, and.
The Georgia Department of Juvenile Justice Board of Education Presentation May 26, 2011.
A Software Engineering Model Based Curriculum Development Approach Leon Pan University of the Fraser Valley.
A Tutorial Institutional Research and Effectiveness.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Improving Courses Across an Online Program: A Design-Based Approach Leonard Bogle, Scott Day, Daniel Matthews & Karen Swan University of Illinois Springfield.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
1 Using DLESE: Finding Resources to Enhance Teaching Shelley Olds Holly Devaul 11 July 2004.
CAEP Standard 4 Program Impact Case Study
Making assessment in PhD programs more useful for faculty and students
Stop, Collaborate and Listen:
ORGANIZATIONAL STRUCTURE
Derek Herrmann & Ryan Smith University Assessment Services
Department of Computer Science The University of Texas at Dallas
RATIONALE CURRICULUM IMPLEMENTATION
Assessing Learning Outcomes
K–8 Session 1: Exploring the Critical Areas
Curriculum Coordinator: Marela Fiacco Date : February 29, 2015
Presentation transcript:

Assessing Student Understanding of Physical Hydrology (#0691) Adam J. Castillo a,c ; Jill Marshall a ; Meinhard B. Cardenas b a Department of Curriculum and Instruction, The University of Texas at Austin, 1912 Speedway Stop D5700, Austin, Texas, 78712, USA b Department of Geological Sciences, The University of Texas at Austin, 2275 Speedway Stop C9000, Austin, Texas, 78712, USA c corresponding author: Assessing Student Understanding of Physical Hydrology (#0691) Adam J. Castillo a,c ; Jill Marshall a ; Meinhard B. Cardenas b a Department of Curriculum and Instruction, The University of Texas at Austin, 1912 Speedway Stop D5700, Austin, Texas, 78712, USA b Department of Geological Sciences, The University of Texas at Austin, 2275 Speedway Stop C9000, Austin, Texas, 78712, USA c corresponding author: Objectives Contribute to the effort to articulate a common knowledge base Identify effective instructional strategies Identify desired learning outcomes of a physical hydrology course Evaluate curriculum reform in physical hydrology Determine a means of assessing whether those learning goals have been met – prototype an instrument Identify “misconceptions, preconceptions, and difficulties” prior to the curriculum development effort 1 Objectives Contribute to the effort to articulate a common knowledge base Identify effective instructional strategies Identify desired learning outcomes of a physical hydrology course Evaluate curriculum reform in physical hydrology Determine a means of assessing whether those learning goals have been met – prototype an instrument Identify “misconceptions, preconceptions, and difficulties” prior to the curriculum development effort 1 Methodology Setting & Participants A physical hydrology course at a large research university Two sections: one upper-division undergraduate & one graduate section No lab or field component Course components include: in-class lectures, homework sets, exams, course project Informed Consent (15 undergraduates & 10 graduates consented; participants equally split between M/F) Learning Goals for Physical Hydrology Course 1.Quantitative process-based understanding of hydrologic processes 2.Experience with different methods in hydrology 3.Learning, problem-solving, communication skills Goals Translated into Questions for the Assessment Tool Q1. What are the important physical processes involved in hydrology? Describe how they affect hydrologic systems in as much detail as you can. Q2. What are the relevant physical laws that govern hydrology and how do these laws determine hydrological processes? Describe them in as much detail as you can. Q3. You have been hired as a consultant by __ to (1) assess how urbanization and the current drought have affected a local spring and (2) predict what the effects will be in the future if the drought continues. What information would you need to gather? What measurements would you make? What analyses would you perform? Be as specific as you can. Methodology Setting & Participants A physical hydrology course at a large research university Two sections: one upper-division undergraduate & one graduate section No lab or field component Course components include: in-class lectures, homework sets, exams, course project Informed Consent (15 undergraduates & 10 graduates consented; participants equally split between M/F) Learning Goals for Physical Hydrology Course 1.Quantitative process-based understanding of hydrologic processes 2.Experience with different methods in hydrology 3.Learning, problem-solving, communication skills Goals Translated into Questions for the Assessment Tool Q1. What are the important physical processes involved in hydrology? Describe how they affect hydrologic systems in as much detail as you can. Q2. What are the relevant physical laws that govern hydrology and how do these laws determine hydrological processes? Describe them in as much detail as you can. Q3. You have been hired as a consultant by __ to (1) assess how urbanization and the current drought have affected a local spring and (2) predict what the effects will be in the future if the drought continues. What information would you need to gather? What measurements would you make? What analyses would you perform? Be as specific as you can. Acknowledgements This research was supported by a U.S. National Science Foundation CAREER Grant (EAR ). We thank the graduate students and researchers who applied the rubric, namely Kevin Befus, Kuldeep Chaudhary, Wen Deng, Alec Norman, Lichun Wang, and Peter Zamora, and the students in the classes where the assessment tool was applied. Acknowledgements This research was supported by a U.S. National Science Foundation CAREER Grant (EAR ). We thank the graduate students and researchers who applied the rubric, namely Kevin Befus, Kuldeep Chaudhary, Wen Deng, Alec Norman, Lichun Wang, and Peter Zamora, and the students in the classes where the assessment tool was applied. Discussion Question 1 Most students entered course with rudimentary understanding of processes involved in hydrology Understanding of the water cycle, especially its physical drivers, proved challenging Difficulties included incorporating groundwater, surface-groundwater interactions, perceptions of the hydrological system as static, and a non- systems approach Students often either neglected or exaggerated biological & human interactions Question 2 Most students entered course with rudimentary understanding of laws that govern hydrology Question 3 Understanding was somewhat higher in the ‘methods’ dimension Still not approaching the ‘full’ level of understanding Post-test Results Student understanding increased as assessed by the drafted rubric 5 Majority of students fell into the ‘basic’ or ‘full’ understanding categories Limitations Small sample – participants were all from one institution Rubric was being piloted and still under development Students are not able to show full range of what they know & are able to do in a limited time and on only one type of assessment 4 Final Comments Had substantial inter-rater agreement among a group of experts from the discipline of hydrology Experts not responsible for the development of the rubric Instrument should serve as a ‘strawman’ proposal to be critiqued Results give indication of ways in which instruction might be redesigned to target specific difficulties & highlight important themes in coherent manner A well-designed pre and post assessment can be used to conclude whether a given instructional intervention has caused a change in understanding in a given group of students Results are not necessarily generalizable Discussion Question 1 Most students entered course with rudimentary understanding of processes involved in hydrology Understanding of the water cycle, especially its physical drivers, proved challenging Difficulties included incorporating groundwater, surface-groundwater interactions, perceptions of the hydrological system as static, and a non- systems approach Students often either neglected or exaggerated biological & human interactions Question 2 Most students entered course with rudimentary understanding of laws that govern hydrology Question 3 Understanding was somewhat higher in the ‘methods’ dimension Still not approaching the ‘full’ level of understanding Post-test Results Student understanding increased as assessed by the drafted rubric 5 Majority of students fell into the ‘basic’ or ‘full’ understanding categories Limitations Small sample – participants were all from one institution Rubric was being piloted and still under development Students are not able to show full range of what they know & are able to do in a limited time and on only one type of assessment 4 Final Comments Had substantial inter-rater agreement among a group of experts from the discipline of hydrology Experts not responsible for the development of the rubric Instrument should serve as a ‘strawman’ proposal to be critiqued Results give indication of ways in which instruction might be redesigned to target specific difficulties & highlight important themes in coherent manner A well-designed pre and post assessment can be used to conclude whether a given instructional intervention has caused a change in understanding in a given group of students Results are not necessarily generalizable Results Background Hydrology has become increasingly interdisciplinary & technologically complex Limited number of studies of student understanding of hydrology & how it evolves over the course of schooling 5 Calls for examining, evaluating & enhancing hydrology education at both the upper-division and graduate level 2,6,7 Early effort – extensive survey of topics covered in college hydrology courses provided as resource to the community 3 Recent calls to “analyze, synthesize, and unite hydrology education” 8 Recent effort – University hydrology educators surveyed about current teaching methods & ways that curriculum and instructor preparation could be improved 8 Major challenge within hydrology community – identifying “common principles, core knowledge, and approaches” 8 Background Hydrology has become increasingly interdisciplinary & technologically complex Limited number of studies of student understanding of hydrology & how it evolves over the course of schooling 5 Calls for examining, evaluating & enhancing hydrology education at both the upper-division and graduate level 2,6,7 Early effort – extensive survey of topics covered in college hydrology courses provided as resource to the community 3 Recent calls to “analyze, synthesize, and unite hydrology education” 8 Recent effort – University hydrology educators surveyed about current teaching methods & ways that curriculum and instructor preparation could be improved 8 Major challenge within hydrology community – identifying “common principles, core knowledge, and approaches” 8 Categories Research Question How can we characterize and assess upper division and graduate student thinking about physical hydrology? Research Question How can we characterize and assess upper division and graduate student thinking about physical hydrology? Rubric for Assessing Student Understanding of Physical Hydrology Rubric Development Process Implementation Obtain informed consent Administration of pre-/post- test Analysis of Student Artifacts R1 – Open-coding R2 – Rubric used to code sample of artifacts R3 – Team used rubric to code set of pre-/post- tests Met to negotiate consensus Development/Revision of Rubric R1 – Codes grouped by theme Four categories established Rubric drafted R2 – Revisions made to rubric R3 – Guidelines created Revisions made to both assessment & rubric