Assessment: Measuring the Impact of our Programs on Students

Slides:



Advertisements
Similar presentations
Employer Mentoring at Edinburgh Napier University Claire Bee Towards a Confident Future.
Advertisements

CRITICAL THINKING The Discipline The Skill The Art.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Achievement of Educational Outcomes: Seniors’ Self- evaluations from 2004 & 2007 National Surveys of Student Engagement (NSSE) Cathy Sanders Director of.
Planning Value of Planning What to consider when planning a lesson Learning Performance Structure of a Lesson Plan.
Getting Started. Decide which type of assessment –Input assessment –Process assessment –Outcomes assessment –Impact assessment.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Graduate Attributes Jackie Campbell, Laura Dean, Mark de Groot, David Killick, Jill Taylor.
Sheila Roberts Department of Geology Bowling Green State University.
Communication Degree Program Outcomes
RESPONDENT BACKGROUND DISTRIBUTION Data from 31 survey respondents Student Assessment of Their Learning Gains from Conducting Collaborative Research Projects.
Building Strong Geoscience Departments for the Future Cathy Manduca, Carol Ormand Carleton College Heather Macdonald, Geoff Feiss, College of William and.
Logic Models and Theory of Change Models: Defining and Telling Apart
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Research Design. Selecting the Appropriate Research Design A research design is basically a plan or strategy for conducting one’s research. It serves.
Ellen Iverson, MESI Graduate Assistant Director of Evaluation at the Science Education Resource Center (SERC) at Carleton College 1 Logic Models for Smarties.
The Power of Being Explicit: Thoughts on Program Planning and Assessment Cathy Manduca SERC at Carleton College.
Source : The Problem Learning and innovation skills increasingly are being recognized as the skills that separate students who are.
Resources and Reflections: Using Data in Undergraduate Geosciences Cathy Manduca SERC Carleton College DLESE Annual Meeting 2003.
Teaching for Student Success Cathryn A Manduca SAGE 2YC 7/18/2013.
LIS 570 Qualitative Research. Definition A process of enquiry that draws from the context in which events occur, in an attempt to describe these occurrences,
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
Fostering Sustained Impact: Lessons Learned from Geoscience Faculty Workshops Ellen Roscoe Iverson, Cathryn A. Manduca, Science Education Resource Center,
Assessment and Evaluation Feedback to Teams Summer 2015.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
The Logic Model as Compass: Guiding Program Staff Through the Consequences of Evaluation Ellen Roscoe Iverson, Carleton College,
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
Where are we now? Assessing Geoscience Programs: February 2009.
Learning Gain in Active Citizenship Funded by the Higher Education Academy (HEA) Dr. Mary Deane, Senior Lecturer in Education Oxford Brookes University.
Logic Models How to Integrate Data Collection into your Everyday Work.
Introduction Social ecological approach to behavior change
Evaluating the Quality and Impact of Community Benefit Programs
Quality Assurance processes
LOGIC MODEL A visual depiction of what a project does and what changes it is expected to bring about. Learn more: Readings, template, examples:
Assessment in student life
DATA COLLECTION METHODS IN NURSING RESEARCH
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
ePortfolios and the First-Year
Chemistry careers in SMEs
Impact-Oriented Project Planning
Keeping Track in Complicated and Complex Situations
Qualitative research: an overview
Classification of Research
What is one transformational experience from your time abroad.
Overview of Learning Outcomes
Building Professional Learning Communities
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Program Evaluation Essentials-- Part 2
Qualitative vs. Quantitative Research
Creating an Active Learning environment
Chapter 7: Critical Thinking
Assessment and Evaluation Feedback to Teams
© 2012 The McGraw-Hill Companies, Inc.
Cathy Manduca, SERC Earth Educators Rendezvous 2017
Earth Educators’ Rendezvous Workshop Leader Webinar
Logic Models and Theory of Change Models: Defining and Telling Apart
Shelton School District
Creating Assessable Student Learning Outcomes
Employability across disciplines:
Learning online: Motivated to Self-Regulate?
Regulated Health Professions Network Evaluation Framework
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Systematic Approach to Training
Changing the Game The Logic Model
Student Learning Outcomes at CSUDH
Approaches to Learning (ATL)
Curriculum Coordinator: Patrick LaPierre February 3, 2017
The influence of teacher participation in Lesson Study on the manifestation of his leadership qualities (from the experience of teachers at NIS Ust-Kamenogorsk)
Keys to Success in Engineering Study
Presentation transcript:

Assessment: Measuring the Impact of our Programs on Students Cathy Manduca and Ellen Iverson SERC at Carleton College

Why Assess? “I don’t know what we are doing right” “We only have 120 hours”

Performance Evaluation Does the program address an important challenge? What is the quality of the implementation? What is the utility--are participants using it? Is it being used in different situations? Is it being shared by participants? John McLaughlin

Why Assess? “I don’t know what we are doing right” “We only have 120 hours” “We will lose majors if we do X” “Change is accomplished one funeral at a time” “That would really help me with the dean” “It must work because other institutions are doing it.”

Adoption of a Practice What results does it yield? How do you do it? Where is it flexible? How much does it cost? What aspects of context influence use? John McLaughlin

Powers of Observation Applied to Complex Systems

Experimental Design

Assessment and Geoscience Research Experimental design: What do I want to know? How can I obtain this information? What do I think is going on? How can I test this idea? Analysis: Is the experimental apparatus working? Is the experiment yielding the desired information? How can I verify my results? Interpretation: How do I determine causality? Is this the only interpretation?

Observing Complex Systems Why are undergraduate research experiences valuable? What do I think is going on? What can I observe to test my theory? How can I quantify my observations?

Geologic Strategies in Assessment Multiple working hypotheses Observations Describing, classifying, coding Inferring process and cause Probing complex systems

Understanding your model through concept maps Articulate and map change process Go from your assumptions to desired goal Present plausible rationale for how program works Analyze likelihood of success Clarify what should be measured when, how, and by whom Depict the links between the resources, activities, or services and desired outcomes Is a Critical thinking exercise to help you with your action plan. It lets you… goal-based, plausible, someone on one of the road checks said assessing is easy but if it fails what is plan B…this thinking helps you go through the what ifs and measurments when, how, by whom and then does it hold together

Internship logic model example Students gain from program: Who then impact: Which impact students’: Employers perception of program Success in finding a job in geoscience at graduation Longevity at first job Knowledge Skills Attitudes Networks Resources Students participate in internship initiative Other students’ perception of program So I’d like to take you through a simplified example for adding a student internship component to a program. KSAs- knowledge of industry S: communication, ability to work with team, balance competing demands, A greater confidence, greater awareness of relevance of education

Are you implementing your model? Is your model correct? Is what you are mapping what you think should happen? What are the impacts? Are the impacts caused by your planned actions?

Instruments to measure interventions Qualitative – self report and artifacts Open-ended survey questions Embedded assessment Artifacts such as reports or online discussion Interviews and focus groups Observations Quantitative - counting Data sets Surveys Counting – how many students, how many employers, how many internships, % partcipating and % satisfied (likert scale) Open-ended survey or interview – end of internship survey student /employer Embedded assessment – journal student keeps or could be online discussion artifacts or the assessment can be its own intervention by students giving oral report of internship back to intro geo class Focus groups to get a sense of their expectations/structure ahead of time

Why Assess? “I don’t know what we are doing right” “We only have 120 hours” “We will lose majors if we do X” “Change is accomplished one funeral at a time” “That would really help me with the dean” “It must work because other institutions are doing it.” What is the purpose? Who is the audience? What will make a compelling argument?