MEASURES OF SUCCESS: Assessment and Evaluation

Slides:



Advertisements
Similar presentations
MEASURES OF SUCCESS: An Evaluators Perspective Carol L. Colbeck Director & Associate Professor Center for the Study of Higher Education The Pennsylvania.
Advertisements

Introducing the Researcher Development Framework (RDF) Gill Johnston, University of Sussex.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Cooperative/Collaborative Learning An Instructional technique in which learning activities are specifically designed for small interactive groups Collaborative.
Chapter 5 Motivation Theories
What makes great teaching?
The Academic Assessment Process
DED 101 Educational Psychology, Guidance And Counseling
7/14/20151 Effective Teaching and Evaluation The Pathwise System By David M. Agnew Associate Professor Agricultural Education.
S.M.Israr Training Evaluation By Dr Syed Israr Aga Khan university Karachi, Pakistan.
Standards and Guidelines for Quality Assurance in the European
Reflective Pathways from Theory to Practice Brewton-Parker College Education Division.
Providing Orientation and Training
Theories of Action: What are they, why are they important, and how are they created? January 2015 Office of Student and School Success, OSPI Travis Campbell,
GOALS & GOAL ORIENTATION. Needs Drive Human Behavior  Murray  Maslow.
EVIDENCE THAT CONSTITUTE A “GOOD PRACTICE IN THE EVALUATION OF POLICIES Education Commission of the States National Center for Learning and Citizenship.
Derek Herrmann & Ryan Smith University Assessment Services.
Overall Teacher Judgements
Interstate New Teacher Assessment and Support Consortium (INTASC)
Program Evaluation and Logic Models
 Examines the nature of culture and the diverse ways in which societies make meaning and are organized across time and space. Topics include cultural.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Inspire Personal Skills Interpersonal & Organisational Awareness Developing People Deliver Creative Thinking & Problem Solving Decision Making, Prioritising,
RET conference An External Evaluation of the RET Program in the Materials Science and Engineering Department, University of Arizona: Our Model &
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
CONTINUING PROFESSIONAL DEVELOPMENT (CPD) MEDU 222.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 4 Motivating People.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
Towards a pedagogy for employability Implications for learning design.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
1 Capstone design and curriculum renewal Margot McNeill Learning and Teaching Centre Thursday, 2 July 2009.
Teacher Work Sample. Lectures Objectives: 1.Define the teacher work sample. 2.Integrate lesson plans with a practice Teacher Work Sample in terms of the.
Dr Camille B. Kandiko King’s College London
21st Centruy Approaches to Teaching Physics
DEPARTMENT OF HUMAN AND SOCIAL CIENCES APPLIED LINGUISTICS IN ENGLISH CAREER    “THE INFLUENCE OF TEACHER’S ATTITUDES AND BELIEFS INTO TECHNOLOGY-RELATED.
Health Promotion & Aging
WORKSHOP Computer Science Curriculum Development
Lesson 1 Motivation.
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
TEACHING MATHEMATICS Through Problem Solving
Health Education THeories
Using Logic Models in Program Planning and Grant Proposals
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
“CareerGuide for Schools”
Competency Assessment
Derek Herrmann & Ryan Smith University Assessment Services
Information Technology (IT)
Study Questions To what extent do English language learners have opportunity to learn the subject content specified in state academic standards and.
WHAT IS LIFE LONG LEARNING IMPORTANCE OF LIFE LONG LEARNING
Dr Camille B. Kandiko Howson Academic Head of Student Engagement
EVALUATION RESEARCH These are those types of research that evaluate the impact of social interventions. Motivated by the desire of social scientist to.
Learning online: Motivated to Self-Regulate?
Dr Camille B. Kandiko Howson Academic Head of Student Engagement
The Heart of Student Success
Understanding a Skills-Based Approach
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Planning Training Programs
Committee # 4: Educational Program For The MD
Teaching to Lead Preparing CTE Teachers for Today’s Students Debbie Anderson, Director cell#
OD Interventions Unit-3.
Chapter 4 Instructional Media and Technologies for Learning
Table 1. Conceptual Framework Learning Outcomes
Table 3. Standardized Factor Loadings of EFA
Learning (Training) Needs Assessment
Research-based Active-Learning Instruction in Physics
Presentation transcript:

MEASURES OF SUCCESS: Assessment and Evaluation CIRTL Measures for Success MEASURES OF SUCCESS: Assessment and Evaluation Chris Pfund based on presentations from Carol L. Colbeck and Sue Daffinrud Colbeck Copyright 2002

Definitions Assessment Assessment instrument Evaluation The process of measuring something (e.g., student knowledge, skills, and attitudes within a classroom setting) Assessment instrument The method or device for measuring (e.g., formally - a test; informally – observations of students, interaction with students) Evaluation A judgment of something based upon results from data gathered from assessment

CIRTL Measures for Success LEVELS OF EVALUATION Participation Satisfaction Learning Application Impact Colbeck Copyright 2002

CIRTL Measures for Success PARTICIPATION Who? What? Why? Demographics Disciplinary Major Race/ Ethnicity/Gender GPA Experience # of years/ classes in discipline Other professional development experience Motivation Own goals External encouragement Evaluation and Research questions: What % of the targeted population are participating? What are demographics, experience, motivation of non participants? Does participation change over time? If so, among which groups? How can we increase participation? Colbeck Copyright 2002

CIRTL Measures for Success SATISFACTION Value for participants Content Instruction/facilitation Did program address participants’ reasons for coming? Own goals External encouragement Were learning objectives specified and addressed? Did process foster participant interest? Instructor style Teaching methods Did participants like the program? Consider it worthwhile? Did the program deliver the content the developers hoped and planned to deliver? Did the instructor’s style engage participants, convince them of his/her suitability to deliver the information, model the desired attitudes and skills? How can process be improved to keep participants coming? Attract new participants? Colbeck Copyright 2002

CIRTL Measures for Success LEARNING Knowledge gains Conceptual change Skill development Has awareness increased? Content Knowledge Theoretical foundations Empirical research Are there changes in? Attitudes Beliefs Confidence Can tasks be performed? Behaviors Did participants learn what was intended? To what extent did the program contribute to their learning? What else contributed? Colbeck Copyright 2002

CIRTL Measures for Success APPLICATION Attempt Implement Adjust Are new knowledge, attitudes, skills tried independently? Does application follow desired processes & procedures? Does participant analyze & adjust appropriately for context? Colbeck Copyright 2002

CIRTL Measures for Success IMPACT On undergraduate students Increased satisfaction Deeper learning Application: in lab, on job, on GRE’s On graduate student participants Increased confidence in teaching Wider array of career opportunities More integration of teaching and research On institution Improved recruitment/retention of graduate students Improved recruitment/retention of undergraduate students Stronger community around research, teaching, and learning Colbeck 2003 Is program worthwhile? Colbeck Copyright 2002

SAMPLE EVALUATION QUESTIONS CIRTL Measures for Success SAMPLE EVALUATION QUESTIONS What types of students are most likely to participate? How do relationships between participant characteristics, experience, and teaching methods affect their satisfaction? Learning? How do relationships between institutional context, class content, and individual characteristics affect participants’ independent application of learning? Under what conditions are undergraduate students most likely to report gains in satisfaction and learning from a participant who has undergone professional development training? Colbeck 2003 Colbeck Copyright 2002

Student Assessment of Learning Gains (SALG) SALG instrument asks students to rate the extent to which course aspects helped them to learn and the extent to which they are achieving objectives of course Website enables you to survey students on-line Available at: http://www.salgsite.org/

The SALG has five main questions: How much did each of the following aspects of the course help you in your learning? How well do you think that you now understand each of the following? How has this class added to your skills in each of the following? To what extent did you make gains in each of the following? How much of the following will you carry with you to your other classes?