Evaluation Styles Logic Model Generic Social Developmental

Slides:



Advertisements
Similar presentations
Measuring Success & Impact: Challenges & Opportunities for STEM Early Learning Programs Tiffany R. Lee University of Colorado Boulder University of Washington.
Advertisements

The Vocational Preparation Task Career Investigation.
Impact assessment framework
Supporting voluntary and community action supporting voluntary and community action Measuring Outcomes How do we understand the changes we are making …
Model Name Transition Project Learning Network Workshop 3 IYF Transition Project.
Practical Definition of Terms Output Targets What do we want to achieve? Deliverables What do we get? Output Targets – To improve the public speaking skills.
The PYP Exhibition Information. Purpose a celebration of the transition of learner from primary to middle school an in-depth, collaborative inquiry demonstrate.
Evaluating OD: The Diamond Standard Dr Dione Hills Principal Researcher Consultant Tavistock Institute of Human Relations.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Development Team Day 5a October Aim To explore approaches to evaluating the impact of the curriculum on pupil learning.
Lesson 11: Designing Research. Naturalistic Observation When designing a naturalistic observation researchers need to consider;  behavioural categories,
To Kill a Mockingbird & Primary Sources Real Women of Alabama.
TECHNOLOGY IN THE CLASSROOM Integration of technology in teaching and learning.
© 2013 TILA 1 Organizing telecollaboration projects TILA Teacher Training Teacher as researcher.
Teaching as Inquiry 2012 Cluster Share presentation Focus: Digital Reading.
How to evaluate the impact of CPD? 28 th April 2016.
Teaching as Inquiry 2012 Cluster Share presentation Focus: Digital Reading.
Research in Sociology  Like all scientists, sociologists gain knowledge by doing research. They ask “how” and “why” and then they form a hypothesis 
EVALUATION An introduction: What is Audience Data and
Evaluating the Quality and Impact of Community Benefit Programs
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Monitoring and Evaluation in Asset Based Approaches
RESEARCH METHODS Lecture 12
Research Methodologies
Monitoring, Evaluation and Learning
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Evaluation Emma King.
Chapter 3 Choosing Information & Communications Technologies that Fit the Research Design Janet Salmons, PhD.
Lecture3 Data Gathering 1.
Rethinking data: Get creative!
Information Gathering Using Quantitative Methods
Research Methods Lesson 1 choosing a research method types of data
Asking questions: Interviews, Wednesday 14th December 2016
Jenny Lyn Tee Estrada-Firman Reporter
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Methods Choices Overall Approach/Design
Providing Evidence for your Impact
Program Evaluation Essentials-- Part 2
Qualitative vs. Quantitative Research
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Measuring Project Performance: Tips and Tools to Showcase Your Results
Starter Look at the photograph, As a sociologist, you want to study a particular group in school. In pairs think about the following questions… Which group.
Market Research: Primary Data.
Discuss with your students the purpose of your visit
Evaluation Jacqui McDowell.
WP4. Development of Evaluation framework
Advanced Program Learning Assessment
2018 OSEP Project Directors’ Conference
Evaluation tools.
Data Literacy Survey results and Data Protocols
SOCIOLOGY RESEARCH METHODS.
Marketing Research.
Choi Wai Kit (Gavin) St. Margaret’s Girls’ College, Hong Kong
What skills did we learn about the importance of R&P?
Market Research: Primary Data.
Explaining the Methodology : steps to take and content to include
So, did it work? Academic Development Officer, HEA 1 April 2014
Monitoring, Evaluation and Learning
Evaluation – Building a Logic Model?
Build it and They Will Come
Monitoring & Evaluating
Partners in Learning Educator Professional Development
Interactive media.
Measuring Audience and Impact
A process for geographical inquiry
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
RESEARCH METHODS Lecture 12
Trimester 3 Week 3 Final Major Project.
Student’s Presentation
Presentation transcript:

Evaluation Styles Logic Model Generic Social Developmental Generic Learning Digital Triangulation Public Engagement with Research Unit University of Southampton

Generic Social Clearly identifying the core purpose/outcome of your activity Developing an evaluative question to reflect this core purpose by thinking about what participants and staff would do, think and say if the project outcome has been met Think of questions that are open (do not invite a yes/no answer) and that give respondents scope to say what is important to them Adapt your methods of data collection to the audience, e.g. if working with children, visual methods might be more effective than a questionnaire Home

Logic Model Inputs (what is required to achieve aims) Activities (what the project does with the resources) Outputs Short-term outcomes Longer-term outcomes Measurement Tools Home

Digital Photographs Video Video diaries Blogs Audio recording Podcasting Mobile phones (texting, apps, etc.) Internet platforms Web polls Email Home

Developmental Innovation Radical redesign Replication Complex issues Crises Home

Generic Learning Knowledge and understanding Skills Behaviour and progression Enjoyment Inspiration Creativity Attitudes and values Home

Triangulation Using different methods to collect information Asking different people the same thing, to gain a well-rounded perspective of evaluation Mixture of quantitative and qualitative methods (e.g. surveys, focus groups, observation, questionnaires, tracked attendance figures etc.) Bring together different theoretical approaches to interpret the outcomes of research Can also describe the work of several researchers combining their observations of the same evidence during the same time (e.g. a gallery observation by a team) Mix of both primary and secondary evidence Can help to protect against built-in bias within evaluation methods Avoids reliance on the written or spoken word, which can be a barrier for some participants (part of the 'Mosaic Approach') Home