Download presentation
Presentation is loading. Please wait.
1
WP4. Development of Evaluation framework
Manoli Pifarré Universitat de Lleida Catalonia - Spain
2
WP4. Development of Evaluation framework WP4 will collect and analyse scientific evidences that could report about how the innovative and good practices designed and implemented have had an impact on students’ development of DA skills and on teachers’ practises. Our evaluation has to be: Systematic – Consistent – Reliable - Replicable
3
In the bid we said that in this meeting we would agree on:
The focus for pilot project evaluation (e.g. pupil participation/enjoyment, skills acquisition, teaching methods etc), The research approach. Use of quantitative (e.g. tests, questionnaires) and qualitative methods (e.g. classroom observation, interviews with teachers/pupils) Responsibilities for developing common tools such as questionnaires or observation guides. Evaluation will run alongside pilot project delivery. It will be reviewed at PM3, after the first wave of pilots.
4
FOCUS FOR PILOT EVALUATION – RESEARCH QUESTIONS
RESEARCH QUESTIONS ABOUT THE IMPACT ON STUDENTS RESEARCH QUESTIONS ABOUT THE IMPACT ON TEACHERS' PRACTICES, ATTITUDES, PERCEPTION
5
INDIVIDUAL - COLLABORATION
FOCUS AND APPROACH IN RELATION TO THE IMPACT ON STUDENTS What – How (changes) PRE - POST DURING PROJECT IMPLEMENTATION ICT INDIVIDUAL - COLLABORATION Quantitative methods Test content (Ad Hoc) Problem-based test (Ad hoc) Questionnaire Students’ work (during the project) DA content - Skills Weather content Attitudes Anxiety Collaboration skills Creative skills ICT use Qualitative methods Focus group Interview Student’s report Observations
6
What is measured in each item
Elaborate Correction Rubrics for each item Well defined Able to be used by other teachers Search for resources Test content (Ad Hoc) Problem-based test (Ad Hoc) Questionnaire Students’ work
7
Define objectives of focus group or interview Define the questions
Focus group Interview Student’s report Classroom observation Classroom videotapes Define objectives of focus group or interview Define the questions Elaborate Correction Rubrics Well defined Able to be used by other teachers Look for resources
8
RESEARCH QUESTIONS ABOUT THE IMPACT ON STUDENTS
METHOD - INSTRUMENT DA content & DA Skills What are the rudimentary ideas of DA skills and how are they expressed among students at different ages? How do the rudimentary ideas of DA skills change after students’ participation in SPIDAS DA learning scenarios at different ages? DA Content Knowledge – Implementation of DA Skills Test ad hoc – For primary, lower secondary and higher secondary. Pre-Post, Individually Students’ Worksheets – During the project Classroom observations Weather content What is the knowledge about XXX (weather content taught during the project)? Test ad hoc
9
Attitudes – Anxiety - Conceptions Collaboration and creative skills
RESEARCH QUESTIONS ABOUT THE IMPACT ON STUDENTS RESEARCH QUESTION METHOD - INSTRUMENT Attitudes – Anxiety - Conceptions How does the SPIDAS DA learning scenario affect students attitudes towards DA, mathematics, weather? What are students' perceptions about the role of DA in addressing real world problems related to weather and climate change? Questionnaire Pre-Post, Individually Collaboration and creative skills How does the SPIDAS DA learning scenario increases the collaborative & creative skills for solving an ill-defined real-life problem? Test ad hoc with student’s report Observation and/or Video taping Pre-Post, Collaboratively Classroom Observation/video taped ICT How can new visualizations using technological tools enrich students' conceptual understanding in DA processes? What and How technological affordances can shape students' development of DA competences? Students’ report during the project can write a report about what visualizations helped them more Follow-up interview (focus group) – During the project: Ask a group about how they have done, worked Poster-presentation
10
Tomorrow discussion and agreement on: HOW ARE WE GOING TO WORK?
WHAT COLLECTION DATA SOURCES DO WE WANT TO COLLECT? WHAT METHODS ARE WE GOING TO USE TO COLLECT THIS DATA? WHAT PROCEDURE ARE WE GOING TO FOLLOW? HOW CAN WE COLLABORATE IN DISIGNING THE EVALUATION INSTRUMENTS? SIMULTANEOUSLY DESIGN PROJECT ACTIVITIES (also part of WP3) AND EVALUATION OF ITS IMPACT ON STUDENTS' LEARNING
11
Data sources & Methods Questionnaire (Ind.) Attitudes Anxiety
PRE-PROJECT EVALUATION DURING-PROJECT EVALUATION - PROCES POST-PROJECT EVALUATION Questionnaire (Ind.) Attitudes Anxiety Conceptions Test ad hoc (Ind.) DA content & DA Skills Weather content Open problem (test ad hoc) (Group) Collaborative skills Creative skills Questionnaire (Ind.) Attitudes Anxiety Conceptions Test ad hoc (Ind.) DA content & DA Skills Weather content Open problem (test ad hoc) (Group) Collaborative skills Creative skills PROJECT Students’ Worksheets & final project product Students’ report during the project about what ICT visualizations helped them more Follow-up interview (focus group) – Ask a group about how they have worked ICT visualizations – DA Skills – Collaborative skills – Creativity 4. Classrooms Observations
12
Designing a Rubric What is a Rubric?
Measure a student's skills and knowledge Rubrics contain rating scales, describing levels of quality, to use for grading student work. Merdler (2001) defined them as, "scoring guides, consisting of specific pre-established performance criteria, used in evaluating student work on performance assessments." Generally there are two types of rubrics: holistic and analytic. A holistic rubric is based on scoring an entire product as a whole, rather than evaluating different components. An analytic rubric, a teacher evaluates individual parts of the final product. In this case the individual scores are added together to obtain one final score. How to scale
13
How to scale Get inspiration - Look at models.
Consider the final product - create examples of good and poor work. List criteria. Brainstorm a list of criteria and important components of the answer / project work. Pack and unpack criteria. You may generate a long list of criteria. The next step is to combine criterion that maybe similar. Try to avoid large categories or ones that hide items you would like to emphasize. Articulate levels of quality. For each criterion, list four levels of quality from the lowest to the highest. Consult Bloom's Taxonomy for help with this. Create a draft rubric. Revise the draft
14
Designing a Rubric
15
Classroom observation: Designing the items to observe
Creativity Skills Divergent Phase – Open mind Generate ideas Develop ideas Propose new approaches to solve the problem Link information/ideas Convergent Phase – Close- Conclusion Understand the structure/patterns Group ideas Organize information Analyse data Summarize/synthesis ideas Making decisions
16
Classroom observation: Designing the items to observe the implementation of DA skills
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.