Download presentation
Presentation is loading. Please wait.
Published byFay Harper Modified over 7 years ago
1
Using Technologies to Support Assessment Fabio R. Aricò
Assessment Fiesta 17 Sep 2014
2
ABOUT MYSELF Fabio R Aricò Lecturer in Macroeconomics
School of Economics University of East Anglia Education BSc Business and Economics, University of Pavia MSc Economics, University of Warwick PhD Economics, University of Warwick PGCert in Higher Education Practice Interests Economics of Education Higher Education Policy and Practice Student Satisfaction and NSS Widening Access Policies Learning Technologies Pluralism in Social Sciences Academic Self-Efficacy Work experience University of Pavia, Associate Tutor University of Warwick, Teaching Fellow University of St Andrews, Lecturer in Economics
3
TWEET AWAY! @FabioArico #LTAFiesta
4
When Student Confidence Clicks
Academic Self-Efficacy and Learning in HE Fabio R. Aricò
5
ACKNOWLEDGEMENTS UEA-HEFCE Widening Participation Teaching Fellowship
HEA – Teaching Development Grant Scheme (Individual) Mentor and co-author: Dr Duncan Watson Senior Research Associate: Dr Kathleen Lane Research Associate: Chris Thomson UG Research Assistants: Zainab Ahmed, Jack Kelehar
6
AIMS AND OBJECTIVES Awareness: Wise use of Learning Technologies in Assessment Self-Efficacy & Self-Assessment as success factors Disseminate: Offer teaching ideas – Share my experience HEA mission and my commitment Feedback: What do you think? Can this be improved? Exchange: Create a ‘community of practice’ Partnership: Let’s work together!
7
OUTLINE 1. Student Response Systems (clickers)
2. An overview of the “Introductory Economics” module at UEA a) learning environment & clickers b) methods that do not use clickers (but use VLE!) 3. Blended surveying: continuous dialogue with the students 4. Some learning analytics (if we have time)
8
ETHICAL REMARK You will be presented with data collected during teaching sessions. Students involved have given informed consent for me to analyse their responses and present the results of this analysis. I can assist with ethical queries as well, please ask me.
9
1. Using Student Response Systems (clickers)
10
Do you know about clickers? Do you know how to use them?
Yes, I use them in my teaching. Yes, but I do not currently use them in my teaching. I have an idea of what clickers do, but a very basic one. I have no clue about what clickers do and how they can be used.
11
Do you use a Virtual Learning Environment? For what?
Yes, mainly as a repository for handouts/teaching material. Yes, as a repository but also to assess students online. Yes, as a repository but also to interact with my students. No, I do not use a VLE.
12
SRS interaction with feedback to the students
13
CLICKER MANAGEMENT School of Economics Protocol:
Each student receives a clicker during orientation week. Information sheet. Clicker collection is not compulsory. Clicker ID associated to Student ID Ethical procedure followed – Informed consent to use data. Students use clickers across their 3 years of study and then return their clicker at the end of their studies to the School. If a student loses/breaks his/her clicker, s/he has to pay a replacement cost (but can obtain a new one).
14
CLICKER MANAGEMENT Statistics for 2013-14 - Year 1
180 students enrolled 13 students did not collect their clickers (majority of overseas students) 1 clicker lost paid for and replaced 1 clicker returned (after chasing) 1 clicker missing
15
CORE IDEAS ABOUT CLICKERS
Clickers increase engagement and student satisfaction Yes, it works and it is tested. The ‘clicker’ novelty wears out quickly Possibly, but (majority) students keep on using them and enjoy them. Cannot be over-ambitious in what you teach If you switch to clickers you will need to cut some of the material. Use a counter to get responses in faster Recent research suggests 80% response rate is the cut-point. It is not all about technology, it is still about good teaching See Nielsen et. al. (2013) , Research in Learning Technology, ALT
16
a) clicker based teaching
2. “Introductory Economics” The learning environment a) clicker based teaching
17
TEACHING PROTOCOL – the module
Introductory Macroeconomics Year 1 – compulsory year-long module students Lectures traditional frontal-teaching (10 per sem.) Seminars small group, pre-assigned problem sets (4 per sem.) Workshops large group, problem-solving sessions (4 per sem.) Support Sessions non-compulsory drop-in sessions (4 per sem.)
18
TEACHING PROTOCOL – lectures
Lectures interaction via clicker technology Seminars revision questions + understanding questions Workshops closing questions: was the lecture enjoyable interesting? was the material difficult? Support Sessions online report of clicking session + feedback
19
Lecture difficulty indicator - 66% (+8%)
Lecture difficulty indicator - 66% (+8%). Please look out for additional resources coming online very shortly. Video tutorials about the IS-LM will be available shortly. I would like you to reflect on the feedback asked on the IS-LM model and try to identify what are your OWN difficulties. If many of you are confident about understanding and mastering the material, we need to make this belief becoming a reality. For those of you who are not confident. Why is this the case? Come and discuss this with me.
20
TEACHING PROTOCOL – lectures
21
TEACHING PROTOCOL – workshop
Workshops peer-instructed flipped classroom approach Seminars
22
Choose the most accurate definition of inflation:
Inflation is defined as the global increase in price levels. Inflation is defined as the rate of change of the price level. Inflation is defined as the difference between price levels in 2 consecutive years.
23
How confident do you feel about your answer?
Very confident. Confident. Somewhat Confident. Not Confident.
24
So did you get the right answer?
Compare your answer with the delegates sitting next to you for a minute and then we will try to re-poll the question.
25
Choose the most accurate definition of inflation:
Inflation is defined as the global increase in price levels. Inflation is defined as the rate of change of the price level. Inflation is defined as the difference between price levels in 2 consecutive years.
27
TEACHING PROTOCOL – workshop
Workshops standard algorithm: 1. Preliminary preparation question 2. Quiz questions + Confidence questions (no solution) 2. Peer-instruction learning 3. Quiz questions + solutions 4. Problem-set questions (no clicking) 4. Feedback questions: - what was the cause of mistakes/problems? - did you enjoy using clickers? - were clickers useful to your learning? Support Sessions online report of clicking session + feedback
28
WHAT DO YOU THINK? Assuming that time and resources were available, would you be keen on using (intensifying) the use of SRS technology? Can you think about applications of SRS in your teaching practice? How would you use them? How frequently? In which teaching context (lecture/workshop…) Which questions would you ask?
29
b) non-clicker based methods + VLE
2. “Introductory Economics” The learning environment b) non-clicker based methods + VLE
30
TEACHING PROTOCOL – extra-curricular
Extra-Curricular Activities to promote engagement and Self-Efficacy Seminars Module Facebook Page + Blackboard pages - ‘challenges’ to encourage further study - interaction and participation Seminars Voluntary in-lecture presentations (5 minutes) - to exploit demonstration effects Support Sessions Campus Vouchers (for engagement, not attainment)
31
TEACHING PROTOCOL – seminars
Seminars preliminary Seminar Quizzes (paper-based) Seminars 3-4 revision/understanding questions Workshops 2 confidence/self-assessment questions Sessions open-answer comments Support Sessions online report of Seminar Quiz solutions and overall performance - individual performance available response to open-answer comments
32
TEACHING PROTOCOL – seminars
33
3. Blended Surveying continuous dialogue with the students
34
BLENDED SURVEYING: a Project’s by-product
I found myself interacting with the students more and more: I found out what students like and dislike with much finer detail I had chance to respond to their opinions in real time Sometimes it is just enough to explain why things cannot be done. I found out that an ‘end of module’ questionnaire is not enough Are we asking the right questions? At the right time? In the right way? Students recognised this: “Best thing: the support provided by all the lecturers, teachers and the amount of feedback that is asked for shows that the staff care a lot for our learning experience”
35
BLENDED SURVEYING: the idea
In a highly structured and diversified Blended Learning environment we need an equally sophisticated Blended Surveying approach. Contact hours: lectures, small group seminars, large group workshops, office hours, and support meetings. Modes of delivery: frontal teaching, seminar discussion, peer-instructed workshop practice, video-assisted individual study, VLE delivered material. How can we assess the effectiveness of all this?
36
BLENDED SURVEYING: principles
1) SIMULTANEITY Evaluate the process of learning as it occurs. Students do not need to recall events: they just share their feelings in real time. 2) CONSISTENCY Assess teaching using the same devices according to which teaching is delivered. This enables simultaneity, and seamlessly blends teaching, learning, and evaluation processes. 3) CONTINUITY Use the process along the whole teaching period Make adjustments. Detect change in opinions. 4) CIRCULARITY Close the feedback loop. Talk to the students. Acknowledge changes. Explain why cannot change.
37
findings of my analysis
4. Some Learning Analytics how to visualize data? findings of my analysis (We did not cover this during the workshops, but you can find some additional ideas in the following slides.)
38
DATASETS Student Q1 Q2 Q3 … performance per student
confidence by student longitudinal study across all lectures across all seminars across all workshops Intermediate and final attainment outcomes course test final exam … … performance per question confidence by question
39
Some ideas to visualize data
Week 5 ( ) % correct responses ■ 1st round ■ 2nd round You can represent results by question…
40
Some ideas to visualize data
Week 5 ( ) % correct responses 2nd round You can represent results by student… % correct responses 1st round
41
Some ideas to visualize data
Workshop Longitudinal data
42
Some ideas to visualize data
Workshop Longitudinal data
43
RESULT 1: Peer-instruction and learning
We found that the learning gains from peer-instruction: are higher in the group of low-performing students so peer-instruction works in getting everybody at the same level. are not associated to self-efficacy and self-assessment skills so everybody has a chance to learn despite being poor at self-assessing. are positively correlated to exam performance so it seems that peer-instruction can display some long-run effects. This result has to be investigated with further attention.
44
RESULT 2: Assessing Self-Assessment
What is the relationship between attainment and confidence? Are students able to self-assess their performance? Compare two self-assessment set-ups: Seminars paper-based self-assessment seminar quizzes + confidence questions Workshops clicker-based self-assessment peer-instructed re-iterated algorithm
45
RESULT 2: Seminars What is the relationship between attainment and confidence?
46
RESULT 2: Seminars What is the relationship between attainment and confidence?
47
RESULT 2: Workshops What is the relationship between attainment and confidence?
48
RESULT 2: Workshops What is the relationship between attainment and confidence?
49
RESULT 2: Summary In Seminar Quizzes:
high-attainment students display higher confidence low-attainment students not able to self-assess their performance. In Workshop sessions: low-attainment display lower confidence. How to interpret this asymmetry?
50
RESULT 2: Summary In Seminar Quizzes:
3 or 4 questions, paper-based quiz, 5-6 minutes, not anonymous 1 confidence assessment for overall performance. In Workshop sessions: 5-10 questions, clicker response, slower pace, quasi-anonymous 1 confidence assessment for each question asked.
51
RESULT 2: Conclusion Low-attainment students encounter more difficulties in self-assessing their performance in an environment where: they self-assess their ‘overall’ performance on a composite task they are exposed to questions for a shorter period of time they are exposed to fewer questions, not anonymously. Focus group interviews (differentiated by attainment groups) confirm that low-attainment students display lower self-assessment skills important finding for intervention!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.