Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham www.nottingham.ac.uk/lsri/msh.

Similar presentations


Presentation on theme: "Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham www.nottingham.ac.uk/lsri/msh."— Presentation transcript:

1 Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham www.nottingham.ac.uk/lsri/msh

2 What is mobile learning? Learning with portable technology –Focus on the technology –Could be in a fixed location, such as a classroom Learning across contexts –Focus on the learner –Could use portable or fixed technology –How people learn across locations and transitions Learning in a mobile world –Focus on the mobile society –How to understand people and technology in constant mobility –How to design learning for the mobile society

3 Can mobile learning be effective? We think so! –Classroom response systems (Draper, Dufresne, Roschelle) –Group learning with wireless mobiles and phones (Nussbaum et al., Dillenbourg) –Classroom handheld simulation games (Collella, Virus Game) –Mobile guides (Tate Modern, Caerus, Mobile Bristol) –Connecting learning in formal and informal settings (Butterfly Watching, MyArtSpace) Lack of convincing studies of mobile learning –Attitude surveys and interviews: “they say they enjoy it” –Observations: “they look like they are learning” –With a few exceptions (e.g. Nussbaum et al.)

4 Issues in evaluating mobile learning It may be mobile –Tracking activity across locations It may be distributed –Multiple participants in different locations It may be informal –How can we distinguish learning from other activities? It may be extended –How can we evaluate long-term learning? It may involve a variety of personal and institutional technologies –Mobile and fixed phones, desktop machines, laptops, public information systems There may be specific ethical problems –How can and should we monitor everyday activity?

5 What do you want to know? Usability –Well-tested methods: Expert evaluations (e.g. Heuristic evaluation and Cognitive Walkthrough) Lab-based comparisons Usefulness –Hard: depends on the educational aims and context Field-based interviews, observations and walk-throughs –Ethnographic analysis –Critical incident studies (including focus group replay) Learning outcome measures –Control group –Pre-test, intervention, post-test, delayed post-test Logbooks and diaries –Logbooks of activity –Diary-diary interview used successfully by Vavoula and others for intensive study of everyday learning over time

6 Some evaluation methods (contd.) Usefulness (contd.) –Other feedback methods Telephone probes Snap polls Interviews Focus groups –Automatic logging Recording where, when and how a mobile device is used Quantitative analysis of student learning action (Trinder et al., 2005) –Learning outcome measures Control group Pre-test, intervention, post-test, delayed post-test Attitude –Attitude surveys General attitude surveys are little use: almost all innovations are rated between 3.5 and 4.5 on a 5 point Likert scale Specific questions can indicate issues (e.g. interface problems) –Microsoft Desirability Toolkit Users indicate their attitudes through choice of cards

7 Case studies Student Learning Organiser –Long term learning MyArtSpace –Learning across contexts PI: Personal Inquiry –Ethics

8 Interactive Logbook project Corlett, D., Sharples, M., Chan, T., Bull, S. (2005) Evaluation of a Mobile Learning Organiser for University Students, Journal of Computer Assisted Learning, 21, pp. 162-170. 17 MSc Students, University of Birmingham Academic year 2002-3 Loaned iPAQ with wireless LAN for personal use Learning organiser Time manager Course manager Communications Concept mapper Standard tools Email Instant messenger Web browsing Free to download further software from the web

9 Evaluation methods Questionnaires –administered at 1, 4, 16 weeks, and 10 months Focus groups, following each of the questionnaires Logbooks –Students kept logbooks for six weeks –Students’ attitudes towards the learning organiser –Patterns of usage of the various applications (including any they had downloaded themselves) –Patterns of usage of the technology, particularly with respect to wireless connectivity –Ease of use issues –Issues relating to institutional support for mobile learning devices Videoed interactions –to compare the concept map tools, three students were videoed carrying out an exercise, which they later commented on after reviewing the video

10 Data Usability –Size, memory, battery life, speed, software usability, integration Usfulness –of PDAs –of Learning Organiser –of concept mapping tools Patterns of use –Locations –Changes over time

11 Frequency of use

12 Use of PDA in specific locations Rank order, for coursework, and in brackets for other activities 4 weeks16 weeks10 months Home 1= (1)2 (1) Department 1= (2)1 (2)1 (3) University (elsewhere) 3 (4)4 (4)3 (4) Travelling 4 (3)3 (3)4 (2)

13 Perceived usefulness of tools (“useful” or “very useful”) 4 Weeks16 Weeks10 months Timetable 59% (10)64% (9)82% (14) Web browser 65% (11)64% (9)71% (12) Instant messaging 59% (10)50% (7)71% (12) Email 76% (13)79% (11)65% (11) Course materials 59% (10)43% (6)41% (7) Supplementary materials 53% (9)43% (6)24% (4) Concept mapper 35% (5)14% (2)0% (0)

14 Perceived impact on activities Number of students naming tool as having greatest impact LearningPersonal OrganisationEntertainment Course materials (6)Timetable and deadlines (6)Media player (7) Browser (3)Calendar (5)Games (3) Timetable and deadlines (2)Writing/note taking (2)Messenger (2) Writing/note taking (1)Email (2)Browser (1) Calendar (1)Task manager (1)Writing/note taking (1) Reader (1)

15 Results Some usability problems –Especially battery life Most use of calendar, timetable and communications PDA-optimised content was well used Importance of connectivity No clear demand for a specific “student learning organiser” Concept mapping tools were not widely used Not generally used while travelling Ownership is important Need for institutional support

16 MyArtSpace Service on mobile phones for enquiry-led museum learning Aim to make school museum visits more engaging and educational Students create their own interpretation of a museum visit which they explore back in the classroom Learning through structured enquiry, exploration Museum test sites –Urbis (Manchester) –The D-Day Museum (Portsmouth) –The Study Gallery of Modern Art (Poole) About 3000 children during 2006

17 How it works In class before the visit, the teacher sets an inquiry topic At the museum, children are loaned multimedia phones Exhibits in the museum have 2-letter codes printed by them Children can use the phone to –Type the code to ‘collect’ an object and see a presentation about it –Record sounds –Take photos –Make notes –See who else has ‘collected’ the object All the information collected or created is sent automatically to a personal website showing a list of the items The website provides a record of the child’s interpretation of the visit In class after the visit, the children share the collected and recorded items and make them into presentations

18 Lifecycle evaluation Micro level: Usability issues –technology usability –individual and group activities Meso level: Educational Issues –learning experience as a whole –classroom-museum-home continuity –critical incidents: learning breakthroughs and breakdowns Macro level: Organisational Issues –effect on the educational practice for school museum visits –emergence of new practices –take-up and sustainability

19 Evaluation At each level Step 1 – what was supposed to happen –pre-interviews with stakeholders (teachers, students, museum educators), –documents provided to support the visits Step 2 – what actually happened –observer logs –post-focus groups –analysis of video diaries Step 3 – differences between 1 & 2 –reflective interviews with stakeholders –critical incident analysis

20 Three levels, in three stages, throughout the project micro meso macro design implement deploy Technology robust enough to support full user trial Service deployed long enough to assess impact

21 Summary of results The technology worked –Photos, information on exhibits, notes, automatic sending to website Minor usability problems Students liked the ‘cool’ technology Students enjoyed the experience more than their previous museum visit The students indicated that the phones made the visit more interactive Teachers were pleased that students engaged with the inquiry learning task

22 Usability Issues +Appropriate form factor +Device is a mobile phone, not a typical handheld museum guide +Collecting and creating items was an easy and natural process –Mobile phone connection –Text annotations –Integration of website with commercial software, e.g. PowerPoint

23 Educational Issues +Supports curriculum topics in literacy and media studies +Encourages meaningful and enjoyable pre- and post- visit lessons +Encourages children to make active choices in what is normally a passive experience –Teacher preparation –Need for teacher to understand the experience and run an appropriate pre-visit lesson –Where to impose constraints –Structure and restrict the collecting activity, or learn from organising the material back in the classroom –Support for collaborative learning –“X has also collected” wasn’t successful

24 Organistional issues +Museum appeal +attracting secondary schools to the museum +Student engagement +Students spent longer on a MAS visit (90 mins compared to 20 mins) +Museum accessibility +Ability to engage with museum content after the visit –Problems of museum staff engagement –Burden on museum staff –Business model –Maintenance of phones –Data charges –Competition with other museum media

25 PI: Personal Inquiry 3 year project between Nottingham and the Open University Support for inquiry science learning between formal and informal settings, keystage 3 School for introducing and framing issues, and planning inquiries Outside, home and science centres for semi-structured investigations

26 PI Ethics, general issues Participatory design, –all participants will willing volunteers –kept fully informed of the purpose –active participants in the design and evaluation Permissions –from the children, teachers parents –Studies in the home will be with the signed informed consent of all target children and their parents –Other children in the family will be asked for their assent –Project staff subject to enhanced CRB checks. –Researchers will not go unaccompanied into homes Confidentiality –All data will be anonymised –Participants and their schools will not be identified in publications or presentations (unless they wish to be)

27 PI Ethics, specific issues Monitoring –Children will be using the technology as part of their curriculum work, so teachers should be able to monitor the online activities as they occur and to inspect all the collected data –Children will be fully informed about how their learning activities outside the classroom may be monitored by teachers and researchers –Children will be able to decide where and when to collect data –System will not continuously monitor movement and activity, but will only log actions and data explicitly entered by the children. Ownership of data, privacy, and copyright –All data collected will be subject to the provisions of the Data Protection Act 1998, in particular Section 33 of the Act relating to data collected for the purposes of research. –Material captured or created by the children will be subject to normal standards of copyright and fair use, and inappropriate material will be deleted. –Authors of teaching materials and field data will retain copyright and moral rights of authorship over their material –A condition of participation will be that the project has rights to publish the material for academic and educational purposes (either crediting the authors or anonymising the material where appropriate and by agreement).

28 Summary of methods Interactive logbook –Usability Videoed interactions with comparative systems and reflective discussion –Usefulness Questionnaires, focus groups, user logbooks –Attitude Questionnaires MyArtSpace –Usability Heuristic evaluation –Usefulness Structured interviews with stakeholders Videotaped observations and notes, critical incident analysis Focus group interviews with learners to discuss incidents –Attitude Interviews with stakeholders PI: Personal Inquiry –Still to be determined, but will include: stakeholder panels, videotaped observations and critical incident analysis, comparative tests of learning process and outcomes for selected tasks


Download ppt "Evaluation Methods for Mobile Learning Mike Sharples Learning Sciences Research Institute University of Nottingham www.nottingham.ac.uk/lsri/msh."

Similar presentations


Ads by Google