Piloting CAA: All Aboard Gavin Sim & Phil Holifield.

Slides:



Advertisements
Similar presentations
ASYCUDA Overview … a summary of the objectives of ASYCUDA implementation projects and features of the software for the Customs computer system.
Advertisements

CONCEPTUAL WEB-BASED FRAMEWORK IN AN INTERACTIVE VIRTUAL ENVIRONMENT FOR DISTANCE LEARNING Amal Oraifige, Graham Oakes, Anthony Felton, David Heesom, Kevin.
Student Feedback from Module/Course Evaluations John Sloman Economics Network.
HND Global Trade and Business Graded Units 1 and 2
Computer Aided Assessment using QuestionMark Perception by Catherine Ogilvie Educational Development Officer.
Academic assessment of work placement – made easy?
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Assessing Learning for Classroom Success Friday, February 16, :00-3:00.
Learning and Teaching Conference 2012 Skill integration for students through in-class feedback and continuous assessment. Konstantinos Dimopoulos City.
Introduction There are various theoretical concepts and skills that bioscience students need to develop in order to become effective at solving problems.
Customisable Online Academic Skills Self-Assessment: Development and Initial Feedback Steve Briggs Nick Collis Nigel Upton.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Consistency of Assessment
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
Virtual Workbenches Richard Anthony Dept. Computer Science University of Greenwich Distributed Systems Operating Systems Networking.
Research Methods for Business Students
Using networked technologies to support staff development 1.Some definitions. 2.Where are we now? 3.Some potential benefits/applications. 4.Issues to consider.
E-portfolios for PDP An overview of student and staff perceptions across subject areas Federica Oradini and Gunter Saunders Online Learning Development.
The Graduate Attributes Project: a perspective on early stakeholder engagement Dr Caroline Walker Queen Mary, University of London.
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
University of Jyväskylä – Department of Mathematical Information Technology Computer Science Teacher Education ICNEE 2004 Topic Case Driven Approach for.
Enhancing a Culture of Teaching and Learning at a ‘Teaching Focused’ University Diane Salter Kwantlen Polytechnic University Liesel Knaack Vancouver Island.
Measuring Learning Outcomes Evaluation
The Computer Science Course at Omar Al-Mukhtar University, Libya The Computer Science Course at Omar Al-Mukhtar University, Libya User-Centered Design.
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
Imperial College Web Review Imperial College.... An audience-focused realignment of our web strategy with our College strategy, our market, technology.
Proposal Writing.
AET/515 Spanish 101 Instructional Plan SofiaDiaz
Teaching Security via Problem- based Learning Scenarios Chris Beaumont Senior Lecturer Learning Technology Research Group Liverpool Hope University College.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Dharmadeo Luchoomun1 Joe McLuckie1, and Maarten van Wesel2 Prepared By: Aiswarya Gopal Ramya Ravi.
Recommendations for Best Practice. Best Practice This section will present an analysis of the literature in the following categories: Organization of.
Man and Machine: Introduction to Module Damian Gordon.
Evaluation of digital collections' user interfaces Radovan Vrana Faculty of Humanities and Social Sciences Zagreb, Croatia
Welcome Synthesis Report on Assessment and Feedback with Technology Enhancement.
Project VILAROB TR1-LEO Second Meeting Warsaw, (WP2) presentation of ISAR Analysis prepared by ISAR partners. Marcin Słowikowski.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Using A Digital Campus to Support Electronic Learning In Lebanon Presenters: Shumin Chuang Professor: Ming-Puu Chen 2008/6/26 王堯興王堯興 Schaik, P., Barker,
On-Line Discussion Forums: New Platforms to supplement Professional Development at Walter Sisulu University BY Z.G.Baleni RPL Manager 10/16/20151.
Course and Syllabus Development Presented by Claire Major Assistant Professor, Higher Education Administration.
1 Berlin School of Economics and Law Hochschule für Wirtschaft und Recht Berlin Malta, th April 2014.
Copyright © 2008, Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and Intel Teach Program are trademarks of.
Direct and Indirect Measures INPUTS OUTCOMES. Assessment Basics: Overview Characteristics of learning outcomes Introduction to assessment tools Validity.
1 The Design of Multimedia Assessment Objects Gavin Sim, Stephanie Strong and Phil Holifield.
Institutional Considerations
1 Ideas of Problem-based Learning As a learner-centred process, problem- based learning meets the learners' interests and as such gives room for developing.
Chapter 6 Supporting Knowledge Management through Technology
Classroom management Scenario 10: Giving praise and reward Behaviour Scenarios Resources to support Charlie Taylor’s Improving Teacher Training for Behaviour.
Slide 12.1 Chapter 12 Implementation. Slide 12.2 Learning outcomes Produce a plan to minimize the risks involved with the launch phase of an e-business.
FLAGSHIP STRATEGY 1 STUDENT LEARNING. Student Learning: A New Approach Victorian Essential Learning Standards Curriculum Planning Guidelines Principles.
Using the Right Method to Collect Information IW233 Amanda Murphy.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
interactive logbook Paul Kiddie, Mike Sharples et al. The Development of an Application to Enhance.
Problem-based Learning in an Online IT Professional Practice Course Goold, A. (2004). Problem-based learning in an online IT professional practice course.
Employing Wikis for online collaboration in the e-learning environment: Case study 1 Raitman, R., Augar, N. & Zhou, W. (2005). Employing Wikis for online.
Digital Libraries1 David Rashty. Digital Libraries2 “A library is an arsenal of liberty” Anonymous.
Students as Change Agents Exploring issues of Student Engagement among On- Campus MSc Students Denise Ryder, Jonathan Doney, Nii Tackie-Yaoboi With Nadine.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
The Review Process: Where Do We Begin? Jennifer L. Bishoff June 7, 2001.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Multimedia Industry Knowledge CUFGEN01A Develop And Apply Industry Knowledge CUFMEM08A Apply Principles Of Instructional Design To A Multimedia Product.
Improving The ABI Transition Experience Hospital to Home/Community Elly Nadorp, MSW.,RSW
Engaging Students in Technical Modules: The Quest to Promote Student Identification of Problematic Knowledge. Dr William Lyons, School of Engineering,
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
Postgraduate podcasting: An innovative approach to assessment Lynne Powell and Dr. Fiona Robson.
ENHANCING QUALITY IN ONLINE LEARNING Nadeosa Conference Durban University of Technology 8-9 July 2015 Dr Ephraim Mhlanga.
NDLR Symposium 2012 Engaging Students in Technical Modules: The Quest to Promote Student Identification of Problematic Knowledge. Dr William Lyons, School.
Lisa Dawson – Head of Student Systems Operations
Student Satisfaction Results
Presentation transcript:

Piloting CAA: All Aboard Gavin Sim & Phil Holifield

2 Overview Introduction CAA at UCLAN Key Challenges Staff Uptake Framework Staff Development Students Other Stakeholders Conclusions and Discussions

3 Introduction Teaching and Learning strategy incorporated e-learning – mainly content development First summative CAA test summer 2003 WebCT TRIADS and Questionmark evaluated for pilot study Questionmark - adopted felt easier for staff develop their own questions

4 Introduction Technical Infrastructure analysed concerns  Scalability – expansion over time  Connectivity – internal and external colleges  Bandwidth – 10 Mbps available, multimedia Purchased dedicated server host Questionmark, Internet Information Server & SQL Server Integration with other systems concern but addressed later Piloting Within Department of Computing

5 Key Challenges Encouraging Staff Uptake Staff Development Stakeholder Acceptance – e.g. management CAA perceived ability to test range of cognitive skills Practical Issues - Labs

6 Methodology Questionnaire Staff n=34 response rate 64%  Views in relation to CAA, support and training Framework developed based on Blooms Taxonomy, 6 staff, 8 modules Questionnaire Students n=86 response rate 94%  Acceptance of technique  Question styles  Language Used  Usability

7 Staff Uptake Computing encompasses range of subjects technical networking, subjective HCI CAA may readily lend itself assessment specific disciplines Questionnaire revealed only five members of staff used CAA, 3 actively using it Encourage uptake CAA being incorporated into department’s strategy, all level 1 formative and summative being optional

8 Staff Uptake Five staff now using CAA within the department Questionnaire revealed  91% use CAA formative  56% Summative Difference could be attributed level lecturer teaches

9 Framework Analysing structure of module identify how CAA could be incorporated into modules Bloom’s Taxonomy Learning Outcomes Syllabus Other Assessment format CAA

10 Framework Number of Learning Outcomes at each level of Blooms Taxonomy Level 1Level 2Level 3Level 4 CO1652C01802C01804CO2751CO2752CO2601CO3707CO4707 Knowledge11 Comprehension31111 Application3262 Analysis132 Synthesis11121 Evaluation1213

11 Framework Variations between number of Learning Outcomes from 3 – 8 Level 1 modules at lower Cognitive Level Level 2 Module CO2601 (Technical Solutions and Business) requires students to demonstrate similar ability found on CO3707 Next is to identify elements of syllabus and relationship to Learning Outcomes Prevent unrelated content being integrated into exam

12 Framework Example for CO3707 Identify the parts of the syllabus that relate to the learning outcomes. ABCD 1 Consideration of primary users XXXX 2 Introduction to Multimedia XX 3 Introduction to human systems XXXX 4 Multimedia Technology XXX 9 Importance of evaluation and choice metrics X

13 Framework Number of syllabus elements at each level of Bloom’s Taxonomy Level 1Level 2Level 3Level 4 CO1652C01802C01804CO2751CO2752CO2601CO3707CO4707 Knowledge21 Comprehension Application Analysis133 Synthesis321 Evaluation5619

14 Framework Is going to be used on MSc Web Development Module Module is all coursework Formative test in first semester  Enable students gain early feedback  Lecturer obtain early indication of their progress Framework shows how staff can integrate CAA into modules but further development necessary

15 Staff Development Asked staff ‘ Would you be prepared to input the questions into the software yourself?’  80% Yes  May not reflect attitude staff in other departments

16 Staff Development Lecturers need support in question design 74% LDU organised staff development in CAA  An introduction to Computer Assisted Assessment CIF bid for funding pay developer to work with staff develop multimedia questions 81% more time required to write questions  Question banks and experience reduce time 61% lecturers help invigilation (essential)

17 Staff Development Informal Focus Groups Discuss problems and share experiences  How accommodate students special needs  Invigilation issues  Risk issues e.g. server fails Without this students experience may be different from module to module

18 Students Attitude measured through series of questionnaires Students asked ‘ Would you find this format of assessment an acceptable replacement for part of your final exam?’ 5 Point Likert Scale, Strongly Disagree=0, Strongly Agree=4 Mean=2.9, SD=.9, 99% Conf. Interval ± 0.26 Indicates reasonable level of support

19 Students Research into computer anxiety and CAA (Liu et al. 2001; Zakrzewski & Steven 2000) Concerns, students no prior experience of QM ‘This format of assessment is more stressful than a paper based test’ Mean=.99, SD=.987, Conf. Interval ± 0.28 Comments ‘I prefer completing a test in this way as it is less intimidating’ ‘As a computer geek I feel more at ease in front of a computer.’ (final exam)

20 Students ‘Did you have any difficulties accessing the test?’  14% Yes Majority problems copying password from with white space Software could trim white spaces Authentication could be achieved through LDAP process

21 Students Questionmark used question by question delivery Standard Templates Question the suitability of a number of templates e.g. scrolling, navigation Idea have a template bank

22 Students Series of questions relating to the interface QuestionMeanStandard Deviation The test was easy to use It is easy to read the characters on the screen The screen layout is clear The screen layout is consistent The navigation was clear I always know where I am in the software The button location is consistent The order of the navigation buttons is logical The button names are meaningful The on-screen navigation is easily distinguished from the questions

23 Students 81 Students completed questionnaire 3o provided qualitative feedback Requested facility go directly back to previous question (11 times) ‘Proceed’ button felt inappropriate near main navigation Features incorporated into forthcoming test and further analysis will be conducted

24 Other Stakeholders Information System Services and Management informed through steering committee Responsibility report finding of the evaluation for institutional wide deployment Without support of management additional resources will not be made available

25 Conclusions and Discussions Scepticism about CAA appropriateness at level 3,4 for summative assessment Framework showed how it may be incorporated further research required Adopting CAA into departments strategy increased uptake but staff development necessary Students responded positively to experience Logging in process could be improved Comparison of WebCT and Questionmark planned

26 Questions