Download presentation
Presentation is loading. Please wait.
Published byJosephine Wike Modified over 10 years ago
1
How Do We Measure Student Achievement During Placements? Margaret Fisher Ceppl Activity Lead/ Senior Lecturer in Midwifery/ Academic Lead Placement Development Team, University of Plymouth www.placementlearning.org Email: mfisher@plymouth.ac.ukmfisher@plymouth.ac.uk 2 nd DIETS Conference, 26/9/08
2
Introduction and overview The importance of placements and practice assessment Evidence from the literature Ceppl project: Assessment of Practice Application in Midwifery – an electronic portfolio Summary and questions
3
The importance of placements and practice assessment Real-life exposure – environment and role models Enables assessment of practice skills Variety of placements Methods of assessment need to be valid, reliable and appropriate
4
Evidence from the literature Assessment of practice is crucial in determining whether or not a student meets the criteria required of their profession, thus ensuring safety of the public UKCC 1999, Watkins 2000, Cowburn et al 2000
5
Defining competence has long been a challenge Cowan et al 2005 Efforts to ‘measure’ competence and professional abilities have resulted in a wide variety of methods of assessment Baume and Yorke 2002, McMullan et al 2003
6
Unless outcomes are clear, the result may be that the student focuses too heavily on completing the portfolio [or other tool] rather than learning from the experience itself Scholes et al 2004 Reflections on practice may form part of portfolio assessments, and this process may also contribute to the student’s learning Mountford and Rogers 1996
7
So: clear purpose and outcomes effective and objective measurement of competence contribution of the assessment process to students’ learning are important factors to consider
8
Ceppl project: Assessment of Practice Longitudinal case studies Staff focus groups Literature search Trawl of websites Conference networking
9
Aim To establish an evidence-based set of key principles and resources to guide Assessment of Practice, relevant across professional boundaries.
10
Research Questions 1. What are perceptions of validity and reliability of the practice assessment methods used? 2. What are perceptions of the impact of the practice assessment process on the student learning experience?
11
Methodology 14 participants from Midwifery, Social Work and Emergency Care programmes (nurses and paramedics) Semi-structured interviews at the end of each year Longitudinal case study approach Single-case and cross-case analysis and synthesis of findings – “Framework technique” Ritchie and Spencer 1984
12
Key themes
13
Methods used 1.Portfolios 2.Reflections 3.Tripartites/ 3-way meetings 4.Criterion referenced assessments 5.Conversations 6.Observations 7.OSCEs
14
1. Portfolios Provide focus Evidence of capability/ achievement Encourage student as see their progress Self-directed Motivate learning × Prescriptive/ restrictive (“tick boxes”) × Weighting of marks unbalanced/ difficult to assess × Potential to “cheat the system” × Bulk (paper format) × Heavy workload
15
2. Reflections Aid and extend learning Enable development and growth May be reliable × Do not always reflect the reality of practice × Potential to “blur the edges” × Don’t necessarily gain from “ticking the boxes” × May be unreliable
16
3. Tripartites/ 3-way meetings Useful checkpoint Opportunity to reflect on progress and learning Opportunity to get feedback from mentor and tutor Enable clarification of issues Student-centred Reliable if student and mentor have worked closely together × Difficult to arrange × May be challenging to express conflicting opinions × Likened to a “parent’s evening” × Some students though mentor and tutor should also have private discussion
17
4. Criterion referenced assessment Focused learning Best if continuous assessment Mostly valid, reliable and achievable × Criteria not always relevant to placement × Some criteria ambiguous/ overly complex/ unclear × Dependent on professional judgement and experience of mentor
18
5. Conversations Useful feedback Demonstrate communication skills × Difficult to organise × Caused anxiety
19
6. Observations Benefit from feedback from different people Assess attitudes to service-users Valid and reliable × Did not always reflect real practice × Difficult to arrange/ heavy workload × Restrictive × Inconsistency of assessors × Would prefer to be shadowed for a day
20
7. OSCEs (Objective Structured Clinical Examinations) Reflect real practice Provide focus Consistent Enjoyable Well prepared Put students’ knowledge to use Huge impact on learning Useful/ best way of assessing practice × Pressurised/ stressful × False environment × Not holistic
21
Application in Midwifery – an electronic portfolio Portfolio work-party Decision to develop part-paper (summative) and part-electronic (formative/ evidence learning) portfolio E-portfolio developed → “Wiki’s” Pilot study Demonstration
22
Key findings from the pilot Guidelines: very positive evaluation by all, but face-to-face explanation recommended in addition Hyperlinks: logical system; tricky to begin with but became easier with use; particularly useful when making external links (eg: to national guidelines) “Hyperlinks are good as it shows evidence of learning” (S)
23
Uncertain how readily accessible in clinical area Students liked the fact that the personal tutor would have access and provide formative feedback Variety of learning styles and IT skills amongst student respondents but this did not appear to affect whether or not students were able to cope with the new format
24
Summary
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.