THE SIMBASE PROJECT A SUMMARY James Ansell. OVERVIEW OF PROJECT 6 STAGES  WP1MANAGEMENT  WP2DEVELOPMENT OF IAM  WP3PILOT OF ICT BASED SIM ADOPTION.

Slides:



Advertisements
Similar presentations
Antonio Mocci External evaluation Ede - the Netherlands, 11 December 2009.
Advertisements

Technical skills and competences
Criteria to design and assess ICT-supported Higher Education Teachers Training European Symposium Teacher training and the innovative use of ICT in HE.
1 Designing a training programme Module 6 Sessions 9&10.
Educational Supervision & Find Your Way Around in the E-portfolio Dr Jane Mamelok RCGP WPBA Clinical Lead.
An Outline for: A User-Based Systems Approach to the Evaluation, Selection, and Institutionalization of Safer Medical Devices.
Integrating the gender aspects in research and promoting the participation of women in Life Sciences, Genomics and Biotechnology for Health.
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Copyright © Healthcare Quality Quest, Proposed standards for a national clinical audit — How we got involved and what we have learned.
Information and Communication Technologies (ICT) in the Seventh Framework Programme Support actions.
WMO Competency Standards: Development and Implementation Status
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
1 SNP Educational Session – January 13, 2014 Model of Care Scoring Guidelines SNP Educational Session - January 13, 2014 Brett Kay, AVP, SNP Assessment,
 Assessment in the natural environment  Recognize and reinforce multiple modes of communication.  Train communication partners.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Performance management guidance
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Business research methods: data sources
CONTACT SEMINAR November 2008 Project management tools.
This Learning Topic gives a broad overview of learning program design, including:  the definition of a learning program  how learning programs relate.
Teachers have a significant role in developing and implementing the most effective teaching and learning strategies in their classroom and striving for.
The Knowledge Resources Guide The SUVOT Project Sustainable and Vocational Tourism Rimini, 20 October 2005.
Project NEStLeD Move Forward VIA University College Project NEStLeD NESTLED (Nurse Educator Simulation Based Learning) Project Leonardo Transfer of Innovation.
The Vision Implementation Project
TECHNICALSUPERVISON HEALTH & HIV/AIDS PROJECT Regional HIV workshop : Africa & South East Asia Bujumbura 21 – 25 February 2011 Dr Almouner TALIBO Regional.
New Advanced Higher Subject Implementation Events
New Advanced Higher Subject Implementation Events Design and Manufacture: Advanced Higher Course Assessment.
Fostering and adoption
DEVELOPMENT AND ASSESSMENT OF TRANSVERSAL KEY COMPETENCES IN THE DEGREE OF FOOD SCIENCE AND TECHNOLOGY M.D. Rivero-Pérez*, M.L. González-SanJosé, P. Muñíz,
New Advanced Higher Subject Implementation Events Chemistry : Unit Assessment at Advanced Higher.
Chapter 6 Training and Development in Sport Organizations.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
CCLVET Cross Cultural Learning and Teaching in Vocational Education and Training Overview LEONARDO DA VINCI Transfer of Innovation AGREEMENT NUMBER – LLP-LDV-TOI-08-AT-0021.
Project Evaluation Report (Indigo Project Solutions)
Dental Public Health DWSI document: How can this help a dentist to set up a contract with the PCT? Eric Rooney Consultant in Dental Public Health.
LEAP 3 An Update for SOCCER March 2014
Project acronym: TEACH Ref. Project: EACEA Harmonization of Preschool Teacher Education Curricula in Serbia.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
Summary A recent study found that almost 65% of all commercial ships have multinational crews. Over 10% of the fleet has crews with members from five.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
Mentorship Preparation Programme Queen’s University Belfast Open University University of Ulster Session 1.
CONTINUING PROFESSIONAL DEVELOPMENT (CPD) MEDU 222.
Graduate studies - Master of Pharmacy (MPharm) 1 st and 2 nd cycle integrated, 5 yrs, 10 semesters, 300 ECTS-credits 1 Integrated master's degrees qualifications.
Looking at our School—LAOS School Development Planning Initiative.
Management in relation to learning processes Proposal Sources: ANECA, CHEA, DETC.
The educational supervisor’s interview Preparing the ground for the educational supervisors report.
Paramedic Science Mentor update. Practice Assessment Team Current Teaching and Assessing Qualifications Assessment Taxonomy Assessment Documents Assessment.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
What have I learnt from GEMSS II? Using a reflective practice model to identify key learning points. Aim: To demonstrate the personal and professional.
Kick-off Meeting Granada, 16-17/04/2009 Agri-Multifunctionality II: Phases and Objectives.
IC&IC eLearning solutions Bucarest, 2 September 2010.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Quality Improvement in Primary/Ambulatory Care: The new Frontier Focus on Patients Piera Poletti CEREF, Padua (Italy)-
Project financed under Phare EUROPEAN UNION QUALITY EXTERNAL MONITORING IN THE SCHOOL YEAR 2007 – 2008 CONCLUSIONS AND RECOMMENDATIONS Material produced.
 Basics in Training Facilitation What do I need to know Vilanova, October 2015.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Module 7- Evaluation: Quality and Standards. 17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques.
Road Safety Education Program (RSEP) A first Summary July 2011 Putting People First.
Non-Medical Prescribing Practice Based Learning and Assessment.
School practice Dragica Trivic. FINDINGS AND RECOMMENDATIONS FROM TEMPUS MASTS CONFERENCE in Novi Sad Practice should be seen as an integral part of the.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
European Women Interactive Learning GRUNDTVIG Learning Partnership PROJECT ACTIVITIES REVISION With the support of the Lifelong Learning Programme of the.
Maria Gabriela Castro MD Archana Kudrimoti MBBS MPH David Sacks PhD
Improved socio-economic services for a more social microfinance.
Understanding Standards: Nominee Training Event
MULTIPLIER EVENT January , Brussels.
TRAINERS AND TRAINING PROCESSES
THE BUSINESS ANALYSIS PROCESS MODEL
The impact of small-group EBP education programme: barriers and facilitators for EBP allied health champions to share learning with peers.
Presentation transcript:

THE SIMBASE PROJECT A SUMMARY James Ansell

OVERVIEW OF PROJECT 6 STAGES  WP1MANAGEMENT  WP2DEVELOPMENT OF IAM  WP3PILOT OF ICT BASED SIM ADOPTION  WP4IMPACT MAXIMISATION  WP5GENERAL DISSEMINATION  WP6QUALITY AND EVALUATION MX

WP2 IMPACT ASSESSMENT MODEL AUTHORSUlf-Daniel Ehlers Tatiana Shamarina Heidenreich 41 pages long!!! The model was designed according to 4 cornerstones 1. Identifying criteria to describe impact of simulation training 2. Identifying success factors and indicators of this 3. Defining learning phases to follow during modelling activities 4. The selection of a set of standards MUST HAVE A STEP BY STEP DESCRIPTION OF HOW TO USE IT

WP2 IMPACT ASSESSMENT MODEL 1. Identifying criteria to describe impact of simulation training* 2. Identifying success factors and indicators of this 3. Defining learning phases to follow during modelling activities 4. The selection of a set of standards *From literature, partners practice, requirements of model (from they conclude model IAM has to represent teaching and learning process as a flow of phases) From, this the PRIME model of learning has been adapted

WP2 IMPACT ASSESSMENT MODEL  They conclude that: “The adoption of the IAM can improve current organisational practices and support innovation in the health care sector” “Also, model can be used to understand how medical students and employees develop and exploit their competences”

WP2 IMPACT ASSESSMENT MODEL  This 1st IAM will be refined based on validation results  Objectives of this document:  Define SBT and methods and chose one model for SIMBASE  Analyse parameters which influence success / failure  Define methodology to analyse the way SIM contributes to the learning process  Define quality of learning process

WP2 IMPACT ASSESSMENT MODEL  Current methods available to analyse SIM training  Kilpatrick model  Miller model  PRIME model  Process model ISO/IEC model  Portfolio methods  Assessment methods

WP2 IMPACT ASSESSMENT MODEL Kilpatrick model  Assesses response and impact of an educational exercise Miller Model Pyramid of competence, useful foot mapping assessment methods Prime Model  The one that they have followed closely

 PRIME MODEL  Level 1-7  1 = Participants  2 = Assessment of educational activities  3 = Knowledge acquisition and attitudinal change  4 = Competence  5 = Performance  6 = Patient Health  7 = Community Health WP2 IMPACT ASSESSMENT MODEL

 Process Model  To establish details of quality development  Model can be used for design and structure of training  Can be divided into:  Creating context specific quality profile  Specifying the individual process descriptions

WP2 IMPACT ASSESSMENT MODEL  They use this evidence to present 2 models for simulation  Model 1 – Process model, to understand how to plan  Model 2 – PRIME model, how to characterize competency and show different levels of learning different project participants can how Adapted to the aims of the SIMBASE project

WP2 IMPACT ASSESSMENT MODEL  PROCESS adapted model Planning phase Design Phase Implementation phase Evaluation

WP2 IMPACT ASSESSMENT MODEL  PRIME adapted model

WP2 IMPACT ASSESSMENT MODEL

WP3 PILOTING GUIDE  Outlines how the IAM can be implemented (43 pages!!)  Aims of WP3  Test IAM as a strategy design and assessment instrument  Define patient simulation training strategies  Implement priority actions from these  Access impact of actions on health care systems

WP3 PILOTING GUIDE  Methodology and timetables for pilots  Tools for recording the results of pilots  Types and methods for analysis  Guidelines for piloting co-ordinators in each location  Detailed success criteria for determining models suitability

WP3 PILOTING GUIDE  BEFORE (Planning & design phase of process model)  DURING (3 RD Phase of process model)  AND AFTER TRAINING

WP3 PILOTING GUIDE Before training:  Detect training requirements Appoint panel of experts (5-10) Arrange initial meeting Supply panel with details of discussion before meeting Panel during meeting to be given 5-15 questions Higher the number of participants the fewer the questions

WP3 PILOTING GUIDE Before training: Example questions include: “In your opinion what are the main problems with…?” “If you had to select three aspects that urgently require improvement, what would they be” “What, in your opinion, is the principle requirement of the patients with require to….?”

WP3 PILOTING GUIDE Before training:  Panel should be experts /have extensive knowledge of subject  The panel should contain 2 different profiles:  Scientific technical profile  Administrative profile  Information during meeting should be recorded on audio/video  Results of data gathering should be sent to panel (APPENDIX 1)

WP3 PILOTING GUIDE Before training: Identifying Competencies “Once area that needs improving defined we need to identify competencies and good practises to allow improvement”

WP3 PILOTING GUIDE Before training:  In order to allow this they recommend:  Semi structured interviews with experts (APPENDIX 2)  Duration should be 30mins-1hr  Minimum number of interviews is 5 (reduce false –ve)

WP3 PILOTING GUIDE Before training: Selecting competencies that can be trained & methodology used To selected competencies the following need to be considered: Necessity Impact on health Impact on patient safety Response to simulation-based training Cost-effectiveness of training

WP3 PILOTING GUIDE  To do this need another panel!! (can use previous one)  Identify competencies that can be trained, tools/methods A maximum of one month should elapse between detection of first training requirements and the formation of this panel  Set up of panel numbers etc same as before

WP3 PILOTING GUIDE  Selecting the most suitable methodology and training scenarios  Should consider:  Cost  Applicability  Reduction of learning curve  Allow the evaluation of the acquisition of competencies  High degree of acceptance with trainee  Availability of resources

WP3 PILOTING GUIDE Acceptable types of training to evaluate: Face-to-face training (Role play, debate analysis, brainstorming, Didactic panels) E-training methodology Mixed methodology (face-to-face, e-training methodology) Staged simulation (using actors, clinical interviews) Virtual simulation Robotic simulation

WP3 PILOTING GUIDE  Designing training guide  Once we have detected the training requirements, selected the competencies to be trained, chosen the methodology and scenario the next stage is to: Design the training and produce the training guide

WP3 PILOTING GUIDE Design the training and produce the training guide  Panel should consider the following  Clear selection process of target students  Competencies to be trained  Requirements that will be covered  Final objectives  Methodologies used  Schedule  Evaluation system  Description of feedback procedure to training team  EXAMPLES IN APPENDIX 3 AND 4

WP3 PILOTING GUIDE  Selecting the training team CV screen (teaching ability, training skills etc) Personal interview Practical testing  Selecting the students need to consider:  Profile  Professional category  Previous experience  Ease of transferral  Heterogeneity  Need to evaluate prior knowledge & motivation of students (APPEN 9 & 10)

WP3 PILOTING GUIDE DURING THE TRAINING Really give information about how to run an event! Which WIMAT do all the time No strict protocol to follow in this section apart from: The trainers must: Encourage processes of reflection Obtains and transmits information Informal communication Transmits support Solves problems Be familiar with a range of teaching methods

WP3 PILOTING GUIDE Should be evaluations should be performed at beginning and end of training of motivation and expectation of to assess if teaching had a positive effect Also satisfaction survey for trainers (APPENDIX 8)

WP3 PILOTING GUIDE AFTER THE TRAINING Facilitate feedback channels using questionnaires (APPENDIX 6, 7 and 8) or using virtual communities Revision of training according to feedback Accredit students with certification

WP3 PILOTING GUIDE AFTER THE TRAINING Pick one of following:  Objective evaluation  Open question test  On-going evaluation (APPEN 11)  In situ test Evaluate transfer of skills (recommend during this by self questionnaires to participants) Questionnaires should be completed within 2/52 to 2/12 IMPACT: To establish if students motivate colleagues to come to course?

SUMMARY OF SKYPE MEETING  Stakeholders information to be completed  Before Portugal meeting  Familiarise with WP2 and WP3  Meeting to discuss courses to pilot

QUESTIONS