Download presentation
Presentation is loading. Please wait.
1
WMO Competency Assessment Workshop for Aeronautical Meteorological Personnel
Russian State Hydrometeorological University St. Petersburg, Russian Federation, June 18-22 Kent Johnson CAeM Expert Team – Education, Training and Competencies Meteorological Service of Canada
2
Better to implement this year–
Workshop Goal Workshop goal – develop a plan, tailored to their own met service, for implementation of competency assessment for aeronautical meteorological personnel by November 2013 Better to implement this year– have time to address any deficiencies
3
Workshop Facilitators
Kent Johnson, Meteorological Service of Canada Paul Bugeac, ROMATSA, Romania Maria Mamaeva, RSHU
4
Workshop Outline Opening Competencies for AMP
Competency Assessment Toolkit Relationship to QMS Modifying AMP Competencies to fit requirements – making assumptions Developing and implementation plan Presentation of preliminary plans Summary and evaluation
5
Workshop Format Small working groups Presentations from experts
Plenary or large group discussions Small working groups Reports of group outcomes Sharing of ideas and best practices Participants presentation of results
6
A few logistical details
7
Why are we here? WMO and ICAO have adopted regulation requiring demonstration of competence of AMP All Aeronautical met service providers must meet these competencies by 2013 Teams of Experts have taken regulation and developed competency criteria These will be presented today by Chris Webster
8
Why are we here? Competency Assessment Toolkit has been developed
Learn from past workshops Nairobi (Sep 2010) Barbados (Jul 2011) Turkey (Sep 2011) India (Oct 2011) Hong Kong (Dec 2011) Opportunity to test and implement Toolkit Toolkit will be discussed and studied in more detail tomorrow
9
The Competency Assessment Toolkit (CAT) should be ….
a recommended framework for meeting competency requirements adaptable to any met service and to any AMF or AMO job an important part of a quality management system (QMS) a minimum competency level for all AMO and AMF effective yet easy to implement
11
The Competency Assessment Toolkit
Developed by many experts from around the world Task Team (TT) Kent Johnson (Canada) Goama Ilboudo (ASECNA) Nir Stav (Israel) Paul Bugeac (Romania) Michelle Hollister (Australia) Many other contributors
12
Competence or performance criteria
Second level criteria were introduced by Chris 15 second level criteria for AMF and 10 for AMO However, these must be further divided
13
could be considered as 9 different criteria
For example, Forecast the following weather phenomena and parameters: temperature and humidity wind including temporal and spatial variability (wind-shear, directional variability and gusts) QNH cloud (types, amounts, height of base and vertical extent) precipitation (intensity, onset and duration, amount and types), and associated visibilities fog or mist, including onset and duration, and associated reduced visibilities other types of obscuration, including dust, smoke, haze, sand-storms, dust-storms, blowing snow, and associated visibilities hazardous weather phenomena listed in Performance criterion 3.1 wake vortex advection and dissipation, as required. could be considered as 9 different criteria
14
Competency Templates TT-CAT started by defining a competency template for each performance criterion There are 40 templates for AMF and 20 for AMO (AMF wind, AMO visibility) First step in implementation plan is to determine which criteria apply to your jobs
15
Templates are not Tools
a Toolkit requiring use of up to 60 tools would be almost useless there is significant overlap criteria can be assessed simultaneously Several tools have been developed to meet performance criteria
16
KNOW-TELL-DO Knowledge does not imply competence
Telling about “what you would do” is better Actually doing something well is the best evidence but often not practical such as for rare or seasonal events In many cases, telling “what I would do if …” is the most reasonable option this is the experiential question technique
17
KNOW-TELL-DO knowledge does not guarantee competence
competence today does not guarantee competence tomorrow no competency assessment technique is perfect assumptions will always be needed must be documented and reviewed
18
Types of Tools direct observation (doing) tests (knowing)
traditional or multiple choice experiential questions (telling) “what would you do if …..?” simulations (artificial doing) portfolio evidence (doing in the past) “tell me about a time when ….”
19
Experiential stories Describe the steps that you would follow in forecasting mountain wave turbulence (for less experienced or new forecaster) OR Describe a time when you effectively forecast mountain wave turbulence (for more experienced forecaster)
20
Mountain Wave Turbulence
Response could include examine large scale flow pattern particularly near ridge top level look for flow perpendicular to barrier representative sounding (real or prog) for stability, shear, wind reversal, etc. use satellite imagery or other tools examine aircraft reports consider need for warning or sigmet attempt to make prediction as precise as possible (smallest 3-D extent) consider impacted clients and contact directly if part of local procedures
21
Competency Matrix Simple way to visualise performance criteria and tools Each tool satisfies one or more criteria Shows that there are options for assessing some criteria Toolbox guide
22
Later …. work in small groups Answer two questions
look at Toolkit, second level competencies Answer two questions what processes already exist which could contribute to the Toolkit? as of today, what seems to be the most difficult aspect of the Toolkit? be prepared to share group ideas
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.