Presentation is loading. Please wait.

Presentation is loading. Please wait.

Integrating Needs Analysis, Assessment & Evaluation of Training

Similar presentations


Presentation on theme: "Integrating Needs Analysis, Assessment & Evaluation of Training"— Presentation transcript:

1 Integrating Needs Analysis, Assessment & Evaluation of Training
Catherina Opperman & Marius Meyer 2009

2 FOCUS D E S I G N NEEDS ANALYSIS ASSESSMENT EVALUATION T R A I N G
Skills Audit

3 STRATEGIC LEARNING ALIGNED TO TRAINING
Identify skills gaps: Current gaps that need further development or are lacking Future skills that the organisation will need in the next two to five years

4 STRATEGIC LEARNING ALIGNED TO TRAINING
Skills gaps & training needs: Corporate training needs across entire organisation, example communication skills Job-specific training needs specific to a particular area or position, example product knowledge or marketing skills

5 STRATEGIC LEARNING ALIGNED TO TRAINING
Identify strategic learning objectives from: Business Plan Workplace Skills Plan Strategic HRD Plan Ensure that all job titles are represented on the Plan

6 STRATEGIC LEARNING ALIGNED TO TRAINING
Record and categorise these needs in a Strategic learning and development Plan: Example: BS: Business specific, product knowledge IP: Interpersonal: communication skills IT: computer skills: Excel ML: Management & leadership skills

7 MAIN ELEMENTS OF A JOB ANALYSIS
Purpose of a job: job demands, responsibilities, accountability, activities, procedures, processes Tasks: how often performed, sequence of tasks, equipment needed, how do tasks relate to other jobs continue

8 DEVELOP THE JOB PROFILE
Focus on the job position, not the employee Divide the position into knowledge, skills & behaviours that are essential

9 METHODS TO COLLECT DATA
Job interviews Observing employees in the workplace Interview individuals Interview groups Interview supervisors & technical specialists Questionnaires Self assessment questionnaires

10 JOB PROFILE ANALYSIS & PERFORMANCE MANAGEMENT
The next step is to link the job profile to performance management to assess specific competencies & address gaps that require training Identify gaps, assess & evaluate by linking output requirements to the individual’s Performance management plan

11 Target population analysis
TERMINOLOGY Needs assessment refers to different types of analysis that are referred to as needs analysis: Performance analysis Gap analysis Target population analysis

12 Needs assessment refers to the three elements of:
organisational-, person- and task analysis

13 DIFFERENT CATEGORIES OF TRAINING NEEDS
Macro level: sector skills needs Meso level: a single organisation Micro level: individual needs

14 Gaps exist as a result of:
PERFORMANCE GAPS Gaps exist as a result of: Performance gaps in the way individuals work Management gaps in the way people are managed Organisation gaps in the manner in which organisations are designed

15 STAKEHOLDERS IN HRD & ASSESSMENT
Government External Management Training committee Trade unions ETD providers Critical interest groups Community groups Assessors Moderators Learners Mentors/coaches Customers Past participants Professional bodies SDF Business partners Suppliers

16 THE NEEDS ANALYSIS PROCESS
Obtain data from two sources: The business plan: key strategies & broad competencies Skills of individual employees: assessed against the business plan will identify gaps in performance

17 REPORT RESULTS AND MAKE RECOMMENDATIONS
Feedback will depend on the nature and extent of the issue/problem and could be done verbally or: A comprehensive document with charts & graphs presented to top management Recommendations to take into account: Training strategy, methods & costs Be aware of the influence of culture & management buy-in to training

18 SELECT OR DESIGN A LEARNING INTERVENTION
If recommendations are accepted, use the identified needs for selecting or designing a learning intervention: Learners need to be aware of the objectives of training related to the purpose and expected outcomes of learning activities: What they should be able to know and do Criteria that indicates acceptable level of performance What criteria they will be assessed against Scope of learning

19 DESIGNING A LEARNING PROGRAMME
Take the following into account: Long term vision & mission of organisation Strategic short term goals & strategies Key skills competencies to meet those needs Skills priorities of the organisation Objectives of learning continue

20 DESIGNING A LEARNING PROGRAMME
Take the following into account: Learning outcomes Assessment instruments Assessment procedures Numeracy & literacy levels of learners

21 OTHER TYPES OF ANALYSIS
Organisation or situation analysis Target population analysis Task analysis Skills & competency analysis Skills Audit – Skills Gap Analysis (SGA)

22 OTHER TYPES OF ANALYSIS: Task analysis
Task analysis form: Five levels of function: Job title Specific duty Tasks under each duty Sub tasks Knowledge

23 SKILLS AUDIT PROCESS Steps in the Skills audit process:
1.Identify current & future priorities from the Business Plan; core work & future strategies 2. Identify essential skills for current & future business 3. Assess current skills level, use a rating scale, example: expert/satisfactory/limited/no experience 4. Identify action to address skills gap continue

24 2.4 OTHER TYPES OF ANALYSIS
Assessment methods: Employee self assessment 360 degree Panel assessments Formal assessment: Diagnostic, formative, summative & recognition of prior learning (RPL)

25 COMPETENCE Perform whole work roles Against standards of performance
Ability to work with your hands (skills), head (knowledge) & heart (attitude, values & behaviour)

26 ASSESSMENT STRATEGY Analysis of source documents (policy & plan)
Assessment context (environment/circumstances) Assessment activities Logistics Scope: range of assessment Instructions to assessors & candidates Evidence requirements Methods to gather evidence

27 OVERALL ASSESSMENT STRATEGY
Plan for the assessment Prepare the learner Conduct assessment Assess evidence & store it Evaluate the process Provide feedback Review at regular intervals

28 COMPETENCY-BASED ASSESSMENT
OBE is a learner-centred process: What learners are to learn is clearly defined Learners’ progress is based on demonstrated achievement Learners’ needs are accommodated through multiple teaching and learning strategies and assessment tools Each learner is provided the time and assistance to realise his/her potential

29 PLAN THE ASSESSMENT PROCESS
The candidate & assessor meet Draw up an Assessment Plan Conduct assessment Evaluate the evidence

30 PLAN THE ASSESSMENT PROCESS
Record the decision Provide feedback Plan reassessment if necessary Certification Record the results

31 PLAN THE ASSESSMENT PROCESS
Stages of planning the assessment: Initial planning Pre-assessment meeting with learners Draft an Assessment Plan Prepare the candidates Collect the evidence Give feedback continue

32 PLAN THE ASSESSMENT PROCESS
Special education needs examples: Illiterate learners Semi-literate learners Innumerate learners Physical disabled learners Blind learners Deaf learners

33 PLAN THE ASSESSMENT PROCESS
Plan against contingencies that may arise Obtain results of previous assessments Design activities & instruments that are appropriate to the outcomes & resources Assessment documentation is prepared for the recording of information continue

34 PLAN THE ASSESSMENT PROCESS
Potential unfair barriers are identified and plans are made to address these barriers without compromising the validity of the assessment. The required physical and human resources are to be ready and available for use, and logistical arrangements are confirmed continue

35 PLAN THE ASSESSMENT PROCESS
Provision for moderation is made in accordance with relevant assessment policies and ETQA requirements. A variety of assessment methods are described and compared i.t.o. strengths, weaknesses and applications

36 PLAN THE ASSESSMENT PROCESS
The Appeals procedure The re-assessment procedure Follow up and support The signatures of all the relevant parties & dates

37 PLAN THE ASSESSMENT PROCESS
Prepare the candidates: Use appropriate language level Carry out checks to ensure candidates are ready for assessment Ensure assessments are in line with policies Provide opportunities for input from candidates

38 ASSESSMENT PRACTICES Validation of assessment:
Verification of the process Statistical analysis Examination of the assessment instrument Sampling of evidence of applied competence Observation of processes Site visits Interviews

39 ASSESSMENT PRACTICES The principles of good assessment practice:
Fairness Validity Reliability Practicability

40 ASSESSMENT PRACTICES Valid Authentic Current Sufficient
Quality of evidence refers to ‘VACS’ criteria: Valid Authentic Current Sufficient

41 ASSESSMENT PRACTICES Types of evidence:
Refers to the method that will be used to collect evidence. Will it be collected by directly observing the learning? By making use of secondary sources of evidence? or Historical evidence?

42 ASSESSMENT PRACTICES Appeals can be brought against:
Unfair assessments Invalid assessments Unreliable assessments The assessors’ judgement; if considered biased Inadequate expertise and experience of the assessor if it influenced the assessment decision Unethical assessment practices

43 ASSESSMENT PRACTICES Guide and support learners: Advise Coach Tutor
Counsel Mentor continue

44 ASSESSMENT INSTRUMENT
Design the assessment instrument according to the assessment strategy that includes the following elements: Determine the purpose of learning Analyse the needs of learners Decide on the learning objectives and outcomes Select the content that will support the achievement of outcomes Decide on the activities, methods, and media for learning and development

45 ASSESSMENT PROCESS Apply integrated assessment strategy
Include Policy documents Provide for RPL, Reassessment, Special needs, language requirements Identify the level of learner’s competence Activities to include knowledge, skills & attitudes continue

46 ASSESSMENT PROCESS Identify methods: Knowledge Observation
3rd form of evidence Design instruments: Case study Discussions Portfolio continue

47 COLLECT THE EVIDENCE Identify the sources to collect evidence from:
Direct evidence Indirect evidence Historical evidence

48 MARKING MEMORANDUM Marking Memorandum is used to score the outcomes of assessment Include: Marks assigned per question The total marks of the assignment & paper The marking memorandum can be combined with model answers

49 Establish the value of learning Decide to continue with a programme
PURPOSE OF EVALUATION Establish the value of learning Decide to continue with a programme Obtain info on how to improve future learning programmes Improve delivery mechanisms to be more efficient and less costly

50 WHAT IS EVALUATION? “The collection of, analysis, and interpretation of information about any aspect of a programme of education or training as part of a recognised process of judging its effectiveness, its efficiency, and any other outcomes it may have.” (Mary Thorpe)

51 ELEMENTS OF AN EVALUATION STRATEGY
Background: purpose The scope of evaluation: levels & activities Output: measure for evaluation Methodology: integration with other processes continue

52 ELEMENTS OF AN EVALUATION STRATEGY
Work plan: roles & responsibilities Evaluation instruments: questionnaires, etc. Resources: human, technology, budget Reporting & communication

53 AREA OF MEASUREMENT EXAMPLES
Customer support Systems Product knowledge Technology

54 What can you increase / decrease?
MEASUREMENT IMPACT Quality – How good? Standards? Criteria? Quantity – How many? Cost - What are the cost implications? Savings? Time – How quick / long? When? Faster? What can you increase / decrease?

55 TRAINING COSTS Gap analysis Design & Development Facilitation fees
Assessment Evaluation Fixed costs Other costs

56 EVALUATION MODELS Kirkpatrick Phillips Nadler Brinkerhoff Holton
Coetsee and Van Zyl

57 EVALUATION PROCESS: DATA COLLECTION METHODS
Checklists Questionnaires Surveys Needs Analysis Performance Audit Case studies Focus groups continue

58 EVALUATION PROCESS: DATA COLLECTION METHODS
Meetings Interviews Observation Role plays Simulations Work samples

59 ROI% = (benefits – costs) x 100
ROI FORMULA R R R R ROI% = (benefits – costs) x 100 ___________ costs R

60 NON-TRAINING VARIABLES
External influences that had an impact on the measurement are referred to as non-training variables. External influences such as new systems and products, trends and business cycles pose threats to validity when measuring ROI.

61 NON-TRAINING VARIABLES
Organisational influences Technology Human influences Other influences

62 REPORTING Executive summary Background of project
Purpose, intent and design of evaluation study Results Discussion Costs and benefits Conclusions, recommendations and options


Download ppt "Integrating Needs Analysis, Assessment & Evaluation of Training"

Similar presentations


Ads by Google