Download presentation
Presentation is loading. Please wait.
1
2015 Pre-Examiner Training and Preparation Course
The Journey Begins (Continues)!
2
Before We Begin…. Parking Lot Logistics Safety Location of exits
Use of phone (please place on stun) Restrooms Parking Lot First, some logistics. It is very important that we know two key things, where the exits are located in case of emergency and where the restrooms are located. [give locations] Smoking is not allowed in the building. [Give locations of smoking areas, if any.] As with all meetings, we ask that you turn your cell phones and pagers to vibrate or preferably off. If you need to keep your ringer on for emergency reasons, please let us know. Explain the parking lot which is posted on the wall.
3
Complete Training Evaluation Forms AND Plus/Delta
Set up FLIP CHART at appropriate exit point Ask examiners to note plus/delta on post it notes and stick on the flip chart in the appropriate column as they exit the room at end of the day!
4
Quality Texas Meeting Behaviors
Be respectful Share openly Have only one conversation at a time Mute phones Return from breaks on time Take care of personal needs Clean up after yourself Stay on point Work toward consensus Understand that silence means affirmation Employ “ELMO” as needed Have fun These meeting behaviors have been developed as part of the leadership system for Quality Texas. They represent the kinds of behaviors we expect from our leaders and from each other. Ground Rules
6
Learning Outcomes Gain an understanding of the Framework; Review
Apply the Six-Step Independent Review Process to an Award Application; Choosing Key Factors, Strengths/OFIs, Scoring, and Key Themes Write Better Feedback Comments Using the Comment Guidelines Improve your BRAND!
7
Course Overview Tuesday – Quick review of the Baldrige Framework; Review the Organizational Profile; Key Factors; Discuss Six- Step IR Process; Review Items 1.1 Wednesday – Six-Step IR Process; Items 2.1 and 5.2 Thursday – Six-Step IR Process; Review 7.1 and 7.5; Scorebook Navigator Class
8
Introductions – At Your Table
Who you are! Name / City What you do! Organization Job title Experience with QTF or Baldrige Criteria Your expectations of this Training NOTE to facilitators: It’s important not to lose time early in the day, so be quick in running this section. Introduce yourself and your co-facilitators, as well as any Quality Texas personnel present. Due to the anticipated size of the classes, to save time, have the Examiners make introductions at their table, rather than across the entire room. After table introductions are complete, discuss expectations from the entire group.
9
About Quality Texas Laying Concrete
10
Quality Texas Foundation
Mr. Ryan Gonzales, Director of Operations (512) Ms. Lin Wrinkle, Director of Administration (512) Dr. Mac McGuire, Chief Executive Officer (512) 201 Woodland Park, Georgetown, Texas
11
Key Facts About Quality Texas Foundation
President Reagan concept ; Mac Baldrige - Secretary of Commerce; died in rodeo accident State Concept 1990; Started 1992; Endorsed by then Governor Ann Richards; November 1992 huge kickoff in Houston Founded in 1994 as a 501(c)3 not-for-profit corporation Full-time staff of two people; one person part time; around 300 state-wide volunteers The Quality Texas Foundation is a non-profit 501(c) 3 corporation that evolved from a concept introduced by Governor Ann Richards in 1990. Cooperative efforts between the Governor's office, the Texas Department of Commerce, and Texas businesses made it possible for the new Quality Texas Foundation to organize and deliver quality awareness seminars across the state. In the first two years, seminars were presented to 1,800 individuals representing more than 700 organizations. The QTF is endorsed by the governor’s office. At the same time, EDS Corporation assigned an executive to lead development of the state quality award. A committee was formed with representatives from organizations across Texas. The committee created the Texas Award for Performance Excellence, open to government, education, healthcare, nonprofit, and business organizations. The American Productivity & Quality Center was chosen as the original award administrator and applications were first accepted in 1993. There are currently two full-time staff working for the QTF—a Chief Executive Officer, a Director – Performance Excellence Program. The work is primarily done through 300 volunteers across the State of Texas. The organization is funded through corporate and individual donations, training, and award participant fees. QTF is recognized as one of the best State award programs with more national level winners than any other state.
12
Key Facts About Quality Texas Foundation
Funded through Memberships and Sponsorships, Customized training workshops, Performance Excellence Program participant fees, Annual Texas Quest Conference Recognized as the most effective state program, with more Baldrige recipients than any other state (19); 52 state winners
13
QTF’s Vision and Mission
Vision: The Quality Texas Foundation, the preeminent state program, will continue to lead the way nationally in innovation by establishing Communities of Excellence (CoE) throughout the state. Mission: QTF exists to assist individuals and organizations in their continuous improvement efforts thereby positively impacting our communities, state, and nation.
14
Services We Provide Assessment; Feedback. Recognition and Awards
Examiner Training (You are here!) Applicant Training (how to apply; how to write; how to assess; site visit training; COE) Training and Coaching Solutions Customized training and coaching based on customer needs within their organization Public training workshops Membership Levels w/various benefits per level to include Corporate/Individual options with discounts
15
Why Organizations Apply
Outside evaluation of organizational goals, objectives, and values Improve financials and employee engagement Helps build a common, holistic, and systematic view of the organization Framework provides a common language and standardized method to examine processes and performance; Receive objective feedback Tracks progress Get to the “next level” of performance Fosters benchmarking within and across industries
16
Six Categories – ADLI; One Category - LeTCI
Updated June 15 DELIVERABLES FEEDBACK REPORT FOR JUDGES (Award) & APPLICANT (all levels) RECOMMENDATION TO BOARD OF DIRECTORS (Award Level only) CORE VALUES Systems Perspective Visionary Leadership Customer-focused excellence Valuing People Organization Learning & Agility Focus on success Managing for Innovation Management by Fact Societal Responsibility Ethics and Transparency Delivering Value and Results ASSESSMENT PROCESS Independent Review Site Visit Consensus • Score • Determination of Site Visit Issues • Feedback Report • ID Key Factors Strengths & OFIs • Verify / Clarify • Consensus Scorebook • Training • Review Criteria • Category Champion leads Consensus discussion and scoring • Read Application • Attend Training Six Categories – ADLI; One Category - LeTCI This is a graphic representation of the stages of evaluation for a Quality Texas application. Review each wave and its key components. Identify the Core Values, which will be discussed later during the training. Review the “waves” of the assessment process Independent Review (goes from bottom up, read the steps in IR) Consensus Review (Can we revise this slide? It should read “Category Champ leads consensus discussion and scoring, Consensus Scorebook, Determination of Site Visit Issues) Site Visit (Read steps) NOTE: There’s a copy of this slide and the following one on page 3 of the Examiner Prep Guide. NEXT STEPS JUDGES REVIEW (Award) BOARD OF DIRECTORS DECISIONS (Award) FEEDBACK TO APPLICANT (all levels) ANNUAL AWARDS CEREMONY (all levels) RESOURCES Criteria Books Application QTF Website QTF Staff Training Scoring Guidelines Scorebook Navigator MBNQA Website
17
PERFORMANCE EXCELLENCE
APPLICATION LEVELS FEEDBACK RESOURCES ENGAGEMENT COMMITMENT PROGRESS AWARD STRENGTHS • Criteria Glossary Training Scorebook Navigator Application • Org. Profile 10 Page App. No Site Visit Feedback Report • Org. Profile 20 Page App. No Site Visit Feedback Report • Org. Profile 30 Page App. Site Visit (if Purchased by Applicant) Feedback Report • Org. Profile 50 Page App. Site Visit Feedback Report PERFORMANCE EXCELLENCE New Beginner Level (OP only) OPPORTUNITIES FOR IMPROVEMENT ADLI (Basic) LeTCI ADLI (Overall) LeTCI ADLI (some multiple) LeTCI Approach Deployment Learning Integration Levels Trends Comparisons Integration SCORING FACTORS Process Results SCORES The focus on writing an application is to receive feedback for improvement. Quality Texas has four levels of applications, which reflect the organization’s progress on its improvement journey. The Engagement Level is for organizations early in their improvement journey—just a 10 page application that addresses basic requirements and performance levels. The Commitment Level is for organizations a little farther along, and has a 15 page application that explores the organization’s approaches and its levels of performance. The Progress Level is for maturing organizations and has a 25 page application that examines approaches and deployment as well as levels and trends of performance. The Award Level is provided for organizations that have been actively working to address the critiera fully. It’s a 50 page application, which may result in a site visit. This application will examine approach, deployment, learning, and integration and will look at levels, trends, comparisons, and integration of its results. You’ll note that all four levels of applications require an organizational Profile. We’ll review what that is, and its importance to you as an examiner, later this morning. ALL 4 LEVELS OF APPLICATION RESULT IN WRITTEN FEEDBACK FOR IMPROVEMENT! NOTE: This is a good time to stop and ask for questions or to clear up any confusion – remember, you’re dealing with new examiners here who haven’t yet gone through the process. APPLICATIONS RECEIVED October-March (5 cycles) OCTOBER
18
Examiners’ Roles and Responsibilities
Provide analysis and feedback to lead the organization to the next level of maturity Examination Process Team Members Examiners Senior Examiners Scorebook Editor Back-up Team Leader Team Leader Process Coach Each team has a Subject Matter Expert or two (Can be any role) This slide provides an overview of team composition. Please stress that examiners are not alone. We are all invested in everyone being successful -- see all members of the team as your team mates. Ask examiners to explain the roles of the following: Senior Examiners (anybody who has served one year as an examiner and been on a team) Scorebook Editor (team member who will perform edits and submit the final feedback report to QT) Back-up Team Lead (team member who serves to help the Leader) Team Leader (team member responsible for team, its deadlines, and delivering a strong feedback report to QT) Process Coach (team member who focuses on the processes that the team uses; the only member of team who does not do an Independent Review) Subject Matter Expert (team member serving in any role who has work experience in the field of the business being reviewed; not every team member will come from the applicant’s sector) Each team must also have a Subject Matter Expert. This “Expert” can fill any of the roles on the team.
19
Benefits of Being an Examiner
Learn Quality Texas/Baldrige Performance Excellence Criteria; training and experience Learn validated best practices; leading organizations; all industry sectors Expand professional network; improve YOUR brand; valuable professional credential Develop assessment, analysis, writing, teamwork and leadership skills Give back to the community, state, and nation by helping organizations be successful through useful feedback (patriotism) The Quality Texas staff and Training Faculty are dedicated to creating a positive experience for you so that you can reap the benefits of being a Quality Texas Examiner.
20
Conditions of Involvement Absolute Essentials to the Credibility, Success and Prestige of the Assessment and Feedback Process Commit to the entire process (until the feedback is presented to the applicant) Maintain Confidentiality Complete the Training and Case Study Honor Time Commitments Represent Quality Texas (not your organization) Follow Code of Conduct; Avoid Conflicts of Interest Stress the success of the program depends on their commitment to see it through! Since Quality Texas relies on its volunteers—its examiners to create the feedback report, each examiner has to adhere to timelines and guidelines for our customers—the applicants—to get the much desired feedback report! Ask the tables to discuss for 1-2 minutes what happens if any of these are not done. (The answer should be that the feedback report, site visit, etc. will be comprised and the product will not be delivered or its quality will be suspect)
21
Value for the Examiner/Sponsor
Work with a diverse team; reach consensus Network with other quality/business experts to build professional friendships Understand/apply the Baldrige Framework to a variety of organizations; your own? Develop analytical/consensus-building skills Attend annual Texas Award Banquet and Conference (discounted rate) – June, 2016, Holiday Inn Riverwalk, San Antonio, TX Special recognition at conference While there is certainly value for the organization that will receive the feedback report. There is also value to YOU as an examiner and to the organization that sponsored you! Please read through these and choose the one that you think is the most valuable. Ask for a few examiners to share which they think are the most valuable.
22
Examiner Career Path Time Experiences Examiner Position Examiner
1-3 years BU TL, FBW, 1 SV, Examiner Training Each Year Senior Examiner 4-6 years TL, 2 SV+, Examiner Training Each Year Alumni Examiner 6+ TL, Process Coach, 3+ SV, Examiner Training Each Year Various Committees Judge TL, Process Coach, 3+ SV, Examiner Training Each Year Various Committees Board of Directors Corporate Sponsorship, TL, Process Coach, 3+ SV, Examiner Training Each Year Various Committees Fellow 10+ Dedication and Continual Volunteerism for QTF Issues
23
QTF EXAMINATION & FEEDBACK PROCESS INDEPENDENT REVIEW
Here’s where the site visit stage fits into the overall assessment process.
24
QTF EXAMINATION & FEEDBACK PROCESS CONSENSUS REVIEW
Here’s where the site visit stage fits into the overall assessment process.
25
QTF EXAMINATION & FEEDBACK PROCESS Site Visit
Here’s where the site visit stage fits into the overall assessment process.
26
Walk Through Baldrige Framework Booklet
Take a look at your Baldrige Framework Book
27
Baldrige Excellence Framework
Criteria: Systems Perspective – 1 Basic, Overall, Areas to Address – 2 Point Values per Category/Item – 3 Organizational Profile – 4-6 Categories 1-7 – 7-29 Scoring System – 30-33
28
Baldrige Excellence Framework
Process Scoring Guidelines – 34 Results Scoring Guidelines - 35 Responding to the Criteria – 36-38 Core Values and Concepts – 39-43 Changes from – 44-46 Glossary of Key Terms – 47-54
30
QTF LEVELS Award Next Level Award Level* Progress Recognition
Commitment Recognition Engagement Recognition Beginning Recognition Award Next (OP) QTF LEVELS Award Levels 50 plus 5 (OP) Progress Level 30 plus 5 (OP) Commitment Level 20 plus 5 (OP) Engagement Level 10 plus 5 (OP) Beginning Level 5 (OP)
31
How Do I Evaluate Process Items?
Process items are evaluated using four factors: Approach Deployment Learning Integration ADLI Now that we’ve spent some time looking at the Criteria, let’s turn our attention to the evaluation of process Items. You may recall that the Items in Categories 1-6 address an organization’s processes. These four factors represent the dimensions that are used in conducting an analysis of any particular process. Each of these terms can be found in the glossary. They have precise meanings in the context of the Baldrige framework, so we will spend some time exploring each one. You will also see these factors in the Scoring Guidelines NOTE: All discussion of results and LeTCI have been moved to the second day. You can mention that here if you wish.
32
Approach (A) “Approach” comprises Methods used to carry out Process
Appropriateness of methods vs requirements Effectiveness of the methods The degree to which the approach is repeatable and based on reliable data and information (i.e., systematic) Approach refers to the methods used by an organization to address the TAPE Criteria Item requirements in Categories 1–6. Approach is where it begins. Without a clear approach then deployment has nothing to build on. Approach includes the appropriateness of the methods to the Item requirements and the effectiveness of their use. Remember that the Criteria does not prescribe what approaches an organization should have, but rather the effectiveness of those approaches based on what the organization tells you is important to them. Group discussions: Looking at the Green sheet – approach score goes up when? As an Examiner, you will determine if the applicant describes a process/approach. Then you will evaluate the degree to which an applicant’s approach is systematic; that is, repeatable and based on reliable data and information. Once we have determined the applicant has an approach then we can look at the deployment. LYNN: I don’t understand this exercise! ACTIVITY: read a section and ask about each evaluation factor as follows: Read 1.1a(1) Identify key concepts related to approach. What is missing? Look at written comment to see how the analysis leads to the comments OFIs – how do you improve?
33
Deployment (D) “Deployment” is the extent to which
Your approach addresses item requirements relevant and important to the organization Your approach is applied consistently The approach is used by all appropriate work units Go over these questions that examiners should ask themselves: Is the approach used in all locations? Is it used by all appropriate personnel, work units, departments? Is it used consistently?
34
Learning (L) Learning comprises
The refinement in approach through cycles of evaluation and improvement The encouragement of breakthrough change through innovation, and, The sharing of refinements and innovation with other work units and processes in the organization Go over these questions that examiners should ask themseleves: Has the approach been evaluated and improved? In a systematic manner? Is there evidence of breakthrough change, or innovation? Is there evidence that learning has been shared with other organizational units and/or partners, as appropriate, so that they might benefit from it? Is there evidence that learning is embedded into how the organization operates?
35
Integration (I) “Integration” is the extent which
Your approach is aligned with organizational needs (OP/ processes) Your measures, information, and improvement systems are complementary across processes and work units Your plans, processes, results, analyses, learning, and actions are “harmonized” across processes and work units to support organization-wide goals Go over these questions that examiners should ask themselves: How well is the approach aligned with organizational needs (i.e., those identified in the Organizational Profile and Process Items)? Are the measures, information, and improvement systems complementary across processes and work units? How well does the approach align and integrate with other approaches to support achievement of organizational goals?
36
Results: 45% of the Total Score
APEX Application Writing Workshop Results: 45% of the Total Score LeTCI We’ve covered the evaluation factors you’ll consider when assessing the process Categories – Categories 1 through 6. But there’s one more Category to discuss, and it’s a big one. Results count for a whopping 45% of the score in an application, so it’s important that we assess them properly. When we get to results, we’re no longer looking at ADLI. Instead, we’re looking at some different factors: Levels, trends, comparisons and integration. September, 2009 36
37
Results Evaluation Factors LeTCI
APEX Application Writing Workshop Results Evaluation Factors LeTCI Levels – Current performance on a meaningful measurement scale Trends – Numerical data that show the direction and rate of improvements (slope over time) Comparisons – Your performance relative to that of other appropriate organizations, competitors or organizations similar to yours; relative to industry leaders or benchmarks Integration – Measures (segmentation) addressing important performance requirements relating to customers, products/services, markets, processes, or action plans identified in your OP and in process items; future performance; harmonization across processes and work units to support organization-wide goals This slide shows what those new factors are: Levels: current performance Trends: Performance over time (emphasize that you should expect to see at least three data points presented in order to discern a trend) Comparisons: It’s expected that the applicant will present appropriate competitive and comparative data for many of its measures whenever such comparisons exist. This helps us gage the applicant’s leadership position. Integration: Many measures are asked to be defined in Categories 1 – 6. The applicant gets to define what’s important to them to measure, within reason. Are the measures you’re seeing presented in Category 7 the most important measures for te applicant and its industry, or are they just presenting high-performing fluff? Segmentation: Results should be segmented by appropriate groups, as the applicant has defined them in the org Profile (workforce segments, product/service lines, customer types, etc.). More details on segmentation are on the following slide. September, 2009 37
38
Segmentation Segments can be defined by, among other things:
APEX Application Writing Workshop Segmentation Segments can be defined by, among other things: Customers (students/families for Education; patients/families for HC) Market or Product offerings By Location Workforce group (employees, tenure, admin, hourly, etc.) Size of the group in question The applicant defines their segments (OP); we assess the results of those segments Remind the examiners that the applicant will define most of their appropriate segments in the org Profile, as shown here. So, these segments become prime candidates to include as Key Factors in results Items. September, 2009 38
39
Comparisons Comparisons can be defined by, among other things:
Inside the industry Competitive comparisons Outside the industry
40
QTF/Baldrige Examination Process
TRUST THE PROCESS! (KEY TEACHING POINT)
42
Step 1: Read the Criteria.
Read Baldrige Framework for Award Level QTF Beginner, Engagement, Commitment, or Progress Level Criteria
43
Step 2: Determine Most Relevant Key Factors
Four to Six Key Factors taken from OP, Eligibility, or from Application
44
Step 3: Read & Analyze the Application
Read the Application Mark as appropriate
45
Step 4: Identify Strengths/OFIs
Around 6 comments Strengths and OFIs
46
Step 5: Write Feedback Ready Comments
Remember: NERD! N – nugget of importance E – evidence/example to support comments R – relevance to the applicant D – Done!
47
Step 6: Determine the Scoring Range and Score
Best Fit Don’t Block a Winner Tie goes to applicant
48
Getting Ready to Learn - Welcome
Scoring: Not like your previous education! Go to Process Scoring Page 34
49
UNDERSTANDING SYSTEMATIC APPROACH
50
UNDERSTANDING DEPLOYMENT
51
UNDERSTANDING IMPROVEMENT/LEARNING/INNOVATION
52
UNDERSTANDING INTEGRATION
53
Go to Results Scoring Page 35
What is different about Results Scoring?
54
Questions to this Point??
55
So let’s begin our Examination/Evaluation
56
Importance of the Organizational Profile
Reader’s digest version of the application Sets the stage for what the applicant says is important Frames our comments (feedback report), strengths/OFIs, scoring, and Key Themes Assists with our scoring band descriptors (Award) Do not read into or take away from what is stated
57
How to determine Key Factors
Review what the applicant says is important in the eligibility form Review the OP Be aware throughout the application of other key factors that are discussed What is really important? Do not rewrite the OP
58
Exercise: Determine Key Factors
Activity Time (min) Select a scribe, timekeeper and reporter (1) Discuss at your table as a group your thoughts on the key factors for this applicant from the Organizational Profile 10 Highlight Key Factors from the OP and select a few to write down; Table Anchors guide this process 4. Record the requirements on chart paper 20 Report Out (1 minute per table) 5 45 Minutes Total
59
Step 1: Read the Criteria
This is just a quick set-up for the exercise on the following page. It’s a good place to ask returning examiners about some of their techniques for this step. Remind the class that nobody has the Criteria memorized. It’s a good idea to refresh yourself on what is there – and what is not.
60
Exercise: Step 1 Read the Criteria
Activity Time (min) Select a scribe, timekeeper and reporter (1) Discuss at your table, (group) your thoughts on the key requirements for this Item 1.1 from Baldrige Framework Manual Review individually your assigned Criteria Item [1.1a(1), (2), (3), 1.1b(1), (2)] Record the requirements on chart paper Report Out (1 minute per table) 30 10 60 Minutes Total
61
Step 2: Determine Most Relevant Key Factors
This is just a quick set-up for the exercise on the following slide. It’s a good place to ask returning examiners about some of their techniques for this step. Question for the group: Why 4 – 6 KFs? Why not more or less than this? What happens with your comments if you choose 8 KFs, for example?
62
Exercise: Step 2 Determine the Most Relevant Key Factors
Activity Time (min) Select a scribe, timekeeper and reporter (1) Review individually your assigned Criteria Item [1.1a(1), (2), (3), 1.1b(1), (2)] 5 Discuss at your table as a group and come to agreement on the relevant 4 to 6 key factors for your assigned item. Write your selected key factors on a flip chart for use in subsequent exercises (capture key words and phrases) 20 10 Report Out (1 minute per table) Facilitator’s Note: Have the examiners turn to page 8 in the Examiners Guide. The full instructions are there. Also remind them that the Key Factors from the Baldrige case study begin on the next page (8). They will be referring back to these over the course of the next three sets of exercises. For report out, contrast the observations for tables that had the same item. Explain why identifying different KFs may lead examiners to come up with different comments in their Independent Review than their teammates. 45 Minutes Total
63
Step 3: Read and Analyze the Application
This is just a quick set-up for the exercise on the following page. It’s a good place to ask returning examiners about some of their techniques for this step. In particular, make sure it’s emphasized to at least skim over the entire application before diving into Item-level evaluation.
64
Exercise: Step 3 Read and Analyze the Application
Activity Time (min) Select a scribe, timekeeper and reporter (1) Review individually your assigned Criteria Item (1.1a(1), (2), (3), 1.1b(1), (2) 20 Discuss at your table as a group your observations on the applicant’s response against the Criteria questions. Report Out (1 minute per table?) 5 Facilitator’s Note: Have the examiners turn to page 14 in the Examiners Guide. The full instructions are there. For report out, just ask for some input from experienced examiners on their techniques for reading and analyzing an application.. 45 Minutes Total
65
Step 4: Identify Strengths/OFIs
This is just a quick set-up for the exercise on the following page. It’s a good place to ask returning examiners about some of their techniques for this step. “C” = Comments “SG” = Scoring Guidelines (i.e., refer to the scoring guidelines for both possible language to use in the comment to indicate level of the applicant’s maturity)
66
Exercise: Step 4 Identify Strengths/OFIs
Activity Time (min) Select a scribe, timekeeper and reporter (1) Use a round robin approach to discuss/highlight all potential strengths/OFIs identified by your table mates [1.1a(1) (2)(3), 1.1b(1)(2)] 15 From these, discuss at your table and select as a group, a total of around 2 strengths and OFIs that you feel are most important for the applicant Record these on chart paper 5 Report Out (1 minute per table) 10 Facilitator’s Note: Have the examiners turn to page 15 in the Examiners Guide. The full instructions are there. For report out, just contrast the observations for tables that had the same item. Discuss why these differences might lead to different scores in Step 6. 45 Minutes Total
67
Our KEY Product for the Applicant!
Feedback Comments Our KEY Product for the Applicant!
68
What Feedback Ready Comments Should Do:
Let the applicant know what it does well—and what it needs to improve to take it to the next level 1. Criteria requirements equitable assessment 2. The particular organization meaningful feedback Feedback comments are important to your customers. They validate strengths and provide good feedback of how to take their organization to the next level and to continually grow. By following the criteria requirements, we ensure that all applicants get a fair assessment against the Criteria. By considering the applicant’s Key Factors in constructing and prioritizing our comments, we make the feedback more meaningful and useful to them. By using ADLI in our assessment, we help the applicant understand their current level of maturity, and what it might take to get them to higher levels of performance. Assessment factors (ADLI) (LeTCI) insights on organizational maturity
69
Six Feedback Comment “Musts”
Polite, professional, non-prescriptive Based on the Criteria, Key Factors, Scoring Guidelines Addresses one topic per comment Begins with “nugget” that shows why this is important to the applicant Has evidence/example and relevance (actionable information) (based on scoring guidelines, Key Factors, Core Values, criteria notes) Comments align with and support the score Review briefly the 6 “musts” for a comment.
70
Well-Written Comments: “NERD”
N – Nugget – Up front comment based on the Item Criteria or a Key Factor (i.e., explains why this is important to the applicant) E – Evidence or Example - State what the applicant has (Strength) or is missing or not addressed (OFI) R – Relevance – Why is this comment important? If an OFI, include actionable (but not prescriptive) information that will help the applicant to improve. D – Done – Read to see if comment makes sense and helps the organization understand its current status Sum up comment writing tips by talking about “NERD”: A well-written Process OFI comment has four elements: Acknowledgement if the applicant does have an approach in place. This is optional, but does provide a nice lead-in. Note to facilitators: This may be an appropriate place to caution about strengths embedded in OFIs, but not listed as separate strength Strengths, OFIs, and rationale. Doing this tends to drive scores down. A description of the gap(s) in the applicant’s approach-deployment, based on whether it addresses the Item requirements and is systematic, effective, and deployed. An explanation of “so what” or why it is important to the applicant’s success based on the Item Criteria or a Key Factor. A description of actionable information that may help the applicant move forward. All OFI Strengths, OFIs, and rationale are valuable to the applicant. “So whats” and actionable information are the added value we provide to help the applicant move forward.
71
Elements of a well written comment
Comment Writing Elements of a well written comment Now for an incredibly important part of this training—Comment Writing NOTE: We’ve expanded this section and put it at the beginning of Step 5 in the process to be just-in-time and to emphasize the importance of well-written comments.
72
The Feedback Report: The Product
The report should be: Useful to the applicant Comments should be clear and actionable Focused on the applicant’s most important key factors Encouraging Not adversarial (it’s not an audit) Help applicant reach the next level Make applicant an advocate for the program Respectful Polite in tone Not judgmental or prescriptive Review these points. Ask if there are any examiners in the room whose organization has applied at Quality Texas at any level (or Baldrige). If so, ask them their thoughts on the importance of a useful feedback report.
73
How to Give Useful Feedback
APEX Application Writing Workshop How to Give Useful Feedback Do: Base your comments on the Criteria Reference the evaluation factors Include an opening “nugget” to give the comment significance Keep each comment to a single issue Make Key Factor references Be accurate; check your statements Recognize the page limitations Give benefit of the doubt Be polite in tone Include a few examples Check spelling and grammar Don’t: Be judgmental or prescriptive Forget linkages Forget the Key Factors Stray from the Criteria Make conflicting strength and OFI statements Forget to: Check your score against the balance and content of comments Check your facts Check grammar and spelling Forget you’re providing a service Ask participants to read through the “do” list and choose two that they believe are critical to providing feedback. Take a show of hands for the top two. Do the same for the “don’t” list. September, 2009 73
74
Sample Process Strength
Relevance Nugget 2.2a(4) To mitigate the strategic challenge of competitors wishing to hire its engaged workforce, the applicant ensures that workforce plans support any needed changes. For example, through data and budget analysis and surveys, the applicant takes a proactive approach to workforce capacity and allocates instructional staff to areas of greatest need through “vertical teamwork.” Read the comment and identify the nugget, the relevance, and the examples. Note that there is only one point of relevance. The examiner chose to put it at the beginning of the comment because they determined that this is where it has the most impact: it “personalizes” the comment for the applicant. Could you have improved on this comment? If so, how? The examiners chose the one example that best makes their point. Depending on the comment, they might choose to include more. What evaluation factors are the examiners citing as a strength? (A, citing an approach; maybe I, since it addresses an SC?) Examples
75
Sample Process OFI Nugget Relevance 4.2a(1) It is not clear how the applicant systematically transfers knowledge specific to the needs of parents and volunteers in support of the PhilP that all are accountable for student performance. For example, parents and volunteers do not appear to be included in teachers’ grade-level discussions, and parents do not appear to have access to teachers’ online forums, blogs, and classroom support server (Figure 4.2-1). Identify the nugget, relevance, and examples. Note the order: why do you think the examiners chose this order? What evaluation factors are the examiners citing? (lack of A) Could you have improved on this comment? If so, how? Example
76
Sample Process Strength
Nugget Relevance 2.2a(4) To mitigate the strategic challenge of competitors wishing to hire its engaged workforce, the applicant ensures that workforce plans support any needed changes. For example, through data and budget analysis and surveys, the applicant takes a proactive approach to workforce capacity and allocates instructional staff to areas of greatest need through “vertical teamwork.” Read the comment and identify the nugget, the relevance, and the examples. Note that there is only one point of relevance. The examiner chose to put it at the beginning of the comment because they determined that this is where it has the most impact: it “personalizes” the comment for the applicant. The examiners chose the one example that best makes their point. Depending on the comment, they might choose to include more. What evaluation factors are the examiners citing as a strength? (A, citing an approach; maybe I, since it addresses an SC?) 53 words, 372 characters/spaces Examples
77
Sample Process OFI 64 words, 415 characters/spaces
Nugget 4.2a(1) It is not clear how the applicant systematically transfers knowledge specific to the needs of parents and volunteers in support of the PhilP that all are accountable for student performance. For example, parents and volunteers do not appear to be included in teachers’ grade-level discussions, and parents do not appear to have access to teachers’ online forums, blogs, and classroom support server (Figure 4.2-1). Relevance Identify the nugget, relevance, and examples. Note the order: why do you think the examiners chose this order? What evaluation factors are the examiners citing? (lack of A) Examples 64 words, 415 characters/spaces
78
Sample Results Strength
Nugget Relevance 7.3a(1) Strong results for key measures of workforce capability and capacity may help strengthen the engagement factor of having sufficient resources to get the job done. Rates of certification (Figure 7.3-1) and student-teacher ratio (Figure 7.3-2) have improved over the periods shown, reaching 100% or close to 100% for all segments, and the student-teacher ratio in elementary and middle schools compares favorably to that of a Baldrige Award winner. Identify the nugget, relevance, and examples. Note the order: others are possible. What evaluation factors are the examiners citing? (T, C; note that this doesn’t have to be in the nugget) Examples
79
Sample Results OFI Nugget Examples 7.2a Results related to key student requirements, such as stimulating creative thought, treating students fairly, and maintaining a safe school, are missing. Tracking such results may give the applicant insights into how to retain families and how to attract families to the district. Relevance Identify the nugget, relevance, and examples. Note the order: others are possible. What evaluation factors are the examiners citing? (missing results)
80
Step 5: Write Feedback Ready Comments
So, with those examples out of the way, let’s have you try your hands at developing fully-formed, ready to present to the applicant, feedback comments.
81
Exercise: Step 5 Write Feedback Ready Comments
Activity Time (min) Select a scribe, timekeeper and reporter (1) Draft feedback ready strength and OFI (2 groups) Give feedback to table mates Redraft the comment, using their input Record the final feedback ready comment on chart paper 15 10 Report Out (1 minute per table) Facilitator’s Note: Have the examiners turn to page 19 in the Examiners Guide. The full instructions are there. All facilitators should spend time at each table coaching the examiners on their comment writing. We have allowed sufficient time in the agenda to get some decent comments developed in this step. The objective is to have truly feedback-ready comments produced at the end of each of the four Step 5 exercises we will do this week, so don’t rush completion of this step. Test the final feedback-ready comments with the class: Is this comment understandable? Within the bounds of the criteria? Polite in tone? Specific enough so that it’s actionable by the applicant? Use criteria and/or scoring language as appropriate so the applicant understands the comment’s significance? Address the evaluation factors (ADLI or LeTCI)? Help the applicant move to higher levels of performance? Avoids being prescriptive (not within the criteria) or judgmental? 55 Minutes Total
82
Introduction to Scoring
Process Scoring Guidelines and Results Scoring Guidelines Use Glossary for Definitions always Process Scores Address: Approach, Deployment, Learning, Integration (ADLI) Results Scores Address: Levels, Trends, Comparisons, Integration (LeTCI) Facilitators Note: This is a brief introduction to scoring. We’ll spend a bit more time on it tomorrow. The reason it’s brief is that examiners, both new and returning, often agonize for too long over scoring. Pushing them to come to a quick conclusion forces them to get over this mental block. Another key ingredient to a successful examiner is understanding how to score an application. Trainers, hold up the scoring sheet (one side is Process, and the other side is Results) or point out pages 32 and 33 in the Criteria booklet. Ask participants to turn to scoring guide in the Category booklet. Have the participants follow as you show them where approach is scored, deployment, learning, and integration. Then explain that process and results are scored differently—even have different criteria! Results are on the back (make sure that they are on the right page) and evaluate levels, trends, comparisons, and integration. Explain levels (point on a scale), trends (at least three data points), comparisons (with other entities), and integration. Discuss (and get input from returning examiners) good and bad practices for determining scoring. 8.0 8.5 8.5 9.0 6.5
83
Step 6: Determine the Scoring Range and Score
This is just a quick set-up for the exercise on the following page. Point out that the graphic shows two things: first is the range, second is the score within that range. Scoring tends to go quicker if you can first determine the most appropriate range.
84
Look at Scoring Guidelines in Baldrige Framework
Pages 34 Process and 35 Results
85
Exercise: Step 6 Scoring
Activity Time (min) Select a scribe, timekeeper and reporter (1) Using your comments from Step 4 and the Scoring Guidelines, discuss as a group the most appropriate scoring range for your assigned Item 10 Next, discuss the appropriate score within that range 4. Record the scoring range and score on chart paper and be prepared to discuss why you chose that score Report Out (1 minute per table) 5 Facilitator’s Note: Point the examiners to page 21 in their workbooks. The details instructions are there and remind them that they should be using the process scoring guidelines. FYI: Since everyone likes to know how their score matches up with that of the Baldrige TST, here’s their scores: for 2.1: 30 – 45% range; for 6.2: 50 – 65% 35 Minutes Total
86
Follow Six Step Process
Complete for 2.1 [(2.1a(1)(2)(3)(4) and 2.1b(1)(2)] Complete for 5.2 [(5.2a(1)(2)(3)(4) and 5.2b(1)(2)(3)] Complete for 7.1 [(7.1a, b(1)(2), c] Complete for 7.5 [(7.5a(1)(2)]
87
Round-Robin Feedback What did you learn?
Do you have confidence to begin/continue the assessment stages? Any final questions or concerns? Final summary
88
Blank
89
Class will begin at 8:30 a. m. Please remember to sign in each day
Class will begin at 8:30 a.m.! Please remember to sign in each day! Thanks and enjoy your evening!
90
Scorebook Navigator Links to the IR and consensus manuals for Scorebook Navigator are on the Examiner Resources page Suggested narrative: We won’t cover the workings of Scorebook Navigator today. You already have some experience with it since you used it to do your pre-work in preparation for Examiner Training. It’s a fairly intuitive program to use, but like any software that you don’t use frequently, it’s easy to forget the steps. We have two manuals on the Quality Texas website on the Examiner Resource page, one for Independent Review and one for consensus. Refer to these manuals “just in time” as you’re ready to log on to Scorebook Navigator to begin work on your application. You’ll see that the program in consensus consolidates all the IR comments for you to make it easy for you to do your job as a Category lead. Side note: Examiners might wonder what The Alliance for Performance Excellence is. Explain that it’s the association of regional, state and local Baldrige-based programs that work together to foster use of Baldrige. Quality Texas is a member of the Alliance, which manages Scorebook Navigator for its members.
91
Scorebook Navigator
92
Scorebook Navigator
93
Last Things…Almost! Be sure to complete your training surveys and forms! Please contact Quality Texas if you need help. Let’s thank our Training Faculty! Please take a few moments to clean up your areas Please remember to post your Plus/Delta comments! Have a safe trip home!
94
CONGRATULATIONS! It’s time to celebrate your successful completion of Examiner Training! Call names, distribute certificates, and take class photos. 94
95
We appreciate YOU! Welcome to the Quality Texas Family
Distribution of certificates – all should stay for class photo
96
Always thank the students – they’re all volunteers!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.