2015 Pre-Examiner Training and Preparation Course

Slides:



Advertisements
Similar presentations
2012 EXAMINER TRAINING Examples of NERD Comment Formatting
Advertisements

[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
Site Visit Review Just-in-Time Training. Pre-work and Training Judging Examiner Evaluation Process Stage 1 Independent Review Stage 2 Consensus Review.
Using Baldrige to Create Organizational Alignment & Integration
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
June 2002QPRC 2002, Tempe, Arizona A Workshop on Assessing to the Baldrige Criteria Cheryl L. Jennings, Motorola Lynn Kelley, Textron.
Welcome, Panel of Examiner and Process Development Members! Washington State Quality Award PEPD #1 Training 2008.
2014 Baldrige Performance Excellence Program | Introduction to the Baldrige Criteria Baldrige Performance Excellence Program |
TNCPE Site Visit Opening Meeting. Site Visit Agenda Introductions What is TNCPE? Where we are in the process Site visit expectations Confidentiality and.
2014 Baldrige Performance Excellence Program | Self-Assessing Your Organization with the Baldrige Criteria.
INITIAL ON BOARDING COACHING
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Comment Writing Exercise Return Examiner Training.
Parent School Climate Survey Results and Analysis November 2010.
2015 Baldrige Performance Excellence Program | Baldrige Performance Excellence Program | 2015 Baldrige Excellence Framework (Health.
WSQA PEPD Mentor Training Session 1: Mentoring role and responsibilities.
2010 AHCA/NCAL National Quality Award Program - Gold Overview - Jeri Reinhardt Ed McMahon Tim Case.
Baldrige National Quality Program Information and Analysis: The Foundation for Performance Excellence Harry S. Hertz DAMA - NCR Meeting March 13, 2001.
Applicant Name RMPEx Site Visit Opening Meeting Team Leader - Team Members –
2015 Baldrige Performance Excellence Program | Baldrige Performance Excellence Program | 2015 Baldrige Excellence Framework A systems.
JUDGES MEETING 1. Judges Meeting The Judges Meeting –Each Team will be assigned a date and time. Each Senior Examiner will be expected to present their.
Staff Compensation Program Update
Leadership and Strategic Planning
Booster/Refresher Training: Team & Faculty Commitment Benchmarks of Quality Items # 1 –
Successful Board Management Building and Leading the Team Deb Fritz, NSR National PTA.
2015 Baldrige Performance Excellence Program | Baldrige Performance Excellence Program | 2015 Introduction to the Baldrige Excellence.
Site Visit. 2 Process Intent Observation (IO) Strength to Comment Intent Observation: “Double plus on the process used for strategic planning.” Comment:
February 8, 2012 Session 3: Performance Management Systems 1.
Aligning Academic Review and Performance Evaluation (AARPE)
THE STERLING MANAGEMENT MODEL… Is it right for you?
2015 Baldrige Performance Excellence Program | Producing a High-Quality Scorebook 2015 Presentation for Senior and Alumni Examiners.
Michigan Quality Leadership Award New Examiner Training 2014.
Board Workshop February 13 th Background Journey started with the adoption of the Sterling Management Criteria Pursuing the Baldrige Education Criteria.
Scoring 1. Scoring Categories 1 – 6 (Process Categories) Examiners select a score (0-100) to summarize their observed strengths and opportunities for.
A Florida Natural Resource: The Sterling Approach to Performance Excellence A workshop at The Florida Conference on Aging 2001 Doris Reeves-Lipscomb Groups-That-Work.
2015 Baldrige Performance Excellence Program | Baldrige Performance Excellence Program | 2015 Introduction to the Baldrige Excellence.
Hillsdale County Intermediate School District Oral Exit Report Quality Assurance Review Team Education Service Agency Accreditation ESA
Before you begin. For additional assistance, contact your club’s Information Technology Chairperson or Electronic Learning at:
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
12-14 Pindari Rd Peakhurst NSW 2210 p: e: Employee Survey Links2Success.
Quality Assurance Review Team Oral Exit Report District Accreditation Bibb County Schools February 5-8, 2012.
Applicant Name RMPEx Site Visit Opening Meeting Team Leader - Team Members –
2010 AHCA/NCAL National Quality Award Program - Silver Overview - Session One Lance Reynolds Kevin Warren Tim Case.
1 ©2002 Minnesota Council for Quality Evaluation Techniques for Team-Leaders and Judges August 2002.
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
2008 AHCA/NCAL National Quality Award Program - Step III Overview - Jon Frantsvog Ira Schoenberger Tim Case.
2013 Baldrige Performance Excellence Program | Comment Writing for High-Quality Feedback Reports.
Quality Assurance Review Team Oral Exit Report District Accreditation Rapides Parish School District February 2, 2011.
Tennessee Center for Performance Excellence Section 2 – Process Evaluation Factors.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Baldrige Performance Excellence Program |
District Leadership Module Preview This PowerPoint provides a sample of the District Leadership Module PowerPoint. The actual Overview PowerPoint is 73.
1 1 Richland College The Role of Leadership and Culture in Richland College’s Baldrige Journey Michigan Community College Assn October 7, 2011 Stephen.
Quality Texas Foundation Site Visit Team Closing Meeting January 15, 2014.
Loudon County Schools External Review Exit Report February 19-21, 2013.
Baldrige National Quality Program 2004 Using the Baldrige Criteria to Achieve Performance Excellence Jane Poulter, BSN, MSA Baldrige National Quality Program.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Common Core Parenting: Best Practice Strategies to Support Student Success Core Components: Successful Models Patty Bunker National Director Parenting.
Baldrige Performance Excellence Program | 2016
Welcome, Examiners! Washington State Quality Award Return Examiner Training 2009.
2016 Baldrige Performance Excellence Program | Writing High-Quality Feedback for 2016 Baldrige Award Applicants.
HOUSTON INDEPENDENT SCHOOL DISTRICT Appraisal Training for Central Office and Campus-Based Non-Teacher Employees September 2013 HOUSTON INDEPENDENT.
Baldrige Performance Excellence Program | New Examiner Orientation and Examiner Training Experience Welcome to … BALDRIGE.
Baldrige Performance Excellence Program | Baldrige Examiner Preparation
Strategic Plan: Goals, Objectives & Success Measures Administrative Forum, South Campus June 17,
MANAGING EMPLOYEE PERFORMANCE Facilitator: Joan Strohauer, CalHR Guest Presenters: Marva Lee, Personnel Officer, CalSTRS Brenna Neuharth, Workforce Planning.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Finalizing Award Recommendations
Sustaining Continuous Improvement
Presentation transcript:

2015 Pre-Examiner Training and Preparation Course The Journey Begins (Continues)!

Before We Begin…. Parking Lot Logistics Safety Location of exits Use of phone (please place on stun) Restrooms Parking Lot First, some logistics. It is very important that we know two key things, where the exits are located in case of emergency and where the restrooms are located. [give locations] Smoking is not allowed in the building. [Give locations of smoking areas, if any.] As with all meetings, we ask that you turn your cell phones and pagers to vibrate or preferably off. If you need to keep your ringer on for emergency reasons, please let us know. Explain the parking lot which is posted on the wall.

Complete Training Evaluation Forms AND Plus/Delta Set up FLIP CHART at appropriate exit point Ask examiners to note plus/delta on post it notes and stick on the flip chart in the appropriate column as they exit the room at end of the day!

Quality Texas Meeting Behaviors Be respectful Share openly Have only one conversation at a time Mute phones Return from breaks on time Take care of personal needs Clean up after yourself Stay on point Work toward consensus Understand that silence means affirmation Employ “ELMO” as needed Have fun These meeting behaviors have been developed as part of the leadership system for Quality Texas. They represent the kinds of behaviors we expect from our leaders and from each other. Ground Rules

Learning Outcomes Gain an understanding of the Framework; Review Apply the Six-Step Independent Review Process to an Award Application; Choosing Key Factors, Strengths/OFIs, Scoring, and Key Themes Write Better Feedback Comments Using the Comment Guidelines Improve your BRAND!

Course Overview Tuesday – Quick review of the Baldrige Framework; Review the Organizational Profile; Key Factors; Discuss Six- Step IR Process; Review Items 1.1 Wednesday – Six-Step IR Process; Items 2.1 and 5.2 Thursday – Six-Step IR Process; Review 7.1 and 7.5; Scorebook Navigator Class

Introductions – At Your Table Who you are! Name / City What you do! Organization Job title Experience with QTF or Baldrige Criteria Your expectations of this Training NOTE to facilitators: It’s important not to lose time early in the day, so be quick in running this section. Introduce yourself and your co-facilitators, as well as any Quality Texas personnel present. Due to the anticipated size of the classes, to save time, have the Examiners make introductions at their table, rather than across the entire room. After table introductions are complete, discuss expectations from the entire group.

About Quality Texas Laying Concrete

Quality Texas Foundation Mr. Ryan Gonzales, Director of Operations ryangonzales@quality-texas.org; (512) 940-8282 Ms. Lin Wrinkle, Director of Administration linwrinkle@quality-texas.org; (512) 818-3901 Dr. Mac McGuire, Chief Executive Officer drmac@quality-texas.org; (512) 656-8946 201 Woodland Park, Georgetown, Texas 78633-2007 www.quality-texas.org

Key Facts About Quality Texas Foundation President Reagan concept 1987-1989; Mac Baldrige - Secretary of Commerce; died in rodeo accident State Concept 1990; Started 1992; Endorsed by then Governor Ann Richards; November 1992 huge kickoff in Houston Founded in 1994 as a 501(c)3 not-for-profit corporation Full-time staff of two people; one person part time; around 300 state-wide volunteers The Quality Texas Foundation is a non-profit 501(c) 3 corporation that evolved from a concept introduced by Governor Ann Richards in 1990.  Cooperative efforts between the Governor's office, the Texas Department of Commerce, and Texas businesses made it possible for the new Quality Texas Foundation to organize and deliver quality awareness seminars across the state.  In the first two years, seminars were presented to 1,800 individuals representing more than 700 organizations. The QTF is endorsed by the governor’s office. At the same time, EDS Corporation assigned an executive to lead development of the state quality award.  A committee was formed with representatives from organizations across Texas. The committee created the Texas Award for Performance Excellence, open to government, education, healthcare, nonprofit, and business organizations. The American Productivity & Quality Center was chosen as the original award administrator and applications were first accepted in 1993. There are currently two full-time staff working for the QTF—a Chief Executive Officer, a Director – Performance Excellence Program. The work is primarily done through 300 volunteers across the State of Texas. The organization is funded through corporate and individual donations, training, and award participant fees. QTF is recognized as one of the best State award programs with more national level winners than any other state.

Key Facts About Quality Texas Foundation Funded through Memberships and Sponsorships, Customized training workshops, Performance Excellence Program participant fees, Annual Texas Quest Conference Recognized as the most effective state program, with more Baldrige recipients than any other state (19); 52 state winners

QTF’s Vision and Mission Vision: The Quality Texas Foundation, the preeminent state program, will continue to lead the way nationally in innovation by establishing Communities of Excellence (CoE) throughout the state. Mission: QTF exists to assist individuals and organizations in their continuous improvement efforts thereby positively impacting our communities, state, and nation.  

Services We Provide Assessment; Feedback. Recognition and Awards Examiner Training (You are here!) Applicant Training (how to apply; how to write; how to assess; site visit training; COE) Training and Coaching Solutions Customized training and coaching based on customer needs within their organization Public training workshops Membership Levels w/various benefits per level to include Corporate/Individual options with discounts

Why Organizations Apply Outside evaluation of organizational goals, objectives, and values Improve financials and employee engagement Helps build a common, holistic, and systematic view of the organization Framework provides a common language and standardized method to examine processes and performance; Receive objective feedback Tracks progress Get to the “next level” of performance Fosters benchmarking within and across industries

Six Categories – ADLI; One Category - LeTCI Updated June 15 DELIVERABLES FEEDBACK REPORT FOR JUDGES (Award) & APPLICANT (all levels) RECOMMENDATION TO BOARD OF DIRECTORS (Award Level only) CORE VALUES Systems Perspective Visionary Leadership Customer-focused excellence Valuing People Organization Learning & Agility Focus on success Managing for Innovation Management by Fact Societal Responsibility Ethics and Transparency Delivering Value and Results ASSESSMENT PROCESS Independent Review Site Visit Consensus • Score • Determination of Site Visit Issues • Feedback Report • ID Key Factors Strengths & OFIs • Verify / Clarify • Consensus Scorebook • Training • Review Criteria • Category Champion leads Consensus discussion and scoring • Read Application • Attend Training Six Categories – ADLI; One Category - LeTCI This is a graphic representation of the stages of evaluation for a Quality Texas application. Review each wave and its key components. Identify the Core Values, which will be discussed later during the training. Review the “waves” of the assessment process Independent Review (goes from bottom up, read the steps in IR) Consensus Review (Can we revise this slide? It should read “Category Champ leads consensus discussion and scoring, Consensus Scorebook, Determination of Site Visit Issues) Site Visit (Read steps) NOTE: There’s a copy of this slide and the following one on page 3 of the Examiner Prep Guide. NEXT STEPS JUDGES REVIEW (Award) BOARD OF DIRECTORS DECISIONS (Award) FEEDBACK TO APPLICANT (all levels) ANNUAL AWARDS CEREMONY (all levels) RESOURCES Criteria Books Application QTF Website QTF Staff Training Scoring Guidelines Scorebook Navigator MBNQA Website

PERFORMANCE EXCELLENCE APPLICATION LEVELS FEEDBACK RESOURCES ENGAGEMENT COMMITMENT PROGRESS AWARD STRENGTHS • Criteria Glossary Training Scorebook Navigator Application • Org. Profile 10 Page App. No Site Visit Feedback Report • Org. Profile 20 Page App. No Site Visit Feedback Report • Org. Profile 30 Page App. Site Visit (if Purchased by Applicant) Feedback Report • Org. Profile 50 Page App. Site Visit Feedback Report PERFORMANCE EXCELLENCE New Beginner Level (OP only) OPPORTUNITIES FOR IMPROVEMENT ADLI (Basic) LeTCI ADLI (Overall) LeTCI ADLI (some multiple) LeTCI Approach Deployment Learning Integration Levels Trends Comparisons Integration SCORING FACTORS Process Results SCORES The focus on writing an application is to receive feedback for improvement. Quality Texas has four levels of applications, which reflect the organization’s progress on its improvement journey. The Engagement Level is for organizations early in their improvement journey—just a 10 page application that addresses basic requirements and performance levels. The Commitment Level is for organizations a little farther along, and has a 15 page application that explores the organization’s approaches and its levels of performance. The Progress Level is for maturing organizations and has a 25 page application that examines approaches and deployment as well as levels and trends of performance. The Award Level is provided for organizations that have been actively working to address the critiera fully. It’s a 50 page application, which may result in a site visit. This application will examine approach, deployment, learning, and integration and will look at levels, trends, comparisons, and integration of its results. You’ll note that all four levels of applications require an organizational Profile. We’ll review what that is, and its importance to you as an examiner, later this morning. ALL 4 LEVELS OF APPLICATION RESULT IN WRITTEN FEEDBACK FOR IMPROVEMENT! NOTE: This is a good time to stop and ask for questions or to clear up any confusion – remember, you’re dealing with new examiners here who haven’t yet gone through the process. APPLICATIONS RECEIVED October-March (5 cycles) OCTOBER

Examiners’ Roles and Responsibilities Provide analysis and feedback to lead the organization to the next level of maturity Examination Process Team Members Examiners Senior Examiners Scorebook Editor Back-up Team Leader Team Leader Process Coach Each team has a Subject Matter Expert or two (Can be any role) This slide provides an overview of team composition. Please stress that examiners are not alone. We are all invested in everyone being successful -- see all members of the team as your team mates. Ask examiners to explain the roles of the following: Senior Examiners (anybody who has served one year as an examiner and been on a team) Scorebook Editor (team member who will perform edits and submit the final feedback report to QT) Back-up Team Lead (team member who serves to help the Leader) Team Leader (team member responsible for team, its deadlines, and delivering a strong feedback report to QT) Process Coach (team member who focuses on the processes that the team uses; the only member of team who does not do an Independent Review) Subject Matter Expert (team member serving in any role who has work experience in the field of the business being reviewed; not every team member will come from the applicant’s sector) Each team must also have a Subject Matter Expert. This “Expert” can fill any of the roles on the team.

Benefits of Being an Examiner Learn Quality Texas/Baldrige Performance Excellence Criteria; training and experience Learn validated best practices; leading organizations; all industry sectors Expand professional network; improve YOUR brand; valuable professional credential Develop assessment, analysis, writing, teamwork and leadership skills Give back to the community, state, and nation by helping organizations be successful through useful feedback (patriotism) The Quality Texas staff and Training Faculty are dedicated to creating a positive experience for you so that you can reap the benefits of being a Quality Texas Examiner.

Conditions of Involvement Absolute Essentials to the Credibility, Success and Prestige of the Assessment and Feedback Process Commit to the entire process (until the feedback is presented to the applicant) Maintain Confidentiality Complete the Training and Case Study Honor Time Commitments Represent Quality Texas (not your organization) Follow Code of Conduct; Avoid Conflicts of Interest Stress the success of the program depends on their commitment to see it through! Since Quality Texas relies on its volunteers—its examiners to create the feedback report, each examiner has to adhere to timelines and guidelines for our customers—the applicants—to get the much desired feedback report! Ask the tables to discuss for 1-2 minutes what happens if any of these are not done. (The answer should be that the feedback report, site visit, etc. will be comprised and the product will not be delivered or its quality will be suspect)

Value for the Examiner/Sponsor Work with a diverse team; reach consensus Network with other quality/business experts to build professional friendships Understand/apply the Baldrige Framework to a variety of organizations; your own? Develop analytical/consensus-building skills Attend annual Texas Award Banquet and Conference (discounted rate) – June, 2016, Holiday Inn Riverwalk, San Antonio, TX Special recognition at conference While there is certainly value for the organization that will receive the feedback report. There is also value to YOU as an examiner and to the organization that sponsored you! Please read through these and choose the one that you think is the most valuable. Ask for a few examiners to share which they think are the most valuable.

Examiner Career Path Time Experiences Examiner Position Examiner 1-3 years BU TL, FBW, 1 SV, Examiner Training Each Year Senior Examiner 4-6 years TL, 2 SV+, Examiner Training Each Year Alumni Examiner 6+ TL, Process Coach, 3+ SV, Examiner Training Each Year Various Committees Judge TL, Process Coach, 3+ SV, Examiner Training Each Year Various Committees Board of Directors Corporate Sponsorship, TL, Process Coach, 3+ SV, Examiner Training Each Year Various Committees Fellow 10+ Dedication and Continual Volunteerism for QTF Issues

QTF EXAMINATION & FEEDBACK PROCESS INDEPENDENT REVIEW Here’s where the site visit stage fits into the overall assessment process.

QTF EXAMINATION & FEEDBACK PROCESS CONSENSUS REVIEW Here’s where the site visit stage fits into the overall assessment process.

QTF EXAMINATION & FEEDBACK PROCESS Site Visit Here’s where the site visit stage fits into the overall assessment process.

Walk Through Baldrige Framework Booklet Take a look at your Baldrige Framework Book

Baldrige Excellence Framework Criteria: Systems Perspective – 1 Basic, Overall, Areas to Address – 2 Point Values per Category/Item – 3 Organizational Profile – 4-6 Categories 1-7 – 7-29 Scoring System – 30-33

Baldrige Excellence Framework Process Scoring Guidelines – 34 Results Scoring Guidelines - 35 Responding to the Criteria – 36-38 Core Values and Concepts – 39-43 Changes from 2013-2014 – 44-46 Glossary of Key Terms – 47-54

QTF LEVELS Award Next Level Award Level* Progress Recognition Commitment Recognition Engagement Recognition Beginning Recognition Award Next 50 + 5 (OP) QTF LEVELS Award Levels 50 plus 5 (OP) Progress Level 30 plus 5 (OP) Commitment Level 20 plus 5 (OP) Engagement Level 10 plus 5 (OP) Beginning Level 5 (OP)

How Do I Evaluate Process Items? Process items are evaluated using four factors: Approach Deployment Learning Integration ADLI Now that we’ve spent some time looking at the Criteria, let’s turn our attention to the evaluation of process Items. You may recall that the Items in Categories 1-6 address an organization’s processes. These four factors represent the dimensions that are used in conducting an analysis of any particular process. Each of these terms can be found in the glossary. They have precise meanings in the context of the Baldrige framework, so we will spend some time exploring each one. You will also see these factors in the Scoring Guidelines NOTE: All discussion of results and LeTCI have been moved to the second day. You can mention that here if you wish.

Approach (A) “Approach” comprises Methods used to carry out Process Appropriateness of methods vs requirements Effectiveness of the methods The degree to which the approach is repeatable and based on reliable data and information (i.e., systematic) Approach refers to the methods used by an organization to address the TAPE Criteria Item requirements in Categories 1–6. Approach is where it begins. Without a clear approach then deployment has nothing to build on. Approach includes the appropriateness of the methods to the Item requirements and the effectiveness of their use. Remember that the Criteria does not prescribe what approaches an organization should have, but rather the effectiveness of those approaches based on what the organization tells you is important to them. Group discussions: Looking at the Green sheet – approach score goes up when? As an Examiner, you will determine if the applicant describes a process/approach. Then you will evaluate the degree to which an applicant’s approach is systematic; that is, repeatable and based on reliable data and information. Once we have determined the applicant has an approach then we can look at the deployment. LYNN: I don’t understand this exercise! ACTIVITY: read a section and ask about each evaluation factor as follows: Read 1.1a(1) Identify key concepts related to approach. What is missing? Look at written comment to see how the analysis leads to the comments OFIs – how do you improve?

Deployment (D) “Deployment” is the extent to which Your approach addresses item requirements relevant and important to the organization Your approach is applied consistently The approach is used by all appropriate work units Go over these questions that examiners should ask themselves: Is the approach used in all locations? Is it used by all appropriate personnel, work units, departments? Is it used consistently?

Learning (L) Learning comprises The refinement in approach through cycles of evaluation and improvement The encouragement of breakthrough change through innovation, and, The sharing of refinements and innovation with other work units and processes in the organization Go over these questions that examiners should ask themseleves: Has the approach been evaluated and improved? In a systematic manner? Is there evidence of breakthrough change, or innovation? Is there evidence that learning has been shared with other organizational units and/or partners, as appropriate, so that they might benefit from it? Is there evidence that learning is embedded into how the organization operates?

Integration (I) “Integration” is the extent which Your approach is aligned with organizational needs (OP/ processes) Your measures, information, and improvement systems are complementary across processes and work units Your plans, processes, results, analyses, learning, and actions are “harmonized” across processes and work units to support organization-wide goals Go over these questions that examiners should ask themselves: How well is the approach aligned with organizational needs (i.e., those identified in the Organizational Profile and Process Items)? Are the measures, information, and improvement systems complementary across processes and work units? How well does the approach align and integrate with other approaches to support achievement of organizational goals?

Results: 45% of the Total Score APEX Application Writing Workshop Results: 45% of the Total Score LeTCI We’ve covered the evaluation factors you’ll consider when assessing the process Categories – Categories 1 through 6. But there’s one more Category to discuss, and it’s a big one. Results count for a whopping 45% of the score in an application, so it’s important that we assess them properly. When we get to results, we’re no longer looking at ADLI. Instead, we’re looking at some different factors: Levels, trends, comparisons and integration. September, 2009 36

Results Evaluation Factors  LeTCI APEX Application Writing Workshop Results Evaluation Factors  LeTCI Levels – Current performance on a meaningful measurement scale Trends – Numerical data that show the direction and rate of improvements (slope over time) Comparisons – Your performance relative to that of other appropriate organizations, competitors or organizations similar to yours; relative to industry leaders or benchmarks Integration – Measures (segmentation) addressing important performance requirements relating to customers, products/services, markets, processes, or action plans identified in your OP and in process items; future performance; harmonization across processes and work units to support organization-wide goals This slide shows what those new factors are: Levels: current performance Trends: Performance over time (emphasize that you should expect to see at least three data points presented in order to discern a trend) Comparisons: It’s expected that the applicant will present appropriate competitive and comparative data for many of its measures whenever such comparisons exist. This helps us gage the applicant’s leadership position. Integration: Many measures are asked to be defined in Categories 1 – 6. The applicant gets to define what’s important to them to measure, within reason. Are the measures you’re seeing presented in Category 7 the most important measures for te applicant and its industry, or are they just presenting high-performing fluff? Segmentation: Results should be segmented by appropriate groups, as the applicant has defined them in the org Profile (workforce segments, product/service lines, customer types, etc.). More details on segmentation are on the following slide. September, 2009 37

Segmentation Segments can be defined by, among other things: APEX Application Writing Workshop Segmentation Segments can be defined by, among other things: Customers (students/families for Education; patients/families for HC) Market or Product offerings By Location Workforce group (employees, tenure, admin, hourly, etc.) Size of the group in question The applicant defines their segments (OP); we assess the results of those segments Remind the examiners that the applicant will define most of their appropriate segments in the org Profile, as shown here. So, these segments become prime candidates to include as Key Factors in results Items. September, 2009 38

Comparisons Comparisons can be defined by, among other things: Inside the industry Competitive comparisons Outside the industry

QTF/Baldrige Examination Process TRUST THE PROCESS! (KEY TEACHING POINT)

Step 1: Read the Criteria. Read Baldrige Framework for Award Level QTF Beginner, Engagement, Commitment, or Progress Level Criteria

Step 2: Determine Most Relevant Key Factors Four to Six Key Factors taken from OP, Eligibility, or from Application

Step 3: Read & Analyze the Application Read the Application Mark as appropriate

Step 4: Identify Strengths/OFIs Around 6 comments Strengths and OFIs

Step 5: Write Feedback Ready Comments Remember: NERD! N – nugget of importance E – evidence/example to support comments R – relevance to the applicant D – Done!

Step 6: Determine the Scoring Range and Score Best Fit Don’t Block a Winner Tie goes to applicant

Getting Ready to Learn - Welcome Scoring: Not like your previous education! Go to Process Scoring Page 34

UNDERSTANDING SYSTEMATIC APPROACH

UNDERSTANDING DEPLOYMENT

UNDERSTANDING IMPROVEMENT/LEARNING/INNOVATION

UNDERSTANDING INTEGRATION

Go to Results Scoring Page 35 What is different about Results Scoring?

Questions to this Point??

So let’s begin our Examination/Evaluation

Importance of the Organizational Profile Reader’s digest version of the application Sets the stage for what the applicant says is important Frames our comments (feedback report), strengths/OFIs, scoring, and Key Themes Assists with our scoring band descriptors (Award) Do not read into or take away from what is stated

How to determine Key Factors Review what the applicant says is important in the eligibility form Review the OP Be aware throughout the application of other key factors that are discussed What is really important? Do not rewrite the OP

Exercise: Determine Key Factors Activity Time (min) Select a scribe, timekeeper and reporter (1) Discuss at your table as a group your thoughts on the key factors for this applicant from the Organizational Profile 10 Highlight Key Factors from the OP and select a few to write down; Table Anchors guide this process 4. Record the requirements on chart paper 20 Report Out (1 minute per table) 5 45 Minutes Total

Step 1: Read the Criteria This is just a quick set-up for the exercise on the following page. It’s a good place to ask returning examiners about some of their techniques for this step. Remind the class that nobody has the Criteria memorized. It’s a good idea to refresh yourself on what is there – and what is not.

Exercise: Step 1 Read the Criteria Activity Time (min) Select a scribe, timekeeper and reporter (1) Discuss at your table, (group) your thoughts on the key requirements for this Item 1.1 from Baldrige Framework Manual Review individually your assigned Criteria Item [1.1a(1), (2), (3), 1.1b(1), (2)] Record the requirements on chart paper Report Out (1 minute per table) 30 10 60 Minutes Total

Step 2: Determine Most Relevant Key Factors This is just a quick set-up for the exercise on the following slide. It’s a good place to ask returning examiners about some of their techniques for this step. Question for the group: Why 4 – 6 KFs? Why not more or less than this? What happens with your comments if you choose 8 KFs, for example?

Exercise: Step 2 Determine the Most Relevant Key Factors Activity Time (min) Select a scribe, timekeeper and reporter (1) Review individually your assigned Criteria Item [1.1a(1), (2), (3), 1.1b(1), (2)] 5 Discuss at your table as a group and come to agreement on the relevant 4 to 6 key factors for your assigned item. Write your selected key factors on a flip chart for use in subsequent exercises (capture key words and phrases) 20 10 Report Out (1 minute per table) Facilitator’s Note: Have the examiners turn to page 8 in the Examiners Guide. The full instructions are there. Also remind them that the Key Factors from the Baldrige case study begin on the next page (8). They will be referring back to these over the course of the next three sets of exercises. For report out, contrast the observations for tables that had the same item. Explain why identifying different KFs may lead examiners to come up with different comments in their Independent Review than their teammates. 45 Minutes Total

Step 3: Read and Analyze the Application This is just a quick set-up for the exercise on the following page. It’s a good place to ask returning examiners about some of their techniques for this step. In particular, make sure it’s emphasized to at least skim over the entire application before diving into Item-level evaluation.

Exercise: Step 3 Read and Analyze the Application Activity Time (min) Select a scribe, timekeeper and reporter (1) Review individually your assigned Criteria Item (1.1a(1), (2), (3), 1.1b(1), (2) 20 Discuss at your table as a group your observations on the applicant’s response against the Criteria questions. Report Out (1 minute per table?) 5 Facilitator’s Note: Have the examiners turn to page 14 in the Examiners Guide. The full instructions are there. For report out, just ask for some input from experienced examiners on their techniques for reading and analyzing an application.. 45 Minutes Total

Step 4: Identify Strengths/OFIs This is just a quick set-up for the exercise on the following page. It’s a good place to ask returning examiners about some of their techniques for this step. “C” = Comments “SG” = Scoring Guidelines (i.e., refer to the scoring guidelines for both possible language to use in the comment to indicate level of the applicant’s maturity)

Exercise: Step 4 Identify Strengths/OFIs Activity Time (min) Select a scribe, timekeeper and reporter (1) Use a round robin approach to discuss/highlight all potential strengths/OFIs identified by your table mates [1.1a(1) (2)(3), 1.1b(1)(2)] 15 From these, discuss at your table and select as a group, a total of around 2 strengths and OFIs that you feel are most important for the applicant Record these on chart paper 5 Report Out (1 minute per table) 10 Facilitator’s Note: Have the examiners turn to page 15 in the Examiners Guide. The full instructions are there. For report out, just contrast the observations for tables that had the same item. Discuss why these differences might lead to different scores in Step 6. 45 Minutes Total

Our KEY Product for the Applicant! Feedback Comments Our KEY Product for the Applicant!

What Feedback Ready Comments Should Do: Let the applicant know what it does well—and what it needs to improve to take it to the next level 1. Criteria requirements equitable assessment 2. The particular organization meaningful feedback Feedback comments are important to your customers. They validate strengths and provide good feedback of how to take their organization to the next level and to continually grow. By following the criteria requirements, we ensure that all applicants get a fair assessment against the Criteria. By considering the applicant’s Key Factors in constructing and prioritizing our comments, we make the feedback more meaningful and useful to them. By using ADLI in our assessment, we help the applicant understand their current level of maturity, and what it might take to get them to higher levels of performance. Assessment factors (ADLI) (LeTCI) insights on organizational maturity

Six Feedback Comment “Musts” Polite, professional, non-prescriptive Based on the Criteria, Key Factors, Scoring Guidelines Addresses one topic per comment Begins with “nugget” that shows why this is important to the applicant Has evidence/example and relevance (actionable information) (based on scoring guidelines, Key Factors, Core Values, criteria notes) Comments align with and support the score Review briefly the 6 “musts” for a comment.

Well-Written Comments: “NERD” N – Nugget – Up front comment based on the Item Criteria or a Key Factor (i.e., explains why this is important to the applicant) E – Evidence or Example - State what the applicant has (Strength) or is missing or not addressed (OFI) R – Relevance – Why is this comment important? If an OFI, include actionable (but not prescriptive) information that will help the applicant to improve. D – Done – Read to see if comment makes sense and helps the organization understand its current status Sum up comment writing tips by talking about “NERD”: A well-written Process OFI comment has four elements: Acknowledgement if the applicant does have an approach in place. This is optional, but does provide a nice lead-in. Note to facilitators: This may be an appropriate place to caution about strengths embedded in OFIs, but not listed as separate strength Strengths, OFIs, and rationale. Doing this tends to drive scores down. A description of the gap(s) in the applicant’s approach-deployment, based on whether it addresses the Item requirements and is systematic, effective, and deployed. An explanation of “so what” or why it is important to the applicant’s success based on the Item Criteria or a Key Factor. A description of actionable information that may help the applicant move forward. All OFI Strengths, OFIs, and rationale are valuable to the applicant. “So whats” and actionable information are the added value we provide to help the applicant move forward.

Elements of a well written comment Comment Writing Elements of a well written comment Now for an incredibly important part of this training—Comment Writing NOTE: We’ve expanded this section and put it at the beginning of Step 5 in the process to be just-in-time and to emphasize the importance of well-written comments.

The Feedback Report: The Product The report should be: Useful to the applicant Comments should be clear and actionable Focused on the applicant’s most important key factors Encouraging Not adversarial (it’s not an audit) Help applicant reach the next level Make applicant an advocate for the program Respectful Polite in tone Not judgmental or prescriptive Review these points. Ask if there are any examiners in the room whose organization has applied at Quality Texas at any level (or Baldrige). If so, ask them their thoughts on the importance of a useful feedback report.

How to Give Useful Feedback APEX Application Writing Workshop How to Give Useful Feedback Do: Base your comments on the Criteria Reference the evaluation factors Include an opening “nugget” to give the comment significance Keep each comment to a single issue Make Key Factor references Be accurate; check your statements Recognize the page limitations Give benefit of the doubt Be polite in tone Include a few examples Check spelling and grammar Don’t: Be judgmental or prescriptive Forget linkages Forget the Key Factors Stray from the Criteria Make conflicting strength and OFI statements Forget to: Check your score against the balance and content of comments Check your facts Check grammar and spelling Forget you’re providing a service Ask participants to read through the “do” list and choose two that they believe are critical to providing feedback. Take a show of hands for the top two. Do the same for the “don’t” list. September, 2009 73

Sample Process Strength Relevance Nugget 2.2a(4) To mitigate the strategic challenge of competitors wishing to hire its engaged workforce, the applicant ensures that workforce plans support any needed changes. For example, through data and budget analysis and surveys, the applicant takes a proactive approach to workforce capacity and allocates instructional staff to areas of greatest need through “vertical teamwork.” Read the comment and identify the nugget, the relevance, and the examples. Note that there is only one point of relevance. The examiner chose to put it at the beginning of the comment because they determined that this is where it has the most impact: it “personalizes” the comment for the applicant. Could you have improved on this comment? If so, how? The examiners chose the one example that best makes their point. Depending on the comment, they might choose to include more. What evaluation factors are the examiners citing as a strength? (A, citing an approach; maybe I, since it addresses an SC?) Examples

Sample Process OFI Nugget Relevance 4.2a(1) It is not clear how the applicant systematically transfers knowledge specific to the needs of parents and volunteers in support of the PhilP that all are accountable for student performance. For example, parents and volunteers do not appear to be included in teachers’ grade-level discussions, and parents do not appear to have access to teachers’ online forums, blogs, and classroom support server (Figure 4.2-1). Identify the nugget, relevance, and examples. Note the order: why do you think the examiners chose this order? What evaluation factors are the examiners citing? (lack of A) Could you have improved on this comment? If so, how? Example

Sample Process Strength Nugget Relevance 2.2a(4) To mitigate the strategic challenge of competitors wishing to hire its engaged workforce, the applicant ensures that workforce plans support any needed changes. For example, through data and budget analysis and surveys, the applicant takes a proactive approach to workforce capacity and allocates instructional staff to areas of greatest need through “vertical teamwork.” Read the comment and identify the nugget, the relevance, and the examples. Note that there is only one point of relevance. The examiner chose to put it at the beginning of the comment because they determined that this is where it has the most impact: it “personalizes” the comment for the applicant. The examiners chose the one example that best makes their point. Depending on the comment, they might choose to include more. What evaluation factors are the examiners citing as a strength? (A, citing an approach; maybe I, since it addresses an SC?) 53 words, 372 characters/spaces Examples

Sample Process OFI 64 words, 415 characters/spaces Nugget 4.2a(1) It is not clear how the applicant systematically transfers knowledge specific to the needs of parents and volunteers in support of the PhilP that all are accountable for student performance. For example, parents and volunteers do not appear to be included in teachers’ grade-level discussions, and parents do not appear to have access to teachers’ online forums, blogs, and classroom support server (Figure 4.2-1). Relevance Identify the nugget, relevance, and examples. Note the order: why do you think the examiners chose this order? What evaluation factors are the examiners citing? (lack of A) Examples 64 words, 415 characters/spaces

Sample Results Strength Nugget Relevance 7.3a(1) Strong results for key measures of workforce capability and capacity may help strengthen the engagement factor of having sufficient resources to get the job done. Rates of certification (Figure 7.3-1) and student-teacher ratio (Figure 7.3-2) have improved over the periods shown, reaching 100% or close to 100% for all segments, and the student-teacher ratio in elementary and middle schools compares favorably to that of a Baldrige Award winner. Identify the nugget, relevance, and examples. Note the order: others are possible. What evaluation factors are the examiners citing? (T, C; note that this doesn’t have to be in the nugget) Examples

Sample Results OFI Nugget Examples 7.2a Results related to key student requirements, such as stimulating creative thought, treating students fairly, and maintaining a safe school, are missing. Tracking such results may give the applicant insights into how to retain families and how to attract families to the district. Relevance Identify the nugget, relevance, and examples. Note the order: others are possible. What evaluation factors are the examiners citing? (missing results)

Step 5: Write Feedback Ready Comments So, with those examples out of the way, let’s have you try your hands at developing fully-formed, ready to present to the applicant, feedback comments.

Exercise: Step 5 Write Feedback Ready Comments Activity Time (min) Select a scribe, timekeeper and reporter (1) Draft feedback ready strength and OFI (2 groups) Give feedback to table mates Redraft the comment, using their input Record the final feedback ready comment on chart paper 15 10 Report Out (1 minute per table) Facilitator’s Note: Have the examiners turn to page 19 in the Examiners Guide. The full instructions are there. All facilitators should spend time at each table coaching the examiners on their comment writing. We have allowed sufficient time in the agenda to get some decent comments developed in this step. The objective is to have truly feedback-ready comments produced at the end of each of the four Step 5 exercises we will do this week, so don’t rush completion of this step. Test the final feedback-ready comments with the class: Is this comment understandable? Within the bounds of the criteria? Polite in tone? Specific enough so that it’s actionable by the applicant? Use criteria and/or scoring language as appropriate so the applicant understands the comment’s significance? Address the evaluation factors (ADLI or LeTCI)? Help the applicant move to higher levels of performance? Avoids being prescriptive (not within the criteria) or judgmental? 55 Minutes Total

Introduction to Scoring Process Scoring Guidelines and Results Scoring Guidelines Use Glossary for Definitions always Process Scores Address: Approach, Deployment, Learning, Integration (ADLI) Results Scores Address: Levels, Trends, Comparisons, Integration (LeTCI) Facilitators Note: This is a brief introduction to scoring. We’ll spend a bit more time on it tomorrow. The reason it’s brief is that examiners, both new and returning, often agonize for too long over scoring. Pushing them to come to a quick conclusion forces them to get over this mental block. Another key ingredient to a successful examiner is understanding how to score an application. Trainers, hold up the scoring sheet (one side is Process, and the other side is Results) or point out pages 32 and 33 in the Criteria booklet. Ask participants to turn to scoring guide in the Category booklet. Have the participants follow as you show them where approach is scored, deployment, learning, and integration. Then explain that process and results are scored differently—even have different criteria! Results are on the back (make sure that they are on the right page) and evaluate levels, trends, comparisons, and integration. Explain levels (point on a scale), trends (at least three data points), comparisons (with other entities), and integration. Discuss (and get input from returning examiners) good and bad practices for determining scoring. 8.0 8.5 8.5 9.0 6.5

Step 6: Determine the Scoring Range and Score This is just a quick set-up for the exercise on the following page. Point out that the graphic shows two things: first is the range, second is the score within that range. Scoring tends to go quicker if you can first determine the most appropriate range.

Look at Scoring Guidelines in Baldrige Framework Pages 34 Process and 35 Results

Exercise: Step 6 Scoring Activity Time (min) Select a scribe, timekeeper and reporter (1) Using your comments from Step 4 and the Scoring Guidelines, discuss as a group the most appropriate scoring range for your assigned Item 10 Next, discuss the appropriate score within that range 4. Record the scoring range and score on chart paper and be prepared to discuss why you chose that score Report Out (1 minute per table) 5 Facilitator’s Note: Point the examiners to page 21 in their workbooks. The details instructions are there and remind them that they should be using the process scoring guidelines. FYI: Since everyone likes to know how their score matches up with that of the Baldrige TST, here’s their scores: for 2.1: 30 – 45% range; for 6.2: 50 – 65% 35 Minutes Total

Follow Six Step Process Complete for 2.1 [(2.1a(1)(2)(3)(4) and 2.1b(1)(2)] Complete for 5.2 [(5.2a(1)(2)(3)(4) and 5.2b(1)(2)(3)] Complete for 7.1 [(7.1a, b(1)(2), c] Complete for 7.5 [(7.5a(1)(2)]

Round-Robin Feedback What did you learn? Do you have confidence to begin/continue the assessment stages? Any final questions or concerns? Final summary

Blank

Class will begin at 8:30 a. m. Please remember to sign in each day Class will begin at 8:30 a.m.! Please remember to sign in each day! Thanks and enjoy your evening!

Scorebook Navigator Links to the IR and consensus manuals for Scorebook Navigator are on the Examiner Resources page Suggested narrative: We won’t cover the workings of Scorebook Navigator today. You already have some experience with it since you used it to do your pre-work in preparation for Examiner Training. It’s a fairly intuitive program to use, but like any software that you don’t use frequently, it’s easy to forget the steps. We have two manuals on the Quality Texas website on the Examiner Resource page, one for Independent Review and one for consensus. Refer to these manuals “just in time” as you’re ready to log on to Scorebook Navigator to begin work on your application. You’ll see that the program in consensus consolidates all the IR comments for you to make it easy for you to do your job as a Category lead. Side note: Examiners might wonder what The Alliance for Performance Excellence is. Explain that it’s the association of regional, state and local Baldrige-based programs that work together to foster use of Baldrige. Quality Texas is a member of the Alliance, which manages Scorebook Navigator for its members.

Scorebook Navigator

Scorebook Navigator

Last Things…Almost! Be sure to complete your training surveys and forms! Please contact Quality Texas if you need help. Let’s thank our Training Faculty! Please take a few moments to clean up your areas Please remember to post your Plus/Delta comments! Have a safe trip home!

CONGRATULATIONS! It’s time to celebrate your successful completion of Examiner Training! Call names, distribute certificates, and take class photos. 94

We appreciate YOU! Welcome to the Quality Texas Family Distribution of certificates – all should stay for class photo

Always thank the students – they’re all volunteers!