Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Teaching Performance Assessment Consortium (TPAC) Andrea Whittaker, Ph.D. Stanford University September 2011.

Similar presentations


Presentation on theme: "The Teaching Performance Assessment Consortium (TPAC) Andrea Whittaker, Ph.D. Stanford University September 2011."— Presentation transcript:

1 The Teaching Performance Assessment Consortium (TPAC) Andrea Whittaker, Ph.D. Stanford University September 2011

2 Agenda and Goals Update on National Project Update on National Project Ohio policies and timeline Ohio policies and timeline Stanford Center for Assessment, Learning and Equity 2011

3 Why Now? Blue Ribbon Panel – 10 Principles PARCC and Smarter Balance Assessments

4 Where TPAC fits in TPAC is working to develop and implement at scale a way of assessing teaching that… Provides evidence of teaching effectiveness, Provides evidence of teaching effectiveness, Supports teacher preparation program improvement Supports teacher preparation program improvement Informs policy makers about qualities of teaching associated with student learning. Informs policy makers about qualities of teaching associated with student learning. TPAC is ONE example of an assessment system that is designed to leverage the alignment of policies and support program renewal. Stanford Center for Assessment, Learning and Equity 2011

5 Accountability reframed How can we gather and use evidence of the qualities of teaching performance that inspire, engage, and sustain students as learners – to improve teaching and teacher preparation? Stanford Center for Assessment, Learning and Equity 2011

6 National Leadership AACTE overall project management, communication with programs Stanford University assessment development and technical support Council of Chief State School Officers policy development and support, communication with state education agencies (prior to March 2011) Stanford Center for Assessment, Learning and Equity

7 Highlights of Pearson’s Role in the TPA Pearson has been selected as Stanford’s operational partner. Support Stanford and AACTE with assessment development and technical review. Train and certify scorers, provide a scoring platform and report results for the operational TPA. Stanford Center for Assessment, Learning and Equity 2011

8 Pearson’s Role in the Field Test Development Support for Field TestingDevelopment Support for Field Testing Handbook and template publicationHandbook and template publication Recruitment and training of scorers, scoring and scorer compensationRecruitment and training of scorers, scoring and scorer compensation BenchmarkingBenchmarking Reporting resultsReporting results Providing an electronic platform to manage TPA submissions.Providing an electronic platform to manage TPA submissions.

9 Pearson’s Role in Operational Use Pearson will provide Assessment Services to deliver the TPA Nationally and Sustainably.Pearson will provide Assessment Services to deliver the TPA Nationally and Sustainably. Web-based services that allow candidate registration, assembly of artifacts, faculty/supervisor feedback, final submission for official TPA scoring and a score report.Web-based services that allow candidate registration, assembly of artifacts, faculty/supervisor feedback, final submission for official TPA scoring and a score report. Scoring services such as the recruitment, training and certification of all scorers, scoring for all submitted TPA responsesScoring services such as the recruitment, training and certification of all scorers, scoring for all submitted TPA responses Reporting services such as the generation of all official score reports to candidates and institutions of record.Reporting services such as the generation of all official score reports to candidates and institutions of record.

10 Partnering States

11 Standards and tPAC Common Core alignmentCommon Core alignment InTASC alignmentInTASC alignment NCATE/CAEP endorsementNCATE/CAEP endorsement SPA endorsementSPA endorsement Stanford Center for Assessment, Learning and Equity 2011

12 TPAC Lineage National Board for Professional Teaching Standards (NBPTS) portfolio assessments – accomplished teachers National Board for Professional Teaching Standards (NBPTS) portfolio assessments – accomplished teachers Connecticut BEST assessment system – teachers at end of induction Connecticut BEST assessment system – teachers at end of induction Performance Assessment for California Teachers (PACT) – pre-service teachers Performance Assessment for California Teachers (PACT) – pre-service teachers Stanford Center for Assessment, Learning and Equity 2011

13 Role of K-12 Partners NEA and AACTE affiliate state meetingsNEA and AACTE affiliate state meetings Roles for cooperating teachers and school site principalsRoles for cooperating teachers and school site principals Call for collaboration with IHEsCall for collaboration with IHEs Stanford Center for Assessment, Learning and Equity 2011

14 State Policy issues Emerging recognition of state role and responsibility for educator effectiveness – states are revisiting policies and practices TPAC is coming at this through: TPAC is coming at this through: Program improvement and accountabilityProgram improvement and accountability The psychometric challenge – is the instrument usable?The psychometric challenge – is the instrument usable? Informs policy development in critical levels of program approval (measure of effectiveness), as well as initial licensure (candidate readiness)Informs policy development in critical levels of program approval (measure of effectiveness), as well as initial licensure (candidate readiness) Stanford Center for Assessment, Learning and Equity 2011

15 STATE POLICY ISSUES States are realizing that valid, reliable and predictive measures are critical to the success of any change, especially when student performance is the end objective. Early implementers: Washington, Minnesota, Tennessee, Illinois, Wisconsin, Ohio Stanford Center for Assessment, Learning and Equity 2011

16 House Bill 1  Transfers responsibility for approving teacher preparation programs from the State Board to the Chancellor of the Board of Regents  Directs the Chancellor, jointly with the State Superintendent, to: (1) establish metrics and educator preparation programs for the preparation of educators and other school personnel, and (2) provide for inspection of the institutions.  Through HB1, Ohio is first in the nation to require a four-year induction program (Resident Educator)

17

18 Formative assessment coupled with goal setting and coaching Annual summative assessment based on multiple measures of educator effectiveness including student growth Ohio Comprehensive System of Educator Accountability Not Effective Effective More coursework or enter different area of study Recommended for resident educator license Teacher Residency PAR Program Recommended for Five Year Professional License Annual Teacher Evaluation Pre-Service Metrics Content Knowledge: Praxis II Performance Assessment: TPA Formative assessments that inform PD and coaching support Annual summative assessment based on multiple measures of educator effectiveness including student growth Not Effective Effective Not Effective Effective Performanc e Outcome Continue with Residency Not Effective Effective Employment terminated Informs decisions: retention, dismissal, tenure, promotion, compensation Continue as Teacher PAR Program Not Effective Effective Employment terminated

19 Ohio alignment TPA has also been aligned to the Ohio Teacher Standards.TPA has also been aligned to the Ohio Teacher Standards. Karen Herrington is working to align TPA with the alignment instrument with state/national standards Ohio IHEs compiled in 2005-06Karen Herrington is working to align TPA with the alignment instrument with state/national standards Ohio IHEs compiled in 2005-06

20 Ohio’s LIneage Praxis III Assessment in Entry Year TeachingPraxis III Assessment in Entry Year Teaching Focus of Planning, Environment, Teaching for Learning and ProfessionalismFocus of Planning, Environment, Teaching for Learning and Professionalism Pathwise Training for Mentors assisting entry year teachers and incorporationPathwise Training for Mentors assisting entry year teachers and incorporation Transition of PIII to Resident Educator ProgramTransition of PIII to Resident Educator Program

21 TPA Architecture Stanford Center for Assessment, Learning and Equity 2011

22 Design Principles for Educative Assessment  Discipline specific and embedded in curriculum  Student Centered: Examines teaching practice in relationship to student learning  Analytic: Provides feedback and support along targeted dimensions.  Integrative: maintains the complexity of teaching  Affords complex view of teaching based on multiple measures Stanford Center for Assessment, Learning and Equity 2011

23 TPA Architecture A summative assessment of teaching practiceA summative assessment of teaching practice Collection of artifacts and commentariesCollection of artifacts and commentaries “Learning Segment” of 3-5 days“Learning Segment” of 3-5 days Stanford Center for Assessment, Learning and Equity 2011

24 TPAC Artifacts of Practice PlanningInstructionAssessment Instructional and social context Instructional and social context Lesson plans Lesson plans Handouts, overheads, student work Handouts, overheads, student work Planning Commentary Planning Commentary Video Clips Video Clips Instruction Commentary Instruction Commentary Analysis of Whole Class Assessment Analysis of Whole Class Assessment Analysis of learning and Feedback to two students Analysis of learning and Feedback to two students Instructional next steps Instructional next steps Assessment Commentary Assessment Commentary Daily Reflection NotesDaily Reflection Notes Analysis of Teaching Effectiveness CommentaryAnalysis of Teaching Effectiveness Commentary Evidence of Academic Language DevelopmentEvidence of Academic Language Development Stanford Center for Assessment, Learning and Equity 2011

25 What? – candidate describes plans or provides descriptions or evidence of what candidate or students did What? – candidate describes plans or provides descriptions or evidence of what candidate or students did So what? – rationale for plans in terms of knowledge of students & research/theory, explanation of what happened in terms of student learning or how teaching affected student learning So what? – rationale for plans in terms of knowledge of students & research/theory, explanation of what happened in terms of student learning or how teaching affected student learning Now what? – what candidate would do differently if could do over, next instructional steps based on assessment, feedback to students Now what? – what candidate would do differently if could do over, next instructional steps based on assessment, feedback to students Conceptual Framework of Assessment Stanford Center for Assessment, Learning and Equity 2011

26 Multiple Measures Assessment System Embedded Signature Assessments Observation/Supervisory Evaluation & Feedback Child Case Studies Analyses of Student Learning Curriculum/ Teaching Analyses TPAC Capstone Assessment Integration of:  Planning  Instruction  Assessment  Analysis of Teaching with attention to Academic Language Stanford Center for Assessment, Learning and Equity 2011

27 Targeted Competencies PLANNING Planning for content understandingsPlanning for content understandings Using knowledge of students to inform teachingUsing knowledge of students to inform teaching Planning assessments to monitor and support student learningPlanning assessments to monitor and support student learningINSTRUCTION Engaging students in learningEngaging students in learning Deepening student learning during instructionDeepening student learning during instructionASSESSMENT Analyzing student work Analyzing student work Using feedback to guide further learning Using feedback to guide further learning Using assessment to inform instruction Using assessment to inform instructionREFLECTION Analyzing Teaching Effectiveness Analyzing Teaching Effectiveness ACADEMIC LANGUAGE Identifying Language Demands Identifying Language Demands Supporting students’ academic language development Supporting students’ academic language development Evidence of language use Evidence of language use Stanford Center for Assessment, Learning and Equity 2011

28 Rubric progression Early novice  highly accomplished beginner Early novice  highly accomplished beginner Rubrics are additive and analytic Rubrics are additive and analytic Candidates demonstrate: Candidates demonstrate: Expanding repertoire of skills and strategies Expanding repertoire of skills and strategies Deepening of rationale and reflection Deepening of rationale and reflection Teacher focus  student focus Teacher focus  student focus Whole class  generic groups  individuals Whole class  generic groups  individuals Stanford Center for Assessment, Learning and Equity 2011

29 Rubric blueprint Task name: Rubric Title Guiding Question Level 1Level 2Level 3Level 4Level 5 Struggling candidate, not ready to teach Some skill but needs more practice to be teacher- of-record Acceptable level to begin teaching Solid foundation of knowledge and skills Stellar candidate (top 5%)

30 Rubric Sample Eliciting and Monitoring Students’ Mathematical Understandings Level 1Level 2Level 3Level 4Level 5 Candidate talks throughout the clip(s) and students provide few responses. The candidate stays focused on facts or procedures with no attention to mathematical concepts and representations of content. Candidate primarily asks surface-level questions and evaluates student responses as correct or incorrect. Candidate makes vague or superficial use of representations to help students understand mathematical concepts. The candidate elicits student responses related to reasoning/prob lem solving. Candidate uses representations in ways that help students understand mathematical concepts. Candidate elicits and builds on students’ reasoning/ problem solving to explicitly portray, extend, or clarify a mathematical concept. Candidate uses strategically chosen representations in ways that deepen student understanding of mathematical concepts. All components of Level 4 plus, Candidate facilitates interactions among students to evaluate their own ideas.

31 Academic Language Academic language is different from everyday language. Some students are not exposed to this language outside of school.Academic language is different from everyday language. Some students are not exposed to this language outside of school. Much of academic language is discipline- specific.Much of academic language is discipline- specific. Unless we make academic language explicit for learning, some students will be excluded from classroom discourse and future opportunities that depend on having acquired this language.Unless we make academic language explicit for learning, some students will be excluded from classroom discourse and future opportunities that depend on having acquired this language. Stanford Center for Assessment, Learning and Equity

32 Academic language is the oral and written language used in school necessary for learning content. This includes the “language of the discipline” (vocabulary and forms/functions of language associated with learning outcomes) and the “instructional language” used to engage students’ in learning content. Academic Language Stanford Center for Assessment, Learning and Equity

33 Academic Language Competencies Measured Understanding students’ language development and identifying language demands Understanding students’ language development and identifying language demands Supporting language demands (form and function) to deepen content learning Supporting language demands (form and function) to deepen content learning Identifying evidence that students understand and use targeted academic language in ways that support their language development and content learning. Identifying evidence that students understand and use targeted academic language in ways that support their language development and content learning. Stanford Center for Assessment, Learning and Equity 2011

34 Development Timeline 2009-10 Small-scale tryout tasks & feedback from users. 2010-11 Development of six pilot prototypes based on feedback. Piloted in 20 states. User feedback gathered to guide revisions. 2011-12 National field test of 13 prototypes, producing a technical report with reliability and validity studies, and a bias and sensitivity review. National standard setting. 2012-13 Adoption of validated assessment Stanford Center for Assessment, Learning and Equity 2011

35 Pilot Data Analysis Scores (descriptive stats)Scores (descriptive stats) Scoring processScoring process Inter-rater reliability and agreement ratesInter-rater reliability and agreement rates Examinee and faculty feedbackExaminee and faculty feedback Benchmark identificationBenchmark identification Stanford Center for Assessment, Learning and Equity 2011

36 Handbook Changes Deep focus on student learningDeep focus on student learning Five level rubricFive level rubric Clear organization, prompts and alignment with rubricsClear organization, prompts and alignment with rubrics Academic language reframingAcademic language reframing Analyzing teachingAnalyzing teaching Subject specific glossariesSubject specific glossaries Professional look and interactive featuresProfessional look and interactive features Stanford Center for Assessment, Learning and Equity 2011

37 Ohio Spring Pilot 2011 Three IHEs completed 150 portfolios in six content areasThree IHEs completed 150 portfolios in six content areas 91 were scored by 35 calibrated faculty (univ./school) scorers representing 9 institutions91 were scored by 35 calibrated faculty (univ./school) scorers representing 9 institutions Results were returned to the three IHEs from a commonly used serverResults were returned to the three IHEs from a commonly used server Feedback sent to T Candidates with scored portfolios in late summerFeedback sent to T Candidates with scored portfolios in late summer

38 Program/Unit Discussions Results shared with program representativesResults shared with program representatives Discussions about strengths and challenges noted from data resultsDiscussions about strengths and challenges noted from data results Sharing of next steps based upon the results for the coming academic yearSharing of next steps based upon the results for the coming academic year

39 Framing Reliability and Validity Research Current policies in playCurrent policies in play Evidence needed to support TPA use for accreditation and licensure decision-makingEvidence needed to support TPA use for accreditation and licensure decision-making Potential role for VAM and other predictive validity measuresPotential role for VAM and other predictive validity measures Stanford Center for Assessment, Learning and Equity 2011

40 Field Test Design Design is driven by overall goals: Design is driven by overall goals: Data to enhance validity evidence Data to enhance validity evidence Reports to describe technical aspects and the set of validity and reliability studies Reports to describe technical aspects and the set of validity and reliability studies Effectiveness and efficiency of scorer training materials and process Effectiveness and efficiency of scorer training materials and process Refinements to the assessment Refinements to the assessment Reporting design and distribution Reporting design and distribution Support systems: Support systems: portfolio management system and portfolio management system and scoring management system scoring management system Participation/Sampling plan – location (state-based or national population) and discipline-specific Participation/Sampling plan – location (state-based or national population) and discipline-specific Stanford Center for Assessment, Learning and Equity 2011

41 Field Test Analyses Field Test data analysis and research areas: Field Test data analysis and research areas: Content Validity Content Validity CV meetings held in July 2011 CV meetings held in July 2011 Bias Review scheduled for November 2011 Bias Review scheduled for November 2011 Construct Validity Construct Validity defining the construct of the TPA, factor analysis defining the construct of the TPA, factor analysis Consequential Validity Consequential Validity candidates & programs candidates & programs Predictive Validity Predictive Validity reliability between performance on the TPA and other measures (e.g., TPA scores and state teacher certification test scores) reliability between performance on the TPA and other measures (e.g., TPA scores and state teacher certification test scores) Stanford Center for Assessment, Learning and Equity 2011

42 Field Test Participation Subject Areas to be field testedSubject Areas to be field tested Elementary Literacy, Elementary Mathematics, English/Language Arts, History/Social Science, Secondary Mathematics, ScienceElementary Literacy, Elementary Mathematics, English/Language Arts, History/Social Science, Secondary Mathematics, Science Special Education, Early Childhood Development, Middle Grades (Science, ELA, Math, and History Social Science), Art, Performing Arts (Music, Dance, Theater), Physical Education, and World LanguageSpecial Education, Early Childhood Development, Middle Grades (Science, ELA, Math, and History Social Science), Art, Performing Arts (Music, Dance, Theater), Physical Education, and World Language Other low-incidence draft handbooks will be available for trying outOther low-incidence draft handbooks will be available for trying out Stanford Center for Assessment, Learning and Equity 2011

43 Field Test Participation Pearson will support scoring training and scoring stipends for a national sample of 18,000 candidatesPearson will support scoring training and scoring stipends for a national sample of 18,000 candidates Scoring training and certification online (some synchronous events)Scoring training and certification online (some synchronous events) Scorers to include IHE faculty, field supervisors, cooperating teaching, principals, NBCTs and others with pedagogical content knowledge and experience with beginning teacher development.Scorers to include IHE faculty, field supervisors, cooperating teaching, principals, NBCTs and others with pedagogical content knowledge and experience with beginning teacher development. Local, state and national scoringLocal, state and national scoring Stanford Center for Assessment, Learning and Equity 2011

44 Ohio’s Projections for 2011-2012 72% of Ohio’s IHEs are current TPA participants72% of Ohio’s IHEs are current TPA participants Additional IHEs have pending MOUs being completedAdditional IHEs have pending MOUs being completed Over 2100 portfolios, 13 of the14 content areas, are projected for completionOver 2100 portfolios, 13 of the14 content areas, are projected for completion

45 Timeline of Activities Release of revised handbooks Release of revised handbooks September 2011 September 2011 Commitment/registrations to participate in Field Test Commitment/registrations to participate in Field Test Summer/Fall 2011 Summer/Fall 2011 Pearson systems ready for registration, submissions, and scoring Pearson systems ready for registration, submissions, and scoring Spring 2012 – scorer management system ready Spring 2012 – scorer management system ready TBD 2012 – candidate registration and TPA submission system ready TBD 2012 – candidate registration and TPA submission system ready Stanford Center for Assessment, Learning and Equity 2011

46 Timeline of Activities Release of revised handbooks Release of revised handbooks September 2011 September 2011 Commitment/registrations to participate in Field Test Commitment/registrations to participate in Field Test Summer/Fall 2011 Summer/Fall 2011 Pearson systems ready for registration, submissions, and scoring Pearson systems ready for registration, submissions, and scoring Spring 2012 – scorer management system ready Spring 2012 – scorer management system ready TBD 2012 – candidate registration and TPA submission system ready TBD 2012 – candidate registration and TPA submission system ready Stanford Center for Assessment, Learning and Equity 2011

47 Next Steps Join TPAC Online (Ning) Join TPAC Online (Ning) Field test commitments Field test commitments Technical assistance Technical assistance AACTE affiliate meetings AACTE affiliate meetings Ongoing webinars and Ning discussions Ongoing webinars and Ning discussions PACT/TPAC Implementation Conference – October 20-21 in San Diego PACT/TPAC Implementation Conference – October 20-21 in San Diego AACTE Annual Meeting – February 17-19, 2012 AACTE Annual Meeting – February 17-19, 2012 Stanford Center for Assessment, Learning and Equity 2011

48 Ohio’s Next Steps NE Region SW Region Host: Hiram College /University of AkronHost: University of Cincinnati Contact: Jennifer Miller/Lynn KlineContact: Chet Laine Date: October 28 or Nov 4 (TBA)Date: February 24 SE RegionNE Region Host: Franciscan UniversityHost: Bowling Green State University Contact: Mary Kathryn McVeyContact: Mary Murray Date: November 16Date: March (TBA) Central Region Host: Ohio Dominican University Contact: Bonnie Beach Date: January 6

49 Ohio’s Next Steps NE Region SW Region Host: University of AkronHost: University of Cincinnati Contact: Lynn KlineContact: Chet Laine Date: Nov. 9 Date: February 24 SE RegionNE Region Host: Franciscan UniversityHost: Bowling Green State University Contact: Mary Kathryn McVeyContact: Mary Murray Date: November 16Date: March 14 Central Region Host: Ohio Dominican University Contact: Bonnie Beach Date: January 6

50 Other TPAc presentations Breakouts todayBreakouts today Supporting StudentsSupporting Students Engaging facultyEngaging faculty Lessons learned from ScoringLessons learned from Scoring Pilot year insightsPilot year insights Academic languageAcademic language TPAC 101 on ThursdayTPAC 101 on Thursday Thursday Keynote – Program renewalThursday Keynote – Program renewal Stanford Center for Assessment, Learning and Equity 2011


Download ppt "The Teaching Performance Assessment Consortium (TPAC) Andrea Whittaker, Ph.D. Stanford University September 2011."

Similar presentations


Ads by Google