The Teaching Performance Assessment Consortium (TPAC) Andrea Whittaker, Ph.D. Stanford University September 2011.

Slides:



Advertisements
Similar presentations
Assessing effective beginning teaching in technology education
Advertisements

Teacher Performance Assessment for State Licensure and District Evaluation Raymond Pecheone, Stanford University Peter Youngs, Stanford University National.
Indiana Mentoring & Assessment Program Purdue’s Teacher Education Professional Development Forum December 3, 2004 Joy A. Seybold, Ph.D. Director of Professional.
+ A performance assessment for teacher candidates.
August 2006 OSEP Project Director's Conference 1 Preparing Teachers to Teach All Children: The Impact of the Work of the Center for Improving Teacher Quality.
On The Road to College and Career Readiness Hamilton County ESC Instructional Services Center Christina Sherman, Consultant.
EdTPA World Languages NYSAFLT Rochester Regional March 8, 2014 Dr. Nancy Barnett Empire State College MAT Program
Mentoring/Assessing & Scoring the edTPA Mara Manson Ed.D September 20, 2013.
1 · PACT· Performance Assessment for California Teachers Sonoma State University Spring 09 Adapted from PACTTPA.org.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Common Core State Standards AB 250 and the Professional Learning.
Teacher Performance Assessment (edTPA) Partner School Visits Winter, 2014.
Teaching Performance Assessment ConsortiuM (TPAC) Andrea Whittaker. Ph.D. Stanford University September 2011 Stanford Center for Assessment, Learning and.
Preparing Elementary Teacher Candidates for the edTPA Prior to Student Teaching: Documenting Experiences in a Math Methods Course Dr. Erica Kwiatkowski-Egizio.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Overview. Three-year grant to create a national teacher performance assessment Based upon the Performance Assessment for Teacher Candidates (PACT) from.
Teacher Performance Assessment (TPA): Scorer Training Overview
Secondary Assessments Chet Laine Karen Haring November 10, 2010.
Brian Yusko Associate Dean of Academic Programs Subject and grade-level specific.
Teacher Performance Assessment (TPA) The University of Toledo Faculty Senate October 9, 2012.
1 Ohio’s Entry Year Teacher Program Review Ohio Confederation of Teacher Education Organizations Fall Conference: October 23, 2008 Presenter: Lori Lofton.
The Teaching Performance Assessment Consortium (TPAC)
Dr. Marcia Matanin, Youngstown State University Dr. Julie M. Knutson, Minnesota State University-Moorhead.
Strategies for Efficient Scoring Jeanne Stone, UC Irvine PACT Conference October 22, 2009.
Interim Joint Committee on Education June 11, 2012.
National Board Certification: What is the Same, Different, and New? North Carolina District Coordinators Meeting September 12, 2014 Prepared by: Amber.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
© 2008 by PACT PACT Scorer Training Pilot.
OCTEO Fall Conference Insights from Ohio’s edTPA Field Test ( ) October 25, 2012 Donna Hanby, PhD.
TPAC - Task 2 By Dora L. Bailey, An analysis of the effects of teaching on students’ learning (the “so what”) Video Tape should : 2.
John Seelke University of Maryland College Park Preparing and Supporting Candidates for the edTPA 1.
Common Core Update – Opening March 11, Common Core Standards  What are the Common Core Standards?  How do the Common Core Standards compare to.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
EdTPA Update NSAII Annual Conference November 2012 Richelle Patterson, National Education Association (NEA)
What is the TPA? Teacher candidates must show through a work sample that they have the knowledge, skills, and abilities required of a beginning teacher.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Teacher Performance Assessment (edTPA) August 25, 2014.
Professional Development for the Teachers of Tomorrow’s Children WACTE October 28, 2008 Sheila Fox, WWU.
EdTPA Teacher Performance Assessment. Planning Task Selecting lesson objectives Planning 3-5 days of instruction (lessons, assessments, materials) Alignment.
The Improving Teacher Quality State Grants Program California Postsecondary Education Commission California Mathematics & Science Partnership 2011 Spring.
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
MISSOURI PERFORMANCE ASSESSMENTS An Overview. Content of the Assessments 2  Pre-Service Teacher Assessments  Entry Level  Exit Level  School Leader.
Developing Effective Teaching: When Assessment is a Gift.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
EdTPA Overview for UNG Supervisors. Today’s Session 30 minutes: What is it? 30 minutes: How is it evaluated? 30 minutes: How do we (supervisors) support.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
Common Core State Standards An overview for Professional Development Leads March 8, 2010 Mary Russell, Region 3 Joyce Gardner, Region 8.
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
What is ? Nationally available, subject-specific performance assessment Focuses on student learning and principles from research and theory Designed.
Using edTPA Data for Program Design and Curriculum Mapping Mary Ariail, Georgia State University Kristy Brown, Shorter University Judith Emerson, Georgia.
EdTPA Information taken from
1. How can we gather and use evidence of the qualities of teaching performance that inspire, engage, and sustain students as learners – to improve teaching.
1 Disclaimer This resource is provided for informational and support purposes only. There is no requirement that it be used as- is or as a template by.
REFLECTION ON TEACHING THROUGH VIDEO ANALYSIS. UNDERGRADUATE ELEMENTARY EDUCATION TEACHER EDUCATION PROGRAM.
Approaches to Measuring Teaching Practice: Review of Seven Systems Tony Milanowski University of Wisconsin-Madison (With Contributions from Herb Heneman.
Implementing edTPA An Overview.
National Board Process
A performance assessment for teacher candidates
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
NCATE Standard 3: Field Experiences & Clinical Practice
Overview to the edTPA Performance Assessment
Teacher Licensure An Evolving Landscape
Oregon Teacher Standards & Practices Commission July 24, 2013
Common Core State Standards AB 250 and the Professional Learning Modules Phil Lafontaine, Director Professional Learning and Support Division.
Tennessee edTPA Conference
Presentation transcript:

The Teaching Performance Assessment Consortium (TPAC) Andrea Whittaker, Ph.D. Stanford University September 2011

Agenda and Goals Update on National Project Update on National Project Ohio policies and timeline Ohio policies and timeline Stanford Center for Assessment, Learning and Equity 2011

Why Now? Blue Ribbon Panel – 10 Principles PARCC and Smarter Balance Assessments

Where TPAC fits in TPAC is working to develop and implement at scale a way of assessing teaching that… Provides evidence of teaching effectiveness, Provides evidence of teaching effectiveness, Supports teacher preparation program improvement Supports teacher preparation program improvement Informs policy makers about qualities of teaching associated with student learning. Informs policy makers about qualities of teaching associated with student learning. TPAC is ONE example of an assessment system that is designed to leverage the alignment of policies and support program renewal. Stanford Center for Assessment, Learning and Equity 2011

Accountability reframed How can we gather and use evidence of the qualities of teaching performance that inspire, engage, and sustain students as learners – to improve teaching and teacher preparation? Stanford Center for Assessment, Learning and Equity 2011

National Leadership AACTE overall project management, communication with programs Stanford University assessment development and technical support Council of Chief State School Officers policy development and support, communication with state education agencies (prior to March 2011) Stanford Center for Assessment, Learning and Equity

Highlights of Pearson’s Role in the TPA Pearson has been selected as Stanford’s operational partner. Support Stanford and AACTE with assessment development and technical review. Train and certify scorers, provide a scoring platform and report results for the operational TPA. Stanford Center for Assessment, Learning and Equity 2011

Pearson’s Role in the Field Test Development Support for Field TestingDevelopment Support for Field Testing Handbook and template publicationHandbook and template publication Recruitment and training of scorers, scoring and scorer compensationRecruitment and training of scorers, scoring and scorer compensation BenchmarkingBenchmarking Reporting resultsReporting results Providing an electronic platform to manage TPA submissions.Providing an electronic platform to manage TPA submissions.

Pearson’s Role in Operational Use Pearson will provide Assessment Services to deliver the TPA Nationally and Sustainably.Pearson will provide Assessment Services to deliver the TPA Nationally and Sustainably. Web-based services that allow candidate registration, assembly of artifacts, faculty/supervisor feedback, final submission for official TPA scoring and a score report.Web-based services that allow candidate registration, assembly of artifacts, faculty/supervisor feedback, final submission for official TPA scoring and a score report. Scoring services such as the recruitment, training and certification of all scorers, scoring for all submitted TPA responsesScoring services such as the recruitment, training and certification of all scorers, scoring for all submitted TPA responses Reporting services such as the generation of all official score reports to candidates and institutions of record.Reporting services such as the generation of all official score reports to candidates and institutions of record.

Partnering States

Standards and tPAC Common Core alignmentCommon Core alignment InTASC alignmentInTASC alignment NCATE/CAEP endorsementNCATE/CAEP endorsement SPA endorsementSPA endorsement Stanford Center for Assessment, Learning and Equity 2011

TPAC Lineage National Board for Professional Teaching Standards (NBPTS) portfolio assessments – accomplished teachers National Board for Professional Teaching Standards (NBPTS) portfolio assessments – accomplished teachers Connecticut BEST assessment system – teachers at end of induction Connecticut BEST assessment system – teachers at end of induction Performance Assessment for California Teachers (PACT) – pre-service teachers Performance Assessment for California Teachers (PACT) – pre-service teachers Stanford Center for Assessment, Learning and Equity 2011

Role of K-12 Partners NEA and AACTE affiliate state meetingsNEA and AACTE affiliate state meetings Roles for cooperating teachers and school site principalsRoles for cooperating teachers and school site principals Call for collaboration with IHEsCall for collaboration with IHEs Stanford Center for Assessment, Learning and Equity 2011

State Policy issues Emerging recognition of state role and responsibility for educator effectiveness – states are revisiting policies and practices TPAC is coming at this through: TPAC is coming at this through: Program improvement and accountabilityProgram improvement and accountability The psychometric challenge – is the instrument usable?The psychometric challenge – is the instrument usable? Informs policy development in critical levels of program approval (measure of effectiveness), as well as initial licensure (candidate readiness)Informs policy development in critical levels of program approval (measure of effectiveness), as well as initial licensure (candidate readiness) Stanford Center for Assessment, Learning and Equity 2011

STATE POLICY ISSUES States are realizing that valid, reliable and predictive measures are critical to the success of any change, especially when student performance is the end objective. Early implementers: Washington, Minnesota, Tennessee, Illinois, Wisconsin, Ohio Stanford Center for Assessment, Learning and Equity 2011

House Bill 1  Transfers responsibility for approving teacher preparation programs from the State Board to the Chancellor of the Board of Regents  Directs the Chancellor, jointly with the State Superintendent, to: (1) establish metrics and educator preparation programs for the preparation of educators and other school personnel, and (2) provide for inspection of the institutions.  Through HB1, Ohio is first in the nation to require a four-year induction program (Resident Educator)

Formative assessment coupled with goal setting and coaching Annual summative assessment based on multiple measures of educator effectiveness including student growth Ohio Comprehensive System of Educator Accountability Not Effective Effective More coursework or enter different area of study Recommended for resident educator license Teacher Residency PAR Program Recommended for Five Year Professional License Annual Teacher Evaluation Pre-Service Metrics Content Knowledge: Praxis II Performance Assessment: TPA Formative assessments that inform PD and coaching support Annual summative assessment based on multiple measures of educator effectiveness including student growth Not Effective Effective Not Effective Effective Performanc e Outcome Continue with Residency Not Effective Effective Employment terminated Informs decisions: retention, dismissal, tenure, promotion, compensation Continue as Teacher PAR Program Not Effective Effective Employment terminated

Ohio alignment TPA has also been aligned to the Ohio Teacher Standards.TPA has also been aligned to the Ohio Teacher Standards. Karen Herrington is working to align TPA with the alignment instrument with state/national standards Ohio IHEs compiled in Karen Herrington is working to align TPA with the alignment instrument with state/national standards Ohio IHEs compiled in

Ohio’s LIneage Praxis III Assessment in Entry Year TeachingPraxis III Assessment in Entry Year Teaching Focus of Planning, Environment, Teaching for Learning and ProfessionalismFocus of Planning, Environment, Teaching for Learning and Professionalism Pathwise Training for Mentors assisting entry year teachers and incorporationPathwise Training for Mentors assisting entry year teachers and incorporation Transition of PIII to Resident Educator ProgramTransition of PIII to Resident Educator Program

TPA Architecture Stanford Center for Assessment, Learning and Equity 2011

Design Principles for Educative Assessment  Discipline specific and embedded in curriculum  Student Centered: Examines teaching practice in relationship to student learning  Analytic: Provides feedback and support along targeted dimensions.  Integrative: maintains the complexity of teaching  Affords complex view of teaching based on multiple measures Stanford Center for Assessment, Learning and Equity 2011

TPA Architecture A summative assessment of teaching practiceA summative assessment of teaching practice Collection of artifacts and commentariesCollection of artifacts and commentaries “Learning Segment” of 3-5 days“Learning Segment” of 3-5 days Stanford Center for Assessment, Learning and Equity 2011

TPAC Artifacts of Practice PlanningInstructionAssessment Instructional and social context Instructional and social context Lesson plans Lesson plans Handouts, overheads, student work Handouts, overheads, student work Planning Commentary Planning Commentary Video Clips Video Clips Instruction Commentary Instruction Commentary Analysis of Whole Class Assessment Analysis of Whole Class Assessment Analysis of learning and Feedback to two students Analysis of learning and Feedback to two students Instructional next steps Instructional next steps Assessment Commentary Assessment Commentary Daily Reflection NotesDaily Reflection Notes Analysis of Teaching Effectiveness CommentaryAnalysis of Teaching Effectiveness Commentary Evidence of Academic Language DevelopmentEvidence of Academic Language Development Stanford Center for Assessment, Learning and Equity 2011

What? – candidate describes plans or provides descriptions or evidence of what candidate or students did What? – candidate describes plans or provides descriptions or evidence of what candidate or students did So what? – rationale for plans in terms of knowledge of students & research/theory, explanation of what happened in terms of student learning or how teaching affected student learning So what? – rationale for plans in terms of knowledge of students & research/theory, explanation of what happened in terms of student learning or how teaching affected student learning Now what? – what candidate would do differently if could do over, next instructional steps based on assessment, feedback to students Now what? – what candidate would do differently if could do over, next instructional steps based on assessment, feedback to students Conceptual Framework of Assessment Stanford Center for Assessment, Learning and Equity 2011

Multiple Measures Assessment System Embedded Signature Assessments Observation/Supervisory Evaluation & Feedback Child Case Studies Analyses of Student Learning Curriculum/ Teaching Analyses TPAC Capstone Assessment Integration of:  Planning  Instruction  Assessment  Analysis of Teaching with attention to Academic Language Stanford Center for Assessment, Learning and Equity 2011

Targeted Competencies PLANNING Planning for content understandingsPlanning for content understandings Using knowledge of students to inform teachingUsing knowledge of students to inform teaching Planning assessments to monitor and support student learningPlanning assessments to monitor and support student learningINSTRUCTION Engaging students in learningEngaging students in learning Deepening student learning during instructionDeepening student learning during instructionASSESSMENT Analyzing student work Analyzing student work Using feedback to guide further learning Using feedback to guide further learning Using assessment to inform instruction Using assessment to inform instructionREFLECTION Analyzing Teaching Effectiveness Analyzing Teaching Effectiveness ACADEMIC LANGUAGE Identifying Language Demands Identifying Language Demands Supporting students’ academic language development Supporting students’ academic language development Evidence of language use Evidence of language use Stanford Center for Assessment, Learning and Equity 2011

Rubric progression Early novice  highly accomplished beginner Early novice  highly accomplished beginner Rubrics are additive and analytic Rubrics are additive and analytic Candidates demonstrate: Candidates demonstrate: Expanding repertoire of skills and strategies Expanding repertoire of skills and strategies Deepening of rationale and reflection Deepening of rationale and reflection Teacher focus  student focus Teacher focus  student focus Whole class  generic groups  individuals Whole class  generic groups  individuals Stanford Center for Assessment, Learning and Equity 2011

Rubric blueprint Task name: Rubric Title Guiding Question Level 1Level 2Level 3Level 4Level 5 Struggling candidate, not ready to teach Some skill but needs more practice to be teacher- of-record Acceptable level to begin teaching Solid foundation of knowledge and skills Stellar candidate (top 5%)

Rubric Sample Eliciting and Monitoring Students’ Mathematical Understandings Level 1Level 2Level 3Level 4Level 5 Candidate talks throughout the clip(s) and students provide few responses. The candidate stays focused on facts or procedures with no attention to mathematical concepts and representations of content. Candidate primarily asks surface-level questions and evaluates student responses as correct or incorrect. Candidate makes vague or superficial use of representations to help students understand mathematical concepts. The candidate elicits student responses related to reasoning/prob lem solving. Candidate uses representations in ways that help students understand mathematical concepts. Candidate elicits and builds on students’ reasoning/ problem solving to explicitly portray, extend, or clarify a mathematical concept. Candidate uses strategically chosen representations in ways that deepen student understanding of mathematical concepts. All components of Level 4 plus, Candidate facilitates interactions among students to evaluate their own ideas.

Academic Language Academic language is different from everyday language. Some students are not exposed to this language outside of school.Academic language is different from everyday language. Some students are not exposed to this language outside of school. Much of academic language is discipline- specific.Much of academic language is discipline- specific. Unless we make academic language explicit for learning, some students will be excluded from classroom discourse and future opportunities that depend on having acquired this language.Unless we make academic language explicit for learning, some students will be excluded from classroom discourse and future opportunities that depend on having acquired this language. Stanford Center for Assessment, Learning and Equity

Academic language is the oral and written language used in school necessary for learning content. This includes the “language of the discipline” (vocabulary and forms/functions of language associated with learning outcomes) and the “instructional language” used to engage students’ in learning content. Academic Language Stanford Center for Assessment, Learning and Equity

Academic Language Competencies Measured Understanding students’ language development and identifying language demands Understanding students’ language development and identifying language demands Supporting language demands (form and function) to deepen content learning Supporting language demands (form and function) to deepen content learning Identifying evidence that students understand and use targeted academic language in ways that support their language development and content learning. Identifying evidence that students understand and use targeted academic language in ways that support their language development and content learning. Stanford Center for Assessment, Learning and Equity 2011

Development Timeline Small-scale tryout tasks & feedback from users Development of six pilot prototypes based on feedback. Piloted in 20 states. User feedback gathered to guide revisions National field test of 13 prototypes, producing a technical report with reliability and validity studies, and a bias and sensitivity review. National standard setting Adoption of validated assessment Stanford Center for Assessment, Learning and Equity 2011

Pilot Data Analysis Scores (descriptive stats)Scores (descriptive stats) Scoring processScoring process Inter-rater reliability and agreement ratesInter-rater reliability and agreement rates Examinee and faculty feedbackExaminee and faculty feedback Benchmark identificationBenchmark identification Stanford Center for Assessment, Learning and Equity 2011

Handbook Changes Deep focus on student learningDeep focus on student learning Five level rubricFive level rubric Clear organization, prompts and alignment with rubricsClear organization, prompts and alignment with rubrics Academic language reframingAcademic language reframing Analyzing teachingAnalyzing teaching Subject specific glossariesSubject specific glossaries Professional look and interactive featuresProfessional look and interactive features Stanford Center for Assessment, Learning and Equity 2011

Ohio Spring Pilot 2011 Three IHEs completed 150 portfolios in six content areasThree IHEs completed 150 portfolios in six content areas 91 were scored by 35 calibrated faculty (univ./school) scorers representing 9 institutions91 were scored by 35 calibrated faculty (univ./school) scorers representing 9 institutions Results were returned to the three IHEs from a commonly used serverResults were returned to the three IHEs from a commonly used server Feedback sent to T Candidates with scored portfolios in late summerFeedback sent to T Candidates with scored portfolios in late summer

Program/Unit Discussions Results shared with program representativesResults shared with program representatives Discussions about strengths and challenges noted from data resultsDiscussions about strengths and challenges noted from data results Sharing of next steps based upon the results for the coming academic yearSharing of next steps based upon the results for the coming academic year

Framing Reliability and Validity Research Current policies in playCurrent policies in play Evidence needed to support TPA use for accreditation and licensure decision-makingEvidence needed to support TPA use for accreditation and licensure decision-making Potential role for VAM and other predictive validity measuresPotential role for VAM and other predictive validity measures Stanford Center for Assessment, Learning and Equity 2011

Field Test Design Design is driven by overall goals: Design is driven by overall goals: Data to enhance validity evidence Data to enhance validity evidence Reports to describe technical aspects and the set of validity and reliability studies Reports to describe technical aspects and the set of validity and reliability studies Effectiveness and efficiency of scorer training materials and process Effectiveness and efficiency of scorer training materials and process Refinements to the assessment Refinements to the assessment Reporting design and distribution Reporting design and distribution Support systems: Support systems: portfolio management system and portfolio management system and scoring management system scoring management system Participation/Sampling plan – location (state-based or national population) and discipline-specific Participation/Sampling plan – location (state-based or national population) and discipline-specific Stanford Center for Assessment, Learning and Equity 2011

Field Test Analyses Field Test data analysis and research areas: Field Test data analysis and research areas: Content Validity Content Validity CV meetings held in July 2011 CV meetings held in July 2011 Bias Review scheduled for November 2011 Bias Review scheduled for November 2011 Construct Validity Construct Validity defining the construct of the TPA, factor analysis defining the construct of the TPA, factor analysis Consequential Validity Consequential Validity candidates & programs candidates & programs Predictive Validity Predictive Validity reliability between performance on the TPA and other measures (e.g., TPA scores and state teacher certification test scores) reliability between performance on the TPA and other measures (e.g., TPA scores and state teacher certification test scores) Stanford Center for Assessment, Learning and Equity 2011

Field Test Participation Subject Areas to be field testedSubject Areas to be field tested Elementary Literacy, Elementary Mathematics, English/Language Arts, History/Social Science, Secondary Mathematics, ScienceElementary Literacy, Elementary Mathematics, English/Language Arts, History/Social Science, Secondary Mathematics, Science Special Education, Early Childhood Development, Middle Grades (Science, ELA, Math, and History Social Science), Art, Performing Arts (Music, Dance, Theater), Physical Education, and World LanguageSpecial Education, Early Childhood Development, Middle Grades (Science, ELA, Math, and History Social Science), Art, Performing Arts (Music, Dance, Theater), Physical Education, and World Language Other low-incidence draft handbooks will be available for trying outOther low-incidence draft handbooks will be available for trying out Stanford Center for Assessment, Learning and Equity 2011

Field Test Participation Pearson will support scoring training and scoring stipends for a national sample of 18,000 candidatesPearson will support scoring training and scoring stipends for a national sample of 18,000 candidates Scoring training and certification online (some synchronous events)Scoring training and certification online (some synchronous events) Scorers to include IHE faculty, field supervisors, cooperating teaching, principals, NBCTs and others with pedagogical content knowledge and experience with beginning teacher development.Scorers to include IHE faculty, field supervisors, cooperating teaching, principals, NBCTs and others with pedagogical content knowledge and experience with beginning teacher development. Local, state and national scoringLocal, state and national scoring Stanford Center for Assessment, Learning and Equity 2011

Ohio’s Projections for % of Ohio’s IHEs are current TPA participants72% of Ohio’s IHEs are current TPA participants Additional IHEs have pending MOUs being completedAdditional IHEs have pending MOUs being completed Over 2100 portfolios, 13 of the14 content areas, are projected for completionOver 2100 portfolios, 13 of the14 content areas, are projected for completion

Timeline of Activities Release of revised handbooks Release of revised handbooks September 2011 September 2011 Commitment/registrations to participate in Field Test Commitment/registrations to participate in Field Test Summer/Fall 2011 Summer/Fall 2011 Pearson systems ready for registration, submissions, and scoring Pearson systems ready for registration, submissions, and scoring Spring 2012 – scorer management system ready Spring 2012 – scorer management system ready TBD 2012 – candidate registration and TPA submission system ready TBD 2012 – candidate registration and TPA submission system ready Stanford Center for Assessment, Learning and Equity 2011

Timeline of Activities Release of revised handbooks Release of revised handbooks September 2011 September 2011 Commitment/registrations to participate in Field Test Commitment/registrations to participate in Field Test Summer/Fall 2011 Summer/Fall 2011 Pearson systems ready for registration, submissions, and scoring Pearson systems ready for registration, submissions, and scoring Spring 2012 – scorer management system ready Spring 2012 – scorer management system ready TBD 2012 – candidate registration and TPA submission system ready TBD 2012 – candidate registration and TPA submission system ready Stanford Center for Assessment, Learning and Equity 2011

Next Steps Join TPAC Online (Ning) Join TPAC Online (Ning) Field test commitments Field test commitments Technical assistance Technical assistance AACTE affiliate meetings AACTE affiliate meetings Ongoing webinars and Ning discussions Ongoing webinars and Ning discussions PACT/TPAC Implementation Conference – October in San Diego PACT/TPAC Implementation Conference – October in San Diego AACTE Annual Meeting – February 17-19, 2012 AACTE Annual Meeting – February 17-19, 2012 Stanford Center for Assessment, Learning and Equity 2011

Ohio’s Next Steps NE Region SW Region Host: Hiram College /University of AkronHost: University of Cincinnati Contact: Jennifer Miller/Lynn KlineContact: Chet Laine Date: October 28 or Nov 4 (TBA)Date: February 24 SE RegionNE Region Host: Franciscan UniversityHost: Bowling Green State University Contact: Mary Kathryn McVeyContact: Mary Murray Date: November 16Date: March (TBA) Central Region Host: Ohio Dominican University Contact: Bonnie Beach Date: January 6

Ohio’s Next Steps NE Region SW Region Host: University of AkronHost: University of Cincinnati Contact: Lynn KlineContact: Chet Laine Date: Nov. 9 Date: February 24 SE RegionNE Region Host: Franciscan UniversityHost: Bowling Green State University Contact: Mary Kathryn McVeyContact: Mary Murray Date: November 16Date: March 14 Central Region Host: Ohio Dominican University Contact: Bonnie Beach Date: January 6

Other TPAc presentations Breakouts todayBreakouts today Supporting StudentsSupporting Students Engaging facultyEngaging faculty Lessons learned from ScoringLessons learned from Scoring Pilot year insightsPilot year insights Academic languageAcademic language TPAC 101 on ThursdayTPAC 101 on Thursday Thursday Keynote – Program renewalThursday Keynote – Program renewal Stanford Center for Assessment, Learning and Equity 2011