New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

MCR Michael C. Rodriguez Research Methodology Department of Educational Psychology.
What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Using Test Item Analysis to Improve Students’ Assessment
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Issues of Technical Adequacy in Measuring Student Growth for Educator Effectiveness Stanley Rabinowitz, Ph.D. Director, Assessment & Standards Development.
Faculty Research & Creativity Forum October 30, 2008 D. MacIsaac, Physics J. Zawicki, Earth Sciences and Science Education L. Gomez, Physics SUNY Buffalo.
New York State K-12 Social Studies Update: Implementation and Strategic Next Steps Greg Ahlquist Social Studies Teacher, Webster Thomas HS State Education.
Intellectual Challenge of Teaching
An Analysis of NYS Regents Physics Exam Issues J. Zawicki Science Education, BSC STANYS DAL, Physics Mentor.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Assessing the Assessments: Physics An Initial Analysis of the June 2003 NYS Regents Physics Exam J. Zawicki, SUNY Buffalo State College M. Jabot, SUNY.
Office of Research, Evaluation, and Assessment April 19, 2008.
Understanding Validity for Teachers
What should be the basis of
Aligning Course Competencies using Text Analytics
Introduction to the Social Studies Frameworks For O/N BOCES Curriculum Council.
New York State K-12 Social Studies Update Dr. John B. King, Jr. New York State Commissioner of Education President of the University of the State of New.
Up-Date on Science and SS February 2015 Faculty Meeting.
CURRICULUM ALIGNMENT Debbi Hardy Curriculum Director Olympia School District.
Objectives are statements of what we want students to learn as a result of the instruction we provide them. Objectives answer the question “What do we.
Supporting the CCSS in the Science Classroom through the Science and Engineering Practices of the Next Generation Science Standards (NGSS) John Spiegel.
Tools for Instruction and Assessment for the Maryland College- and Career-Ready Standards Time to Revisit Tools that will Inform Instruction.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
Becoming a Teacher Ninth Edition
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Classroom Assessment and Grading
The DATA WISE Process and Data- Driven Dialogue Presented by: Lori DeForest (315)
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Illinois MSP Program Goals  To increase the content expertise of mathematics and science teachers; 4 To increase teaching skills through access to the.
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
Computation-based versus concept-based test questions: High school teachers’ perceptions Computation-based versus concept-based test questions: High school.
Joseph L. Zawicki 1, Kathleen Falconer 2 and Dan MacIsaac 3 1 Department of Earth Sciences and Science Education, Buffalo State College, Buffalo, New York.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Understanding the TerraNova Test Testing Dates: May Kindergarten to Grade 2.
Language Assessment Instructor: Dr. Yan-Ling Hwang, Assistant Professor Class Time : Monday 1:10 a.m. - 2:50 p.m. Classroom : B28 大慶校區 Office : A26 應語系研究室.
Human Learning Asma Marghalani.
FEBRUARY KNOWLEDGE BUILDING  Time for Learning – design schedules and practices that ensure engagement in meaningful learning  Focused Instruction.
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
Writing Objectives Including Bloom’s Taxanomy. Three Primary Components of an Objective Condition –What they’re given Behavior –What they do Criteria.
Classroom Diagnostic Tools. Pre-Formative Assessment of Current CDT Knowledge.
Goals for Webinar: Applications of SEC Alignment Analysis
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Blooms Taxonomy Margaret Gessler Werts Department of Language, Reading, and Exceptionalities.
COUNCIL OF CHIEF STATE SCHOOL OFFICERS (CCSSO) & NATIONAL GOVERNORS ASSOCIATION CENTER FOR BEST PRACTICES (NGA CENTER) JUNE 2010.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Assessment Power! Pamela Cantrell, Ph.D. Director, Raggio Research Center for STEM Education College of Education University of Nevada, Reno.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
CREATING AN ACTIVE LEARNING ENVIRONMENT Using Inquiry and Primary Sources.
L ESSONS FROM THE J UNE 2009 NYS R EGENTS P HYSICS E XAMINATION – N OTES FOR THE C LASSROOM NYSS AAPT Fall Conference Syracuse, New York – October 17,
Evaluating the Alignment and Quality of the Written, Taught, and Tested Curriculum Written Taught Test Curriculum Presented By: Dr. Shawnrell Blackwell.
WHY? To transform teaching and learning.. Strategic Pillars Goal 1: Student Growth and High Academic Achievement – Develop and implement a comprehensive.
Planning for Assessment Blooms Taxonomy. TAXONOMIES A taxonomy may be defined as a system of classification.
FSM NSTT Teaching Competency Test Evaluation. The NSTT Teaching Competency differs from the three other NSTT tests. It is accompanied by a Preparation.
CHAPTER 3: Practical Measurement Concepts
Assessment in Science Rodney Doran, SUNY Buffalo, Emeritus
Formative and Summative Assessment
Teaching and Learning with Technology
ASSESSMENT OF STUDENT LEARNING
Diagnosis and Remediation of Reading Difficulties
Creating an Active Learning environment
Validating Interim Assessments
Assessment for Learning
Media Project 4 Assessment and Learning
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson Erie 1 BOCES, Data Warehouse Mike DuPré Biology Mentor Network A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson Erie 1 BOCES, Data Warehouse Mike DuPré Biology Mentor Network

October 7, 2006J. Zawicki, T. Johnson, M. DuPré2 Assessment Purposes  Teachers  Measure knowledge  Measure gain in knowledge  Sorting (Grading)  Students/Parents  Measure preparation (predict success)  School District/State Education Department  Degree requirements (benchmarks)  Others…  Teachers  Measure knowledge  Measure gain in knowledge  Sorting (Grading)  Students/Parents  Measure preparation (predict success)  School District/State Education Department  Degree requirements (benchmarks)  Others…

October 7, 2006J. Zawicki, T. Johnson, M. DuPré3 Frameworks Syllabi Guides Blueprints Benchmarks Objective tests Performance assessments Portfolios Teacher Observations Group Activities Program Evaluations Curriculum Standards Assessment/Evaluation SystemInstructional Program alignment validity correlation Instructional styles Print materials Equipment Facilities Technology Community

October 7, 2006J. Zawicki, T. Johnson, M. DuPré4 Types of Analysis  Traditional  Difficulty  Discrimination  Response pattern  Rasch Analysis  Item difficulty equated to student ability  Standard setting benchmark’s essential  Traditional  Difficulty  Discrimination  Response pattern  Rasch Analysis  Item difficulty equated to student ability  Standard setting benchmark’s essential

October 7, 2006J. Zawicki, T. Johnson, M. DuPré5 Types of Analysis (Continued)  Cognitive Level - Bloom’s taxonomy  Knowing  Using  integrating  Alignment  Curriculum and Assessment  Andrew Porter  Item format  Cognitive Level - Bloom’s taxonomy  Knowing  Using  integrating  Alignment  Curriculum and Assessment  Andrew Porter  Item format Creating Evaluating Analyzing Applying Understanding Remembering

October 7, 2006J. Zawicki, T. Johnson, M. DuPré6 Types of Analysis (Continued) Teacher Review (Biology Mentor Network)  Difficulties analyzed in the context of:  Student issues  Testing issues  Instructional issues  Use of formative techniques to support conjectures Teacher Review (Biology Mentor Network)  Difficulties analyzed in the context of:  Student issues  Testing issues  Instructional issues  Use of formative techniques to support conjectures

October 7, 2006J. Zawicki, T. Johnson, M. DuPré7 Concepts  Difficulty – Percentage or proportion that are successful on an item  Discrimination – How well does an item differentiate between students who understand the subject and those who do not?  Validity – Does an item measure student understanding of the intended concept?  Difficulty – Percentage or proportion that are successful on an item  Discrimination – How well does an item differentiate between students who understand the subject and those who do not?  Validity – Does an item measure student understanding of the intended concept?

October 7, 2006J. Zawicki, T. Johnson, M. DuPré8 Concepts (Continued)  Reliability – can the results be replicated?  Inter-rater  Test/Re-test  Internal Consistency  Criterion referenced tests  Latency  Reliability – can the results be replicated?  Inter-rater  Test/Re-test  Internal Consistency  Criterion referenced tests  Latency

October 7, 2006J. Zawicki, T. Johnson, M. DuPré9 Student Difficulty?  Content Knowledge?  Literacy/Reading Comprehension?  Question interpretation Skills?  Misconception?  From previous instruction?  From culture contexts?  Insufficient reinforcement?  Effort?  Content Knowledge?  Literacy/Reading Comprehension?  Question interpretation Skills?  Misconception?  From previous instruction?  From culture contexts?  Insufficient reinforcement?  Effort?

October 7, 2006J. Zawicki, T. Johnson, M. DuPré10 Test Difficulty?  Difficulty (Facility) Level?  Discrimination?  Placement on exam?  Visual distraction by nearby (graphic) items?  Style of Question?  Flawed item?  Difficulty (Facility) Level?  Discrimination?  Placement on exam?  Visual distraction by nearby (graphic) items?  Style of Question?  Flawed item?

October 7, 2006J. Zawicki, T. Johnson, M. DuPré11 Instructional Difficulty?  You didn’t teach the associated core major understandings.  You didn’t reinforce the core understandings enough.  You taught the core content wrong  You didn’t teach the associated core major understandings.  You didn’t reinforce the core understandings enough.  You taught the core content wrong

October 7, 2006J. Zawicki, T. Johnson, M. DuPré12 Test Data – Discussion and Analysis  Collecting Data  Analysis  Difficulty  Response Pattern  Collecting Data  Analysis  Difficulty  Response Pattern

October 7, 2006J. Zawicki, T. Johnson, M. DuPré13 Interpreting Data

October 7, 2006J. Zawicki, T. Johnson, M. DuPré14 Multiple Choice - Easier ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré15 Multiple Choice - Easier ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré16 Multiple Choice, Easier ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré17 Multiple Choice, Easier ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré18 Multiple Choice, Easier ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré19 Multiple Choice, More Difficult ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré20 Multiple Choice, More Difficult ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré21 Multiple Choice, More Difficult ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré22 Multiple Choice, More Difficult ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré23 Multiple Choice, More Difficult ItemDifficulty1234NR

October 7, 2006J. Zawicki, T. Johnson, M. DuPré24 Constructed Response, Easier ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré25 Constructed Response, Easier ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré26 Constructed Response, Easier ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré27 Constructed Response, Easier ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré28 Constructed Response, Easier ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré29 Constructed Response, More Difficult ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré30 Constructed Response, More Difficult ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré31 Constructed Response, More Difficult ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré32 Constructed Response, More Difficult ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré33 Constructed Response, More Difficult ItemDifficulty

October 7, 2006J. Zawicki, T. Johnson, M. DuPré34 In Conclusion  Summary of findings  Conceptually challenging items  “Inscription”  Calculations, showing work…  Future directions  Revisiting standard setting…  How well are the current standards working?  Next steps…  Considerations within our classrooms  Summary of findings  Conceptually challenging items  “Inscription”  Calculations, showing work…  Future directions  Revisiting standard setting…  How well are the current standards working?  Next steps…  Considerations within our classrooms

October 7, 2006J. Zawicki, T. Johnson, M. DuPré35 Resources from this presentation…  Tmtgs/NYSS/Fall06 Tmtgs/NYSS/Fall06 Office Phone (716)  Tmtgs/NYSS/Fall06 Tmtgs/NYSS/Fall06 Office Phone (716)