Presentation is loading. Please wait.

Presentation is loading. Please wait.

New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson.

Similar presentations


Presentation on theme: "New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson."— Presentation transcript:

1 New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson Erie 1 BOCES, Data Warehouse Mike DuPré Biology Mentor Network A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson Erie 1 BOCES, Data Warehouse Mike DuPré Biology Mentor Network

2 October 7, 2006J. Zawicki, T. Johnson, M. DuPré2 Assessment Purposes  Teachers  Measure knowledge  Measure gain in knowledge  Sorting (Grading)  Students/Parents  Measure preparation (predict success)  School District/State Education Department  Degree requirements (benchmarks)  Others…  Teachers  Measure knowledge  Measure gain in knowledge  Sorting (Grading)  Students/Parents  Measure preparation (predict success)  School District/State Education Department  Degree requirements (benchmarks)  Others…

3 October 7, 2006J. Zawicki, T. Johnson, M. DuPré3 Frameworks Syllabi Guides Blueprints Benchmarks Objective tests Performance assessments Portfolios Teacher Observations Group Activities Program Evaluations Curriculum Standards Assessment/Evaluation SystemInstructional Program alignment validity correlation Instructional styles Print materials Equipment Facilities Technology Community

4 October 7, 2006J. Zawicki, T. Johnson, M. DuPré4 Types of Analysis  Traditional  Difficulty  Discrimination  Response pattern  Rasch Analysis  Item difficulty equated to student ability  Standard setting benchmark’s essential  Traditional  Difficulty  Discrimination  Response pattern  Rasch Analysis  Item difficulty equated to student ability  Standard setting benchmark’s essential

5 October 7, 2006J. Zawicki, T. Johnson, M. DuPré5 Types of Analysis (Continued)  Cognitive Level - Bloom’s taxonomy  Knowing  Using  integrating  Alignment  Curriculum and Assessment  Andrew Porter  Item format  Cognitive Level - Bloom’s taxonomy  Knowing  Using  integrating  Alignment  Curriculum and Assessment  Andrew Porter  Item format Creating Evaluating Analyzing Applying Understanding Remembering

6 October 7, 2006J. Zawicki, T. Johnson, M. DuPré6 Types of Analysis (Continued) Teacher Review (Biology Mentor Network)  Difficulties analyzed in the context of:  Student issues  Testing issues  Instructional issues  Use of formative techniques to support conjectures Teacher Review (Biology Mentor Network)  Difficulties analyzed in the context of:  Student issues  Testing issues  Instructional issues  Use of formative techniques to support conjectures

7 October 7, 2006J. Zawicki, T. Johnson, M. DuPré7 Concepts  Difficulty – Percentage or proportion that are successful on an item  Discrimination – How well does an item differentiate between students who understand the subject and those who do not?  Validity – Does an item measure student understanding of the intended concept?  Difficulty – Percentage or proportion that are successful on an item  Discrimination – How well does an item differentiate between students who understand the subject and those who do not?  Validity – Does an item measure student understanding of the intended concept?

8 October 7, 2006J. Zawicki, T. Johnson, M. DuPré8 Concepts (Continued)  Reliability – can the results be replicated?  Inter-rater  Test/Re-test  Internal Consistency  Criterion referenced tests  Latency  Reliability – can the results be replicated?  Inter-rater  Test/Re-test  Internal Consistency  Criterion referenced tests  Latency

9 October 7, 2006J. Zawicki, T. Johnson, M. DuPré9 Student Difficulty?  Content Knowledge?  Literacy/Reading Comprehension?  Question interpretation Skills?  Misconception?  From previous instruction?  From culture contexts?  Insufficient reinforcement?  Effort?  Content Knowledge?  Literacy/Reading Comprehension?  Question interpretation Skills?  Misconception?  From previous instruction?  From culture contexts?  Insufficient reinforcement?  Effort?

10 October 7, 2006J. Zawicki, T. Johnson, M. DuPré10 Test Difficulty?  Difficulty (Facility) Level?  Discrimination?  Placement on exam?  Visual distraction by nearby (graphic) items?  Style of Question?  Flawed item?  Difficulty (Facility) Level?  Discrimination?  Placement on exam?  Visual distraction by nearby (graphic) items?  Style of Question?  Flawed item?

11 October 7, 2006J. Zawicki, T. Johnson, M. DuPré11 Instructional Difficulty?  You didn’t teach the associated core major understandings.  You didn’t reinforce the core understandings enough.  You taught the core content wrong  You didn’t teach the associated core major understandings.  You didn’t reinforce the core understandings enough.  You taught the core content wrong

12 October 7, 2006J. Zawicki, T. Johnson, M. DuPré12 Test Data – Discussion and Analysis  Collecting Data  Analysis  Difficulty  Response Pattern  Collecting Data  Analysis  Difficulty  Response Pattern

13 October 7, 2006J. Zawicki, T. Johnson, M. DuPré13 Interpreting Data

14 October 7, 2006J. Zawicki, T. Johnson, M. DuPré14 Multiple Choice - Easier ItemDifficulty1234NR 270.9561192219033

15 October 7, 2006J. Zawicki, T. Johnson, M. DuPré15 Multiple Choice - Easier ItemDifficulty1234NR 230.9043679318023

16 October 7, 2006J. Zawicki, T. Johnson, M. DuPré16 Multiple Choice, Easier ItemDifficulty1234NR 460.894317939724

17 October 7, 2006J. Zawicki, T. Johnson, M. DuPré17 Multiple Choice, Easier ItemDifficulty1234NR 180.8896176478637

18 October 7, 2006J. Zawicki, T. Johnson, M. DuPré18 Multiple Choice, Easier ItemDifficulty1234NR 160.86621321733756

19 October 7, 2006J. Zawicki, T. Johnson, M. DuPré19 Multiple Choice, More Difficult ItemDifficulty1234NR 440.326491519023024

20 October 7, 2006J. Zawicki, T. Johnson, M. DuPré20 Multiple Choice, More Difficult ItemDifficulty1234NR 250.325664510452584

21 October 7, 2006J. Zawicki, T. Johnson, M. DuPré21 Multiple Choice, More Difficult ItemDifficulty1234NR 500.397754922205129

22 October 7, 2006J. Zawicki, T. Johnson, M. DuPré22 Multiple Choice, More Difficult ItemDifficulty1234NR 510.4248626739485110

23 October 7, 2006J. Zawicki, T. Johnson, M. DuPré23 Multiple Choice, More Difficult ItemDifficulty1234NR 400.479407221641784

24 October 7, 2006J. Zawicki, T. Johnson, M. DuPré24 Constructed Response, Easier ItemDifficulty012 640.9215218540

25 October 7, 2006J. Zawicki, T. Johnson, M. DuPré25 Constructed Response, Easier ItemDifficulty012 530.9020817980

26 October 7, 2006J. Zawicki, T. Johnson, M. DuPré26 Constructed Response, Easier ItemDifficulty012 610.871442491613

27 October 7, 2006J. Zawicki, T. Johnson, M. DuPré27 Constructed Response, Easier ItemDifficulty012 590.8627417320

28 October 7, 2006J. Zawicki, T. Johnson, M. DuPré28 Constructed Response, Easier ItemDifficulty012 650.861542501602

29 October 7, 2006J. Zawicki, T. Johnson, M. DuPré29 Constructed Response, More Difficult ItemDifficulty012 580.37995524487

30 October 7, 2006J. Zawicki, T. Johnson, M. DuPré30 Constructed Response, More Difficult ItemDifficulty012 720.42964388654

31 October 7, 2006J. Zawicki, T. Johnson, M. DuPré31 Constructed Response, More Difficult ItemDifficulty012 620.50749506751

32 October 7, 2006J. Zawicki, T. Johnson, M. DuPré32 Constructed Response, More Difficult ItemDifficulty012 630.5198010260

33 October 7, 2006J. Zawicki, T. Johnson, M. DuPré33 Constructed Response, More Difficult ItemDifficulty012 680.654105761020

34 October 7, 2006J. Zawicki, T. Johnson, M. DuPré34 In Conclusion  Summary of findings  Conceptually challenging items  “Inscription”  Calculations, showing work…  Future directions  Revisiting standard setting…  How well are the current standards working?  Next steps…  Considerations within our classrooms  Summary of findings  Conceptually challenging items  “Inscription”  Calculations, showing work…  Future directions  Revisiting standard setting…  How well are the current standards working?  Next steps…  Considerations within our classrooms

35 October 7, 2006J. Zawicki, T. Johnson, M. DuPré35 Resources from this presentation…  http://physicsed.buffalostate.edu/pubs/AAP Tmtgs/NYSS/Fall06 http://physicsed.buffalostate.edu/pubs/AAP Tmtgs/NYSS/Fall06 Email: zawickjl@buffalostate.eduzawickjl@buffalostate.edu Office Phone (716) 878-3800  http://physicsed.buffalostate.edu/pubs/AAP Tmtgs/NYSS/Fall06 http://physicsed.buffalostate.edu/pubs/AAP Tmtgs/NYSS/Fall06 Email: zawickjl@buffalostate.eduzawickjl@buffalostate.edu Office Phone (716) 878-3800


Download ppt "New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson."

Similar presentations


Ads by Google