Diagnoser.com: Online learning & assessment tools for Physics

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Measuring Student and Teacher Technology Literacy for NCLB Whats an LEA to do? 2004 National School Boards Association Conference Denver Carol D. Mosley.
Mentoring New Educators
Teacher Librarians. Contact Information Mary Cameron Iowa Department of Education (515)
Holyoke Public Schools Professional Development By, Judy Taylor
Intel ® Education Assessing Projects: Teaching the Net Generation: Curriculum, Pedagogy and the Challenge of 21 st Century Learning 10 to 11 September.
Copyright © 2009 Intel Corporation. All rights reserved. Intel and Intel Education are trademarks or registered trademarks of Intel Corporation or its.
Technology Integration S UMMER 2009 IISD Technology Staff Development.
The ACCESS Project, Colorado State University Jesse Hausler, Assistive Technology Coordinator Craig Spooner, Project Coordinator The Universally Designed.
Smarter Balanced Digital Library: Putting Formative Assessment Into Practice Getting Smarter & Long Term EL Symposium January 14, 2015 Access PPT at:
The Framework for Teaching An Overview of the Danielson Model.
Getting Ready for Algebra Online: Enhancing Instruction 24/7 Leslie Hays, Ed.D., Director Cheryl Tyler, Project Specialist San Diego County Office of Education.
PROJECT SHARE: THE BIG PICTURE. Karen Teeters Need Help with Project Share or OnTrack ?
Introduction to digiCOACH Empowering Instructional Leaders Common Core Edition.
Next Generation Science Standards Update Cheryl Kleckner Education Specialist.
SUMMER WORKSHOP PROPOSAL Spring Grove Area School District.
Freehold Borough Teacher Evaluation Training KICKOFF PRESENTATION March 8, 2013 Presented by: Joy Forrest (FLC) Jennifer Donnelly (PAE) Rich Pepe (FIS)
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Agenda for New Teacher Induction Brief overview of Formative Assessment Brief Overview of Differentiated Instruction Standards Aligned System-
Agenda New Teacher Induction
Instructional leadership: The role of promoting teaching and learning EMASA Conference 2011 Presentation Mathakga Botha Wits school of Education.
Lighting Our Path to Stellar Learning. The Goal The ultimate goal in school improvement is for the people associated with the school to drive its continuous.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Copyright © 2009 Intel Corporation. All rights reserved. Intel and Intel Education are trademarks or registered trademarks of Intel Corporation or its.
Introducing the Course: Power Up Your Students’ Research Projects Judith Zorfass and Tracy Gray American Institutes for Research September.
Measuring the Power of Learning.™ California Assessment of Student Performance and Progress (CAASPP) Using Formative Assessment and the Digital Library.
CERA 87 th Annual Conference- Effective Teaching & Learning: Evaluating Instructional Practices Rancho Mirage, CA – December 4, 2008 Noelle C. Griffin,
Introduction to the Smarter Balanced Digital Library
The Power of Formative Assessment to Advance Learning.
Implementing Formative Assessment Online Professional Development What Principals Need to know.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
1 Welcome Back  Topics for today Pedagogy Curriculum Standards Introduction to desktop publishing & design.
Data Sources Artifacts: Lesson plans and/or curriculum units which evidence planned use of diagnostic tools, pre- assessment activities, activating strategies,
Supporting Rigorous Curriculum (Overcoming Isolationism) Participants are expected to purchase the book Results Now Instructional Leadership.
Measuring What Matters: Technology & the Assessment of all Students Jim Pellegrino.
SCIENCE COMPANION, SESSION 3 INTRODUCTION TO THE ________________ MODULE [Your name here] January 24, 2011.
Data Teams. Which comes first – the data or the team?
EDUCATE ALABAMA & PROFESSIONAL LEARNING PLAN Oak Mountain High School
The Four P’s of an Effective Writing Tool: Personalized Practice with Proven Progress April 30, 2014.
Assessment Online. Assessment Online on TKI offers a wide range of information on how leaders and teachers can use assessment to enhance teaching and.
Teaching... Learning... Leading... 1 Light the Way to Student Achievement Presenters: Lisa Gibson Sharla Shults Funded through the U.S. Department of Education.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Introduction and Overview Welcome !
Smarter Balanced Interim and Formative Assessment PTE Summer Conference June 17, 2014 Nancy Thomas Price, Comprehensive Assessment Coordinator.
Evaluating the 21 st Century Educator with Fidelity School Executive Training Beaufort County Schools April 19, 2012.
13.1 WELCOME TO COMMON CORE HIGH SCHOOL MATHEMATICS LEADERSHIP SCHOOL YEAR SESSION APR 2015 MARGINS: GREAT FOR ERRORS, NOT SO GOOD FOR.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
Instructional Leadership: Planning Rigorous Curriculum (What is Rigorous Curriculum?)
Teacher Leadership & Action Research or Teachers As Leaders: Some Thoughts To Share Rebecca K. Fox, Ph.D. College of Education and Human Development.
Instructional Leadership: Monitoring Insights, Patterns, & Trends.
Developing Structures for Teacher- Lead Learning Communities Jill Cabrera, Ph.D. Western Kentucky University.
Teaching for Results Session 4 Facilitated by: Shauna Watson.
Learning Goals, Scales, and Learning Activities Clarity and Purpose.
Name of School District | Date | Presenter’s Name | Curriculum Specialist Introduction to Pearson Forward.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Advice to non-specialist EALD teachers of EALD students in SACE Essential English.
Morris School District Grade 4 and 5 Math Grouping Alternative BOE Approved June 2015.
Intel ® Teach Program Copyright © 2010 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative and Intel Teach Program.
1 Using DLESE: Finding Resources to Enhance Teaching Shelley Olds Holly Devaul 11 July 2004.
Preliminary Data Analyses
NSDL: A New Tool for Teaching and Learning.
Magothy River Middle School
Introducing the Course: Power Up Your Students’ Research Projects
Building Academic Language
Building Academic Language
Welcome to the overview session for the Iowa Core Curriculum
Building Academic Language
Presentation transcript:

Diagnoser.com: Online learning & assessment tools for Physics   Jim Minstrell Facet Innovations, Seattle, WA

Agenda Introductions Complete a brief online student assessment Overview of Diagnoser.com Register and explore Diagnoser resources Q & A with teacher users Research results from Diagnoser.com use Experience diagnoser online assessment Tour of the Diagnoser Tools YOU explore Diagnoser Hear and see results of using Diagnoser  

Diagnostic assessment experience In your workshop packet, find your student User Name (number, including dashes) and Password (four digit number) Go to Diagnoser.com and login as a student. Please complete your assignment. Class data will become part of a demo later. Forces at an Interaction   Suppose you have been investigating the phenomena and ideas of this unit for a few days. Still formative to learning in the unit, I as the teacher (and maybe you as a student) want to know if you are understanding. Comments: Christina and others from the APEX Project will help you to get on to the site, if needed. I suggest you use a device that has a larger screen than a cell phone. You will not be graded on this, but I need your help to populate the class data which we will look at later in the workshop. As you complete your Question Set be sure you note features of the items and save your questions about the QSets for later. Thank you for your participation.

What is Diagnoser? 67 science concept learning units 1600 high quality diagnostic assessment items All resources are organized around a common interpretive framework of 67 “Facet Clusters” Actionable assessment information all item responses are meaningful; tied to individual “Facets” of students’ conceptual understanding automated analysis of student responses descriptive feedback complete next lessons based on Facet “diagnosis” Comments: 67 units mostly physics/physical science 1600 items, 3 formats (MC, Numerical, Open Response) Facets of thinking are standards based and derived from misconceptions research (more on the next slide) Actionable (responses associated with Facets of thinking, automated analysis, descriptive feedback to students, next lessons based on diagnoses)

Diagnoser Tools Tools in Diagnoser parallel foci in typical planning of a unit. BUT Unit size is based on mainly ONE KEY IDEA See Poster All tools are based on the Facet Cluster -- FC Ideas embedded in every assessment and instruction activity The Tools -Learning Goals BASED On STANDARDS Docs (NSES & AAAS, but moving toward NGSS (AL COS) -EQs are initial, engaging questions and discussion -- Open up the science content and practice issues in the short unit –T and Sts hear other students’ thinking – Motivate Lean -Developmental Lessons– Give students opportunity to test their ideas -- Opportunities to use problematic as well as goal ideas/Facets -Qsets –you just complete one within the Forces as interactions unit. (T assigns ONLINE FA for students) --- Response DATA goes into T Report -- Students get feedback on Questions and in Summary (Up to 2 Problematic ideas to work on) Prescriptive Activities– Lessons to address specific problematic Facets Other QSets for more Formative Assessment   Comments- Tools in Diagnoser parallel foci in typical planning of a unit. All tools are based on the Facet Cluster Learning Goals EQs Developmental Lessons QSets (ONLINE for students) Prescriptive Activities QSets for more Formative Assessment

Brief Background Developed over the last 30+ years based on research on how students learn (Diagnoser.com launched in 2004) Development and research funded by National Science Foundation and Department of Education Resources vetted in multiple classrooms More than 9,000 teacher users (100,000 students) from >30 states and >15 countries >6 million student responses to diagnostic items Used by teachers, researchers, professional developers, and teacher educators

Facet Clusters: the “backbone” of Diagnoser assessment and instruction -Facets of Students’ Thinking form the backbone of the assessment and instruction. -”Forces as Interactions” is one example of the 67 FCs for interpreting students responses--- what learners say and do. -If students don’t have the goal ideas and practices what are the ideas and practices they do use? -The research-based misconceptions (problematic ideas) , the part below the line, are roughly ranked from 90s, very troublesome to understanding the idea of the unit to Facets that are closer to the learning goals (typically 20s to 40s). -Standards statements are parsed into sub-propositions (e.g. 01, 02, 03, so we can identify what part of the learning goal the learner seems to understand. -Each major problematic Facet (e.g., 60) has examples under it (e.g., 61 to 64). -In this FC the 90 is a problematic prerequisite idea for this cluster. -The FC is not intended for use as a grading scale. Comments: -This is one example of the 67 FCs for interpreting students thinking. -The research-based misconceptions (problematic ideas) , the part below the line, are roughly ranked from 90s, very troublesome to understanding the idea of the unit to Facets that are closer to the learning goals (typically 20s to 40s). -Standards statements are parsed into sub-propositions, so we can identify what part of the learning goal the learner seems to understand.

Online tour of FAI unit Facets and Facet Clusters (Backbone of the system) Elicitation Questions and Discussion (Primer) EQActivity- Open the content (phenomenological) issues Discussion- Voicing Student ideas Hearing other students’ ideas and experiences

Online tour of FAI unit Developmental Lessons- AddressStudents’ initial ideas and motivation to know Address Learning Goals

Online tour of FAI unit Online Question Sets Average of 10 questions 3 formats MC and Numerical are diagnostic based on Facet Cluster Each response is coded to a Facet from this Facet Cluster

Online tour of FAI unit Example of Openended Item Opportunity to Validate Dignosis Opportunity to learn new Facets

Teacher Report (summary format) Data from online assessment propagates a report to Students and a Teacher Report Teacher Report Summary Two most frequently occurring Facet diagnoses and relevant Facet-based Prescriptive Activities Frequencies of other problematic Facets also shown ID numbers for the students who chose a response associated with each Facet at least twice (or at least once.) Deemphasized in the lower left is the average of all students % correct in that class

Teacher Report (individual response details) Matrix of Assessment Items (across top) mapped to Student IDs (down left side) In each cell is the diagnostic Facet Code associated with that student’s response to that item. NOT just % correct, but if not correct, what seems to be the student’s idea? Reading across table gives the teacher glimpse of individual student performance Cells of Matrix include Facet diagnosis for MC and Numerical items. Text from open-ended items is accessible to teacher. Table also included student’s self-report of their understanding in unit Also includes %Correct of the items that student encountered.

Online tour of FAI unit What do I do after interpreting the data??? Prescriptive activities specific to student difficulties Based on diagnosis resulting in a particular Facet

Common concerns/challenges of teachers engaging in formative assessment How do I interpret and prioritize the information I get from my students? What kind of feedback do I give my students? And how can I possibly do that all the time for everyone? Designing next instruction to target all the different issues that emerge is not feasible. Cycle of Collect FA responses Interpret using Facet Cluster Act to address problematic as well as productive ideas and learning goals.

What makes Diagnoser different from other online assessment resources? Supports FA as a process (an integrated cycle of learning & assessment --not an event or type of assessment) An interpretive framework of “Facets” & “Facet Clusters” supports a consistent, comparable, and descriptive interpretation of student work Facets provide a common language for describing learning Notes needed Recently we have also noted that Diagnoser could help classroom assessment participate in the larger system of learning assessment in a district and even in high stakes testing—to make both more meaningful and actionable: Embed a few common, or similarly diagnostic questions throughout the various levels of assessment (class, school, district, state) Use a common interpretive framework based on real student data

Exploring Diagnoser Tools Register as a teacher (Save your teacher ID and PW) Handouts for reference Explore Diagnoser.com site; individual Q&A Those experienced with Diagnoser will help you.

Evidence for APEX PD Program Improvement Description of APEX Program (content, pedagogy, and technology; Action Research Effects of Diagnoser Qsets being used in teacher’s action research Effects of the APEX Program for teacher professional development Briefly describe from slide

APEX teacher use of the QSet data Formative Assessment within the unit Inform decisions for next activities Summative Assessment checking for retention How the APEX teachers are using Diagnoser

Diagnoser data from one classroom (2014-15) Results from Forces as Interactions Unit. Unit Q Set 2014-15 School yr. 40 50 60 90 N Correct   Forces as Interactions 1 December 86% 0% 36% 5% 22 48% 2 January 100% 40% 8% 4% 25 68% June 96% 24 82% Report of multiple use of Qsets within one unit -- Facets in that Unit across the top -- Unit, Date and Qset used down left -- % Correct down right side Formative use in December Prescriptive Activities and winter break Formative use in January Summative use at end of school year (five months later.)   Results: %Correct increases through the school year --- What to work on --- Retention by end of year % for problematic facets Changes but mostly decreases through the year. The major problematic Facets evolved --- Gives Teacher/Researcher information about LP within one unit. Majority of 70 Physics Teachers from an MSP in Alabama using Diagnoser as one source of data to monitor progress. (Collaboration between Alabama A & M, U Alabama, State of Alabama ASIM specialists, teachers and their schools and districts. VISION Think about information that can affect teacher decisions Possible teacher sharing with administrator to show progress. Possible share and compare with other teacher results Think Far vision- Suppose a framework (something like Facets) was used at all levels of assessment system to help all levels find meaning in the data

APEX Teachers share their Diagnoser experiences APEX teachers and ASIM specialists share the benefits & challenges of using Diagnoser in their classrooms

Diagnoser & Physics Education Research on learner understanding Of 40 comparisons of first assessment with last assessment in 4 forces units resulted in 23 statistically significant positive Effects and 1 significant negative Effect. High frequency problematic Facets change across time *These results are from one cohort for 2014-15 As per slide

Diagnoser data from one classroom (2014-15) Results from Forces as Interactions Unit. Unit Q Set 2014-15 School yr. 40 50 60 90 N Correct   Forces as Interactions 1 December 86% 0% 36% 5% 22 48% 2 January 100% 40% 8% 4% 25 68% June 96% 24 82% Report of multiple use of Qsets within one unit - Note what happens across time to the problematic Facets. Formative use in December Prescriptive Activities and winter break Formative use in January Summative use at end of school year (five months later.)   Results: %Correct increases through the school year --- What to work on --- Retention by end of year % for problematic facets Changes but mostly decreases through the year. The major problematic Facets evolved --- Gives Teacher/Researcher information about LP within one unit. Majority of 70 Physics Teachers from an MSP in Alabama using Diagnoser as one source of data to monitor progress. (Collaboration between Alabama A & M, U Alabama, State of Alabama ASIM specialists, teachers and their schools and districts. VISION Think about information that can affect teacher decisions Possible teacher sharing with administrator to show progress. Possible share and compare with other teacher results Think Far vision- Suppose a framework (something like Facets) was used at all levels of assessment system to help all levels find meaning in the data

Closing Remarks Automated and coordinated tools for integrating teaching, learning & assessment that support teachers and students in a continuous cycle of learning with ongoing feedback to improve that cycle. A shared and robust (research-based) framework of Facets for interpreting student responses/ learning behind the assessment and learning tools that are developed. Focus on creating high quality (reliable and descriptive) classroom data that can inform and/or be integrated into other levels of assessment with the educational system. Automated Tools, Data, and Analysis distributed to Teachers --- -Diagnoser Tools useful to students, teachers, administrators, professional developers, and researchers -Diagnoser Tools for Facet-based, diagnostic formative assessment and instruction   2. Framework for interrpeting what students say and do specifically related to the ideas and reasoning of the unit -Facet-based interpretation of data to serve teaching and learning decisions -All assessment and instruction tools based on Facets 3. -Multiple use of assessments and interpretive framework to mark learning and -Identify parts of curriculum and instruction that need tuning -We are getting data consistent and relialbe FROM the classrromm. This data can be the base of the funnerling up through the system. -Datameaningful at classroom level AND meaningful to administrative systems for monotoring progress. With all levels of the system using the same framework for interpretation.

Acknowledgements The results, analyses, and Ideas in this presentation have been partially supported by NSF Grant DUE 1238192. The ideas are those of the presenter and may not be consistent with those of the foundation nor of the APEX Project. Informing evidence-based decision making

Every response to MC or Numerical item is associated with a particular FACET --- This example shows the associated Facet with each response. Question Sets average 10 items/set. All students respond to a subset of common items System has branching to followup (e.g. repeat question) on student difficulties. -Some paired items ask for an answer and reasoning (and check consistency) Some items include reasoning with the DCI -Students receive feedback after each item or pair of items.   Summary Report for Student -Number correct, -Student Self-rating, -Two Problematic ideas to work on. -Some teachers ask students to print out and use this report to guide their rethinking of the student’s problematic ideas related to the learning goals. Student to discuss the problematic idea, what evidence they have that that itea does not work; AND What ideas works and what is evidence?

Suppose we had classroom assessment as part of a larger system of assessment? Embed a few common, or similarly diagnostic questions throughout the various levels of assessment (class, school, district, state) Use a common interpretive framework based on real student data Suppose our assessments (formative and summative) at all levels involved some items that were based on a common framework (like Facets) for interpreting data. Recommendation to Policy makers, in case there are any here.

Reading and using the Teacher Report Data & Results from Diagnoser Qsets The goal of a good formative assessment process is not to collect all the student data that you can, but to “can” all the student data that you collect.

Building on Learner Thinking (BOLT)