JCDL 2008 Exploring Educational Standard Alignment: In Search of ‘Relevance’ René Reitsma*, Byron Marshall* Michael Dalton*, Martha Cyr *Oregon State University.

Slides:



Advertisements
Similar presentations
Inquiry-Based Instruction
Advertisements

The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
Standards, Assessment, and Curriculum
Supporting Literacy in the Content Areas: Science, Math, & Social Studies.
LEAPING INTO BACKWARDS DESIGN USING AN AACU VALUE RUBRIC TO DESIGN A COURSE MARC BOOTS-EBENFIELD DIRECTOR, CENTER FOR TEACHING INNOVATION SALEM STATE UNIVERSITY.
PROPOSED MULTIPLE MEASURES FOR TEACHER EFFECTIVENESS
Dimensional Standard Alignment in K-12 Digital Libraries Assessment of Self- found vs. Recommended Curriculum Byron Marshall René Reitsma Oregon State.
Network Visualization of Human and Machine-based Educational Standard Assignment IV10 – London, UK René Reitsma 1, Anne Diekema 2 1 Oregon State University.
Planning Value of Planning What to consider when planning a lesson Learning Performance Structure of a Lesson Plan.
The TeachEngineering Digital Library Collection: An Engineering Resource for K-12 Jacquelyn Sullivan and Mindy Zarske— University of Colorado at Boulder.
Recognizing User Interest and Document Value from Reading and Organizing Activities in Document Triage Rajiv Badi, Soonil Bae, J. Michael Moore, Konstantinos.
Educational Standard Assignment: Some Findings Working with CAT & SAT NSDL 2010 Annual Meeting René Reitsma 1, Anne Diekema 2 Byron Marshall 1, Trevor.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Planning, Instruction, and Technology
Stronge Teacher Effectiveness Performance Evaluation System
 A set of objectives or student learning outcomes for a course or a set of courses.  Specifies the set of concepts and skills that the student must.
Assessing the Curriculum Gary L. Cates, Ph.D., N.C.S.P.
Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa,
Resources for Teaching Teachers Earth Science Content and Pedagogy The Association for Science Teacher Education Rusty Low Shelley Olds January 2006.
Research Writing and Scientific Literature
CIITS Institute: Creating District Curriculum & Materials Copyright © 2012 Schoolnet, Inc. All rights reserved.
Tie Into Practice Technology Integration Example: A Research Paper Website Jennifer Jarvis and Connie Keating.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Domain 1: Preparation and Planning. ElementUnsatisfactoryBasicProficientDistinguished Knowledge of content and the structure of the discipline In planning.
Inquiry-based Learning and Digital Libraries in Undergraduate Science Education Xornam Apedoe Learning & Instruction University of San Francisco November.
Standards-Based Science Instruction. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
NTeQ: Designing an Integrated Lesson
Stronge Teacher Effectiveness Performance Evaluation System
Student Classroom Engagement in 4 th to 12 th Grade Christi Bergin, Ze Wang, David Bergin, Rebecca Bryant, & Renee Jamroz University of Missouri American.
The Instructional Decision-Making Process 1 hour presentation.
Knowledge of Subject Matter OCPS Alternative Certification Program.
Welcome to the State of the STEM School Address National Inventor’s Hall of Fame ® School Center for Science, Technology, Engineering and Mathematics (STEM)
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Standards or Semantics for Curriculum Search? Byron Marshall and René Reitsma Oregon State University Martha N. Cyr Worcester Polytechnic Institute JCDL.
November 2006 Copyright © 2006 Mississippi Department of Education 1 Where are We? Where do we want to be?
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Construct-Centered Design (CCD) What is CCD? Adaptation of aspects of learning-goals-driven design (Krajcik, McNeill, & Reiser, 2007) and evidence- centered.
Oregon Department of Education Office of Assessment and Accountability Jim Leigh and Rachel Aazzerah Mathematics and Science Assessment Specialists Office.
Selected Teaching-Learning Terms: Working Definitions...
Teaching Philosophy and Teaching Portfolio
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
1 Instructional Practices Task Group Chicago Meeting Progress Report April 20, 2007.
InAction!.   PBL is curriculum fueled and standards based.  PBL asks a question or poses a problem that ALL students can answer. Concrete, hands-on.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Introduction to STEM Integrating Science, Technology, Engineering, and Math.
Towards a Reference Quality Model for Digital Libraries Maristella Agosti Nicola Ferro Edward A. Fox Marcos André Gonçalves Bárbara Lagoeiro Moreira.
Module Requirements and Components NCSLP Summer Institute 2002.
Educator Effectiveness Summit School District’s Recommendation for the School Year.
Model Curriculum Units in Literacy and Humanities Strengthening curriculum, instruction, and assessment 2012 Curriculum Summit November 13 and 14, 2012.
You Can’t Afford to be Late!
Teaching with Data: Context and Resources Sean Fox, SERC Carleton College.
Measuring Mathematics Self Efficacy of students at the beginning of their Higher Education Studies With the TransMaths group BCME Manchester Maria.
1 Far West Teacher Center Network - NYS Teaching Standards: Your Path to Highly Effective Teaching 2013 Far West Teacher Center Network Teaching is the.
Defining & Aligning Local Curriculum. What is Curriculum? Individually consider your personal definition of the term curriculum What words do you think.
ELA FELLOWS Convening #4. SWITCHING LENSES As a learner, how did I manage my learning? As a teacher, how have I helped my students make meaning? As a.
Asking the Right K-12 Questions How to Answer Them to Evaluate K-12 STEM Outreach and Engagement Carlos Rodriguez, Ph.D., Principal Research Scientist.
Research Questions  What is the nature of the distribution of assignment quality dimensions of rigor, knowledge construction, and relevance in Math and.
How can the examination of student/educator work be supported as a routine for continuous improvement?
1 Using DLESE: Finding Resources to Enhance Teaching Shelley Olds Holly Devaul 11 July 2004.
Inquiry-Based Instruction
You Can’t Afford to be Late!
UbD: Goals for the Session
Are Your Educational Programs Learning-Centered? Can You Measure This?
Lecture 4: Approaches to Data Collection
General Education Assessment
EPAS Educational Planning and Assessment System By: Cindy Beals
Assessing Academic Programs at IPFW
Presentation transcript:

JCDL 2008 Exploring Educational Standard Alignment: In Search of ‘Relevance’ René Reitsma*, Byron Marshall* Michael Dalton*, Martha Cyr *Oregon State University Worcester Polytechnic Institute

JCDL 2008 Exploring Educational Standard Alignment: In Search of ‘Relevance’ Problem: aligning DL learning objects with educational standards –Need & tantalizing promise –National Science Digital Library (NSDL) efforts & accomplishments –Early results show low Inter-Rater Reliability (IRR) Hypothesis: low IRR is partially a methodological artifact Proposal: multifactor concept of ‘alignment’ Experiment: 10-factor alignment model –High IRR –Four factor regression model (R=.75) of ‘overall’ alignment

Aligning DL Learning Objects with Educational Standards Expanding DL learning resource base; e.g., –National Science Digital Library (NSDL): 928 collections –K-12: TeachEngineering.org, TeachersDomain.org, Engineering is Elementary, etc. –NSF-GK-12 program (ongoing). ≈84,500 math, science & technology standards (changing frequently) JCDL 2008

Curriculum Standard Alignment Efforts NSDL leadership: –Jes&Co: Achievement Standards Network (ASN) –Center for Natural Language Processing (CNLP): Curriculum Alignment Tool (CAT) Standard Alignment Tool (SAT) –WGBH Teachers’ Domain: standard alignment & lexicon Others: –Academic Benchmarks –AAAS/NSDL Strandmap server –Etc. JCDL 2008

NSDL-based Curriculum Alignment Services JCDL 2008

How Good are these Alignments? JCDL 2008

Low Inter-Rater Reliability (IRR) Devaul, H., Diekema, A.R., Ostwald, J. (2007) Bar-Ilan, J. Keenoy, K., Yaari, E., Levene, M. (2007) “There is no average user, and even if the users have the same basic knowledge of a topic, they evaluate information in their own context…” Hypothesis: Low IRR is partially a methodological artifact –Alignment is a multifactor, multidimensional concept –Learning objects may align with certain dimensions but not with others –Levins, R, Lewontin, R.C. (1980): “Abstraction becomes destructive when the abstract becomes reified… so that the abstract descriptions are taken for descriptions of the actual objects” JCDL 2008

Dimensions of Alignment Seracevic, T. (2007): ‘ Relevance: A Review of the Literature and a Framework for Thinking on the Notion in Information Science. Part II: Nature and Manifestations of Relevance ’ JCDL 2008 “Clues”Our(!) Mapping to Educational DLs ContentTopics & concepts ObjectCost, learning object type, formatting ValidityTrustworthiness Use/Situational matchGrade level, institutional requirements; e.g., testing procedures, professional development Cognitive matchTeacher qualifications, pedagogy Belief /Affective matchEmotional response

Hypotheses H: One-dimensional alignment/relevance IRR is partially a methodological artifact. H-1: At least some dimensional IRRs will be high(er) H-2: Dimensional IRR will vary H-3: ‘Overall alignment/relevance’ IRR will be low, even when asked in the context of dimensional relevance testing. JCDL 2008

Experiment ‘Clue’AlignmentStatement Affective matchR-1 AppealThe document contains materials that are motivational or stimulating (interesting, appealing or engaging) for students ContentR-2 ConceptsThe document includes concepts, keywords, terms and definitions from the standard ContentR-3 BackgroundThe document provides interesting and important background material related to the standard ObjectR-4 Grade levelThe grade level of this material is appropriate for this task or else I can easily adapt the materials in this document to my grade level Situational matchR-5 NontextualsI can use (a) nontextual component(s) ; e.g., figures, tables, images, videos or graphics Situational matchR-6 ExamplesI can use the real-world examples provided in the document in class. Situational matchR-7 Hands-onI can use one or more of the hands-on, active engineering activities Situational matchR-8 AttachmentsI can use some of the attachments; e.g., score sheets, rubrics, test questions, etc. Situational matchR-9 ReferencesI can use references or Internet links to relevant materials elsewhere R-10 overall relevanceOverall, I consider this document relevant for this teaching assignment JCDL 2008

Experiment Cont.’d 14 Subjects all familiar with the TeachEngineering system Two teaching tasks: –“As a third grade Massachusetts teacher you are assigned to teach material related to the standard “Relate earthquakes, volcanic activity, mountain building, and tectonic uplift to plate movements.” You have two hours of class time to spend on instruction.” Judge the alignment of three curricular objects (R-1 – R-10, six-point Likert scale) JCDL 2008 IRR-1: both subjects score on the same side of the scale; i.e., both score either ‘strongly agree,’ ‘agree,’ or ‘somewhat agree,’ or both score ‘somewhat disagree,’ ‘disagree,’ or ‘strongly disagree.’ IRR-2: same as IRR-1 except that answers may not differ with more than one scale point. IRR-3: both subjects answer the question identically

Results 91 IRR comparisons × 10 alignment dimensions × 6 alignments H-1: IRRs are relatively high; IRR-1 (binary): 64%-95% H-2: IRR variability H-3: Overall relevance (R-10) among the weaker ones JCDL 2008

How about Overall Relevance (R-10)? r R-1R-2R-3R-4R-5R-6R-7R-8R-9R-10 R-1 Appeal 1.0 R-2 Concepts R-3 Background R-4 Grade R-5 Nontextuals R-6 Examples R-7 Hands-on R-8 Attachments R-9 References R-10 Overall JCDL 2008

MLR Model of Overall Relevance (R-10) JCDL 2008 R 2 =.75 ‘Overall alignment/relevance’ is meaningful as a complex variable. Some high IRR alignment dimensions do not contribute to overall alignment. βStd. ErrorT-valuep Intercept R-2 Concepts <.01 R-3 Background R-4 Grade level <.01 R-7 Hands on

Conclusion JCDL 2008 K-12 educational DL content is expanding; educational standard alignment is needed. Innovative and promising resources are available but reported IRR of assessment of alignments is low. Propose that ‘Alignment’ is a complex concept: –Recognize alignment dimensions –Experiment suggests that dimension-specific IRR will be (much) higher –‘Overall’ alignment has a very specific interpretation. What do we need: –Continued assessment and IRR collection –Collections making their assessment data available. –Alignment methods that can assimilate ‘evidence’ from the multiple dimensions that comprise ‘alignment’ of a learning resource with a teaching standard.