The Portals (Coolfont) Workshop decided that: DLESE will be a two-level collection: –The unreviewed collection: a broad collection of content which is.

Slides:



Advertisements
Similar presentations
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Advertisements

NIET Teacher Evaluation Process
Design and Development of Pathways to Inquiry MERLOT, August 7-10, 2007 New Orleans, Louisiana Yiping Lou, Pamela Blanchard, Jaime Carnaggio & Srividya.
Teacher Evaluation Model
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
Project Management Methodology Procurement management.
Set a course for teacher effectiveness.. Effectiveness Ahead. Extensive, customizable and usable software to improve teacher effectiveness.
Workshop on Quality/Selectivity of the DLESE Collections Framing the Question History of the Discussion Kim Kastens, June 30, 2003.
INTERPRET MARKETING INFORMATION TO TEST HYPOTHESES AND/OR TO RESOLVE ISSUES. INDICATOR 3.05.
What is DLESE (part 1) Shelley Olds University Corporation for Atmospheric Research DLESE Program Center July 17 – 22, Resources.
What is DLESE (part 3) Shelley Olds University Corporation for Atmospheric Research DLESE Program Center July 17 – 22, Resources.
Digital library for Earth System Education Shelley Olds University Corporation for Atmospheric Research DLESE Program Center July 17 – 22,
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Creating a Course Portfolio Office of Faculty and Instructional Development and the College of Education Qatar University.
Assessing and Evaluating Learning
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
Resources for Teaching Teachers Earth Science Content and Pedagogy The Association for Science Teacher Education Rusty Low Shelley Olds January 2006.
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
For each of the Climate Literacy and Energy Literacy Principles, a dedicated page on the CLEAN website summarizes the relevant scientific concepts and.
The Digital Library for Earth System Education: A Community Resource
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Interim Assessments: Do You Know What You Are Buying and Why? Scott Marion, Center for Assessment Imagine: Mathematics Assessment for Learning A Convening.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Network of Communities: Synergy Through Common Formats, Reuse, and Models for Contribution Cathy Manduca, Sean Fox, Bruce Mason representing SERC, comPADRE,
CLIMATE LITERACY AND ENERGY AWARENESS NETWORK PATHWAY CLIMATE LITERACY AND ENERGY AWARENESS NETWORK PATHWAY SUMMARY OF A RIGOROUS REVIEW PROCESS CLN Webinar.
Chapter 4 Survey Designs Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.
February 28, 2008The Teaching Center, Washington University The Teaching Citation Program & Creating a Teaching Portfolio Beth Fisher, Ph.D. Assistant.
Creating and Operating a Digital Library for Information and Learning– the GROW Project Muniram Budhu Department of Civil Engineering & Engineering Mechanics.
Eight Months of APS Wikipedia Initiative Rosta Farzan & Robert Kraut Human Computer Interaction Institute Carnegie Mellon University.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
1 Improving Data Quality. COURSE DESCRIPTION Introduction to Data Quality- Course Outline.
Quantitative and Qualitative Approaches
The Digital Library for Earth System Science: Contributing resources and collections Meeting with GLOBE 5/29/03 Holly Devaul.
EPotential ICT Capabilities Resource. The ePotential ICT Capabilities Resource (ePotential) is designed to: Assist teachers to develop their own ICT Professional.
Publish and Disseminate Your Earth Science Activities on the Web The Digital Library for Earth System Education and The Geological Society of America.
Educable Mental Retardation as a Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Developing and Maintaining a User Community Todd Swarthout Economic Science Laboratory University of Arizona.
Finding Partners, Creating Impact Rusty Low Poles Together Workshop NOAA Boulder, CO July 20-22, 2005.
The Digital Library for Earth System Science: Contributing resources and collections GCCS Internship Orientation Holly Devaul 19 June 2003.
SP_IRS Introduction to Research in Special and Inclusive Education(Autumn 2015) Lecture 1: Introduction Lecturer: Mr. S. Kumar.
1 Understanding Cataloging with DLESE Metadata Karon Kelly Katy Ginger Holly Devaul
Lead. inspire. teach. Welcome! iTeachAZ Co-Teaching & Professionalism.
This work is supported with NSF-NSDL funding (DUE ). Refining MARGINS Mini-Lessons Using Classroom Observations Ellen Iverson, Cathryn A Manduca,
Chapter 14: Affective Assessment
1 Prof. Christopher P. Manfredi Provost & Vice-Principal (Academic) 8 October 2015 Tenure at McGill Information Session.
Input into DLESE Collections Suggest a Resource Begin quality discussion Threshold criteria Is it ESS Does it work Supply metadata Create metadata Review.
Teaching with Data: Context and Resources Sean Fox, SERC Carleton College.
1 Annotation Framework March Terminology CV - abbreviation for controlled vocabulary CRS - Community Review System (a collection within DLESE)
Standards-Based Teacher Education Continuous Assessment of Teacher Education Candidates.
Providing Feedback on Apprentice Teachers. University Supervisors are responsible for formally evaluating Apprentice Teachers. However, feedback from.
8 Principles of Effective Documentation.
Tenure and Recontracting August 29, 2017
PAc-17 Sabbatical Leave of Absence
Evaluation of Research Methods
Current Resource Accessioning (updated )
ASSESSMENT OF STUDENT LEARNING
Digital library for Earth System Education Teaching Boxes
Principles of Effective Documentation
Tenure and Recontracting February 7, 2018
Tenure and Recontracting August 27, 2018
Tenure and Recontracting February 6, 2018
Tenure and Recontracting October 6, 2017
What Makes a Good K-12 Resource
Monitoring Children’s Progress
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
Using Data Workshops etc.
Academically/Intellectually Gifted
Tenure and Recontracting February 26, 2019
Presentation transcript:

The Portals (Coolfont) Workshop decided that: DLESE will be a two-level collection: –The unreviewed collection: a broad collection of content which is relevant to Earth System education and meets minimum quality and technical standards – The reviewed collection: a subset of high quality teaching and learning materials which have been rigorously evaluated

The rationale for having a reviewed collection: –For the user: guaranteed high-quality resources, even for a teacher without expertise in the field or time to “comparison shop” – For the creator: inclusion in the reviewed section of DLESE can become a recognized stamp of professional approval The rationale for having an unreviewed collection: –For the user: access to a wider range of teaching and learning resources –For the library builders: a pool from which to select the reviewed collection

OK, so how do we decide what goes into the reviewed collection?

From the Portals (Coolfont) workshop: Selection Criteria: –Accuracy, as evaluated by scientists –Importance/significance –Pedagogical Effectiveness –Well-documented –Ease of use for students and faculty –Motivational/inspirational for students –Robustness/Sustainability

Familiar review procedures: “ Traditional peer review” “Traditional educational evaluation”

Traditional “Peer- Review” Reviewers are selected for their expertise by an editor. Reviewers examine the material, or a description of the material, in their home or office. Typically two reviews.

What’s wrong with this picture? Traditional “Peer-Review”

There are no students in this picture! Traditional “Peer-Review”

“Traditional Educational Evaluation” Evaluator (reviewer) is selected by the developer. Evaluator observes students in another teacher’s classroom and/or administers evaluation instruments Typically one evaluator, several classes of students.

What’s wrong with this picture? “Traditional Educational Evaluation”

Evaluation by independent professional evaluators is labor-intensive and expensive! “Traditional Educational Evaluation”

Community Review Concept Premises The materials in the “inner-circle” of reviewed, DLESE-stamp-of- approval-bearing resources must be classroom-tested. –However, testimony from the creator of a resource that learning has occurred in his or her classroom is insufficient. –It is not realistic to pay for professional evaluators to go into classrooms to evaluate whether student learning has occurred for every potential DLESE resource. –Experienced educators can tell whether or not their own students are learning effectively from an educational resource. –It is easier to answer: “Did your students learn?” than “Do you think students would learn?”

Community Review Concept Premises (cont’d) In order to be useful, DLESE has to contain lots of resources. Therefore it must grow fast. In the DLESE ecosystem, teachers, classrooms and students will be abundant resources. The rate-limiting resources in DLESE’s growth will be money, and the time of paid librarians/editors/gatekeepers and computer wizards. This is a digital library; we can and should take advantage of automated information gathering

Community Review Concept Procedure

Community Review Concept Procedures (cont’d) What happens to the questionnaire information? –Creator receives all feedback from “YES” and “NO” respondents. –Builders of Discovery System receive feedback from “NO” respondents. –Suggestions typed in the teaching tips field are added to the Teachers’ section of the resource. –“Editor” or “gate-keeper” is automatically notified and receives full packet of reviews when number of complete reviews exceeds N and the average, or weighted average, of the numerical scores exceeds Y.

Community Review Concept Procedure

Community Review Concept Strengths Inclusive: The community builds the library. Scalable: Hundreds or thousands of resources can be classroom-tested. Thorough: All seven Coolfont/Portals selection criteria are applied. Economical: Scarce talents are applied at the end of the process, to the smallest number of items.

Community Review Concept Issues How do we get educators to send in their reviews? How do we ensure that reviews come from bona fide educators? Would creators “spin” the review process by soliciting reviews from their friends? Would the merely-good early arrival tend to keep out the truly excellent later arrival? Some topics are inherently less inspirational/motivational than others; how do we avoid filtering out resources on such topics? What about off-the-wall, or erroneous, or malicious reviews?

How can I become part of DLESE? … as a resource creator/contributor … as a user … as a reviewer/tester

Continue the conversation at: or