Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Portals (Coolfont) Workshop decided that: DLESE will be a two-level collection: –The unreviewed collection: a broad collection of content which is.

Similar presentations


Presentation on theme: "The Portals (Coolfont) Workshop decided that: DLESE will be a two-level collection: –The unreviewed collection: a broad collection of content which is."— Presentation transcript:

1

2 The Portals (Coolfont) Workshop decided that: DLESE will be a two-level collection: –The unreviewed collection: a broad collection of content which is relevant to Earth System education and meets minimum quality and technical standards – The reviewed collection: a subset of high quality teaching and learning materials which have been rigorously evaluated

3 The rationale for having a reviewed collection: –For the user: guaranteed high-quality resources, even for a teacher without expertise in the field or time to “comparison shop” – For the creator: inclusion in the reviewed section of DLESE can become a recognized stamp of professional approval The rationale for having an unreviewed collection: –For the user: access to a wider range of teaching and learning resources –For the library builders: a pool from which to select the reviewed collection

4 OK, so how do we decide what goes into the reviewed collection?

5 From the Portals (Coolfont) workshop: Selection Criteria: –Accuracy, as evaluated by scientists –Importance/significance –Pedagogical Effectiveness –Well-documented –Ease of use for students and faculty –Motivational/inspirational for students –Robustness/Sustainability

6 Familiar review procedures: “ Traditional peer review” “Traditional educational evaluation”

7 Traditional “Peer- Review” Reviewers are selected for their expertise by an editor. Reviewers examine the material, or a description of the material, in their home or office. Typically two reviews.

8 What’s wrong with this picture? Traditional “Peer-Review”

9 There are no students in this picture! Traditional “Peer-Review”

10 “Traditional Educational Evaluation” Evaluator (reviewer) is selected by the developer. Evaluator observes students in another teacher’s classroom and/or administers evaluation instruments Typically one evaluator, several classes of students.

11 What’s wrong with this picture? “Traditional Educational Evaluation”

12 Evaluation by independent professional evaluators is labor-intensive and expensive! “Traditional Educational Evaluation”

13 Community Review Concept Premises The materials in the “inner-circle” of reviewed, DLESE-stamp-of- approval-bearing resources must be classroom-tested. –However, testimony from the creator of a resource that learning has occurred in his or her classroom is insufficient. –It is not realistic to pay for professional evaluators to go into classrooms to evaluate whether student learning has occurred for every potential DLESE resource. –Experienced educators can tell whether or not their own students are learning effectively from an educational resource. –It is easier to answer: “Did your students learn?” than “Do you think students would learn?”

14

15 Community Review Concept Premises (cont’d) In order to be useful, DLESE has to contain lots of resources. Therefore it must grow fast. In the DLESE ecosystem, teachers, classrooms and students will be abundant resources. The rate-limiting resources in DLESE’s growth will be money, and the time of paid librarians/editors/gatekeepers and computer wizards. This is a digital library; we can and should take advantage of automated information gathering

16

17 Community Review Concept Procedure

18

19

20 Community Review Concept Procedures (cont’d) What happens to the questionnaire information? –Creator receives all feedback from “YES” and “NO” respondents. –Builders of Discovery System receive feedback from “NO” respondents. –Suggestions typed in the teaching tips field are added to the Teachers’ section of the resource. –“Editor” or “gate-keeper” is automatically notified and receives full packet of reviews when number of complete reviews exceeds N and the average, or weighted average, of the numerical scores exceeds Y.

21 Community Review Concept Procedure

22 Community Review Concept Strengths Inclusive: The community builds the library. Scalable: Hundreds or thousands of resources can be classroom-tested. Thorough: All seven Coolfont/Portals selection criteria are applied. Economical: Scarce talents are applied at the end of the process, to the smallest number of items.

23 Community Review Concept Issues How do we get educators to send in their reviews? How do we ensure that reviews come from bona fide educators? Would creators “spin” the review process by soliciting reviews from their friends? Would the merely-good early arrival tend to keep out the truly excellent later arrival? Some topics are inherently less inspirational/motivational than others; how do we avoid filtering out resources on such topics? What about off-the-wall, or erroneous, or malicious reviews?

24

25 How can I become part of DLESE? … as a resource creator/contributor … as a user … as a reviewer/tester

26 Continue the conversation at: collections@dlese.org or http://www.ldeo.columbia.edu/dlese/collections


Download ppt "The Portals (Coolfont) Workshop decided that: DLESE will be a two-level collection: –The unreviewed collection: a broad collection of content which is."

Similar presentations


Ads by Google