Download presentation
Presentation is loading. Please wait.
Published byMarylou Parks Modified over 9 years ago
1
Measuring the Ephemeral: Evaluation of Informal STEM Learning Experiences Broader Impacts Infrastructure Summit Arlington, VA April 16 th -18 th, 2014 Images courtesy of ISE PI Meeting 2012 attendees From left to right: Geoffrey Haines-Stiles; Mohini Patel Glanz, NWABR; Scot Osterweil; April Luehmann
2
Jamie Bell, Trevor Nesbit, Kalie Sacco, Grace Troxel (Association of Science-Technology Centers) John Falk (Oregon State University, Free-Choice Learning Program) Kevin Crowley (University of Pittsburgh Center for Learning in Out-of-School Environments) Kirsten Ellenbogen (Great Lakes Science Center) Sue Ellen McCann (KQED Public Media)
3
caise Convene, Connect, Characterize, Communicate Broader Impacts/ISE Practice- and- Research Evaluation Capacity Building informal science.org NSF AISL Program Citizen Science Cyber & Gaming Media (TV, Radio, Film) Museums & science centers Zoos Botanical Aquaria Festivals, cafes, events Youth & Community Programs ^CAISE major initiatives for 2012-2015^
4
Framework for Evaluating Impacts of Informal Science Projects (2008)
5
Surrounded by Science: Learning Science in Informal Environments National Academies Press, 2010 Learning Science in Informal Environments: People, Places and Pursuits National Academies Press, 2009
6
Learning Science in Informal Environments (2009) Six Learning Strands: 1.Developing interest 2.Understanding scientific knowledge 3.Engaging in scientific reasoning 4.Reflecting on science 5.Engaging in scientific practices 1.Identifying with the scientific enterprise
7
Taking Science to School: Learning and Teaching Science in Grades K-8 (2007) K-8 School science learning strands: 1.Blank 2.Understanding scientific explanations 3.Generating scientific evidence, explanations and arguments 4.Reflecting on how science knowledge is produced and used in society 5.Participating in the practices of science—specialized talk, disciplinary tool use, representations
8
Informal learning environments are complex. Challenges: Cannot separate single experience Experimental design is impractical Opportunities: Allow wide range of outcomes Naturally learner-driven Inspires new methods and approaches Many different types of learning experiences and interactions can take place within a single exhibition. Image attribution: How People Make Things Summative Evaluation Camellia Sanford, University of Pittsburgh, 2009 Accessed March 20 th, 2014: http://informalscience.org/evaluation/ic-000-000-003- 205/
9
Informal learning is collaborative and social. Challenges: Individual assessment can be difficult Opportunities: Helps us better understand how people learn High school students collaborate with an ecologist to study aquatic ecosystems. Image attribution: Making Natural Connections: An Authentic Field Research Collaboration Accessed March 20 th, 2014: http://informalscience.org/projects/ic-000-000-001-050
10
LIFTOFF Evaluation Framework Data Collection: # of programs offering STEM and the intensity of those experiences Youth Impacts: attitudes, motivation, identity Professional Impacts: attitudes, demand, confidence Program Quality Impacts: frequency and intensity
11
12 Dimensions of Success Features of the Learning Environment OrganizationMaterials Space Utilization Activity Engagement Participation Purposeful Activities Engagement with STEM STEM Knowledge & Practices STEM Content Learning InquiryReflection Youth Development in STEM RelationshipsRelevanceYouth Voice www.PEARweb.org
12
Resources for Evaluation: The PI’s Guide to Managing Evaluation InformalScience.org/evaluation/evaluation-resources/pi-guide
13
Resources for Evaluation: Evaluations on InformalScience.org InformalScience.org/evaluation
14
Building Informal Science Education (BISE) Project: Characterizing evaluations on InformalScience.org 275 250 225 200 175 125 100 75 50 25 0 150 300 Interview Survey Timing and tracking Observation Focus groupArtifact reviewJournalsWeb analytics Recorded conversations Participation data Didn’t describe method Card sortProfessional critique Drawings Other Comment cards and books Interactive methods Concept map Delphi technique 275275 261261 128128 7373 6969 40 14 11 10 9 88 6 6 5 5 3 1 Frequency of data collection methods used in the BISE synthesis reports (n=427)
15
Informal STEM Education Assessment Projects Advancing Technology Fluency (PI: Brigid Barron, Stanford University) Developing, Validating, and Implementing Situated Evaluation Instruments (DEVISE) (PI: Rick Bonney, Cornell University) Common Instrument (PI: Gil Noam, Harvard University) Framework for Observing and Categorizing Instructional Strategies (FOCIS) (PI: Robert Tai, University of Virginia) Science Learning Activation Lab (PI: Rena Dorph, University of California, Berkeley) SYNERGIES (PI: John Falk, Oregon State University Learn more at: http://informalscience.org/perspectives/blog/updates-from-the-field- meeting-on-assessment-in-informal-science-education
16
What are We Measuring? This Wordle represents constructs measured by six informal STEM education assessment projects.
17
facebook.com/informalscience @informalscience Learn more… Jamie Bell, Project Director & PI of CAISE jbell@astc.org InformalScience.org
18
InformalScience.org/about/informal-science-education/for-scientists
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.