Measuring the Ephemeral: Evaluation of Informal STEM Learning Experiences Broader Impacts Infrastructure Summit Arlington, VA April 16 th -18 th, 2014 Images courtesy of ISE PI Meeting 2012 attendees From left to right: Geoffrey Haines-Stiles; Mohini Patel Glanz, NWABR; Scot Osterweil; April Luehmann
Jamie Bell, Trevor Nesbit, Kalie Sacco, Grace Troxel (Association of Science-Technology Centers) John Falk (Oregon State University, Free-Choice Learning Program) Kevin Crowley (University of Pittsburgh Center for Learning in Out-of-School Environments) Kirsten Ellenbogen (Great Lakes Science Center) Sue Ellen McCann (KQED Public Media)
caise Convene, Connect, Characterize, Communicate Broader Impacts/ISE Practice- and- Research Evaluation Capacity Building informal science.org NSF AISL Program Citizen Science Cyber & Gaming Media (TV, Radio, Film) Museums & science centers Zoos Botanical Aquaria Festivals, cafes, events Youth & Community Programs ^CAISE major initiatives for ^
Framework for Evaluating Impacts of Informal Science Projects (2008)
Surrounded by Science: Learning Science in Informal Environments National Academies Press, 2010 Learning Science in Informal Environments: People, Places and Pursuits National Academies Press, 2009
Learning Science in Informal Environments (2009) Six Learning Strands: 1.Developing interest 2.Understanding scientific knowledge 3.Engaging in scientific reasoning 4.Reflecting on science 5.Engaging in scientific practices 1.Identifying with the scientific enterprise
Taking Science to School: Learning and Teaching Science in Grades K-8 (2007) K-8 School science learning strands: 1.Blank 2.Understanding scientific explanations 3.Generating scientific evidence, explanations and arguments 4.Reflecting on how science knowledge is produced and used in society 5.Participating in the practices of science—specialized talk, disciplinary tool use, representations
Informal learning environments are complex. Challenges: Cannot separate single experience Experimental design is impractical Opportunities: Allow wide range of outcomes Naturally learner-driven Inspires new methods and approaches Many different types of learning experiences and interactions can take place within a single exhibition. Image attribution: How People Make Things Summative Evaluation Camellia Sanford, University of Pittsburgh, 2009 Accessed March 20 th, 2014: 205/
Informal learning is collaborative and social. Challenges: Individual assessment can be difficult Opportunities: Helps us better understand how people learn High school students collaborate with an ecologist to study aquatic ecosystems. Image attribution: Making Natural Connections: An Authentic Field Research Collaboration Accessed March 20 th, 2014:
LIFTOFF Evaluation Framework Data Collection: # of programs offering STEM and the intensity of those experiences Youth Impacts: attitudes, motivation, identity Professional Impacts: attitudes, demand, confidence Program Quality Impacts: frequency and intensity
12 Dimensions of Success Features of the Learning Environment OrganizationMaterials Space Utilization Activity Engagement Participation Purposeful Activities Engagement with STEM STEM Knowledge & Practices STEM Content Learning InquiryReflection Youth Development in STEM RelationshipsRelevanceYouth Voice
Resources for Evaluation: The PI’s Guide to Managing Evaluation InformalScience.org/evaluation/evaluation-resources/pi-guide
Resources for Evaluation: Evaluations on InformalScience.org InformalScience.org/evaluation
Building Informal Science Education (BISE) Project: Characterizing evaluations on InformalScience.org Interview Survey Timing and tracking Observation Focus groupArtifact reviewJournalsWeb analytics Recorded conversations Participation data Didn’t describe method Card sortProfessional critique Drawings Other Comment cards and books Interactive methods Concept map Delphi technique Frequency of data collection methods used in the BISE synthesis reports (n=427)
Informal STEM Education Assessment Projects Advancing Technology Fluency (PI: Brigid Barron, Stanford University) Developing, Validating, and Implementing Situated Evaluation Instruments (DEVISE) (PI: Rick Bonney, Cornell University) Common Instrument (PI: Gil Noam, Harvard University) Framework for Observing and Categorizing Instructional Strategies (FOCIS) (PI: Robert Tai, University of Virginia) Science Learning Activation Lab (PI: Rena Dorph, University of California, Berkeley) SYNERGIES (PI: John Falk, Oregon State University Learn more at: meeting-on-assessment-in-informal-science-education
What are We Measuring? This Wordle represents constructs measured by six informal STEM education assessment projects.
Learn more… Jamie Bell, Project Director & PI of CAISE InformalScience.org
InformalScience.org/about/informal-science-education/for-scientists