Download presentation
Presentation is loading. Please wait.
1
Did We Succeed? Assessing a New Instruction Initiative for a Revamped Physics Lab
Carrie Leatherman, Western Michigan University ALA Annual Conference 2014 Hello, Everyone. Thank you for coming today. My name is Carrie Leatherman and I am the Natural Sciences Librarian at Western Michigan University. My presentation today is titled “Did We Succeed?: Assessing a New Instruction Initiative for a Revamped Physics Lab”.
2
Overview Developed assessment plan for new instruction initiative
Why initiative was needed Instruction and assessment plan Results of the assessment Assessment challenges Let me start by giving you an overview of what I’m going to be talking about today. I will be describing how I developed an assessment plan for a new information literacy instruction initiative for a physics class at my home institution, WMU. First I will explain why this new instruction initiative was needed Then, I will briefly describe instruction initiative, including how the assessment of the initiative was planned for from the very beginning of this project Next, I will give you the results of the assessment, which will answer the question ‘”did we succeed?”. Last, I will talk about some of the “assessment challenges” I encountered when assessing this particular initiative, and how I handled them
3
Why the initiative was needed
Overview of PHYS 3100 lab New learning outcomes for lab Problems first semester Students lacked information literacy skills Additional learning objective needed First, let me tell you about why this new information literacy instruction initiative was needed The initiative was for a physics lab called PHYS 3100: Intro to Modern Physics PHYS 3100 is the third of a three-semester series of intro physics courses required for physics majors, physics education majors, aeronautics majors, and some engineering majors. Typically 20 students take PHYS 3100 each semester At the beginning of 2012, the instructor of the related physics lecture (PHYS 3090) wanted to revamp the lab so students would be better prepared for more advanced physics labs in the curriculum, and so students would have a better understanding of how to do lab research in the professional world. The instructor wanted to do this because there was anecdotal evidence that by the time physics majors reached their capstone class in senior year, they didn’t have the skills to do physics laboratory work. So the lab instructors developed a new learning outcomes for the lab Goal of old labs was to demonstrate of physical laws discussed in lectures But the learning outcome of the revamped lab was that students will be able to practice the Scientific Method by investigating the problem in the lab before the lab session, preparing and testing a hypothesis, deriving results and conclusions, and communicating those results in a lab report in order to have the skills to do physics laboratory work But, the first semester they tried the new approach, the students struggled. The students didn’t know how to do background research needed to investigate the problem before the lab. Professor asked me for help. I suggested information literacy instruction. So we added another learning objective to the revamped lab: Students will be able to search for background information relevant to their labs in order to understand the physics concepts tested in the lab and to perform the lab successfully
4
Instruction and assessment plan
Learning outcomes for instruction Students will be able to : Identify key concepts …in their lab in order to form research questions Select keywords…in order to use them as search terms in background information sources Identify the subdiscipline…being investigated in a lab in order to select an appropriate textbook from course reserves Understand that research is cyclical…in order to not give up too soon when searching in background info sources… Instruction more effective with assessment So with those lab learning outcomes in mind, I identified four Learning outcomes for the information literacy instruction: Students will be able to : Identify key concepts being investigated in their lab in order to form research question(s) Select keywords from their research questions in order to use them as search terms in background information sources Identify the subdiscipline of physics being investigated in a lab in order to select the appropriate textbook from their class’s course reserves Understand that research is cyclical not linear in order to not give up too soon when searching in background info sources! I let the instructor know that in order for the instruction to be more effective in the long term, we really ought to assess the effect the information literacy instruction had on the students I asked that the instructor require the students cite their background information sources in their pre-lab write-ups so I could eventually analyze what sources they were using before and after the info lit instruction The instructor agreed, and made it a requirement
5
Assessment Plan Assessment tools Students’ background info sources
Rubric to quantify citation quality Student interviews So to assess what effect the info lit instruction had on the students, I used three tools: The background information sources the students’ cited in their pre-lab write-ups. I was trying to see if they were using more authoritative sources link the scholarly, online science encyclopedias linked on the class LibGuide A rubric to quantify the quality or appropriateness of the background information sources they used And, in-person interviews during which I asked the students… Rubric to rate the background information quality rated five components of the info source: (1) Format, (2) Literary Content, (3) Author type, (4) Editorial Process, and (5) Purpose Rubric was adopted from a study by Leeder, Markey, and Yakel (2012) at the University of Michigan (Full citation: Leeder, C., Markey, K., & Yakel, E.,. (2012). A faceted taxonomy for rating student bibliographies in an online information literacy game. College & Research Libraries, 73(2), )
6
Information Literacy Instruction
IL instruction in beginning of semester Class-specific LibGuide Textbooks on reserve Follow-up visit So, the actual information literacy instruction for the class had several components: In-person instruction during the second week of lab 22 textbooks on course reserves that explained, in detail, various physics concepts being examined in the labs A class LibGuide ( that included: Links to several online science encyclopedias and short Jing videos on how to use them Links to our course reserves web site, and short videos on how to use course reserves A follow-up visit to the lab in the tenth week of class, so see how using background information sources was going, what questions did they have During the in-person instruction, I: Explained why using published reference works was more appropriate than using labs you find online, Wikipedia, or other general sources Demonstrated how to take a lab title/topic, form a research question, select keywords, search for them in sources Showed how to effectively use the online reference sources linked from the class LibGuide Emphasized that research is cyclical not linear – they first search they do in an encyclopedia may not find anything helpful, but it may give them info on what other terms to search for
7
Assessment Plan Interview questions:
Used techniques from instruction session? If not, how did you find info? Changes to session? Advice for future students? I asked the students…. Did you use the techniques from the instruction session to find background information sources? If not, how did you find background information? Are there any changes to the instruction sessions you would recommend? What advice about doing background research for this lab would you give to students taking the class in the future?
8
Post-instruction, Pre-follow up
Results - Citations Information source format Pre-instruction Post-instruction, Pre-follow up Post-follow up Blog 3% 1% 0% Course material (non-WMU) 26% 8% 16% Course material (WMU) 10% 12% 32% Encyclopedias (non-WMU, reliable) 13% 6% Encyclopedias (Wikipedia) Encyclopedias (WMU) 28% Informational video (Author: academic professional) 2% Monograph (includes textbooks) 21% 24% Promotional materials (non-WMU) Promotional materials (WMU) Scholarly journal 5% 4% Other Undetermined Pre-instruction (1/1/2014 thru 1/22/2014 ); 1 pre-lab write-up handed in reports from 16 students, 2 did not cite sources Post-instruction, pre-follow up (1/22/2014 thru 3/19/2014); 4 pre-lab write-up handed in reports from 16 students, 2 students did not cite sources in 3 lab reports Post-follow up (3/19/2014 thru 4/30/2014); reports from 12 students, 0 did not cite sources For future article, need to… Make some of the rubric facets more granular – distinguish between WMU course materials and non-WMU materials in rubric, determine what to do with some web sites that are in grey areas Further analyze the patterns in citations, apply rubric ratings to get a numerical value for pre- and post-instruction citations Note if monographs are course reserves, textbooks in WMU holdings, or other textbooks
9
Results – Student Interviews
Three of 24 students interviewed Observations from interviews: Liked the online science encyclopedias Didn’t use course reserves Liked in-person instruction, wouldn’t watch videos And, here are the results of the student interviews: The students said they did use the sources they were told about in the instruction session, like scholarly online science encyclopedias But they also said they still used Google searches, Wikipedia, lab manuals they found online because sometimes the more scholarly sources assume a level of knowledge that they simply don’t have yet. They almost need background sources to better understand the content of their background sources. Even though these three students didn’t use course reserves because it was too time consuming to come to the library to use them. One student that said she came to the library a lot still didn’t use course reserves. But asides from what these three students said, 14 of the 21 books on reserve were checked out a total of 40 times. They said they like the in-person instruction and wouldn’t watch online videos even if it contain similar content – having someone in front of them made them to pay attention, and they probably wouldn’t watch the videos on their own if it wasn’t required of them
10
Did We Succeed? Preliminary results: Yes!
Achieved learning outcomes Session was useful Complete the assessment cycle Apply results of assessment Start another cycle So based on these results, did we succeed? I’d say yes. The students did use the more scholarly background information sources shown to them in the instruction sessions Also, the interviews indicated that the students thought the instruction was useful My next steps will be to complete the assessment cycle By applying the results and adjusting my instruction for the fall 2014 PHYS 3100 lab. Probably by reconsider course reserves. Should we scan some of the textbook chapters and make those into e-reserves materials? Then I will be starting another assessment cycle in the fall by Providing information literacy instruction for the fall 2014 class and assessing how that instruction goes. Preliminary plans to do some sort of information literacy skills assessment of physics students in the senior capstone research class
11
Assessment Challenges
“Data inferiority complex” Choosing collaborators Communication breakdowns Mid-course adjustments Don’t give up! Now to switch gears a bit. I want to briefly talk about some of the challenges I encountered when assessing this initiative. I think these challenges are universal enough that most librarians doing assessment may encounter them “Data inferiority complex” This happens when you don’t feel that they data you have or could gather is enough to assess a library program or initiative. Do let this prevent you from trying to assess your initiative The assessment I just told you about was not idea – there was no control or experimental groups, there was no elaborate data gathering. I simply used data that was part of the course assignments, and I still was able to get a snapshot of what effect the instruction had on the students information literacy skills Choosing collaborators -- How do I find or pick someone to work with if I assess an initiative? Aim for “low-hanging fruit”. In my case I was fortunate enough to be able to work with instructors that were open to assessing this initiative. If I hadn’t been able to persuade them that assessment would benefit them and their students, I would probably tried it with a different class. Maybe somewhere down the road this instructor would have changed their mind Communication breakdowns The project I just told you about was not my first attempt at assessment for this class. The first semester I tried this, I had arranged for the professor to give me the students’ citations at the end of the semester. When I asked him if I could have the citations, his response was “what citations?” Obviously, communicating had broken down. I tried again another semester and it worked out. Mid-course-adjustments I thought I was going to collaborate long-term with the professor I just mentioned in the example above, but it turned out that this wasn’t a good match because he had s many other commitments. So I began talking with the physics department lab manager. It turns out he is in the physics education graduate program and is very open to the type of instruction and assessment I’m doing. I’ve begun collaborating with him, and it’s been working really well. Take away message – when you’re doing assessment, there probably will be challenges, But don’t give up!
12
Thank you! Carrie Leatherman carrie.leatherman@wmich.edu
Okay. Thank you. What questions do you have.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.