Watch it or Read it Understanding Undergraduate Engineering Students’ Learning Effectiveness and Preference for Video Tutorials versus Guide-on-the-Side.

Slides:



Advertisements
Similar presentations
Whiteboard Content Sharing Audio Video PollsRecordingMeet Now Skype Integration MS Lync 2013 Tools & Tips for facilitators… Limitations Alternatives One.
Advertisements

Agent-Based Architecture for Intelligence and Collaboration in Virtual Learning Environments Punyanuch Borwarnginn 5 August 2013.
Refresher Instruction Guide Strategic Planning and Assessment Module
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
ASHLYN H. SAVAGE, MD, MSCR ANGELA R. DEMPSEY, MD, MPH MEDICAL UNIVERSITY OF SOUTH CAROLINA Randomized trial comparing initiation of Bedsider.org contraceptive.
Computer Skills By Ian Cole Lecturer in C&IT (Communications and Information Technology) University of York Department of Health Sciences Presentation.
Digital Library Resources Instructional Design (5100:631) Dr. Savery April 27, 2010.
R U There? Looking for those Teaching Moments in Chat Transcripts Frances Devlin, John Stratton and Lea Currie University of Kansas ALA Annual Conference.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
“Retrospective vs. concurrent think-aloud protocols: usability testing of an online library catalogue.” Presented by: Aram Saponjyan & Elie Boutros.
AUTHENTICATION TASK FORCE NEEDS ASSESSMENT PRESENTATION OF RESEARCH PRESENTED TO THE MASSACHUSETTS BOARD OF LIBRARY COMMISSIONERS (MBLC) SUBMITTED BY Anne.
An online information literacy program: the case of a Greek academic library Ilias Nitsos, Aphrodite Malliari Library, Alexander Technological Educational.
Business and Management Research
Welcome to the Southeastern Louisiana University’s Online Employment Site Applicant Tutorial!
Evaluation and analysis of the application of interactive digital resources in a blended-learning methodology for a computer networks subject F.A. Candelas,
Santa, I want books, a computer, … and the ability to travel through time and space. Easy! Here is your SJSU library card.
Seeking and providing assistance while learning to use information systems Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Sep. 16, 2009 Babin, L.M.,
 Prototype for Course on Web Security ETEC 550.  Huge topic covering both system/network architecture and programming techniques.  Identified lack.
Comparing the Effectiveness of Alternative Approaches for Displaying Edit-Error Messages in Web Forms Bill Mockovak Office of Survey Methods Research Bureau.
Web Site Usability Study John Gottfried Spring 2008.
The Required Library Component: Assessing First-Year Teaching in the Small Academic Library Susan von Daum Tholl, PhD, Director Diane Zydlewski, Head of.
GRADE 9-10 FSA ELA READING SESSION 1 2 INSTRUCTIONS Today, you are going to take Session 1 of the Grade ___ Florida Standards Assessments English Language.
Presented by ESC 7 Advanced Academic Services. Click on Set up new account and follow the directions. Return to this page to log in and register for.
Document 3: Website Focus Group Discussions A Summary of Our Findings.
Anytime, any place, anywhere Yvonne Nobis, Head of Science Information Services.
NACEP Conference 2004 Capturing them in the Web: Using Web- Based Surveys to Increase Response Rates for College Freshman Debbie.
Assessment of Information Literacy Presented by Touro College Libraries Sara Tabaei, Information Literacy Director Bashe Simon, Director of Touro Libraries.
Understanding User Goals in Web Search University of Seoul Computer Science Database Lab. Min Mi-young.
What works online. What do you like online? List the things that make it easy for you when you are online.
University of Alberta Elluminate Trial Introduction.
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
LIBRARY INDUCTION MOVES FROM STAGE TO SCREEN: Adapting our performance for smarter teaching delivery Damian J. J. Farnell (School of Dentistry), Erica.
KIMBERLY BABCOCK MASHEK INFORMATION LITERACY LIBRARIAN WARTBURG COLLEGE Library Technology Conference March 17 th, 2010 Getting Your Library Users Involved.
Univeristy of Tennessee Knoxville Science Journals and Science Students: Bringing Them Together Dr. Carol Tenopir University of Tennessee
Measuring the Power of Learning.™ 2015–16 CAASPP March, 2016 Laurie Carlson California Assessment of Student Performance and Progress (CAASPP)
How can I use a digital library to support my teaching? Find good resources to enhance existing curriculum  Search special collections aimed at your interests.
PowerPoint Presentation Guide
SLA San Diego Fall Seminar Round Table Discussion
Claudia J. Dold USF Libraries, Tampa
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
Interpreting, Editing, and Communicating MISO Survey Results
Usability Testing and Web Design
Chapter 6. Data Collection in a Wizard-of-Oz Experiment in Reinforcement Learning for Adaptive Dialogue Systems by: Rieser & Lemon. Course: Autonomous.
17th of April, 2015 Tiago Freire
Katherine Prentice, MSIS Richard Usatine, MD
Assessment feedback: More modality matters Michael Henderson,
The User Experience: Online and IRL
ENGINEERING A BRIDGE TO INFORMATION LITERACY
Business and Management Research
NDSA Online Assessment Training 2016
Navigating Through Canvas
Checking your NSCC The Green Group!
Athabasca University School of Computing and Information Science
Use of Zoomerang to Assess Interest in Nursing as a Career Choice
Navigating Through Canvas
Group: Chris, Daniel, Jeff, Kathy, Shirelle, Vincent
Study Island Student Demo:
Project 1 - Finding courses at UC Irvine Extension
Claudia J. Dold USF Libraries, Tampa
Redesigning the Archival Services’ Website with User Perspectives
Sarah Lucchesi Learning Services Librarian
Business and Management Research
Blackboard Tutorial (Student)
Mike Timms and Cathleen Kennedy University of California, Berkeley
ENDANGERED ANIMALS A RESEARCH PROJECT
NEPf-Aligned Student Perception Survey Implementation
Electronic Outreach Services for Library Users
Using Science Notebooks as tools for teaching
PowerPoint Presentation Guide
Presentation transcript:

Watch it or Read it Understanding Undergraduate Engineering Students’ Learning Effectiveness and Preference for Video Tutorials versus Guide-on-the-Side Tutorials Good afternoon, everyone. I’m Marina Zhang, Engineering & Informatics Librarian from University of Iowa Engineering Library. In this presentation, I’ll introduce a research project that my co-author Kari Kozak and I worked on. ASEE Annual Conference - Engineering Libraries Division June 26, 2017 Columbus, OH Marina Zhang & Kari Kozak Lichtenberger Engineering Library The University of Iowa

Introduction Video tutorials and Guide-on-the-Side tutorials As we know, video tutorials and text-and-image tutorials are widely used for teaching database searching skills in many academic libraries. Since last summer, our university libraries has adapted a web application called Guide on the Side which can assist users to navigate a live database by providing instruction and activities on the left-hand of the screen. We also hear good feedback from our colleagues who use Guide on the Side. We think Guide on the Side might be a good solution to increase the usage of online tutorials on our library website. Considering that some of our video tutorials are often used but some of them are not, we see the necessity of conducting a usability study on different tutorial formats before transforming existing tutorials to Guide on the Side tutorials at large.

Research Questions Determine the effectiveness of video and Guide-on- the-Side tutorials and which tutorial format is more effective. Discover the students’ preference for which tutorial format. There are two research questions in this study. One is to determine the effectiveness of tutorials and which tutorial format is more effective. The other is to discover the students’ preference for which tutorial format.

Methodology IRB Approval #: 201606746 Participants Study Design 31 undergraduate engineering students Study Design Three tasks on Compendex database searching Emailing citations Finding a controlled term for artificial reality Designing a search term about computer, computing, computational etc. using wildcards/stemming Two tutorial formats with identical content Prior to the study, we consulted our institutional review board and received approval for using human participants. During the study, we recruited 31 undergraduate engineering students who had no database searching experience with Compendex. We designed three tasks on Compendex database search including [read through the bullet points]. For tutorial formats, we chose video tutorials and Guide-on-the-Side tutorials. Video tutorials were created first and each video was less than 3 minutes. Guide-on-the-Side tutorials were created based on video tutorials with the same information.

Study Design (cont’d) Tasks Tutorial Formats 6 possible combinations of tasks and tutorial formats without considering the orders Tasks Emailing citations Finding a controlled term for artificial reality Designing a search term using wildcards or stemming Tutorial Formats Video Tutorial Guide on the Side tutorial No Instruction Each participant would do all three tasks. In order to assure that each task and tutorial format condition was equally presented to participants, we counterbalanced each combination of tasks and tutorial formats. First of all, we matched each task with each tutorial format and there would be 6 possible combinations. [click mouse] For example, one combination could be emailing citations with a video tutorial, finding a controlled term with Guide-on-the-Side and designing a search term using wildcards/stemming with no instruction.

Study Design (cont’d) Tasks Tutorial Formats A total of 36 possible combinations of tasks and tutorial formats when considering the orders Tasks Emailing citations Finding a controlled term for artificial reality Designing a search term using wildcards or stemming Tutorial Formats Video Tutorial Guide on the Side tutorial No Instruction Another matching could be emailing citations with Guide on the Side, finding a controlled term with no instruction, and designing a search term using wildcards or stemming with a video tutorial. We also changed the order of tasks presented to participants. Since each type of combination have 6 sequences, there would be a total of 36 combinations.

Study Design P1 P2 P3 Examples of Combinations Finding a controlled term (Guide-on-the-Side) Designing a search term using wildcards or stemming (no instruction) Emailing citations (video tutorial) P2 Designing a search term using wildcards or stemming (video tutorial) Emailing citations (no instruction) P3 Finding a controlled term (video tutorial) Designing a search term using wildcards or stemming (Guide-on-the-Side) Each participant would be randomly assigned to one of the combinations. [click mouse twice] For example, Participant No. 1 would do a controlled term with the help of Guide-on-the-Side, do wildcards/stemming with no instruction and email the citation with the help of a video tutorial. [click mouse twice] Participant No. 2 would have the same tutorial formats but complete tasks in different order. [click mouse twice] Participant No. 3 would experience tasks in the same order but with different tutorial formats.

Methodology Procedure Consent Letter Tasks Post-Test Survey The test took place in a conference room with a computer and a big monitor and we allowed only one participant to visit at one time. Each participant was instructed to read a consent letter, do the tasks and then fill out a post-test survey. At the end of the visit, the participant would get a 5 dollar gift card.

All participants successfully completed the test. Results & Discussion

Correctness and time spent on tasks Correctness for wildcards and stemming: Kruskal-Wallis H test with Mann-Whitney U test (post-hoc test) detected significant differences for the control versus video group (p < 0.001) and the control versus Guide- on-the-Side group (p < 0.001). Participants’ performance was measured by correctness and the time spent on each task. Data points collected from participants were re-grouped by tasks and tutorial formats. The green bar represents no instruction, the orange bar represents video tutorial and the blue bar represents Guide on the Side. [Click mouse] For the task of wildcards and stemming, the correctness in the video group and Guide-on-the-Side group are significantly higher than the control. This finding indicates that tutorials effectively helped solving the task. Some participants in the control said that they did not understand wildcards and stemming. Only one participant in the control figured it out because he googled “wildcards stemming” and located a community college library guide. Although other participants attempted to google it, they still could not get it.

Rate easy or difficult it was to understand concepts presented in each tutorial A post-test survey is used to evaluate participants’ perceptions on tasks and tutorial formats. First of all, we asked if it was easy or difficult to understand concepts in each tutorial. We removed two invalid data points for finding a controlled term with Guide on the Side, because one participant did not go through the Guide-on-the-Side and the other participant skipped this question in the survey. Most participants in both video or Guide on the Side group thought it easy or somewhat easy to understand. [Click mouse] But half of the participants who viewed Guide on the Side for controlled terms thought it a little difficult to follow. No significant differences were observed.

How appropriate was the length of the tutorial? When we asked if each tutorial was appropriate in length, [click mouse] for emailing citations, 45% of participants who viewed the video thought it too long but all participants who viewed Guide-on-the-Side thought it just about right. [double click mouse] For controlled terms, 82% of participants who viewed the video thought it just about right. However some participants who viewed the Guide-on-the-Side thought it too long. [double click mouse] As to wildcards and stemming, 33% of participants in the video group thought it too short but most participants in the Guide-on-the-Side group felt it just about right. No significant differences were observed.

Overall satisfaction with each tutorial Most participants were satisfied or somewhat satisfied with the tutorials in any formats. This result is consistent with the result of the ease of understanding the concepts presented in different tutorial formats. No significant differences were found.

Perception of the ease or difficulty to complete each task Task of emailing citations: Significant differences for the control versus video tutorial group (p = 0.010) and the control versus Guide-on-the-Side group (p = 0.020). In addition, we asked if it’s easy or difficult to complete each task. [click mouse] For emailing citations, participants in either video group or the Guide-on-the-Side group thought it much easier than the control. This indicates that both the video and Guide on the Side tutorials effectively lowered the difficulty of the task. This result is also consistent with the correct rates.

Perception of the ease or difficulty to complete each task Task of emailing citations: Significant differences for the control versus video tutorial group (p = 0.010) and the control versus Guide-on-the-Side group (p = 0.020). We checked our observation notes to see how participants in the control group failed. We found that they opened the “share this record” pop-up window but could not find the “email record” feature.

Conclusion Both video tutorials and Guide-on-the-Side tutorials were effective. Participants in video group or Guide-on-the-Side group perceived the task of emailing citations much easier than control group. Correctness of designing a search term using wildcards or stemming in video group and Guide-on-the-Side group were significantly higher than control group. No significant differences between the two tutorial formats Participants preferred Guide-on-the-Side tutorials (58%) over video tutorials (32%). We conclude that both video tutorials and Guide-on-the-Side tutorials effectively helped the undergraduate engineering students learn database searching. [Read through the bullet points under “Both video tutorials…”] But we do not find any no strong evidence for any differences between the two tutorial formats. When we asked participants’ preference for tutorial formats, 58% of participants preferred Guide-on-the-Side tutorials over video tutorials.

Future Research Redesign tasks Examine correlations between categorized content and tutorial formats Examine correlations between English proficiency and students’ preference for different tutorial formats To improve this study, we will redesign tasks because the high correct rates in control group for finding a controlled term reveals inappropriate task design. Also, we will look into a better way of doing initial screening process because in this study we had to remove data points collected from one undergraduate who had prior knowledge of database searching but we did not know until she completed the test. Next, we would like to continue examining the effectiveness and preference for different tutorial formats by doing a similar study, but analyzing correlations between categorized content and tutorial formats. As the College of Engineering has a large population of international students, we would also like to see if there is a correlation between English proficiency and students’ preference for different tutorial formats.

Acknowledgement Alyssa Grigsby, Digital Resources & Technical Services Librarian, Buena Vista University Student Workers at Lichtenberger Engineering Library, The University of Iowa Last but not the least, I’d like to thank Alyssa Grigsby who was an intern at our library three years ago for creating these video tutorials. Also, I’d like to thank our student workers for taking a pre-test for our study.

Selected References Mikkelsen, S. and E. McMunn-Tetangco, Guide on the Side: Testing the Tool and the Tutorials. Internet Reference Services Quarterly, 2014. 19(3-4): p. 271- 282. Mestre, L.S., Student Preference for Tutorial Design: A Usability Study. Reference Services Review, 2012. 40(2): p. 258-276. Turner, B., C. Fuchs, and A. Todman, Static vs. Dynamic Tutorials: Applying Usability Principles to Evaluate Online Point-of-Need Instruction. Information Technology & Libraries, 2015. 34(4): p. 30-54. Sachs, D.E., et al., Assessing the Effectiveness of Online Information Literacy Tutorials for Millennial Undergraduates. College & Undergraduate Libraries, 2013. 20(3/4): p. 327-351. Mery, Y., et al., Evaluating the Effectiveness of Tools for Online Database Instruction. Communications in Information Literacy, 2014. 8(1): p. 70-81.

Thank You!