Effects of Targeted Troubleshooting Activities on

Slides:



Advertisements
Similar presentations
Standard 22A Curricular Structure HT Accredited Curriculum.
Advertisements

Group Work and Grading How should we assess individual learning? Ideas from Susan M. Brookhart and Kagan.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Research & Statistics Student Learning Assessment comparing Online vs. Live (Face to Face) Instruction.
Assessment Report Department of Psychology School of Science & Mathematics D. Abwender, Chair J. Witnauer, Assessment Coordinator Spring, 2013.
New Advanced Higher Subject Implementation Events Design and Manufacture: Advanced Higher Course Assessment.
Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
Evaluation of Multimedia Software and a Workbook Designed to Improve 3-D Spatial Skills of Engineering Students Sheryl A. Sorby & Thomas Drummer Michigan.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Instructional Strategies Teacher Knowledge, Understanding, and Abilities The online teacher knows and understands the techniques and applications of online.
Abstract Service Learning is a useful avenue in developing agency in college students, giving them the opportunity to interact with issues linking course.
Then Now  Teaching as Art  Teachers and Teaching  Great teachers are born  How did I do?  Scholarship informs Teaching  Culture of Unexamined assumptions.
Poster Print Size: This poster template is 36” high by 48” wide. It can be used to print any poster with a 3:4 aspect ratio. Placeholders: The various.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
Scott Elliot, SEG Measurement Gerry Bogatz, MarketingWorks
Using a Strategy Project to Promote Self-Regulated Learning
Module Example: Influence of Race and Gender on Income1
Theories of Language Acquisition
Evaluating Student-Teachers Using Student Outcomes
Does training in number knowledge improve arithmetic scores?
Team members: Erdenechimeg. T Erdenejargal. Ts Ariunzul. U
CRITICAL CORE: Straight Talk.
What makes our course engaging for our students?
How to be a Successful Student
Improving Student Engagement Through Audience Response Systems
Workshop for ART mentors
Motivation to Redesign
A WebQuest for a Very Large Principles of Economics Class
Stephen Hughes Assessment Coordinator October 18, 2016
Task 4 Mathematics Boot Camp Fall, 2016.
Principles of Learning
Evidence for gender bias in interpreting online professor ratings
Meredith A. Henry, M.S. Department of Psychology
How to be a Successful Student
ELT. General Supervision
Findings from Cardinal Ambrozic Grade Nine Math Survey
This presentation is designed to give you an overview of the features in your improved LLN Robot System and the functions each feature performs. For more.
Course Transformation:
Business and Management Research
Chapter Six Training Evaluation.
Your Inquiry Project
Teaching with Instructional Software
Foundation Degree IT Project
Office of Education Improvement and Innovation
Understanding the student journey – from pre-arrival to graduation
Performance Task Overview
EVAAS Overview.
Sequence comparison: Multiple testing correction
Engaging Students in Construction Safety Research
Ontario`s Mandated High School Community Service Program: Assessing Civic Engagement After Four Years S. D. Brown, S.M. Pancer, P. Padanyi, M. Baetz, J.
How to support those with SEND
ACUE Program at Salem State
Student Satisfaction Results
Business and Management Research
Functional Functions: Transfer of Math Proficiency in Physics and Chemistry Elizabeth Grotemeyer, Jennifer Delgado, Sarah Rush, Drew Vartia, Chris Fischer.
The Heart of Student Success
Erin M. Buchanan, Katherine D. Miller, Emily R. Klug, and the DOOM Lab
Assessment Literacy: Test Purpose and Use
Humble Independent School District Parent Information Guide
Unified Clinical Communication Workshop
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Can E-learning Replace the Traditional Classroom
Chaney et al.’s funhaler study (2004)
Building Better Classes
Curriculum Coordinator: Janet Parcell Mitchell January 2016
EDUC 2130 Quiz #10 W. Huitt.
An overview of course assessment
HLPs and EBPs in mathematics and science in clinical experiences
Writing competence based questions
Presentation transcript:

Effects of Targeted Troubleshooting Activities on Student Confidence In a Statistics Computer Lab Meredith A. Henry THE CHALLENGE CHANGES IN AVERAGE STUDENT CONFIDENCE ASSESSING CONFIDENCE *** Successful psychology students are competent conducting and reporting statistical analyses. But, students often struggle with/hold negative attitudes towards statistics and research design. (Mills, 2004) One reason may be lack of confidence/intimidation when confronted with analytical packages such as SAS (Statistical Analysis System). Previous experience teaching a graduate statistics lab also suggests: A) Students struggle with troubleshooting SAS code/analyses B) Students express preference for hands-on class activities The current study tested the effectiveness of a series of exercises, designed to give students experience troubleshooting the SAS program, in increasing student confidence across different domains of statistical skills. *** Pre-course survey: During the first class of the semester, students self-reported Overall confidence using the SAS program Confidence for skills related to conducting analyses (i.e., “Rate your confidence in using SAS to conduct dependent samples t-test.”) Confidence for skills related to troubleshooting analyses (i.e., “Rate your confidence troubleshooting issues related to SAS code for t-tests.”) Post-course survey: During the last course of the semester students Self-reported on all items again Rated 7 components of the course—Including “Make it Work” (MIW) exercises—from “most effective” to “least effective” Provided qualitative feedback on the MIW exercises Students became significantly more confident over the course of the semester. Repeated measures t-tests revealed that students in both years reported more confidence post-course than pre-course in all three domains: Overall confidence: t(12) = 7.90, p < .001; t(11) = 13.00, p <.001 Analysis: t(11) = 9.54, p < .001; t(11) = 11.46, p < .001 Troubleshooting: t(12) = 8.70, p < .001; t(11) = 7.93, p <.001 GROUP A vs. GROUP B COMPARISONS *** Topics covered in the course were divided into 2 groups: Group A topics had a MIW exercise assigned in both 2014 and 2015. Group B topics only had a MIW exercise assigned in 2015. In Fall 2014, students reported significantly more confidence gain for Group A topics than for Group B topics for both analytic [t(11) = 4.12, p < .005] and troubleshooting [t(12) = 2.50, p < .05] confidence. This suggested that the MIW exercises caused greater confidence gains. *** THE STUDENTS WHAT DID THE STUDENTS THINK? PY 716-L is the lab section of the first of a three-course graduate statistics sequence for students enrolled in UAB’s three psychology graduate programs. Student characteristics: Traditional course structure: Lecture gives students appropriate code to run a variety of analyses in SAS 9.3. Assignments ask students to modify code to successfully run, interpret, and report analyses A group project involves conceiving a study design, creating data, choosing analyses, running analyses, interpreting results, and reporting conclusions. Students had a positive opinion of the MIW exercises Across both semesters, 60% of students ranked MIW as one of the top three most useful aspects of the course. Qualitative Feedback: “I found them helpful because they featured common mistakes.” “Helpful—hands on experience with the code is always good.” “They were helpful; a few more would have been great.” [2014, emphasis added] “I think the more practice you get at picking up on those small errors can make a big difference.” “I like them. I thought they were very helpful in understanding the syntax via troubleshooting. I just think doing more of them would be better. Maybe start the class with that?” [2014, emphasis added] I liked the opportunity to work independently; I was able to process and retain the info better this way compared to during lecture. If so, we would expect no significant difference between Group A and Group B in Fall 2015. However, we saw the same pattern of results. Students still gained more confidence in both domains for Group A vs. Group B topics [t(11) = 4.19, p < .005; t(11) = 2.73, p <.05] In an attempt to explain this surprising finding, we looked back at the properties of the MIW exercises themselves.   Fall 2014 Fall 2015 Total students 13 14 Male 5 2 Beh’l-Neuroscience 3 4 Developmental Medical-Clinical 6 7 PRACTICED vs. UNPRACTICED SKILLS *** Statistics is a discipline that often “builds” upon itself Some SAS skills (e.g., entering data, running tests of normality, etc.), while their own topic of instruction, are also necessary parts of more advanced analyses. As such, some skills will appear in multiple lectures, assignments, and MIW exercises. We proposed that students would gain more confidence for these “practiced” skills vs. “unpracticed” skills. 2015 students did report significantly more confidence for “practiced “skills, t(11) = 3.68, p < .005. However, 2014 students reported no difference in confidence based on amount of “practice”, t(11) = 2.109, p = n.s. CONCLUSIONS By chance, many Group A topics were also “practiced” skills. Fall 2014 students were exposed to “practiced” skills multiple times in lectures, but didn’t have as many MIW opportunities for hands-on engagement. Fall 2015 students had MIW activities for all topics, but still ended up “practicing” some of them more. It is possible that the consistently higher gains in confidence for Group A skills, and the failure of Fall 2014 students to benefit from “practice” both reflect the same mechanism. We propose that neither the MIW exercises nor repeated exposure to SAS skills alone is sufficient to optimize gains in student confidence. Rather, targeted, hands-on activities like the MIW exercises, administered multiple times will lead to the greatest levels of student confidence in statistical skill. Briefly, 1) The MIW exercises are effective in increasing student confidence, 2) Greater confidence was associated with higher grades (r = 0.82, p = .001), and 3) student enjoy the MIW exercises. (see feedback above right) Thus, there is sufficient evidence supporting continued use & refinement of MIW exercises to improve the statistics lab. MAKE IT WORK EXERCISES LIMITATIONS & FUTURE DIRECTIONS Both sample sizes were small (N=13; N=14). This reduced power and made a control group unrealistic Graduate students may have different attitudes towards statistics, which may make them more receptive to MIW Future studies should attempt to look at larger and more diverse samples There should also be an attempt to more objectively study the possible MIW*practice interaction Selections of “bad code” given to students Designed to include “common” SAS errors Students’ task: 1) Use knowledge from lectures and error messages in the SAS log window to fix coding errors; 2) Interpret results of analyses once run; 3) Report results in appropriate written form Exercises assigned at the beginning of the class. Students have 10 minutes to complete them, then the whole class reviews and discusses. Contact Meredith A. Henry Email: mahenry@uab.edu Phone: (205) 612-5560