How Technologically Literate are EMCC Students ?
Technological Literacy A COMPARISON OF 2012 AND 2015 ASSESSMENTS SAAC REPORT 11/25/15
Purpose and Methodology Spring 2012: Technological Literacy assessment rubric developed Designed to assess student technological literacy across the College Spring 2015: Reassessed with same rubric
Purpose and Methodology Seven common “deliverable components” were assessed: Data management References Assignment content Communication Layout Embedded objects Conventions
What was Assessed? Data managing : was the respondent’s name properly displayed, was the course identified, and was the file saved in the correct format? References : were the appropriate number of references used, were all references listed in APA or MLA style, and were the references embedded into the document? Assignment content : was there evidence of effective research, did the student achieve the purpose of the assignment, and was the correct audience targeted? Communication : was the assignment successfully transmitted, and does the student provide evidence supporting the retrievability of the file Layout : did headings/subheadings reflect a logical structure, was there a consistent visual theme, were appropriate font color and size used, and were animations appropriately used? Embedded objects : did the deliverable contain the following types of objects: pictures, video, audio files, animation, hyperlinks, and other (per instructor)? Conventions : phrases are of appropriate length for the medium, punctuation is appropriate, grammar usage is appropriate, and spelling is accurate?
Participant Information Information Prefixes87 Sections1514 Instructors116 Students Class Level0 Dev Ed, 22% 100 level, 78% 200 level 1 Dev Ed, 28% 100 level, 68% 200 level Sophomores19% Sophomores47% sophomores Freshmen42% New Freshmen20% New Freshmen Lowest AreasEmbedded Objects, References
Overall Proficiency (scored at 3 or 4)
Overall Differences in the Means Aggregate score of the Technology Literacy assessment was high in both years 2012: 3.80 (on a 4-point scale)* 2015: 3.60 (on a 4-point scale)* * statistically significant level
COMPARING NEW FRESHMEN to SOPHOMORES 2015 RESULTS Mean scores for Sophomores were only statistically significantly different than Freshmen in Data Management. Freshmen: 3.83 (on a 4-point scale)* Sophomores: 3.98 (on a 4-point scale)* * statistically significant level
Analysis - Limitations Cannot extrapolate the results to represent the college population due to small sample size Differences in scoring methodologies between 2012 and Faculty did not identify specific deliverable components missing in 2012 (just if the total number was sufficient). This was corrected in 2015 but may compromise the ability to accurately compare results between the two assessments Proportional changes between new freshmen and sophomores further compromises the ability to accurately compare results between the two years With only one instructor responsible for almost half of all 2015 assessments, the final results cannot be extrapolated to be representative of the college at large Anecdotally three instructors (representing more than half the students sampled) indicated their grading rigor increased between 2012 and 2015
Improvements For Next Cycle Feedback from Leadership Council (11/25) Feedback from Classroom Conversations (12/2) Need to get more faculty involved Review the rigor of the assessment (is it too easy?) Inform the faculty that references, embedded objects were lowest areas of proficiency for two cycles in a row. Work with Instructional Computing faculty members for creating a tutorial video for how to embed objects into student work. Work with Writing Center to integrate appropriate format for references across the curriculum
Questions and Suggestions Thanks for your input!