2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny Beile University of Central Florida
Background Institutional description SACS Quality Enhancement Plan Accreditation driven initiative Four programs initially selected Multiple assessments
A (very brief) Comment on Types of Direct Measures… ObjectiveInterpretive Costs$$ to purchaseLabor to score AdministrationLarge scaleSmaller numbers ResultsWide and thinNarrow and deep DomainKnowledgePerformance
Methods - Nursing Goal is to collect baseline data, design curricular interventions, reassess to evaluate instructional efficacy Nursing students matriculate as a cohort, ~120 students enter each semester for BSN program Analyze at cohort level and across cohorts
Program EntryProgram Exit Cohort 1 Baseline, no intervention, Design instruction to target deficiencies (2007) Maturation, possibly control for that later (non- instructional variables) (2009) Cohort 2Intervention effect (2008)Growth in program (2010) Cohort 3Intervention effect (2009) Growth in program (2011) Cohort 4Intervention effect (2010)Growth in program (2012) Plan
Program EntryProgram Exit Cohort 1 Baseline, no intervention, Design instruction to target deficiencies (2007) Maturation, possibly control for that later (non- instructional variables) (2009) Cohort 2Intervention effect (2008)Growth in program (2010) Cohort 3Intervention effect (2009) Growth in program (2011) Cohort 4Intervention effect (2010)Growth in program (2012) Still Early in the Project
Results of 2007 Administration 114 students in class, 107 completed iSkills Scores ranged from 485 to 625, m=561.36, sd=29.94 Established cut score of 575
Results on 7 Dimensions Access, m=64.27, sd=16.84 Communicate, m=55.05, sd=21.49 *Create, m=51.87, sd=23.74 Define, m=59.07, sd=20.21 Evaluate, m=65.38, sd=18.53 Integrate, m=59.89, sd=25.98 *Manage, m=49.30, sd=29.03
iSkills Correlations SAT (r(96)=.443, p<.01 Course grade (r(113)=.251, p<.01 GPA (r(113)=.035, p=.72
How Data are Being Used To identify where instruction is needed Over time, to assess efficacy of interventions and instructional models To provide evidence that we are meeting our instructional goals
Implications for Practice Assessment offers a critical and objective way to see how effective we are in meeting our instructional goals –Do libraries contribute to the academic mission of the institution? –How effective are our current models? –Lead us to explore new frameworks
2008 Library Assessment Conference, Seattle, WA Using iSkills to Measure Instructional Efficacy: The CSU Experience Stephanie Brasley California State University, Office of the Chancellor
Background- Calif. State University 23 Campuses Information Competence (IC) Pioneers Sponsoring Partner with ETS on iSkills Assessment IC Grant Program, iSkills Focus, –9 Campuses Snapshot of Use –California Maritime Academy – Small campus –CSU Los Angeles – Medium-Sized campus –San Jose State – Large campus
California Maritime Academy (CMA) - Approach Mindy Drake – Advanced test used as pre-test Goal: Baseline Data and Current skills-set Test Groups: –Freshmen in Com 100 and Engr Tested; 137 analyzed: 57% of incoming frosh –Seniors in capstone courses 80 tested; 49 analyzed: 32% of Senior population
CMA - Deliverables –Information Fluency and Communication Literacy Learning Objectives –Rubric for assessing the development of information and communication technology skills within course assignments –Modified COM 100 & ENG 120 assignments and supplemental materials –Syllabus and iSkills-influenced learning objectives of the newly developed LIB 100: Information Fluency in the Digital World course
CMA – Summary Results iSkills data used in 4 ways –Development of learning objectives –Baseline for ICT Literacy of incoming freshmen –Determining ICT Literacy skill-set of current seniors –Catalyst for innovation in design of ICT literacy instructional activities for freshmen
CMA – More Results
CSU Los Angeles – Approach Catherine Haras - Advanced test used as pre-post test Goal: Evaluate ICT literacy-related instructional interventions Target Group: 234 students enrolled in Business Communications (Juniors and Seniors) –Approx. 60% transfer students and 70% ESL students Study run over three quarters (Fall 2006, Winter 2007, Spring 2007)
CSU Los Angeles Study Methodology Treatment (day) Instructor A Business 305 Curriculum 1.5 hr Library Lecture Two Library Workshops Information Literacy project Treatment (evening) Instructor B Business 305 Curriculum 1.5 hr Library Lecture Two Library Workshops Control Instructor A Business 305 Curriculum 1.5 hr Library Lecture iSkills Pretest iSkills Posttest
CSU Los Angeles - Summary Results Treatment (day) Control Treatment (evening)
CSU Los Angeles – More Results Submitted June, 2008 – J. of Education for Business
San Jose State University Toby Matoush – Advanced test used as a post test Goal: Determine gaps and develop instructional interventions Test Groups: –Freshmen: MUSE students (59); Eng. 1B (100) –Sophomores - Juniors
GradeNMeanStd. Deviation95% Confidence Interval for MeanMinimumMaximum Lower BoundUpper Bound 10th grade grade Freshman Sophomore Junior Senior Grad Other Total