Download presentation
Presentation is loading. Please wait.
Published byYesenia Howey Modified over 9 years ago
1
2008 Library Assessment Conference, Seattle, WA Using iSkills to measure instructional efficacy: One example from the University of Central Florida Penny Beile University of Central Florida
2
Background Institutional description SACS Quality Enhancement Plan Accreditation driven initiative Four programs initially selected Multiple assessments
3
A (very brief) Comment on Types of Direct Measures… ObjectiveInterpretive Costs$$ to purchaseLabor to score AdministrationLarge scaleSmaller numbers ResultsWide and thinNarrow and deep DomainKnowledgePerformance
4
Methods - Nursing Goal is to collect baseline data, design curricular interventions, reassess to evaluate instructional efficacy Nursing students matriculate as a cohort, ~120 students enter each semester for BSN program Analyze at cohort level and across cohorts
5
Program EntryProgram Exit Cohort 1 Baseline, no intervention, Design instruction to target deficiencies (2007) Maturation, possibly control for that later (non- instructional variables) (2009) Cohort 2Intervention effect (2008)Growth in program (2010) Cohort 3Intervention effect (2009) Growth in program (2011) Cohort 4Intervention effect (2010)Growth in program (2012) 2007-2012 Plan
6
Program EntryProgram Exit Cohort 1 Baseline, no intervention, Design instruction to target deficiencies (2007) Maturation, possibly control for that later (non- instructional variables) (2009) Cohort 2Intervention effect (2008)Growth in program (2010) Cohort 3Intervention effect (2009) Growth in program (2011) Cohort 4Intervention effect (2010)Growth in program (2012) Still Early in the Project
7
Results of 2007 Administration 114 students in class, 107 completed iSkills Scores ranged from 485 to 625, m=561.36, sd=29.94 Established cut score of 575
8
Results on 7 Dimensions Access, m=64.27, sd=16.84 Communicate, m=55.05, sd=21.49 *Create, m=51.87, sd=23.74 Define, m=59.07, sd=20.21 Evaluate, m=65.38, sd=18.53 Integrate, m=59.89, sd=25.98 *Manage, m=49.30, sd=29.03
9
iSkills Correlations SAT (r(96)=.443, p<.01 Course grade (r(113)=.251, p<.01 GPA (r(113)=.035, p=.72
10
How Data are Being Used To identify where instruction is needed Over time, to assess efficacy of interventions and instructional models To provide evidence that we are meeting our instructional goals
11
Implications for Practice Assessment offers a critical and objective way to see how effective we are in meeting our instructional goals –Do libraries contribute to the academic mission of the institution? –How effective are our current models? –Lead us to explore new frameworks
12
2008 Library Assessment Conference, Seattle, WA Using iSkills to Measure Instructional Efficacy: The CSU Experience Stephanie Brasley California State University, Office of the Chancellor sbrasley@calstate.edu
13
Background- Calif. State University 23 Campuses Information Competence (IC) Pioneers Sponsoring Partner with ETS on iSkills Assessment IC Grant Program, iSkills Focus, 2006-2008 –9 Campuses Snapshot of Use –California Maritime Academy – Small campus –CSU Los Angeles – Medium-Sized campus –San Jose State – Large campus
14
California Maritime Academy (CMA) - Approach Mindy Drake – mdrake@csum.edu Advanced test used as pre-test Goal: Baseline Data and Current skills-set Test Groups: –Freshmen in Com 100 and Engr 120 151 Tested; 137 analyzed: 57% of incoming frosh –Seniors in capstone courses 80 tested; 49 analyzed: 32% of Senior population
15
CMA - Deliverables –Information Fluency and Communication Literacy Learning Objectives –Rubric for assessing the development of information and communication technology skills within course assignments –Modified COM 100 & ENG 120 assignments and supplemental materials –Syllabus and iSkills-influenced learning objectives of the newly developed LIB 100: Information Fluency in the Digital World course
16
CMA – Summary Results iSkills data used in 4 ways –Development of learning objectives –Baseline for ICT Literacy of incoming freshmen –Determining ICT Literacy skill-set of current seniors –Catalyst for innovation in design of ICT literacy instructional activities for freshmen
17
CMA – More Results
18
CSU Los Angeles – Approach Catherine Haras - charas@calstatela.edu Advanced test used as pre-post test Goal: Evaluate ICT literacy-related instructional interventions Target Group: 234 students enrolled in Business Communications (Juniors and Seniors) –Approx. 60% transfer students and 70% ESL students Study run over three quarters (Fall 2006, Winter 2007, Spring 2007)
19
CSU Los Angeles Study Methodology Treatment (day) Instructor A Business 305 Curriculum 1.5 hr Library Lecture Two Library Workshops Information Literacy project Treatment (evening) Instructor B Business 305 Curriculum 1.5 hr Library Lecture Two Library Workshops Control Instructor A Business 305 Curriculum 1.5 hr Library Lecture iSkills Pretest iSkills Posttest
20
CSU Los Angeles - Summary Results Treatment (day) Control Treatment (evening)
21
CSU Los Angeles – More Results Submitted June, 2008 – J. of Education for Business
22
San Jose State University Toby Matoush – tmatoush@sjsu.edu Advanced test used as a post test Goal: Determine gaps and develop instructional interventions Test Groups: –Freshmen: MUSE students (59); Eng. 1B (100) –Sophomores - Juniors
23
GradeNMeanStd. Deviation95% Confidence Interval for MeanMinimumMaximum Lower BoundUpper Bound 10th grade 2555.0021.213364.41745.59540570 12 grade 4520.0048.305443.14596.86480590 Freshman 154554.7731.645549.73559.81455620 Sophomore 93547.9026.696542.41553.40465615 Junior 192548.1532.345543.55552.76470615 Senior 149548.5632.290543.33553.78475620 Grad 4580.0044.347509.43650.57525625 Other 5544.0029.240507.69580.31515575 Total 603549.9231.623547.39552.45455625
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.