Download presentation
Presentation is loading. Please wait.
Published byScott Cobb Modified over 9 years ago
1
Achieving the Dream: Assessing Implementation CCPRO February 20, 2007
2
Focus on Student Success The Learning College a Reality Front Door Experience Culture of Evidence Themes of Achieving the Dream at GTCC
3
Culture of Evidence Data collection, publication, analysis, and use to guide change is part of the college’s culture. Data collection, publication, analysis, and use to guide change is part of the college’s culture.
4
Culture of Evidence: Tools Use what we have –Colleague, State Data Warehouse –Annual surveys –Performance Measures/Critical Success Factors –Departmental data, Program Review data –Existing Survey data (Faces of the Future) Ability to survey, conduct qualitative research, research literature Analytic capability Create custom database
5
Plan for Assessment/Evaluation from the beginning. Use data to prepare content of interventions. Determine how to define success. Define measures for that definition. Inventory assessment resources Collect data to identify cohorts involved in any intervention. Anticipate the need for control or comparison groups.
6
College GOALS College GOALS Graduation Rate of 20% Overall fall-to-fall retention rate at or above 60%. Developmental Education retention rate 55%. Improvement in gateway course success of 4% and 4% for all gender/ethnic groups.
7
Achieving the Dream supports: Front Door Experience (Intake, Advising, Initial Academic Experience) Professional Development Curriculum development Orientation Learning Communities Study Skills classes Mentoring Transfer Advising Center Realignment of Advising
8
Preparing Interventions Literature review Visit to other institutions Benchmarking Experience Existing data sources
9
Preparing Interventions Experience – What do we (faculty and staff) want them to know? What do we want to happen because they participated? Orientation: Eng 112 Class project “What do you wish you had known?” What do other colleges do? Should we have criteria for participation? How can we capture identifying information?
10
Cohort: Orientation Student Identifier(s) Characteristics for Comparison: –Gender –Ethnicity –Age –New/Previous College –Motivation? Measures of success
11
Orientation Persistence Rates Persisted to Spring 2006 Total(N) Did NOT Attend Orientation 114864.31784 Attended Orientation 70578.7896 Total185369.12680
12
Orientation Persistence Rates
13
Orientation Attendance/Success Rates in English
14
Orientation Evaluation Results Fall, 2005 Evaluation: Satisfaction –Session I: 87.7% –Session II: 91.1% –Session III: 94.5% They: –Felt Welcomed –Thought staff were knowledgeable –Understood the registration process –Would encourage others to attend –Wanted to Register at the Same Time
15
Orientation Evaluation Results Spring, 2006 Evaluation: from satisfaction to asking “What did they learn?” Need to add a tour More hands-on: Less material in large group, more in smaller groups
16
Study Skills Classes: ACA 111 Fall, 2006 –Requirement in Two Programs: Paralegal, Office Systems Technology Fall, 2007 –Three additional programs in Business Technologies Innovative design: –Front-load Gateway Courses –Infuse program-specific content
17
Gateway Course Success Rates with ACA Linked Classes Office Systems Technology Fall 2005 87% Fall 2005 87% Fall 2004 44% (comparison group) Fall 2004 44% (comparison group) Paralegal Fall 2005 94% Fall 2005 94% Fall 2004 60% Fall 2004 60% (comparison group) (comparison group)
19
Male Students
20
Female Students
21
Culture of Evidence What were the issues in “cleaning” these persistence data? What are the issues inherent in setting up a comparison or control group?
22
Linked Classes 1999-2004 Reading 090/English 111 Fast Track grant from the State for the purpose of identifying ways to get students through college faster. Success rates over nine semesters – 73% to100%, averaged 89%. 73% to100%, averaged 89%. Denied permission to continue.
23
Curriculum Innovations Split Math Class –94% success rate (traditional success rate 60%) –Now following through subsequent courses Anecdotal data Denied Permission to continue
24
Spring 2006 Transitions Learning Community English 111 88.9% retention; control 82.2% 83.3% success; control 60.7% success Psychology 150 88.9% completion; control 82.2% 72% success; control 52% success ACA 118 (College Success) 88.9% completion; control 75.6% 66.7% success; control 44.4% success
25
Fall 2006 Transitions Learning Community
26
Linked Classes Fall 2006
27
Mentoring Male Mentoring – Spring 2006: 22 Participants – Fall 2006: 21 Participants Female Mentoring –Fall 2006: 25 Participants
28
Mentoring Spring 2006: Retention/Success –Course Retention Rate: 81.4% –Overall Course Success Rate: 39.5%
29
Mentoring: Persistence & Success
30
Mentoring & Learning Communities What happens when you measure small numbers? What are some of the multiple factors that can cloud interpretation? What else can we do to get a truer and more useful interpretation of results?
31
Principles Start with the end in mind: gather the data that will help with assessment. Create the right cohort. Collect the data frequently, uniformly. Clean the data. Accumulate data over time to overcome small group bias. Need good picture of overall intervention – ask questions about conditions.
32
Principles Use your experience – Does it look right? Are there outliers? Why? Get others to react; let your ego go and consider what they say.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.