Download presentation
Presentation is loading. Please wait.
Published byDoreen Hensley Modified over 9 years ago
1
Building an Information Community: IT and Research Working Together Responsive Evaluation in the Community College: An Alternative Approach to Evaluating Programs Nathan R. Durdella, PhD Monterey, California April 10, 2006
2
Presentation Overview Results: Project HOPE & MESA Findings & Conclusions Background, Design, & Methods
3
Background, Design, & Methods
4
Research Context and Problem Increasing institutional, accreditation requirements to document student outcomes Dominant model: systematic evaluation (Rossi, 1993) –Program objectives, outcomes Alternative evaluation models –Recently been used successfully (Shapiro, 1988) –Responsive evaluation
5
Evaluation Models: Systematic vs. Responsive Evaluation Stake’s problem with systematic evaluation: –Systematic evaluation’s narrow focus to assess program’s goals, measurements, and standards (Shaddish et al, 1991) –Systematic evaluations best suited for summative evaluations Responsive evaluation’s focus: –The primary purpose should be “to respond to audience requirements for information” (Guba, 1978, p. 34) –Process-oriented issues Program implementation –Stakeholder-based Locally-generated criteria
6
Stake’s Responsive Evaluation Responsive Evaluation’s prescriptive steps: 1.Program staff/participants “are identified and solicited for those claims” (Guba & Lincoln, 1989, p. 42) 2. Issues of program staff and participants are organized and brought to staff members for comment 3. Issues not resolved are used as “organizers for information collection” (Guba & Lincoln, 1989, p. 42) 4. The evaluator approaches each audience member with the evaluation results to resolve all issues
7
Research Questions Two research questions: 1.How effectively does responsive evaluation theory work as a way to evaluate instructional support programs? 2.How does responsive evaluation articulate with systematic evaluation approaches?
8
Research Design and Methods Design: Comparative, qualitative case study Data sources and Sampling: –Interviews and journals –2-step procedure: purposeful and random Case selection –Institutions Cerritos College & Santa Ana College = HSIs –Programs Project HOPE & MESA Data Collection –Interviews: 19 total subjects, 23 total interviews Per program: 3 students, 2 staff and 2 faculty, 2-3 administrators Program directors were interviewed 3 times
9
Results: Project HOPE & MESA
10
Results: Project HOPE 1.Faculty resisted cultural pedagogy Project HOPE faculty: “It’s a method of learning where you would approach you’re teaching looking at culture” “They don’t feel like it would have any impact on their students.” Faculty and administrators: “We need to serve all of our students equitably.” “Well we’re not really a minority any more.” 2. Campus did not value Project HOPE Project HOPE staff: “There are issues of, I’d say, with respect to this program and the college in general about the value of it, the need for it because I think there’s a prevailing thought that we do already all we can for students of color just by default because we have such a diverse student population to have programs like these.”
11
Results: Project HOPE Guidance Counseling “Well now I know exactly what am I supposed to be taking for every, every semester and everything.” Parent, family participation “[My mom] was telling my dad, ‘We have to do our taxes because they have to file.’ So now she knows what we’re talking about when we have to do our financial aid paperwork.” Health Occupations 100 as central “I definitely know I want to stay in in L.A. and really serve those communities in need.” Program communication, coordination “There was nothing said or nothing exchanged.” Lack of faculty buy-in, participation “The only things I ever hear is why aren’t we part of this.”
12
Results: MESA Program MESA staff: central to students “I know you really want to go, call me. If you can’t make it, call me. If you can’t come to class, tell me why. If you think you’re doing bad in class, just talk to me. We can work something out.” Major issue: Program impact –In general, MESA students outperform math/science, SAC students Successful program coordination “We have an organized system.”
13
Results: MESA Program Other emerging themes: –Student finances: book loans & more “I then use the money I saved to attend events sponsored by the Transfer Center.” –MESA Study Center “The MESA Study Center is a good place if one wants to share a friend’s company and eat lunch while one studies.” –Program focus: no parent participation “A big obstacle for me as well was that the lack of information available to my parents.” –Course scheduling, engineering “These classes are not offered every semester.”
14
Findings & Conclusions
15
Findings: Responsive Evaluation Ongoing programs, categorically funded or institutionalized Program staff: cooperation, participation Programs: challenges, underlying problems Program processes, improvement Programmatic or institutional need –Not solely program impact
16
Further Findings: Responsive Evaluation Politically charged context Personality and power conflicts –Project HOPE: preexisting –UC, well established MESA programs Responsiveness: no assurance model responds to all stakeholders –Identification, development of issues
17
Findings: Responsive & Systematic Models Models articulate well –Project HOPE: prior evaluations vs. responsive evaluation –MESA: program impact Results meaningful –Project HOPE: new “face” But, reinforce perceptions –MESA: few surprises but useful Student voices
18
Findings: Responsive Evaluator Balance between encouraging participation and maintaining control –Stakeholder-based models Initial phases: conditions present to conduct evaluation Key: understanding programs as insider while maintaining checks Presentation of results: critical
19
Conclusions: Responsive Evaluation in the Community College Institutional charge: respond to students, faculty, staff, stakeholders Responsive evaluation: powerful tool for community colleges programs Community colleges: limited resources Research offices: overburdened
20
Building an Information Community: IT and Research Working Together Thank you for attending… Questions or comments? Nathan R. Durdella, PhD Cerritos College ndurdella@cerritos.edu
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.