Plenary - Eye on the Prize : Sharing and Using Program Review Results Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education Leadership and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego State University 3590 Camino Del Rio North San Diego, California, U.S.A
What is the purpose of your program review? What kinds of decisions do your program review findings allow you to make?
Visioning the Result of your Program Review Process Visioning requires the setting of intentions the communication of those intentions the sharing of those intentions the changing of past behaviors and beliefs the implementation of new ways of doing and thinking
Visioning Questions What kinds of decisions do your program review findings allow you to make so that the purpose of your program review is fulfilled? What kinds of data do you need to inform the kinds of decisions that affirm the purpose of your program review process?
Visioning Questions Who needs to be involved … in the collecting of that data? in the interpretation of that data? In the synthesizing of that data? In the brainstorming around decisions that could be made In the prioritization of the implementation plan and the resources to move that plan to fruition? What professional development is needed to accomplish this?
Uses of OBPR Results (WASC Program Review Guidelines, 2009) Developing program learning outcomes and identifying appropriate means for assessing their achievement Better aligning department, college and institutional goals Refining departmental access, and other interventions to improve retention/attrition, and graduation rates Bresciani, M.J.
Uses, Cont. (WASC Program Review Guidelines, 2009) Designing needed professional development programs, especially for faculty to learn how to develop and assess learning outcomes Reorganizing or refocusing resources to advance specific research agendas Re-assigning faculty/staff or requesting new lines Bresciani, M.J.
Uses, Cont. (WASC Program Review Guidelines, 2009) Illuminating potential intra-institutional synergies Developing specific plans for modifications and improvements Informing decision making, planning and budgeting, including resource re/allocation Linking and, as appropriate, aggregating program review results to the institution’s broader quality assurance/improvement efforts Bresciani, M.J.
How do you envision using your results?
In order for these Uses to Occur, An Institution Needs… (Bresciani, 2006) Set priorities around institutional values Communicate a shared conceptual framework and common language Systematically gather data that actually evaluates these priorities Bresciani, M.J.
Sharing & Using Program Review Results Challenges occur because: Results not linked to outcomes, college/institutional goals, or strategic priorities Template fix is the solution here (p.144) Those who receive results are unsure what to do with them Presentation of data Interpretation of the data Professional development
Sharing & Using Program Review Results, Cont. Challenges occur because: The routing for discussion of results and decisions is not clear Clarified by roles and responsibilities Communication routing Expectations of use of results is not clear Clarifying how results will be used by whom, when
What are your Challenges and how could they be readily addressed?
Real Life Considerations for the Write-Up Bresciani, M.J. The Audience For whom is the data? Change language for different audiences if necessary The Story What point are you trying to make and for whom? What decision needs to be made and who needs to make it? The Format Depends on the audience
Reporting Strategies from Gary Hanson, Ph.D. Bresciani, M.J. Know your data Know your audience Tell the story Identify meaningful indicators to shape the story Examine indicators for patterns Begin with the end in mind Tie the data to the outcomes Involve the end users in the process
Reporting Strategies, Cont. from Bresciani, Zelna, and Anderson, 2004 Bresciani, M.J. Identify the values of your constituents and find out how your constituents prefer to see data and reports. Especially important for IR people who are the “keepers of the data” Continual process of refinement Students (or those whom you evaluated) can be extremely helpful in your writing and dissemination of results and decisions made. Be sure to link the data and decisions made to the outcome and the program being assessed (Maki, 2001).
Reporting Strategies, Cont. from Bresciani, Zelna, and Anderson, 2004 Bresciani, M.J. Timing is everything when delivering results and decisions. Prepare to defend your outcome, your evaluation method, your results, and the decisions made based on those results. If you need help interpreting the data, get it. Consider multiple layers of reporting (broad and general to detailed and specific)
Share Examples - Reporting Bresciani, M.J. What “need” generated the data requested? What is the purpose of the data request? What story needs to be told? Who will be using the data? What are their values? …preferences for receipt of data? Who needs to be involved in preparing the data? …presenting the data? What story are you trying to tell? What key points are you trying to make?
Sharing Examples - Providing Data Bresciani, M.J. What data do your constituents want available to inform their decisions? How often do they want it? In what format? How transparent do they want the data to be? What is the communication flow? Do they need comparisons? What type of comparisons will be most meaningful to them?
Some Examples 1. Various Ways to Tell the Story 2. Using Dashboard Indicators to Inform OBPR Focus
Example Results SLO 12) Apply research to practice, particularly in their area of specialization and focus Student feedback and the evaluation of learning artifacts from 610, 795 A&B as well as the learning portfolio reflect a misalignment in the curriculum. We have made slight changes to the ED 690 course to align it to better prepare students for ED 795A & B by offering a stronger transition to preparing students to write a literature review and to reflect on a problem statement and purpose statement. Faculty will continue to visit about further opportunities to improve this alignment over the coming year. While the faculty explored the possibility of offering ED 690 in spring only and designing a section just for SA students and a section just for rehab students; this solution was not possible due to the in-affordability of offering this solution. Students reported wanting more control over the selection of their research topic. Student groups will be given more autonomy in selecting their research project topic. Students working in groups will continue to evaluate their peers. However, due to student feedback, the peer evaluation rubric will be revised to include aspects that students wanted to evaluate their peers on and students will be educated more frequently on how the rubric scores will influence their peer grades. Results indicate that many students felt that they learned about each different aspect of a research project better in a group environment rather than individually. Students commented that they would prefer to be in a smaller group in the future, with a recommended group size of 4-5 students per project. The group size will be decreased from 8-9 students per group to a group size ranging from 3-5. We will re-evaluate this outcome in 2008 to see if we made any improvement. Student Exit Survey –We will continue the exit survey each year but explicitly ask students to respond to the extent that they learned each program SLO, rather than ask them for satisfaction of each course.
How do you perceive your {Primary Academic Advisor}? Bresciani, M.J. Percent of college total in bold
Overall, rate the level of satisfaction of the assistance you have received from your {Primary Academic Advisor} Bresciani, M.J. Scale: 1=poor, 2=fair, 3=good, 4=excellent significant main effect
Advisors by College (continued) Bresciani, M.J. Percent of students indicating the type of advisor they go to most often, by college.
Same Information Bresciani, M.J.
More focused Information Bresciani, M.J.
Real–Life Reporting Reminders Bresciani, M.J. Keep your audience in mind If you have to draft varying reports/summaries of results for your varying audiences… do it Report data in context of issues or outcomes Provide a detailed version and an “executive summary” Use graphs wisely
Real–Life Reporting Reminders, Cont. Bresciani, M.J. Timing IS really Everything Don’t under-estimate the power of “trying out” drafts on key decision- makers Interpret your data so that it informs program improvement, budgeting, planning, decision-making, or policies. Report limitations honestly
Sharing & Using Program Review Results Questions In an ideal world, who would you want to review the results of program review? Does your answer vary by type of program (prof accreditation or not) or level of program (UG, GR) ? What reflection questions would you provide those reviewing the results to guide their interpretation of and therefore use of the results?
Sharing & Using Program Review Results Questions, Cont. What are the articulated expectations for use of the results? How would results be disseminated? Who would be involved in interpreting the results and what is their role? Are their clear paths for communication flow of results, interpretation of results, decisions, and recommendations?
Sharing & Using Program Review Results Questions, Cont. Is one type of data more influential over another type in the interpretation of the results? Is it clear who is involved in formulating decisions that are based on interpretation of results? Is it clear on which level the decision resides?
Mentor Group Reflection Questions Is there anything I need to change on my OBPR template so that it is clear how results align with outcomes at each level? Am I clear about who needs to see the results from program review and how they prefer to see it in order to inform the necessary decisions? Do I need to provide any professional development for anyone so they know how to use the results to inform decisions?