Plenary - Eye on the Prize : Sharing and Using Program Review Results Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education Leadership and Co-Director.

Slides:



Advertisements
Similar presentations
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Advertisements

Identifying Your Role in Outcomes-Based Assessment Program Review Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the.
Tri-County Technical College Quality Enhancement Plan.
Campus Improvement Plans
Bresciani, M.J. Implementing Lessons Learned: Establishing A Culture of Evidence- Based Assessment Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
EVALUATING WRITING What, Why, and How? Workshopping explanation and guidelines Rubrics: for students and instructors Students Responding to Instructor.
An Assessment Primer Fall 2007 Click here to begin.
SEM Planning Model.
Consistency of Assessment
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Baccalaureate Reform: Strategies to Move Forward Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education Leadership and Co-Director of the Center.
Are you Using Technology to Assess Learning or Assessing Learning to Improve Technology? Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary.
Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego.
Uncovering General Learning: Presenting Findings of General Education within a Bottom-line Business World Marilee J. Bresciani, Ph.D. Associate Professor,
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Working through Challenges of Implementation, including Identifying Resources for the Process Break-Out Session Marilee J. Bresciani, Ph.D. Professor,
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
Assessment Workshop SUNY Oneonta May 23, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
WRITING STUDENT LEARNING OUTCOMES October What is Assessment?  It is the systematic collection and analysis of information to improve student learning.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Assessing Student Learning: What Kapi‘olani Community College is Doing Kristine Korey-Smith, Assessment Coordinator Louise Pagotto, Interim Vice Chancellor.
WASC Visiting Committee Final Presentation for Overseas Schools International School Eastern Seaboard March , 2011.
Strategic Planning Board Update February 27, 2012 Draft - For Discussion Purposes Only.
Re-Thinking Assessment in Student Affairs Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary Education and Co-Director of the Center for Educational.
Exploring Good Practices in Outcomes-Based Assessment Program Review Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary Education and Co-Director.
Linked Learning: Utilizing Data High School Teachers Council September 22, :00 A.M. to 11:00 A.M.
Creating a Learning Community Vision
Assessing Student Learning at Central Piedmont Community College Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary Education and Co-Director.
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Enhancing Outcomes-Based Assessment for Student Affairs Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for.
Connecting Course Evaluations to Program Evaluations Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary Education and Co-Director of the Center.
Logistics and supply chain strategy planning
LEARNING OUTCOMES WORKSHOP Dr. Jan Hillman University of North Texas January 8, 2007.
Connecting Student Learning and Assessment to Program Review Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center.
Vaal University of Technology (formerly Vaal Triangle Technikon ) Ms A.J. GOZO Senior Director: Library and Information Services.
WHAT DO WE KNOW ABOUT THE QUALITY OF HIGHER LEARNING IN THE UNITED STATES? Marilee J. Bresciani, Ph.D. Professor of Postsecondary Education Leadership.
Rethinking Pre-College Math: A Brief Reminder about Why We’re Here and What We’re Trying to Do Overall context/purpose of project Defining characteristics.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
Developing a Teaching Portfolio for the Job Search Graduate Student Center University of Pennsylvania April 19, 2007 Kathryn K. McMahon Department of Romance.
Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Incorporating SLOs and Assessment into Program Review, Planning and Resource Allocation in a Time of Budget Reduction Julie Bruno, Sierra College, Facilitator.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
Age Appropriate Assessments: A Necessary Component to Transition.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Deepening Student Impact Via Instructional Practice Data Joe Schroeder, PhD Associate Executive Director, AWSA.
Team Up! How to Turn Problems into Solutions PRESENTERS MARJUYUA LARTEY-ROWSER, PHD, RD MARY FRANCES NETTLES, PHD, RD.
Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Nuts and Bolts: Functional Variations of Assessment and Evaluation Barbara Hornum, PhD Director, Drexel Center for Academic Excellence Associate Professor,
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
Consider Your Audience
What to do with your data?
Presentation transcript:

Plenary - Eye on the Prize : Sharing and Using Program Review Results Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education Leadership and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego State University 3590 Camino Del Rio North San Diego, California, U.S.A

What is the purpose of your program review? What kinds of decisions do your program review findings allow you to make?

Visioning the Result of your Program Review Process  Visioning requires  the setting of intentions  the communication of those intentions  the sharing of those intentions  the changing of past behaviors and beliefs  the implementation of new ways of doing and thinking

Visioning Questions  What kinds of decisions do your program review findings allow you to make so that the purpose of your program review is fulfilled?  What kinds of data do you need to inform the kinds of decisions that affirm the purpose of your program review process?

Visioning Questions  Who needs to be involved …  in the collecting of that data?  in the interpretation of that data?  In the synthesizing of that data?  In the brainstorming around decisions that could be made  In the prioritization of the implementation plan and the resources to move that plan to fruition?  What professional development is needed to accomplish this?

Uses of OBPR Results (WASC Program Review Guidelines, 2009)  Developing program learning outcomes and identifying appropriate means for assessing their achievement  Better aligning department, college and institutional goals  Refining departmental access, and other interventions to improve retention/attrition, and graduation rates Bresciani, M.J.

Uses, Cont. (WASC Program Review Guidelines, 2009)  Designing needed professional development programs, especially for faculty to learn how to develop and assess learning outcomes  Reorganizing or refocusing resources to advance specific research agendas  Re-assigning faculty/staff or requesting new lines Bresciani, M.J.

Uses, Cont. (WASC Program Review Guidelines, 2009)  Illuminating potential intra-institutional synergies  Developing specific plans for modifications and improvements  Informing decision making, planning and budgeting, including resource re/allocation  Linking and, as appropriate, aggregating program review results to the institution’s broader quality assurance/improvement efforts Bresciani, M.J.

How do you envision using your results?

In order for these Uses to Occur, An Institution Needs… (Bresciani, 2006)  Set priorities around institutional values  Communicate a shared conceptual framework and common language  Systematically gather data that actually evaluates these priorities Bresciani, M.J.

Sharing & Using Program Review Results  Challenges occur because:  Results not linked to outcomes, college/institutional goals, or strategic priorities  Template fix is the solution here (p.144)  Those who receive results are unsure what to do with them  Presentation of data  Interpretation of the data  Professional development

Sharing & Using Program Review Results, Cont.  Challenges occur because:  The routing for discussion of results and decisions is not clear  Clarified by roles and responsibilities  Communication routing  Expectations of use of results is not clear  Clarifying how results will be used by whom, when

What are your Challenges and how could they be readily addressed?

Real Life Considerations for the Write-Up Bresciani, M.J.  The Audience  For whom is the data?  Change language for different audiences if necessary  The Story  What point are you trying to make and for whom?  What decision needs to be made and who needs to make it?  The Format  Depends on the audience

Reporting Strategies from Gary Hanson, Ph.D. Bresciani, M.J.  Know your data  Know your audience  Tell the story  Identify meaningful indicators to shape the story  Examine indicators for patterns  Begin with the end in mind  Tie the data to the outcomes  Involve the end users in the process

Reporting Strategies, Cont. from Bresciani, Zelna, and Anderson, 2004 Bresciani, M.J.  Identify the values of your constituents and find out how your constituents prefer to see data and reports.  Especially important for IR people who are the “keepers of the data”  Continual process of refinement  Students (or those whom you evaluated) can be extremely helpful in your writing and dissemination of results and decisions made.  Be sure to link the data and decisions made to the outcome and the program being assessed (Maki, 2001).

Reporting Strategies, Cont. from Bresciani, Zelna, and Anderson, 2004 Bresciani, M.J.  Timing is everything when delivering results and decisions.  Prepare to defend your outcome, your evaluation method, your results, and the decisions made based on those results.  If you need help interpreting the data, get it.  Consider multiple layers of reporting (broad and general to detailed and specific)

Share Examples - Reporting Bresciani, M.J.  What “need” generated the data requested?  What is the purpose of the data request?  What story needs to be told?  Who will be using the data?  What are their values?  …preferences for receipt of data?  Who needs to be involved in preparing the data?  …presenting the data?  What story are you trying to tell?  What key points are you trying to make?

Sharing Examples - Providing Data Bresciani, M.J.  What data do your constituents want available to inform their decisions?  How often do they want it?  In what format?  How transparent do they want the data to be?  What is the communication flow?  Do they need comparisons?  What type of comparisons will be most meaningful to them?

Some Examples 1. Various Ways to Tell the Story 2. Using Dashboard Indicators to Inform OBPR Focus

Example Results  SLO 12) Apply research to practice, particularly in their area of specialization and focus  Student feedback and the evaluation of learning artifacts from 610, 795 A&B as well as the learning portfolio reflect a misalignment in the curriculum. We have made slight changes to the ED 690 course to align it to better prepare students for ED 795A & B by offering a stronger transition to preparing students to write a literature review and to reflect on a problem statement and purpose statement. Faculty will continue to visit about further opportunities to improve this alignment over the coming year.  While the faculty explored the possibility of offering ED 690 in spring only and designing a section just for SA students and a section just for rehab students; this solution was not possible due to the in-affordability of offering this solution.  Students reported wanting more control over the selection of their research topic. Student groups will be given more autonomy in selecting their research project topic.  Students working in groups will continue to evaluate their peers. However, due to student feedback, the peer evaluation rubric will be revised to include aspects that students wanted to evaluate their peers on and students will be educated more frequently on how the rubric scores will influence their peer grades.  Results indicate that many students felt that they learned about each different aspect of a research project better in a group environment rather than individually. Students commented that they would prefer to be in a smaller group in the future, with a recommended group size of 4-5 students per project. The group size will be decreased from 8-9 students per group to a group size ranging from 3-5.  We will re-evaluate this outcome in 2008 to see if we made any improvement.  Student Exit Survey –We will continue the exit survey each year but explicitly ask students to respond to the extent that they learned each program SLO, rather than ask them for satisfaction of each course.

How do you perceive your {Primary Academic Advisor}? Bresciani, M.J. Percent of college total in bold

Overall, rate the level of satisfaction of the assistance you have received from your {Primary Academic Advisor} Bresciani, M.J. Scale: 1=poor, 2=fair, 3=good, 4=excellent significant main effect

Advisors by College (continued) Bresciani, M.J. Percent of students indicating the type of advisor they go to most often, by college.

Same Information Bresciani, M.J.

More focused Information Bresciani, M.J.

Real–Life Reporting Reminders Bresciani, M.J.  Keep your audience in mind  If you have to draft varying reports/summaries of results for your varying audiences… do it  Report data in context of issues or outcomes  Provide a detailed version and an “executive summary”  Use graphs wisely

Real–Life Reporting Reminders, Cont. Bresciani, M.J.  Timing IS really Everything  Don’t under-estimate the power of “trying out” drafts on key decision- makers  Interpret your data so that it informs program improvement, budgeting, planning, decision-making, or policies.  Report limitations honestly

Sharing & Using Program Review Results Questions  In an ideal world, who would you want to review the results of program review?  Does your answer vary by type of program (prof accreditation or not) or level of program (UG, GR) ?  What reflection questions would you provide those reviewing the results to guide their interpretation of and therefore use of the results?

Sharing & Using Program Review Results Questions, Cont.  What are the articulated expectations for use of the results?  How would results be disseminated?  Who would be involved in interpreting the results and what is their role?  Are their clear paths for communication flow of results, interpretation of results, decisions, and recommendations?

Sharing & Using Program Review Results Questions, Cont.  Is one type of data more influential over another type in the interpretation of the results?  Is it clear who is involved in formulating decisions that are based on interpretation of results?  Is it clear on which level the decision resides?

Mentor Group Reflection Questions  Is there anything I need to change on my OBPR template so that it is clear how results align with outcomes at each level?  Am I clear about who needs to see the results from program review and how they prefer to see it in order to inform the necessary decisions?  Do I need to provide any professional development for anyone so they know how to use the results to inform decisions?