The Academic Program Review Bridging Standards 7 and 14 Middle States Annual Conference December 10, 2010.

Slides:



Advertisements
Similar presentations
European Commission, DG EAC – Unit A3
Advertisements

EcoTherm Plus WGB-K 20 E 4,5 – 20 kW.
Symantec 2010 Windows 7 Migration Global Results.
1 A B C
AP STUDY SESSION 2.
1
Select from the most commonly used minutes below.
1 When learners are at a distance, the careful design of assessments is particularly important, because society somewhat unfairly imposes higher expectations.
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Achieving the Dream: Completing Educational Goals for the Workforce Annual Conference of the Middle States Commission on Higher Education December 2008.
Slide 1 FastFacts Feature Presentation November 11, 2008 We are using audio during this session, so please dial in to our conference line… Phone number:
ASTM Member Website Tools Jeff Adkins Diane Trinsey 1 September 2012 Officers Training Workshop.
September 2013 ASTM Officers Training Workshop September 2013 ASTM Officers Training Workshop ASTM Member Website Tools September 2013 ASTM Officers Training.
Solving the Faculty Shortage in Allied Health 9 th Congress of Health Professions Educators 4 June 2002 Ronald H. Winters, Ph.D. Dean College of Health.
Findings from a National Study Ashley Finley, Ph.D Director of Assessment & Research, AAC&U National Evaluator, Bringing Theory to Practice POD Conference,
11 A New Accountability Model March 31, 2010 GCS 2 Discussion Session.
David Burdett May 11, 2004 Package Binding for WS CDL.
1 Quality Education Investment Act of 2006 (QEIA) 1 Quality Education Investment Act (QEIA) of 2006 County Superintendents Monitoring Responsibilities.
New Jersey Statewide Assessment Results: Highlights and Trends State Board of Education, February 6, 2008 Jay Doolan, Ed.D., Assistant Commissioner,
Create an Application Title 1Y - Youth Chapter 5.
Overview for CTE Educators CTE Accountability, Budget and Grants Management: Data Reporting July 15-17, 2013 Murfreesboro, TN Susan Cowden: Director of.
CALENDAR.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
Supported by ESRC Large Grant. What difference does a decade make? Satisfaction with the NHS in Northern Ireland in 1996 and 2006.
The 5S numbers game..
Media-Monitoring Final Report April - May 2010 News.
Welcome. © 2008 ADP, Inc. 2 Overview A Look at the Web Site Question and Answer Session Agenda.
60 Great Ways to Use MS Word in the Classroom Using MS Word in the classroom is a practice that should be social as well as technical The social organisation.
Break Time Remaining 10:00.
PP Test Review Sections 6-1 to 6-6
Operating Systems Operating Systems - Winter 2010 Chapter 3 – Input/Output Vrije Universiteit Amsterdam.
Middle School 8 period day. Rationale Low performing academic scores on Texas Assessment of Knowledge and Skills (TAKS) - specifically in mathematics.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
Biology 2 Plant Kingdom Identification Test Review.
Adding Up In Chunks.
FAFSA on the Web Preview Presentation December 2013.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
Facebook Pages 101: Your Organization’s Foothold on the Social Web A Volunteer Leader Webinar Sponsored by CACO December 1, 2010 Andrew Gossen, Senior.
© 2013 E 3 Alliance 2013 CENTRAL TEXAS EDUCATION PROFILE Made possible through the investment of the.
Artificial Intelligence
Before Between After.
7/16/08 1 New Mexico’s Indicator-based Information System for Public Health Data (NM-IBIS) Community Health Assessment Training July 16, 2008.
12 October, 2014 St Joseph's College ADVANCED HIGHER REVISION 1 ADVANCED HIGHER MATHS REVISION AND FORMULAE UNIT 2.
1 Lab 17-1 ONLINE LESSON. 2 If viewing this lesson in Powerpoint Use down or up arrows to navigate.
: 3 00.
5 minutes.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
1 hi at no doifpi me be go we of at be do go hi if me no of pi we Inorder Traversal Inorder traversal. n Visit the left subtree. n Visit the node. n Visit.
One-Degree Imager (ODI), WIYN Observatory What’s REALLY New in SolidWorks 2010 Richard Doyle, User Community Manager inspiration.
1 Phase III: Planning Action Developing Improvement Plans.
Converting a Fraction to %
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Clock will move after 1 minute
famous photographer Ara Guler famous photographer ARA GULER.
Leading Change: Creating a Culture of Learning & Assessment.
Physics for Scientists & Engineers, 3rd Edition
Select a time to count down from the clock above
Copyright Tim Morris/St Stephen's School
Columbus Tustin Middle School 3-Year Achievement Results Analysis September 2013.
1.step PMIT start + initial project data input Concept Concept.
WHO DO WE TRUST? Robert Phillips President & CEO Edelman EMEA Edelman.
1 Dr. Scott Schaefer Least Squares Curves, Rational Representations, Splines and Continuity.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
Student Equity Report Prepared by Berkeley City College, Faculty, Administrators, and Staff May, 2012 Data Sources: PCCD Institutional Research, CCCCO.
Presented to: By: Date: Federal Aviation Administration FAA Safety Team FAASafety.gov AMT Awards Program Sun ‘n Fun Bryan Neville, FAASTeam April 21, 2009.
Institutional Effectiveness at CPCC DENISE H WELLS.
How an Assessment Framework helped revitalize Program Review at JCCC
Bakersfield College Annual Program Review Work Session 2013
Presentation transcript:

The Academic Program Review Bridging Standards 7 and 14 Middle States Annual Conference December 10, 2010

Presenters Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs

Overview Framework to Address the APRs Structure/Challenges/Approach Examples of Metrics Current Action Plan Integration of End User Technology Next Steps Benefits of Our Approach Questions

Assessment Cycle-2005 Plan to meet Meet to plan Report out on planning Plan to meet Meet to plan

What we had to build on Strong focus on programs. State mandated 5-year academic program review in need of revision. Institutional Effectiveness Model (IEM) with performance indicators benchmarked through State and National data bases.

Mission Strategic Initiative: Access & Success Institutional Effectiveness

IEM Needed a way to assess how the College was performing on key metrics in relation to prior. years/semesters and compared to other institutions. Historical/Trend data Benchmark data –Pennsylvania & National Peers

Institutional Effectiveness Model

Where we started Restructured the Academic Program Review process Incorporated the use of technology

Goal of the restructuring Measure student performance as evidence by results of assessment of student learning outcomes. Measure program performance as evidenced by comparison of program performance to overall college performance on specific key indicator (current and aspirational).

Challenges Usual issues with assessment in general. Faculty had little knowledge of the Colleges performance indicators. Organizational separation of assessment of institutional and student learning outcomes.

Approach Began by building it backwards from the IEM by mapping out specific core indicators to program data, making additions where needed.

Examples of Metrics Used for APR

TARGETSCautionAcceptableAspirational Graduation Rate <19%19%-23%>23%

TARGETSCautionAcceptableAspirational Transfer Rate<29%29%-32%>32%

Definitions of Success & Retention Success=Grades of (A,B,C & P)/(A, B, C, D, P, D, F, & W) Retention=Grades of (W)/(A, B, C, D, P, D, F, & W)

Added a curricular analysis How well program goals support the colleges mission. How well individual course outcomes reinforce program outcomes. How well instruction aligns with the learning outcomes.

Specific assessment results. Changes made based on the assessment findings. Evidence of closing the loop Changes made to the assessment plan.

Action Plan Outcomes expected as a result of appropriate actions steps. Timelines and persons responsible for each action step. Resources needed with specific budget requests. Evaluation plan with expected benefits.

Bottom Line Is there sufficient evidence that the program learning outcomes are being met? Is there sufficient evidence that the program is aligned with the college on specific key indicators?

The Framework Planning and Budgeting (Standard 2) APR Action Plan APR Annual Report Annual Academic Planning Assessment Results Curriculum Committee Presidents Office Curriculum BOT & BOT

Addition of Technology Worked in concert with Information Technology to integrate iStrategy with ERP (Datatel). The implementation of this permitted end users to obtain the data needed for program assessment, without the middle man (IR and/or IT).

Next Steps in the Evolution of of College and Program Outcomes

Example of APR Report Card

Examples of Course Success

Success in ACC /FA2004/FA2005/FA 2006/F A 2007/F A 2008/F A 2009/F A % Success 61.4%57.1%55.4%55.3%51.4%44.2%48.3% # Success % Non Success 38.6%42.9%44.6%44.7%48.6%55.8%51.7% # Non Success

Success in ACC /FA2004/FA2005/FA2006/FA2007/FA2008/FA2009/FA % Female Success 63.3%57.5%58.8%57.7%57.3%51.8%58.7% Female Success % Male Success 59.8%56.8%53.2%53.6%47.2%39.1%42.1% Male Success

Success in Math /FA2004/FA2005/FA2006/FA2007/FA2008/FA2009/FA % Success53.6%46.3%47.3%45.7%44.8%43.3%47.4% Success % Non Success46.4%53.7%52.7%54.3%55.2%56.7%52.6% Non Success

Success in Math /FA2004/FA2005/FA2006/FA2007/FA2008/FA2009/FA % African American Success 42.6%37.7%38.5%25.8%26.9%29.9%34.7% African American Success % Caucasian Success58.2%51.8%50.3%52.7%53.5%48.4%52.0% Caucasian Success

Benefits Build a bridge between Standards 7 and 14. Better data. By putting data in the hands of faculty, have them actively engaged with using data in decisions/planning. IR time better used.

Annual planning cycle developed. Built a culture of assessment in several of the academic divisions. Curricular changes that align with graduation initiative. Curricular and program improvement. Created a college-wide model for improvement of student learning.

Evolution of the Dashboard Creation of a Student Success Dashboard Metrics: Course level success and retention (Developmental and College-Level) Persistence (fall to spring and fall to fall) Progression of various cohorts of students College level success in Math or English after Developmental Math or English Graduation Transfer

Graphic Representation for the SSD

Final Thoughts Its not perfect, but it works for us. Do the research on which tools are appropriate for your college Assessment of the core curriculum Launching of assessment software It all starts with asking the right question PRR 2010

Questions

Presenters Mr. H. Leon Hill Director of Institutional Research Dr. Joan E. Brookshire Associate Vice President of Academic Affairs