1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Institutional Effectiveness (ie) and Assessment
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
The Academic Assessment Process
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Title I Needs Assessment and Program Evaluation
Understanding Teaching Effectiveness and Assessment Projects.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Administrative.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
February 8, 2012 Session 3: Performance Management Systems 1.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Supporting Evaluation Rubrics with Digital Evidence and Artifacts Bill Jensen, Rhonda Laswell, Rose Maudlin & Ramesh Sabetiashraf.
ACADEMIC PERFORMANCE AUDIT
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
School Improvement Planning Today’s Session Review the purpose of SI planning Review the components of SI plans Discuss changes to SI planning.
The Instructional Decision-Making Process 1 hour presentation.
Assessment Workshop College of San Mateo February 2006.
HECSE Quality Indicators for Leadership Preparation.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSING STUDENT LEARNING OUTCOMES IN DEGREE PROGRAMS CSULA Workshop Anne L. Hafner May 12, 2005.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Direct and Indirect Measures INPUTS OUTCOMES. Assessment Basics: Overview Characteristics of learning outcomes Introduction to assessment tools Validity.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
Field Test of Counselor System July 2000 Alabama Professional Education Personnel Evaluation Program.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
“Assessment Made Easy” Workshop CCSC Eastern Conference October 15, 2004 Based on a presentation by Philip K. Way, University of Cincinnati Frances K.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Reviewer Training 5/18/2012. Welcome & Introductions Co-Chairs: NHDOE Representative:Bob McLaughlin.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
External Review Team: Roles and Responsibilities A Very Brief Training! conducted by JoLynn Noe Office of Assessment.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Office of Service Quality
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
February, MansourahProf. Nadia Badrawi Implementation of National Academic Reference Standards Prof. Nadia Badrawi Senior Member and former chairperson.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
Office of Planning & Development
Eastern’s Assessment System
Assessment & Evaluation Committee
Program Assessment Processes for Developing and Strengthening
Dr. Ron Atwell Ms. Kathleen Connelly
Assessment & Evaluation Committee
February 21-22, 2018.
Presentation transcript:

1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers

2 Quality Enhancement Institutional Effectiveness defined  an ongoing, comprehensive, and institutionally integrated system  of planning and assessment  designed to enhance and improve the institution, as well as to demonstrate to what degree the institution and its components have been effective in fulfilling or achieving their stated mission or purpose.

3 Our Quality Enhancement/ Institutional Effectiveness activities include :  Strategic Planning Activities (intentional activity related to creating our “desired UST future,” based on planning, broad involvement, responsible agents, implementation, tracking, & continuous internal & external reviews)  Program Review Activities  Outcomes Assessment Activities

4 Outcomes Assessment Plans include:  Your Department’s Mission Statement  Conceptual Goals related to what you are trying to achieve as the legacies of your work (integrative & summative)  Measurable Objectives that permit you to shed light on the extent to which you are achieving these legacies  Assessment Plan indicating how you will measure objectives  Annual Summary of Findings  Annual statement of How Findings are Being Used to help the unit improve/enhance its efforts to achieve the legacies/outcomes

5 Your Mission Statement  Describes your mission in narrative.  Describes how your unit supports UST’s Mission/ Vision/ Strategic Agenda.

6 Conceptual Goals relate to Student Learning outcomes  Generate 3-5 student learning goals that support your mission and UST’S Mission/ Vision/ Strategic Agenda.  Articulate the integrative learning goals that your curriculum is designed to achieve.  Faculty own the curriculum; this is the process by which faculty define the outcomes your curriculum is designed to achieve in light of student learning.

7 Your Goals Drive Departmental Processes  Faculty should strive for consensus on student learning/ outcomes assessment goals.  Goals become a central foci for faculty activity.  Goals drive curriculum, teaching, and assessment.

8 Measurable Objectives permit you to shed light on the extent to which you are achieving your goals.  S imple: focus on a given goal.  M easurable: permit measurement.  A ffordable: can be done within budgets, with people in place.  R ealistic: are doable.  T imely: make sense in light of what you are doing.

9 SMART Objectives: S imple: Focus on a goal.  An objective should address one important function/aspect of a given conceptual goal…  Typically from one perspective.  You should have multiple objectives in order to provide multiple ways of shedding light on the extent to which a given goal is being reached.

10 SMART Objectives: M easurable: Permit measurement.  The objective should be written in measurable terms such that it acts as a standard or measuring stick by which you evaluate your efforts.  The objective is always focused on measuring an aspect of a given goal.

11 SMART Objectives A ffordable: Can be done within budgets, with people in place.  Most objectives should be something that you can manage within your current budget.  If new money is needed, consult your supervisors for testing affordability.

12 SMART Objectives R ealistic: Are “doable.”  Consider the all resources available to you including:  Financial  person-power  others  Set objectives that are ‘doable.’

13 SMART Objectives T imely: make sense in light of what you are doing.  The objectives should address issues of current importance such that the information obtained can be useful.  We need not expect to achieve and to measure all things all years.

14 An Assessment Plan for each objective  You need an Assessment Plan for each Objective.  The Assessment Plan measures what is identified in the related objective.  Briefly describe the assessment tool and/or procedure (what, who, how, when)  The assessment you include in the plan does not have to be something you have already been doing. It may be an assessment activity you will implement for the first time in in the OA Plan year.  Consult with colleagues, Dean, VPAA, with professionals in other institutions, with SPIRE for ideas, development of new tools, etc.

15 Types of Assessment Data  Tests  Surveys  Performance Evaluations (of students)  Scored Simulations  Portfolios that contain numerous pieces of evidence of student work and which are evaluated  Other evaluative measures

16 Tests as Assessments  Need to provide information of relevance to the faculty/curriculum  Locally developed vs. Nationally available  Local: time intensive vs. perceived appropriateness  National: lessened time commitment vs. relevance  Availability of reliable subtest scores, providing useful feedback.  Classroom exams, graded by a single faculty member, are not effective learning outcome measures of integrative learning.

17 Surveys as Assessments  Do not provide strong evidence of student learning; often used in OA Plans to assess service units.  Can focus on either perceptions or facts.  Can be completed by students-graduates or others (UG students’ graduate school professor, employers).  Do not measure learning directly.  Useful as adjunct for some purposes.

18 Performance Evaluations & Simulations  Require scoring rubrics to which faculty agree.  Evaluators must be trained in applying scoring rubrics.  High face validity, instructional validity, student and instructor acceptability.  Time intensive to take and score.  Multiple subscores preferred.  Necessitate reliability checks.  Scorers should not be all faculty; must have independence from teachers.

19 Portfolios  Assemblies of multiple types of info.  Replicate the same concerns as performance assessments & simulations  Need to have consensus, compilation, & summarization  Labor intensive, but good for looking at strengths and weaknesses, if multiple assessments are available

20 Evaluating Data Quality and Process Issues  Perceived validity = Feedback quality  Usefulness of resultant changes  Costs (time [student, faculty & staff], effort, and cost)  Impact on people in the department & the program  Sustainability  Justification of the program to prospective students and other constituencies  Acceptability to accrediting bodies

21 Frequently Utilized Means of Assessment For Administrative and Support Areas  Attitudinal measures of client satisfaction  Client/user survey  Direct measures or counts of area services  Volume of activity (number of persons served)  Level of efficiency (average time for responses)  Measure of quality (average errors per audit)  External Evaluation  Periodic assessment of the relationship of the department/unit’s efforts to “good and acceptable practice” by a “neutral” person who is knowledgeable in the field.  Student outcomes related measures (e.g., Student Satisfaction Survey findings)

22 Annual Summary of Findings  Be sure to report specific results of the listed objective/ assessments, including numbers where appropriate.  Provide a brief summary, including the numbers or specific examples of qualitative data found.  Maintain all support data and analyses for review, if requested.

23 Show How You Used Results to Improve Your Effectiveness, Teaching-Learning Interventions  Be sure to record program enhancement made as a result of findings.  Provide a brief summary of ways in which the results impacted your decision making, curriculum, teaching, other support behavior.  Include specific examples such as a change.

24 This is where we “Use Data to Close the Loop”  Data need to be evaluated by all  Data need to be quasi-public  Review on annual or more frequent basis  Serious time needs to be spent considering the findings—like research  Curricula are jointly owned; courses are parts of curricula. Instructional freedom is diminished by consensus.

25 OA Due Date: C ommunicated on Faculty Study Day 2003  Your Outcomes Assessment Plans Units OA Plans for are due to supervisors and VP (without results) by  OA Plans for , with findings and how they are being used to improve program effectiveness are due & for units with OA data not available earlier (e.g., Major Field Test results).

26 YOUR QUESTIONS? COMMENTS?