A Comprehensive Assessment Approach for the School of Advanced Studies Rob Olding, Ph.D. Associate Dean for Assessment University of Phoenix: School of.

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Learning through Service Community Service-Learning at the University of Guelph Cheryl Rose, CSL Specialist, Student Life Executive Director, Canadian.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Engaging Business Students in Online Research and Critical Thinking through Customized Assignments Henri Mondschein Information Specialist Manager, Information.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
First Choice Graduate Programs Initiating a First Choice Program Consultation and Review.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Courtesy of Sue Groh.
Fostering Continuous Improvement of Curriculum - Learning Outcomes Peter Wolf Director, Centre for Open Learning Educational Support University of Guelph.
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Contributions by Sue Groh and Hal White.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
FLCC knows a lot about assessment – J will send examples
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Curriculum Renewal in the School of Business. Business School Approach What are the characteristics and capabilities you want Graduates from your Program.
The Role of Assessment in the EdD – The USC Approach.
Program Level Outcomes Jessica Carpenter Elgin Community College.
Franklin University Dr. Lewis Chongwony, Instructional Designer
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Communication Degree Program Outcomes
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
The Leader and Global Systems: The Impact of an International Partnership Activity on the Redesign of the Doctoral Program in Leadership Studies at Gonzaga.
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
1 Selecting Appropriate Assessment Methods Presented at the Teaching & Learning Innovations 17 th Annual Conference At the University of Guelph May 12,
Joo Hee “Judy” Kim ED 480 Teachback Fall 2007 / M. Campo.
Writing Learning Outcomes David Steer & Stephane Booth Co-Chairs Learning Outcomes Committee.
PORTFOLIO. Student Uses Students CollectReflectShare Collect class requirements Collect graduation requirements Reflect on learning in class Reflect on.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Institutional Outcomes and their Implications for Student Learning by John C. Savagian History Department Alverno C O L L E G E.
Chapter 6-Evaluating 3 S Education James and Kelly McVey.
MODULE 7 Putting All Together and Designing the Course.
Assessment Workshop College of San Mateo February 2006.
Inquiry and Investigation. What was the TOPIC? PROBLEM? CIVIC INQUIRY?
Paul Parkison: Teacher Education 1 Articulating and Assessing Learning Outcomes Stating Objectives Developing Rubrics Utilizing Formative Assessment.
PERSONALIZED, ADAPTIVE, AND COMPETENCY-BASED EDUCATION: DO YOU KNOW THE DIFFERENCE?
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
David Steer Department of Geosciences The University of Akron Writing Learning Outcomes October 2013.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Dr. Nancy S. Grasmick July 26,2012.  Maryland is proud to be the top-ranked state in U.S. growth as reported in this study, and judged by Education Week.
ML evi ne CP ED Co nv en in g Ju ne  The Purpose  The People  The Process.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
1 Rossier School of Education Defining Excellence in Urban Education.
Understanding By Design
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
Career Path Analysis in Assessment of Scholarship to Work Focus for Practitioner Doctorate Degrees Rob Olding, Ph.D. Associate Dean for Assessment School.
Tools for Knowledge Synthesis – Dialogue Gabriele Bammer.
Developing Rubrics within the Context of Assessment Methods Peggy Maki Senior Scholar Assessing for Learning AAHE
Module 3: Programmatic Assessment Strategies
Ernest Boyer’s Model of Scholarship Mary Corcoran PhD, OTR/L Professor, CRL Overview & Implications for Teaching and Learning* * Modified from presentation.
Stetson University welcomes: NCATE Board of Examiners.
Averett University November 5, 2012 Presenter: Barbara Jacoby, Ph.D.
Students’ Perceptions of Clinical Reasoning Development Rebecca Jensen, PhD, RN.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Sharon Lalla, Ed.D. Wenona Nutima ASSESSMENT AND FEEDBACK.
Presented by Rob Till, Chair UAC Craig Bain, Chair UCC Bruce Fox, Chair LSC & member of UAC Niranjan Venkatraman, member UGC & UAC 1 3/10/14.
Enhancing Course Syllabi for High-Quality Service Learning Gail Robinson, Education Consultant Maryland-DC Campus Compact Frostburg State University April.
DIRECTED ADMINISTRATIVE PORTFOLIO MSA 698. DIRECTED ADMINISTRATIVE PORTFOLIO CAPSTONE ALTERNATIVE Credits: 3 16 weeks The course is centered on the development.
Maja Holmes and Margaret Stout West Virginia University
Introduction to Program Learning Assessment
Advanced Program Learning Assessment
Assessment and Accreditation
Presentation transcript:

A Comprehensive Assessment Approach for the School of Advanced Studies Rob Olding, Ph.D. Associate Dean for Assessment University of Phoenix: School of Advanced Studies

Learning Assessment for Doctoral Level Education is …? Relatively new in current conceptualization Relatively abstract (compared to Associate and Bachelor levels) Higher thresholds for expectations of existing skills Research and writing intensive

A Multiple Method Approach Assessment Strategy ­ Somewhat in the form of an “Instrumental Case Study” approach consisting of “data vectors” from various perspectives: ­ Rubric based assessment based on IRMA mapping of curriculum ­ External evaluation sources (e.g., accreditor reports, Council on Graduate Schools, external consultants). ­ Appreciative Inquiry approach to assessment

Triangulation of Assessment Data Sources (basic model) SAS Practitioner Doctorate Learning Assessment Data Sources Rubric based measurement: Student Learning Outcomes External 3 rd Party Data (Reports) Appreciative Inquiry (AI) Based Assessment Deliberative Analysis

Rubric based measurement: Student Learning Outcomes ­ Critical to the assessment process is the development of analytic rubrics to measure performance on signature assignments as described. ­ As documented in multiple sources, analytic rubrics have been shown as effective feedback even at the education level (Groggins-Selke, 2013) ­ As documented in multiple sources, analytic rubrics have been shown as effective feedback even at the advanced education level (Groggins-Selke, 2013)  Analytic Rubrics focused upon:  Signature assignments in content and research courses  Key Progression assignments such as dissertation concept paper and proposal  Doctoral Dissertation assessment via multiple reviewers as “capstone” measure of doctoral success

External 3 rd Party Data (Reports) ­ Leveraging the data from these different sources to form a clear analytical picture of how our learning outcomes and programs are perceived from external views.  External Consultant Reports as engaged by the University of Phoenix School of Advanced Studies – (e.g., Council of Graduate Schools)  Survey Data (e.g., SEOCS and Alumni Surveys)  Accreditor Reports (e.g., Higher Learning Commission)

Appreciative Inquiry approach to Learning Assessment ­ This approach is argued to be atypical in assessment of learning, but very appropriate for the Doctoral level which compared to lower level degrees can be diverse, abstract and dynamic in outcomes.  As a method is known to be “transformational” and “engaging” which is important for both faculty and students  Positive in focus: “What were the most important learning outcomes you found in this course”?  Tied directly to a process of envisioning what we do best, what our dream outcomes would be, and how we might move to obtain that level!

Appreciative Inquiry Steps: 4-D Model adapted from Watkins and Mohr (2001) Discovery “Best of what is?” Appreciating Dream “What might be?” Envisioning Results Design “What should be the ideal? ” Destiny “How to empower, learn, improve?” Sustaining Change

­ The three approaches to assessment data provide a diverse set of perspectives ­ These perspectives can be left unreconciled, or… ­ Integrated to provide a new level of perspective that informs far beyond any on its own. ­ TRIANGULATION Integration of Multiple Data Sources using diverse methods

­ Includes diverse perspectives in the discussion ­ Provides for a basis of meaningful engagement for administration, faculty and students ­ Avoids an over reliance on a single paradigm approach that limits perceptions Advantages:

­ The Triangulation process requires a means of bringing together diverse data sources ­ An ideal approach would extend the process to multiple “stakeholders” that would include students, faculty and administration ­ Such a method would work to ensure “fair consideration” of each data source in relation to optimum learning outcomes Means of Integration:

The Deliberation Model An inclusive approach that involves all stakeholders and represents each of the data sources in the process

Deliberation: A means of integrating diverse perspectives ­ Deliberation is neither a casual discussion or a formal or informal debate. The discussion should focus first on each source individually and then in combination and from a “best solution” view  Maximum engagement – Can be used in face to face, online synchronous and asynchronous formats  Allows for carefully weighing diverse views and bringing light upon points not previously considered and explored  Allows for exploration of facts in relationship to strategic positions as represented by each area data source

External Reports (3 rd party) Rubric based Assessment Appreciative Inquiry Results Interpretation and Integration Each perspective is sequentially discussed in terms of meaning and both positive and negative implications. Then they are compared and discussed in relationship to integration with each other. Commonalities are of particular note in TRIANGULATING meaning.

­ The strategy is to: ­ Maximize engagement of stakeholders ­ Optimize the data sources in ensuring that diverse methods and perspectives are included in consideration ­ Develop an integrated set of insights that can inform strategy and contribute meaningfully to continuous improvement ­ Move assessment and evaluation to a new level for Doctoral Education Insights and Outcomes:

Thank You! Questions????

REFERENCES Borkowski, Nancy, A. (2006). Changing our thinking about assessment at the doctoral level. In Maki, Peggy L. & Borkowski, Nancy A. (Eds.), The assessment of doctoral education: Emerging criteria and new models for improving outcomes (pp ). Stylus Publishing LLC, Sterling, Virginia. Bushe, Gervase R. & Kassam, Aniq F. (2005). When is appreciative inquiry transformative?: A meta- case analysis. The Journal of Applied Behavioral Science, Vol. 41, No. 2, pp DOI: / Groggins Selke, Mary J. (2013). Rubric assessment goes to college: Objective, comprehensive evaluation of student work. Rowman & Littlefield Education, Lanham, Maryland.