Assessment of General Education

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
ASSESSMENT 101 Preparing Future Faculty (PFF) Workshop Spring 2011 Facilitators: Dr. Kara Penfield, Director of Academic Assessment; Isis Artze-Vega, PFF.
Curriculum, Instruction, & Assessment
The Academic Assessment Process
The Current Refocusing of General Education. Objectives for the Workshop Proposing and/or Renewing a Course Assessing the general education aspect of.
1 KapCC Student Learning Outcomes Assessment Framework Faculty Senate SLO Committee Presented by Kristine Korey-Smith.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
FLCC knows a lot about assessment – J will send examples
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Maps, Rubrics and Templates A Primer on Their Uses for Assessment in Student Affairs.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Assessing General Education Programs GE Fall Retreat 2010 Kathleen Thatcher.
Assessing Student Learning: What Kapi‘olani Community College is Doing Kristine Korey-Smith, Assessment Coordinator Louise Pagotto, Interim Vice Chancellor.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
California State University East Bay
Focus on Learning: Student Outcomes Assessment and the Learning College.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessment Workshop College of San Mateo February 2006.
Assessment of Student Learning North American Colleges and Teachers of Agriculture Cia Verschelden June 17, 2009.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
Developing a Culture of Assessment Elizabeth Godfrey 14 September 2009 University of Auckland/UTS Sydney.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
The Assessment Process: A Continuous Cycle
Creating a Culture of Assessment
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Assessment Planning and Learning Outcome Design Dr
Designing Valid Reliable Grading Tools Using VALUE Rubrics
Maja Holmes and Margaret Stout West Virginia University
CRITICAL CORE: Straight Talk.
Consider Your Audience
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Student Affairs Assessment
Director of Policy Analysis and Research
Effective Outcomes Assessment
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
Closing the Loop: The Assessment Process from Outcomes to Academic Excellence, Budgetary Competence and Community Engagement January 2012.
First-Stage Draft Plans for Gen Ed Revision
Student Learning Outcomes Assessment
Using VALUE Rubrics to Assess Almost Any Program Outcome
Advanced Program Learning Assessment
Jo Lynn Autry Digranes Coordinator for Assessment Updated 10/2017
Institutional Effectiveness Presented By Claudette H. Williams
Program Assessment Processes for Developing and Strengthening
Randy Beach, South Representative Marie Boyd, Chaffey College
Jillian Kinzie, Indiana University Center for Postsecondary Research
Presented by: Skyline College SLOAC Committee Fall 2007
Shazna Buksh, School of Social Sciences
The Heart of Student Success
Assessing Academic Programs at IPFW
February 21-22, 2018.
Curriculum Committee Report
Designing Programs for Learners: Curriculum and Instruction
Student Learning Outcomes at CSUDH
Closing the Loop: The Assessment Process from Outcomes to Academic Excellence, Budgetary Competence and Community Engagement January 2012.
Peralta Community Colleges: Environments of Effective Learning and Innovation. January 2012.
Presentation transcript:

Assessment of General Education Tips and Techniques Developed by OCU Office of Assessment Coordinator: Dr. Jo Lynn Autry Digranes

Assessment of Student Learning The systematic collection of information about student learning, using the time, knowledge, expertise, and resources available, in order to inform decisions about how to improve learning. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

HLC Statement on Student Learning, Assessment, and Accreditation Fundamental Questions for Conversations on Student Learning Six fundamental questions serve as prompts for conversations about student learning and the role of assessment in affirming and improving that learning: How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students? What evidence do you have that students achieve your stated learning outcomes? In what ways do you analyze and use evidence of student learning? How do you ensure shared responsibility for student learning and for assessment of student learning? How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? In what ways do you inform the public and other stakeholders about what students are learning---and how well? Higher Learning Commission. (2007) Statement on Student Learning, Assessment and Accreditation. HLC Website: http://ncahlc.org/Information-for-Institutions/publications.html

Setting Goals The institution’s statements of learning outcomes clearly articulate what students should be able to do, achieve, demonstrate, or know upon the completion of each undergraduate degree. The outcomes reflect appropriate higher education goals and are stated in a way that allows levels of achievement to be assessed against an externally informed or benchmarked level of achievement or assessed and compared with those of similar institutions. New Leadership Alliance for Student Learning and Accountability. (2012). Committing to quality Guidelines for assessment and accountability in higher education. Washington, D.C.: New Leadership Alliance for Student Learning and Accountability.

Setting Goals Institutional practices, such as program review, are in place to ensure that curricular and co-curricular goals are aligned with intended learning outcomes. The institution and its major academic and co-curricular programs can identify places in the curriculum or co-curriculum where students encounter or are expected or required to achieve the stated outcomes. Learning outcome statements are presented in prominent locations and in ways that are easily understood by interested audiences. New Leadership Alliance for Student Learning and Accountability. (2012). Committing to quality Guidelines for assessment and accountability in higher education. Washington, D.C.: New Leadership Alliance for Student Learning and Accountability.

Evidence of Gathering Student Data Policies and procedures are in place that describe when, how, and how frequently learning outcomes will be assessed. Assessment processes are ongoing, sustainable, and integrated into the work of faculty, administrators, and staff. Evidence includes results that can be assessed against an externally informed or benchmarked level of achievement or compared with those of other institutions and programs. New Leadership Alliance for Student Learning and Accountability. (2012). Committing to quality Guidelines for assessment and accountability in higher education. Washington, D.C.: New Leadership Alliance for Student Learning and Accountability.

Evidence of Gathering Student Data Evidence also includes assessments of levels of engagement in academically challenging work and active learning practices. Results can be used to examine differences in performance among significant subgroups of students, such as minority group, first-generation, and non-traditional-age students. New Leadership Alliance for Student Learning and Accountability. (2012). Committing to quality Guidelines for assessment and accountability in higher education. Washington, D.C.: New Leadership Alliance for Student Learning and Accountability.

General Education Assessment The process of assessing general education is not a means to its own end; rather, it is a way to systematically engaged in daily critical inquire about what works well and what needs to be improved. Maki, P.L. (2004). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus.

Assessment Steps Develop learner outcomes. Check for alignment between the curriculum and the outcomes. Develop an assessment plan. Collect assessment data. Use results to improve the program. Routinely examine the assessment process. Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Steps in Developing and Sustaining Assessment of General Education Determine the purpose of general education. Articulate goals and outcomes for general education. Design or refine the delivery of learning outcomes. Align learning outcomes to the delivery of the general education (e.g., connect the learning to teaching and course and curriculum design). Bresciani, M. J. (2007) Assessing student learning in general education Good practice case studies. Bolton, MA: Anker Publishing Company, Inc.

Steps in Developing and Sustaining Assessment of General Education Align education outcomes with the discipline outcomes, cocurricular outcomes, and overarching student learning principles, as applicable. Identify the means to evaluate learning outcomes (both direct and indirect methods) and do so in a manner that will help inform decisions or recommendations to improve learning. Bresciani, M. J. (2007) Assessing student learning in general education Good practice case studies. Bolton, MA: Anker Publishing Company, Inc.

Steps in Developing and Sustaining Assessment of General Education Gather, analyze, and interpret results in a manner that will lead to decisions, and recommendations for improvement at all levels. Discuss the recommendations for improving student learning in a collegial manner and with regard to the resources that will be needed to improve student learning. Bresciani, M. J. (2007) Assessing student learning in general education Good practice case studies. Bolton, MA: Anker Publishing Company, Inc.

Steps in Developing and Sustaining Assessment of General Education Report the results in a manner that will invite feedback from those involved in the learning process, including students and co-curricular professionals. Align the evaluation processes of general education with those of the disciplines and professional accreditors, where appropriate. Bresciani, M. J. (2007) Assessing student learning in general education Good practice case studies. Bolton, MA: Anker Publishing Company, Inc.

Steps in Developing and Sustaining Assessment of General Education Build consensus at each step. Explore additional opportunities to collaborate for improving student learning in general education. Have conversations about removing barriers to improving student learning. Bresciani, M. J. (2007) Assessing student learning in general education Good practice case studies. Bolton, MA: Anker Publishing Company, Inc.

General Education Models A distribution model that guarantees breadth in the undergraduate curriculum A specially created set of core courses that integrate the liberal arts A specially created set of courses that emphasize processes and individual student growth Allen, M. J. (2006). Assessing general education programs, San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Examples of Goal Levels Institutional: Students will communicate effectively orally and in writing. General Education Curriculum: Students will write essays in which they select and defend a position on a debatable issue, analyze a text, propose research, or define a problem and suggest solutions. Composition Course: Students will write a 5 to 7 page argumentative essay in which they select and defend a position on a debatable issue, support their position with evidence from their readings, and address counterarguments. Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Direct and Indirect Assessment Direct assessment involves an analysis of products or behaviors that demonstrate the extent of students’ mastery of learning outcomes. Indirect assessment involves people’s opinions, and these opinions can richly supplement what is learned in direct assessment studies. Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Direct Assessment Examples Standardized tests Locally developed tests Embedded assignments and activities Portfolios Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

More Direct Assessment Examples Final Projects – such as senior thesis, undergraduate research project, senior art show or music recital Capstone Experiences – such as student teaching, internship, cooperative educational experience Middaugh, M.F. (2010). Planning and assessment in higher education. San Francisco, CA: Jossey-Bass A Wiley Imprint.

Direct Assessments Strengths and Limitations Standardized Tests Strengths Professionally developed Established validity and reliability Typically provide multiple comparison groups Limitations May not align with institution’s learning outcomes or curriculum May only test at limited depth of processing Students may not be motivated Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Direct Assessments Strengths and Limitations Locally Developed Tests Strengths Align precisely with campus general education outcomes Can include authentic assessment, performance assessment, and questions that reveal depth of understanding Limitations Unknown validity and reliability Lack norm groups for comparisons Faculty time to develop and to score Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Direct Assessments Strengths and Limitations Embedded Assessment Strengths May be based on a variety of learning activities, such as homework, oral presentations, writing assignments Both teachers and students are fully engaged Data collection is unobtrusive and requires little or no additional faculty or student time Limitations Embedding same assessment in multiple courses requires coordination and faculty agreement Validity and reliability not known Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Direct Assessments Strengths and Limitations Portfolios (Showcase and Developmental) Strengths Requires student to take responsibility for work and to reflect upon the work Developmental portfolios can be integrated into advising Limitations Effort in preparing portfolio assignment Effort and resources needed in storing portfolios Effort in assessing portfolios Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Direct Assessments Strengths and Limitations Final Projects Strengths Good for measuring mastery of general education skills Good for general content and skills with a specific discipline Used extensively in fine arts disciplines Limitations Can be time intensive Middaugh, M.F. (2010). Planning and assessment in higher education. San Francisco, CA: Jossey-Bass A Wiley Imprint.

Direct Assessments Strengths and Limitations Capstone Experiences Strengths Commonly utilized in professional disciplines Assist in addressing employer criticisms that graduates do not have requisite skills and knowledge for jobs Feedback from experience sites can be very helpful for curriculum Limitations Appropriate placements can be difficult to locate Ensuring quality feedback from placement site may be difficult Middaugh, M.F. (2010). Planning and assessment in higher education. San Francisco, CA: Jossey-Bass A Wiley Imprint.

Indirect Assessment Examples Surveys Interviews Focus Groups Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Indirect Assessments Strengths and Limitations Surveys Strengths Can reach many respondents asynchronously and at a distance Can complement results from direct assessments There are many standardized surveys, such as NSSE, that have validity and reliability and can compare to norm groups Often easy to score Limitations May have low response rate and low motivation by respondents What people say or do may be inconsistent with what they actually do or know. Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Indirect Assessment Strengths and Limitations Interviews Strengths Flexible in format Can include both close-ended and open-ended questions May provide more detail Limitations Scheduling and time involved Requires interpersonal skills and non-biased view for interviewer Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Indirect Assessment Strengths and Limitations Focus Groups Strengths May uncover more in-depth responses Allow checking for consensus Limitations Best for addressing a few issues Required interpersonal skills for a non-biased interviewer Recruiting and scheduling can be difficult Analysis of responses may be easily visible, but may require further skilled interpretation Allen, M. J. (2006). Assessing general education programs. San Francisco, CA: Jossey-Bass, A Wiley Imprint.

Building Assessment on the Grading Process Grading is a “direct” measure of learning because there is direct evaluation of student work. Grading is already embedded within the culture of higher education. The grade alone is not sufficient, but must include explicit performance and criteria. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

Building Assessment on the Grading Process A direct measure requires: A student performance such as an exam or project A set of criteria by which to evaluate the performance Analysis and interpretation of the results A feedback loop into department, gen ed, and/or institutional decision-making processes Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

Making the Grading Process Useful for Program Assessment Ensure that the classroom exam or assignment actually measures the learning goals. State explicitly in writing the criteria for evaluating student work in sufficient detail to identify students’ strengths and weaknesses. Develop systematic ways of feeding information about student strengths and weaknesses back to decision makers at the departmental, general education, and institutional levels and using that information for programmatic improvement. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

The Rubric A rubric articulates in writing the various criteria and standards that a faculty member uses to evaluate student work. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

The Rubric A rubric can be constructed by: The faculty member A group of faculty (or teaching assistants) all teaching the same course A department or general education committee that is establishing criteria for evaluating samples of students work or that wants all faculty teaching a certain course to use the same rubric. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

The Rubric The rubric can also be constructed by other groups, such as: College faculty/departments working together within a local area or nationally Content experts Combinations of the above, such as the Association of American Colleges and Universities VALUE RUBRIC project http://www.aacu.org/value

The Rubric Steps in Construction Choose a test or assignment that tests what you want to evaluate. Make clear your objectives for the assignment. Collect any grading criteria you have handed out to students in the past as well as sample student papers with your comments. They assist with the following steps. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

The Rubric Steps in Construction Identify the “traits” that will count in the evaluation. These are nouns or noun phrases without any implication of judgment: for example, “thesis,” “eye contact with client,” “costume design,” or “control of variables.”

The Rubric Steps in Construction For each trait, construct a scale describing each level of student performance from the least skillful to the most skillful. You may use three, four, or five levels, depending on the needs for finer distinction. The scales use descriptive statements. An example: “A thesis that receives a score of “5” is limited enough for the writer to support within the scope of the essay and is clear to the reader; it intelligently enters the dialogue of the discipline as reflected in the students’ sources, and it does so at a level that shows synthesis and original thought; it neither exactly repeats any of the student’s sources nor states the obvious.” Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

The Rubric Steps in Construction Try out the scale with actual student work. Revise the scale as needed. Have a colleague in your discipline use your scale to evaluate actual student work. Revise the scale as needed. Walvoord, B. E. (2004). Assessment clear and simple A practical guide for institutions, departments, and general education. San Francisco, CA: Jossey-bass A Wiley Imprint.

AACU VALUE Rubrics The project “Valid Assessment of Learning in Undergraduate Education (VALUE) began in 2007, initiated by the American Association of Colleges and Universities (AACU). The project focused upon development of rubrics, with an initial release in 2009. The rubrics were to be designed to link to the AACU Liberal Education and America’s Promise (LEAP) fifteen areas of “Essential Learning Outcomes.”

AACU Essential Learning Outcomes Integrative and applied learning Intercultural knowledge and competence Oral communication Problem solving Quantitative literacy Reading Teamwork Written communication Civic engagement Creative thinking Critical thinking Ethical reasoning Foundations and skills for lifelong learning Information literacy Inquiry and analysis

AACU VALUE Rubrics Developed by teams of faculty and other academic and student affairs professionals from all sectors of higher education across the United States The VALUE Rubrics were initially tested at over 100 colleges and universities. Since made public in 2010, they have been downloaded and used at over 4,000 discrete institutions in the U.S., Australia, Japan, Hong Kong, Dubai, and Korea.

Examples

Examples

Examples

Examples

Examples

Examples

Timing Assessment Ideally, assessment is built into strategic planning for an institution or department and is a component of any new program as it is being conceived. Because effective assessment requires the use of multiple methods, it is not usually resource- efficient to implement every method right away or every year. A comprehensive assessment plan will include a schedule for implementing each data-gathering method at least once over a period of three (03) to five (05) years. Banta, T.W., Jones, E.A., & Black, K.E. (2009). Designing effective assessment Principles and profiles of good practice. San Francisco, CA: Jossey-Bass A Wiley Imprint

Questions?