Undergraduate Scholarship and other High Impact Practices: Assessing the Outcomes, Not the Content. Norman Jones Director of General Education & Curricular.

Slides:



Advertisements
Similar presentations
IB Portfolio Tasks 20% of final grade
Advertisements

Performance Assessment
What is information literacy? Information Literacy - the definition "Information literacy is knowing when and why you need information, where to find.
SENIOR SEMINARS Specifics & Example Performances CEPR Center for Educational Policy Research.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Organizing Assessment to Foster Students’ Best Work Council for the Advancement of Standards National Symposium November 16, 2009 Carol Geary Schneider.
Bridging the Sophomore Gap: A Developmental Model of Information Literacy Shawn Bethke, Head of Library Public Services George Loveland, Library Director.
National Academic Reference Standards
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Student Growth Developing Quality Growth Goals II
Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Consistency of Assessment
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
October 22, 2009 Report to the Faculty Senate Professor John Stevenson Senator Sandy Jean Hicks UCGE-Subcommittee on Assessment of General Education (SAGE)
Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center
Session 6: Writing from Sources Audience: 6-12 ELA & Content Area Teachers.
USING STUDENT OUTCOMES WHEN INTEGRATING INFORMATION LITERACY SKILLS INTO COURSES Information Literacy Department Asa H. Gordon Library Savannah State University.
FLCC knows a lot about assessment – J will send examples
Faculty Senate Writing Skills Committee Scott Lazerus, ChairChristy Jespersen Jessica YoungJoAnn Arai-Brown Nancy GaussAnne Ryter Julie LukengaCourtney.
Interactive Science Notebooks: Putting the Next Generation Practices into Action
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Formulating objectives, general and specific
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Debbie Poslosky Taken from the Common Core Standard Document.
Academy for Student-Centered Learning – Workshop Two Melia Fritch, Shawna Jordan, & Shannon Washburn October 28, 2013 CREATING STUDENT-CENTERED LEARNING.
Jeremy Hall Nicholas Jones Wouter Poortinga An Exploration of Assessment Practices at Cardiff University’s Schools of Engineering, Psychology and the Centre.
{ Senate Hearing Project Kathryn Gustafson Farmington High School.
Sheila Roberts Department of Geology Bowling Green State University.
Exploring the Normal Distribution Rick Luttrell James Price South Iredell High School.
Margaret J. Cox King’s College London
Research Writing and Scientific Literature
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
American Chemical Society The Role of Undergraduate Research in the Certified Chemistry Major Thomas Wenzel Department of Chemistry Bates College Lewiston,
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
4/16/07 Assessment of the Core – Science Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Common Core State Standards Professional Learning Module Series.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
The Academy Experience: A Plan for Integrated Education.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
“Outcomification”: Development and Use of Student Learning Outcomes Noelle C. Griffin, PhD Director, Assessment and Data Analysis Loyola Marymount University.
Competency-Based Education What It Is & What It Is Not Alison Kadlec, Ph.D. Director of Higher Education & Workforce Programs Public Agenda May 14, 2015.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Information Literacy Module for FYI Available to any FYI Tony Penny, Research Librarian – Goddard Library Research & Library Instruction Services We support.
Information Literacy Module for Majors Available to support any department Tony Penny, Research Librarian – Goddard Library Supporting the Architecture.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Unit 5 Seminar D ESCRIBING Y OUR L EARNING. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Common Core State Standards in English/Language Arts What science teachers need to know.
Argumentative Writing Grades College and Career Readiness Standards for Writing Text Types and Purposes arguments 1.Write arguments to support a.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Developing Program Learning Outcomes To help in the quality of services.
National Science Education Standards. Outline what students need to know, understand, and be able to do to be scientifically literate at different grade.
THE SOLUTION TO OUR WOES? PRESENTATION BY DR. RICHARD LEIBY DIRECTOR OF STRATEGIC PLANNING AND ASSESSMENT Competency-based Assessment.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
Defining 21st Century Skills: A Frameworks for Norfolk Public Schools NORFOLK BOARD OF EDUCATION Fall 2009.
Critical Information Literacy
Assessment Planning and Learning Outcome Design Dr
CRITICAL CORE: Straight Talk.
“Bridging General Education and the Major: Critical Thinking, the Mid- Curriculum, and Learning Gains Assessment” Dr. Jane Detweiler, Associate Dean, College.
WHAT MAKES A GOOD ASSIGNMENT?
General Education Assessment Subcommittee Report
Critical Thinking Skills In English
Student Learning Outcomes Assessment
Using the 7 Step Lesson Plan to Enhance Student Learning
Presentation transcript:

Undergraduate Scholarship and other High Impact Practices: Assessing the Outcomes, Not the Content. Norman Jones Director of General Education & Curricular Integration Utah State University

Assessment of teaching in 2030 will probably not look like it does now. The delivery of “content” will not be our job. It will be to prepare students for the human jobs that can’t be done by machines, jobs requiring personal interrelations and critical thinking.

The Future of Higher EdThe Future of Higher Ed “Degrees and other postsecondary credentials can’t simply be defined by the amount of time a student spends in classrooms or labs. Rather, degrees must represent well-defined and transparent learning outcomes. In short, students should get credit for what they know and what they can do. And all learning should count ― no matter how, when or where it was obtained.” Jamie Merisotis, President, Lumina Foundation

The traditional assessment method is to measure progress incrementally, using quizzes and tests, measuring individual performance against the average of group performance, over a fixed period of time, calibrated in multiple units of seat time.

 “Current assessment practice, for the most part, rests on faculty-established goals, developed independently at each institution, for what graduates should know and be able to do.”  Whether or not graduates attain these goals is then investigated on average by using various methods to examine the performance of representative samples of students.” Peter T. Ewell, “The Lumina Degree Qualifications Profile (DQP): Implications for Assessment.”

These methods do not ask about developing individual proficiencies, or mastery, or the possibility of learning that occurs outside the classroom.

We Must Teach Toward Proficiency: “a set of demonstrations of knowledge, understanding and skill that satisfy the levels of mastery sufficient to justify the award of an academic degree.” Degree Qualification Profile, 2.0

In order to assess for proficiency we have to rethink what we are asking students to learn. Is it content, or the application of content?

 “Assignments or examination questions designed to determine proficiency in particular DQP competencies, consequently, must require students to generate a product of some kind—a research paper, an oral presentation, a dance performance, a translation of a text from one language to another, an engineering design.”  “Merely identifying a “correct” answer from a set of posed alternatives is not a production task. Because the assessments associated with DQP competencies require students to directly demonstrate mastery, the assessment really is the competency from an operational standpoint.” Ewell, “DQP…Asssessment” 12.

WE MUSTWE MUST  Decrease emphasis on content. They have libraries in their pockets.  Create opportunities for practice, linking content to a need to know.  Emphasize acquisition, deployment, and communication of knowledge through experience.

High Impact Practices develop their proficiencies, more than their book learning.

Pedagogical High Impact Practices that Reinforce Proficiencies  First-Year Seminars and Experiences  Undergraduate Research  Writing-Intensive Courses  Capstone Courses and Projects

Institutionally Organized Experiences that Reinforce Proficiencies:  Common Intellectual Experiences  Learning Communities  Diversity/Global Learning  Internships  Service Learning

HIP AssessmentsHIP Assessments  There are “Process” assessments – based on course design  There are “Outcomes” assessment – based on demonstrated proficiency

 In “Process” assessments we identify the target outcomes and design ways for students to prepare and practice for them.  In "Outcomes” assessments, students produce evidence of proficiency.

At USU, we built Gen Ed template rubrics that identify the proficiencies we expect a course to deliver WITHOUT PRECISELY DEFINING CONENT.

Breadth Life Science RubricBreadth Life Science Rubric Knowledge: The student who attains proficiency The student who is approaching proficiency The student who lacks proficiency Understand how the enterprise of science works (i.e., erecting testable hypotheses, refining hypotheses, reproducible results, etc.) Can apply the basic structure and methodology of scientific enterprise. Can articulate the basic structure and methodology of scientific enterprise. Is unable to articulate the basic structure and methodology of scientific enterprise. Learn the key laws, concepts and processes that govern biological systems. Knows the key laws and concepts, and is able to apply them to novice problems. Knows the key laws and concepts and can articulate them. Does not know the key laws and concepts beyond memorization.

Utilize quantitative methods to collect, analyze and interpret scientific information Can interpret, read, understand and explain a graph, table, or quantitative series of data, and apply that understanding to a problem. Can interpret, read and understand a graph, table, or quantitative data. Is not able to interpret, read or understand a graph, table, or quantitative series of data. Evaluate the credibility of various sources of information about science-related issues. Can assess the credibility of sources of scientific information, and critique source as it applies to a scientific issue. Can assess credible sources of scientific information, and can articulate why they are credible. Cannot assess credible sources for scientific information, or unable to determine credibility of sources. Use written or visual communication to demonstrate knowledge of scientific findings. Can write and/or illustrate knowledge of a scientific idea or concept clearly, comprehensively, and concisely. Can write and/or illustrate knowledge of a scientific idea or concept. Is unable to convey knowledge. Examine the relationship of the science learned to societal issues (such as sustainability, etc...) Can apply science concepts and societal issues to the greater question of the course. Can articulate the relationship between science concepts and societal issues. Is unable to recognize the links between social issues and scientific findings.

These template rubrics require that achieving proficiency in the outcomes be intentionally designed into the course. Using active verbs, we demand demonstration that students know, understand and are able to do generic things.

The template rubrics establish the hierarchies of proficiency in the subject, and give form to the grading rubrics used to assess the demonstrations of competence.

HOW DO YOU ASSESS AN EXPERIENCE THAT HAS A PROFICIENCY OUTCOME?

We have to think about the “take away” from the experience. If we focus on outcomes, there is no precise question like “Who is buried in Grant’s tomb?” There are no points to be lost for incorrect answers.

We want to understand the student’s engagement with practice. Has he or she demonstrated proficiencies in the discipline through use of the tools of the discipline?

We have to think about the demonstration of proficiencies within the outcomes, both in courses and in degrees. They build together, reaching apogee in the capstone project.

Building Pathways to Increasing Proficiency  As we refine our templates and enact them in course rubrics, we can begin to map the way student practice builds proficiency over the curriculum. That means we can envision appropriate pathways to degrees.

USU, HIST 4990: Senior Capstone Common Rubric Used by all capstones in the History degree to assess proficiencies.

Learning Excellent mastery Good masterySome masteryMinimal mastery No mastery Student frames historical questions in a thoughtful, critical manner. The paper addresses a significant historical question that is clearly stated. The question’s significance is satisfactorily demonstrated; the student is conscious of the role of periodization in forming the question; the question is of manageable scope and logically formulated. The paper addresses a significant historical question that is clearly stated. The student makes an effort to demonstrate significance and to employ periodization. Question is of manageable scope, posed with minimal logical flaws in question framing. The paper addresses a historical question that can be identified with some difficulty. Significance of question unclear; minimal grasp of periodization; serious logical lapses in question framing. Significance of question not demonstrated; question of inappropriate scope or illogically presented; no grasp of periodization. No identifiable historical question.

Student employs a range of primary sources appropriate to the informing thesis of the paper. Makes thorough use of all relevant online and print databases to identify primary source literature; all available primary sources identified. All sources in bibliography thoroughly used in text. Makes good use of relevant online and print databases; some gaps in primary source base. A few sources in bibliography not fully used. Makes some use of online or print databases; significant gaps in source base; paper based on only a few of cited sources. No evidence of using databases to establish source base; source base very limited. Major sources unknown or not employed. Little evidence that author has used works listed in bibliography. No evidence of using databases; sources entirely insufficient and inappropriate to paper topic.

Student evaluates and analyzes primary sources. Shows thorough awareness of origins, authors, contexts of all primary sources; consciously employs verification strategies as needed. Shows some awareness of contexts of primary sources; employs some verification strategies. Offers partial evaluation of primary sources; spotty verification. Offers little to no evaluation of primary sources; no verification. Is not aware of need to evaluate or verify sources.

Through assessing the 7 degree outcomes, we are asking if students have sufficient mastery to be granted a degree. The senior thesis is the artifact used as a demonstration.

We are “preparing students to tackle nonstandard, unscripted problems and questions. Unscripted problems, by definition, are those where ‘right answers’ are not known and where the nature of the problem itself is likely uncertain at best, and often actively contested.” Carol Geary Schneider in Ewell, “DQP…Asssessment,” 25.

 Our only real assessment audience for improvement is ourselves. It reassures us that we can honestly say we are educating our students.  Is a fancy metric needed?  No. Proof of proficiency is.

 Process assessment + Proof of outcomes performance = evidence of proficiency.  An Intentional curriculum, taught by intentional faculty to intentional learners.  Through it, every students’ proficiency is established.