Assessment Planning for EOF Programs Susan DeMatteo June 13, 2013.

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Guidelines and Methods for Assessing Student Learning Karen Bauer, Institutional Research & Planning, Undergraduate Studies; Gabriele Bauer, CTE.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
22 May Session Overview Purposes: Provide an overview of the assessment plan for Foundational Studies Discuss Phase I of the assessment plan – assessing.
An Assessment Primer Fall 2007 Click here to begin.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
The Academic Assessment Process
Month, XX YEAR (Arial 10) Evaluating Student Learning – Ted Scholz.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Understanding the Process and the Product Professional Development Spring, 2012.
performance INDICATORs performance APPRAISAL RUBRIC
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Strategic Planning with Appreciative Inquiry
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
FLCC knows a lot about assessment – J will send examples
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
BY Karen Liu, Ph. D. Indiana State University August 18,
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Assessment Workshop College of San Mateo February 2006.
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Susan DeMatteo June 13, 2013 Putting EOF on the Map! Mapping Strategies for Assessment Design.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
August 15th 2007 Assessment of Student Learning Outcomes by Kirby Hayes.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
Assessment 101: Or Why and How to Assess ACRAO Spring Conference 2011 Presenter: Gladys Palma de Schrynemakers.
Designing Quality Assessment and Rubrics
The Assessment Process: A Continuous Cycle
The assessment process For Administrative units
D2L Refresher Upload content into the Content section in a D2L course
Consider Your Audience
Program Learning Outcomes
Effective Outcomes Assessment
Curriculum and Accreditation
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
Presentation transcript:

Assessment Planning for EOF Programs Susan DeMatteo June 13, 2013

Assessment Planning  Program Effectiveness  Student Success  Student Learning

Assessment Planning  CAS Standards for EOF  Making Connections!  Building a Presence!

Assessment Planning  Step by Step Process  Where are you now?  Where do you want to be?

Assessment Planning  Student Learning Outcomes  Curriculum Mapping  Long-term Planning

Where to start? –Review the CAS standards for TRIO- EOP (EOF) –What is already occurring organically? – Take inventory – Use a rubric –Identify gaps in your assessment

Building a Document Roadmap –Makes planning manageable –Breaks down plan into separate components for work distribution –Aligns to multiple external reports –Links to tangible evidence

Document Roadmap Example Middle States Standard 14 Document Roadmap

Roadmap Template X. Standard Category X.x.x Standard Criterion Short narrative explaining how standard has (or will be) met. (1-5 sentences) Evidence Links: –Reference/attach supporting documents that provide evidence that you have met (or plan to meet) standard. –(i.e. web links, “See appendix X”, …etc.) Action Items: Responsible Party Timeline Articulate what needs to be done to continue to address this criterion. Specify who is responsible for getting it done. Provide a specific time for completion.

Roadmap Example 1. Mission [The mission statement] is consistent with that of the institution. The institutional and EOF mission statements were reviewed in June, 2013 and were determined by the self-study team to be strongly aligned. Evidence Links: – us/institutional-mission.htmlhttp://abc-college.edu/about us/institutional-mission.html – Action Items: Responsible Party Timeline Review institutional and EOF mission statements every 5 years to ensure continued alignment. EOF DirectorJune, 2018

Action Plan –Lists ALL action items from the document roadmap. –Organized in chronological order

Action Plan –Establishes a central source for coordination –Provides for distribution of work –Makes roles and responsibilities explicit to all –Ensures tasks are accomplished within a given time frame –Keeps ongoing activities on the radar

Action Plan Example Action Items: Responsible Party Timeline Distribute self-reflection survey to all students who have completed First Year Experience (FYE) program. Director of Institutional Research May, Compile and distribute FYE self-reflection survey results to the FYE Coordinator. Director of Institutional Research June, Use self-reflection survey results to create strategies for improvement. FYE Coordinator July, Review institutional and EOF mission statements every 5 years to ensure continued alignment. EOF DirectorJune, 2018

Action Plan –Shared –Communicated –Coordinated –Updated/ Revised –Integrated with Document Roadmap

Action Plan: Challenges –Missing Timelines –Vague Responsible Parties Counseling Department Assessment Plan xlsx –Lack of Coordination (Attention to Detail) –Getting Lost in the Shuffle (Other Departments)

Assessment Planning  Student Learning Outcomes  Curriculum Mapping  Long-term Planning

5 Steps for Assessing Student Learning 1. Student Learning Outcomes2. Assessment Instruments3. Data Collection & Analysis4. Instructional Changes 5. Post-Change Data Collection & Analysis: “Closing the Loop”

Distinguishing between Student Success Outcomes and Learning Outcomes Success Outcomes: –Reflect on the success of students. –Include such concepts as retention, completion rates, or student satisfaction. Examples: –75% of students who are interested in a four-year degree program will be accepted as transfer students. –70% of students in the course will receive a grade of “C” or better.

Skills and abilities that students should have acquired by the end of a course or program. Identifies what a student is able to do with the content. Begins with an action verb. Is measurable. 1. Student Learning Outcomes: Distinguishing between Student Learning Outcomes and Success Outcomes

First Year Experience Learning Outcomes Upon completion of this program, students will be able to: –Distinguish between high school and college expectations. –Develop a plan of educational, transfer, and career goals. –Weigh career assessment information and apply results to an educational decision-making process.

CAS Learning and Developmental Outcomes ( developental-outcomes/) developental-outcomes/

2. Assessment Instruments Measure the skills or abilities students will have when they complete a course or program. Lead to good evidence of student learning. Provide multiple approaches to evaluation. Provide information to improve a course or program.

Direct and Indirect Methods for Assessing Student Learning Direct Methods: –Provide evidence in the form of student products or performances. –Demonstrate that learning has occurred relating to a specific content area or skill. Examples –Rubrics or Rating scales –Student Self-reflection –Ratings/Comments from Internships –Portfolios –“Minute Papers”

Direct and Indirect Methods for Assessing Student Learning Indirect Methods: –Support findings from direct measures. –Reveal characteristics associated with student learning, but only imply that learning has occurred. Examples –Completion rates –Graduation rates –Department or program review data –Number of student hours spent on homework

Strategies for Choosing Your Assessment Tools Strategy #1: The Rubric –Criterion-based rating scale that can be used to evaluate student performance in almost any area. –Portrays what knowledge, skills, and behaviors are indicative of various levels of learning or mastery.

Foundations and Skills for Lifelong Learning VALUE Rubric Excellent 4 Proficient 3 Emerging 2 Poor 1 Curiosity Initiative Independence Transfer Reflection

Strategies for Choosing Your Assessment Tools Strategy #2: Student Self-reflection –Asks students to reflect on what and how they have learned. –Gives insight into the learning process.

Examples of Student Self-reflection Questions Middle States (2007). Student Learning Assessment : Options & Resources, p Describe something major that you have learned about yourself in this program. 2.What strategies did you use to learn the material in this course/program? Which were most effective? Why? 3. What one question about this course/program is uppermost on your mind?

Strategies for Choosing Your Assessment Tools Strategy #3: Internship Ratings/Comments –On-site supervisor rates the student on essential knowledge, skills, and attitudes. –Supervisor comments provide insights into overall strengths and weaknesses of a program.

Please refer to Middle States (2007). Student Learning Assessment : Options & Resources, p. 44. Figure 7 - Service Learning Rating Scale.pdf

3. Collecting & Interpreting Your Data “...the focus should be on direct measures of student learning. Knowing how students perform in the aggregate on structured examinations, written papers, laboratory exercises, and so on within a given course [or program] provides essential information on the extent to which stated learning outcomes are being realized.” (Middaugh, p.102)

Student Privacy –Be compliant with student privacy laws and regulations Omit any identifying student information such as name, address, social security number, etc. For more information on student privacy procedures, contact your College Registrar.

Reliable Results Trend –Collect results over time to improve the reliability of the results. –Particularly useful for small student populations (i.e. a course that offers one section of 25 students per term). Aggregate –Combine data from multiple collections of results. –This yields more information to use in making instructional decisions.

Reliable Results Samples –Have representative student samples Collect student work from multiple sections Include all possible variables to give you a complete picture: ‐Day/time distributions ‐Full and part-time faculty ‐Delivery Locations (Lincroft, Higher Education Centers, etc.) ‐Delivery Methods (Face-to-Face, Distance Learning, ITV, etc.) Consistency –Be consistent in using tools for pre-assessment and post-assessment. Results from the pre-assessment or Level 3 measure (i.e., test questions, rubric, etc.) provides the baseline/benchmark. Results from the post-assessment or Level 5 measure can then be compared to the previous results and yields accurate information that can be used to further improve teaching and learning.

4. Instructional Changes –The fourth step of the five- level cycle of ongoing assessment of student learning. –Changes to curriculum and instruction are made based on data. –The results of data analyses are used to inform decisions.

Successful Level 4 Changes Require: Analysis –Sharing and discussion of results –Analyzing results –Identifying areas of strength and weakness Actions –Decision-making based on those results –Creating a strategy for improving student learning “It is pointless simply ‘to do assessment’; the results of assessment activities should come full circle to have a direct impact on teaching and learning...” (Middle States, 2007, p. 59)

Successful Level 4 Changes Require: Attention to Detail –Implementing changes –Flexible planning and processes Empowerment –Who are the individuals empowered to make changes?

Attention to Detail EmpowermentAdvocacy Successful Level 4 Changes Require:

Level 4 Examples: Please see pages of: Student Learning Assessment: Options and Resources (Middle States, 2007) Figure 15 - Level 4 Changes.pdf

Step 5 Post-Change Data Collection & Analysis: “Closing the Loop” “Were the changes or decisions I made effective in improving student learning?”

References [AAC& U Value Rubrics] Rhodes, Terrel, ed Assessing Outcomes and Improving Achievement: Tips and Tools for Using Rubrics. Washington, DC: Association of American Colleges and Universities. KEN= Council for the Advancement of Standards in Higher Education. (2012). CAS Learning and Development Outcomes. Washington, DC: Author. 83AA C480B Council for the Advancement of Standards in Higher Education. (2012). CAS professional standards for higher education (8th ed.). CAS Self-Assessment Guide for TRIO and Other Educational Opportunity Programs. Washington, DC: Author.

References Middaugh, Michael, F. (2010). Planning and Assessment in Higher Education: Demonstrating Institutional Effectiveness. San Francisco, CA: Jossey-Bass. Opportunity For A New Millennium: EOF Planning Report (1997). New Jersey Commission on Higher Education. Retrieved online. Student Learning Assessment—Options and Resources. (2nd ed. 2007). Philadelphia: Middle States Commission on Higher Education, pp Walvoord, B. E., & Anderson, V. J. (1998). Effective grading. San Francisco, CA: Jossey-Bass.