How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.

Slides:



Advertisements
Similar presentations
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
Advertisements

Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
SLO Process A process to document a measure of educator effectiveness based on student achievement of content standards.
Writing the Syllabus Teaching Skills Purpose of Syllabus Communicates what the course is about Communicates what students need to know in the beginning.
IDENTIFICATION 1 PROPOSED REGULATORY CHANGECOMMENTS Implement a four step ELL identification process to ensure holistic and individualized decisions can.
Beyond the Parent-Teacher Conference: Partnerships that Enhance Student Learning Developed by Mary Louise Silva, Director of Parent & Community Engagement.
Looking at Student work to Improve Learning
CAA’s IBHE Program Review Presentation April 22, 2011.
Collaboration I nstruction Assessment 1st AnalysisReflection Intervention Assessment 2nd COMING FULL CIRCLE Mallard Creek and UNCC PDS Work Plan Outcomes.
Evidence Based Teaching Strategy: Applied to Student Orientation
Assessment Report School of The Arts, Humanities and Social Sciences________________ Department: Political Science and International Studies.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
Implementing Change: A Holistic Approach to Developmental Education Sue Cain, Director Transition and University Services Eastern Kentucky University.
OPENING DAY 2012 THE “ONE MORE” CAMPAIGN. BACKGROUND Goal 1: Increasing student success and academic excellence through student-centered instruction,
LEILEHUA HIGH SCHOOL Aloha Coleman - Principal Kerry Kawamura – DIR. of Curriculum/Instruction Tisha Yamasaki - DIR. of Curriculum/Instruction Dion Cabalce.
University of Maryland Baltimore County Department of Psychology Psyc100: Introductory Psychology Eileen O’Brien, Ph.D.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Standards-Based Education Curriculum Alignment Project Elementary Principals’ Meeting October 21, 2010.
Instructional Plan Template | Slide 1 AET/515 Instructional Plan Misty Lunsford.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
Southern Regional Education Board HSTW SC 2005 cte1 Career/Technical Education: Doing the Right Thing and Getting High Student Achievement Gene Bottoms.
Technology assessment in math 2400 calculus 2 spring assessment day 2010 jenn berg.
Instructional Plan | Slide 1 AET/515 Instructional Plan December 17, 2012 Kevin Houser.
1 What Are We Doing Here Anyway? Vision for our Work: Effective Science Learning Experiences Dave Weaver RMC Research Corp.
“Learn It! Live It!” Ensuring the Workforce Readiness Skills and Behaviors of Today’s and Tomorrow’s Workers Fall Convocation 2015 Presentation Quality.
Evolution of a Program Assessment Liberal Arts Degree.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Professional Development Opportunities for the New Math Standards.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Assessment of Course-Level Learning Outcomes in Psychology.
Increased Academic Success Motivation Commitment College Prep Skills Academic Vision (Goals) Life Skills Academic Support Student Engagement Content Relevancy.
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute ABAI, 2011.
Language Arts Senate Report: Student Learning Outcomes SLO Process => Inquiry Inquiry => What We Do Well vs. What Needs to Improve Inquiry = Faculty Dialogue.
High Risk Comparisons SPC High Risk Data AY2010Fall 2009Spring 2010Totals # of High Risk Courses Enrollment in High Risk Courses 10,55211,00321,555.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Program Review Presentation April 30th, 2014
Consider Your Audience
Learning Assistance Department
Closing the Assessment Loop
Computational Reasoning in High School Science and Math
Institutional Student Learning Outcome Assessment Report Fall 2015 Assessment Cycle
Wethersfield Teacher Evaluation and Support Plan
Institutional Program Review 2017 Update
New Teacher Evaluation Process
Program Level Assessment reported in: WEAVEOnline
Introduction to Student Achievement Objectives
High Risk Comparisons
The Heart of Student Success
What to do with your data?
Industrial Technology Management Program Canino School of Engineering Technology Fall 2016 Assessment Report Curriculum Coordinator: Eric Y. Cheng Date.
Criminal Justice A.A.S. School of Science, Health and Criminal Justice Fall 2015 Assessment Report
Criminal Justice: Law Enforcement Leadership School of Health, Science and Criminal Justice Fall 2015 Assessment Report Curriculum Coordinator: Lisa Colbert.
_Accounting_ Program School of Business and Liberal Arts Fall 2016 Assessment Report
Curriculum Coordinator: Marela Fiacco Date : February 29, 2015
Program/Department Canino School of Engineering Technology Fall 2016 Assessment Report
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Electrical Engineering Tech. (A. A. S
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Department Chair: Liz Brown Date of Presentation: 01/20/17
Program Director: D. Para January 2016
Curriculum Coordinator: Brandon Baldwin
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
Curriculum Coordinator: Janet Parcell Mitchell January 2016
Curriculum Coordinator: Pamela Quinn Date of Presentation: 1/19/18
Curriculum Coordinator: Marela Fiacco Date : January 18, 2018
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning

Participants’ Learning Outcomes At the end of this training, participants will be able to  Interpret SLO assessment data;  Make recommendations based on the interpretations.

We have collected SLO assessment data, now what? Once we have collected data, we tend to … 1.Make no recommendations; 2.Make recommendations that are too broad to be implemented; 3.Make recommendations but don’t implement them. “We have met the outcome, so no recommendations for improvement at this time.”

Using assessment data involves—  Interpreting the Data  Making Recommendations (Future Tense)  Implementing Recommendations  Report on Implementation (Past Tense) Data are meant to be used!

Interpreting Data 1.Explain what factors contributed to the results. 2.Address why the target was met or not met? Example 1 PLO: Identify and describe the use of various ceramic materials, production methods, and firing processes. (Art) Target Outcome: 70% of students enrolled will pass the final exam for the course. Actual Results: 85% of the students in the course passed the final exam, exceeding the target. Interpretation of Data: Further analysis of the data indicated that 15% of the students (16 students) did not meet the outcome, and out of these 16 students, 10 students didn’t describe the firing processes adequately. This raised the questions about the teaching method and assessment method. Is verbal explanation of the firing processes sufficient? What is the best way to test students’ mastery of firing processes?

Making Recommendations Based on Interpretation Interpretation of Data: Further analysis of the data indicated that 15% of the students (16 students) did not meet the outcome, and out of these 16 students, 10 students didn’t describe the firing processes adequately. This raised the questions about the teaching method and assessment method. Is verbal explanation of the firing processes sufficient? What is the best way to test students’ mastery of firing process? Recommendations: 1.Use visual aid to explain the firing process; 2.Use video to demonstrate the firing process; 3.Test students’ mastery of the firing processes by having students display their ceramic art work that is produced by students’ applying firing processes.

Interpret the Data Example 2 PLO: Gather information and formulate conclusions regarding the client's needs and priorities to develop a client centered intervention plan. (Occupational Therapy Assistant) Target Outcome: 80% of students should score a 3 and 100% should score a 2 or above on a scale of 1-3). Actual Result: 54% of the students scored a 3 (Excellent) and 100% of the students scored a 2 Satisfactory). The target was partially met. Interpretation of Data: Faculty discussed the actual results and found that students who failed to score “3” failed to demonstrate critical thinking skills when designing the intervention plan, which prompted the necessity to implement critical thinking activities in OTHA2302.

Making Recommendations Based on Data Interpretation of Data: Faculty discussed the actual results and found that students who failed to score “3” failed to demonstrate critical thinking skills when designing the intervention plan, which prompted the necessity to implement critical thinking activities in OTHA2302. Recommendations: Faculty will design student critical thinking activities and incorporate these activities in OTHA2302.

Types of Recommendations Recommendation TypesExamples Instructional Strategies (1). Adjust course delivery to teach the course using a topics-based approach and make it more engaging by incorporating active learning strategies and case studies to increase retention of concepts. (2). We would like to incorporate more hands on activity in lab to reinforce learning. Curriculum Changes (1). Add more relevance and pertinence to the biology non-majors’ curriculum so that non-majors can use the basic concepts in biology to better understand social issues related to the environment and public health. (2). We would like to add a research paper in the course, showing uses of the information in real life. (3). Require uniform amount of time to teach the learning outcome (e.g. 45 minutes).

Types of Recommendations Recommendation TypesExamples Student support (1). Continue to advise and motivate students: encourage attendance and encourage completion of coursework. (2).Offer tutoring during office hours Teacher Development (1). We would like to provide professional development for faculty on collaborative learning. (2). Implement faculty mentoring system Assessment Method (1). Using one capstone project to evaluate all PLOs may not be adequate. We will add simulation questions that simulate the state exams to assess each PLO. (2). We will revise the grading rubric to target more discrete skills so that we can identify which area is most challenging to students.

Implement Recommendations  Recommendations made based on data interpretation need to be implemented during the subsequent assessment cycle.  Ways to implement recommendations include the following: A. Make the recommendation a departmental requirement; B. Form a taskforce to complete the recommended project; C. Implement the recommendation in the classroom; D. Other activities.

Report on Implementation Example 1 (Departmental Activity and Chair’s Requirement) The recommendation, “Provide professional development to adjunct faculty with regard to the course content standards set by the department and require them to abide by the standards,” was implemented by the chair who organized the professional development event and also communicated the requirements to the adjunct faculty in Fall  Report on the implementation by specifying what recommendation was implemented in what way and by when;  Use past tense.

Report on Implementation Example 2 (Faculty Classroom Activity): The recommendation, “Faculty will design student critical thinking activities and incorporate these activities in OTHA2302,” was implemented by Faculty who designed and implemented the critical thinking activities in OTHA2302 in Fall 2013.

Report on Implementation Example 3 (Taskforce Activity): A taskforce was formed with full-time and part-time faculty representatives to address the recommendation, “Revise the grading rubric to target more discrete skills so that we can identify which area is most challenging to students.” The revised rubric was adopted in Fall Using the revised rubric, the faculty was able to identify the “translation process” to be the most challenging area for students, so instruction was adjusted to focus more on teaching the “translation process.”

Participants’ Activity Based on the scenario of data interpretation, please make recommendations and report on the implementation: Scenario 1: The accounting faculty at a campus found that some aspects of the accounting cycle proved to be difficult to students. Faculty believed that lack of critical thinking skills might have contributed to the difficulty in understanding the accounting cycle concepts. Recommendation Report on Implementation (at the end of subsequent year)

Participants’ Activity Based on the scenario of data interpretation, please make recommendations and report on the implementation: Scenario 2: The Math faculty found that students who failed the exam were online students who failed to complete all the practice exercises. Recommendation Report on Implementation (at the end of subsequent year)

Thank you!