Assessment of Learning

Slides:



Advertisements
Similar presentations
Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Advertisements

Engaging Online Faculty and Administrators in the Assessment Process at the American Public University System Assessment and Student Learning: Direct and.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
Training Practitioner Adjuncts: A Model for Increasing Educator Effectiveness Paul C. Jackson DM, PE Peg Jackson, DPA, CPCU.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
 Reading School Committee January 23,
Quality Matters TM : Introduction to QM and to the Rubric The Quality Matters™ Rubric 2008 – 2010 Edition Updated July 08.
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Overview: Competency-Based Education & Evaluation
Evaluating and Revising the Physical Education Instructional Program.
 To assess the learners achievement at the end of a teaching-learning process, for instance, at the end of the unit.  Measures the learners attainment.
Assessing Student Learning
Measuring Learning Outcomes Evaluation
Standards and Guidelines for Quality Assurance in the European
performance INDICATORs performance APPRAISAL RUBRIC
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Program Overview The College Community School District's Mentoring and Induction Program is designed to increase retention of promising beginning educators.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
1 Integrating Google Apps for Education to Business English Student Trainees’ On-the-Job Training English Reports Asst.Prof. Phunsuk Kannarik.
Instructional Design Aldo Prado. Instructional Design Instructional design is the process of easing the acquisition of knowledge and making it more efficient.
ADDIE Instructional Design Model
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Assessment Leader Training General Education Student Learning Assessment GEO Training Series: 2 of 5 Spring 2012 February 13, 2012.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
LOGO Internal Quality Assurance Model: Evidence from Vietnamese Higher Education Tang Thi Thuy, Department of International and Comparative Education,
Adolescent Literacy – Professional Development
Competency-based Instructional System Design: A Model for Program Assessment 3 rd Annual Texas A&M Assessment Conference Barbara Lyon, Ed.D., SPHR Tarleton.
Achieving the Dream Status Report Mentor Visit February 5-6, 2009.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Program Overview The College Community School District's Mentoring and Induction Program is designed to increase retention of promising beginning educators.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing Program Quality with the Autism Program Environment Rating Scale.
Understanding Meaning and Importance of Competency Based Assessment
Implementing Formative Assessment Online Professional Development What Principals Need to know.
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
RPPS Education Development Process Debbie Bender.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.
NAME Evaluation Report Name of author(s) Name of institution Year.
NSE Assessment Overview: Your Role in Outcomes-Based Assessment What Will We Learn About Our Students?
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Online Course Design Jennifer Freeman ACADEMIC ■ IMPRESSIONS
Developing an Effective Evaluation to Check for Understanding Part 2 Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
B.A. (English Language) UNIVERSITI PUTRA MALAYSIA
Using Groups in Academic Advising Dr. Nancy S. King Kennesaw State University.
2011–2012 Holistic Rating Training Requirements Texas Education Agency Student Assessment Division.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Program Evaluation Making sure instruction works..
Instructional Design Ryan Glidden. Instructional Design The process of planning, creating, and developing instructional or educational resources.
Instructional Leadership: Applying Concern & Use Name Workshop Facilitator.
NPQ Instructor II. Introduction  The NPQ Instructor II program is designed for individuals that are interested in the Fire and Emergency Services Training.
Quantitative research Meeting 7. Research method is the most concrete and specific part of the proposal.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
‘Uni Life’ A blended learning experience for students from low socio economic and low participation regions within the USQ catchment area.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Instructional Leadership Supporting Common Assessments.
Assessment & Evaluation Committee
Perspective Interview: Sofia Perez
Student Assessment and Evaluation
Assessment & Evaluation Committee
Student Assessment and Evaluation
Presentation transcript:

Assessment of Learning An Interview Analysis Elena Marcum Gary Bowling CUR-528 Assessment of Learning February 11, 2015 Dr Teresa Lao

Overview Interview Questions Interviewed 4 Educators Compare & Contrast Purpose Development Implementation Analysis of Assessments and Evaluations in their Learning Environments

Facilitators & Educators Date & Mode of Interview Joseph Boisley Instructional System Specialist/Curriculum Developer Christopher Melver Faculty University of Phoenix, San Antonio Campus Cindi Bluhm Faculty, Northwest Vista College Ripsime Bledsoe Academic Program Specialist, Northwest Vista College Mr Joseph Boisley 210-808-2009 email: joseph.boisley.civ@mail.mil Mr Chris Melver 210-524-2110 Christopher.Melver@phoenix.edu Mrs. Cindi Bluhm 210-486-4284 cbluhm@Alamo.edu Mrs. Ripsime Bledsoe 210-486-4221 rbledsoe6@Alamo.edu

1. What is the purpose of an instructor assessment? 1. An instructor assessment is used to determine if the recommended curriculum is being covered. 2. The purpose of instructor assessments is to be used as a tool in order to determine if learners are understanding the material. 3. Assess the environment of the class. 4. To ensure departmental standards are met JB: An instructor assessment is used to determine if the recommended curriculum is being covered. It’s also used to determine if sufficient information is provided to meet the desired objective outcome. CM: The purpose of instructor assessments is to be used as a tool in order to determine if learners are understanding the material. It is important in both traditional and non-traditional learning environments to measure learning outcomes. The instructor assessment can also serve as a Benchmark for the instructor to determine what adjustments might need to be made. CB: View the environment of the class – teacher and student interaction and how students interact. Is instructor prepared for class? Are students engaged? Is material appropriate? RB: To assess whether departmental standards have been met in the course like required topics, content, instructional methods, etc.

2. How do you gather information on external evaluations? 1. External evaluations are surveys sent to the Military Services after the service member returns to their duty station and perform duties related to the training we have provided. 2. Use data from online forums, and activity, the end of course survey is another resource used as well 3. College/District Institutional Research 4. External surveys sent to local business council to help develop peer-mentor program JB: External evaluations are surveys sent to the Military Services after the service member returns to their duty station and perform duties related to the training we have provided. This data is used to collect trend data to make necessary adjustment to the curriculum. CM: CM: In regards to external evaluations I use data from online forums, and activity. There is also peer evaluations that occasionally take place which provides me valuable feedback. The end of course survey is another resource used as well. The more information I can gather the better so I can continue to improve as well. CB: The college and district Institutional Research (IR) offices administer many outside surveys and studies each year and compile the results. RB: We sent surveys to our local business partners when we were developing our peer-mentor program. The surveys asked questions such as what traits do they see in successful mentors.

3. What do you consider when developing a student survey? 1. Students surveys are designed using a Likert Scale (for ease) to capture data on instructors, objectives, equipment, training environment, hours, references, and assessments. 2. Surveys must be developed in an effort to obtain the necessary data required by the facilitator. 3. It identifies the good and bad about a course 4. Departmental goals and desired outcomes JB: Since our curriculum is student-centered, we attempt to get student’s perspective on all facets or training. Students surveys are designed using a Likert Scale (for ease) to capture data on instructors, objectives, equipment, training environment, hours, references, and assessments. CM: In developing a student survey I first need to determine what my desired goal behind the survey is. Surveys must be developed in an effort to obtain the necessary data required by the facilitator. CB: Ask what they liked and did not like. Type of course (required?) What changes they would make. RB: I would consider the departmental goals and outcomes that we want taught in every class.  

4. What information do you gain from test analysis? 1. Test analysis provides insight on curriculum content, instructor performance, proficiency levels, as well as question stem, distracters and correct choice design. 2. The information can allow the educator to determine if desired learning goals are being met. 3. Reliability and validity of test questions 4. Provides information on whether the assessment instrument does what it is supposed to do. JB: Test analysis provides insight on curriculum content, instructor performance, proficiency levels, as well as question stem, distracters and correct choice design. CM: Information gained from the analysis of test can vary depending on the course. The main type of information would be whether learners are passing the test. While a test is only one form of assessment it can serve as a great benchmark for both learners and educators alike. The information can allow the educator to determine if desired learning goals are being met. CB: A test analysis can be used to ensure reliability (consistency) of the questions. Additionally it verifies validity—that the question is measuring what its supposed to be measuring. RB. Provides information on whether the assessment instrument does what it is supposed to do. Is it measuring what it is intended to measure/

5. What assessment tool do you use the most and why is it so frequent? 1. Student surveys are used the most because our training environment is student-centered. 2. Performance based assessments 3. Common Writing Assignments 4. Survey questions, observations and student feedback JB: Student surveys are used the most because our training environment is student-centered. It provides us with a large population of opinions and provides eyes on without a formal evaluation of instructors. CM: Performance based assessments are my favorite. It is important to see if the learner really understands and knows the material that they were taught. Assessments are frequently used in order to determine if the desired learning outcomes are being achieved. CB: Common writing assignments. I use these for two reasons. First, these writing assignments show the higher order of learning—such as synthesis and analysis. Second, the director of our discipline has prescribed it. RB: Survey questions, observations and student feedback

6. What are the most common write-ups during instructor evaluation? 1. The most grossly overlooked component of instruction is transitions 2. The most common write-ups are ensuring that the instructor is connecting with learners 3. Organization and preparedness 4. Organization and over-lecturing JB: The most grossly overlooked component of instruction is transitions. Instructors seem to not understand the importance of tying previous material to new material. Another area is questioning. Instructors tend to focus on providing a lecture instead of engaging students in the learning process. The problem is that pedagogy (child learning) is quite different from andragogy (adult learning) in that students have a stake and connection to the learning experience. CM: I would say the most common write-ups are ensuring that the instructor is connecting with learners. Understand the types of learners in order to be effective in engaging and teaching learners. It is important that learners are understanding the objectives being taught. When teachers improve then learning among the learners should also improve. CB. Organization and preparedness. Interaction with students (including classroom management). Is the instructor’s material appropriate for the course. RB: Lack of interaction with students; too much lecturing; group work with no organization  

7. How do you use the data in test analysis to make corrections in the learning environment? 1. Identify key facts, break down data 2. The data obtained for test should serve as benchmark in an effort to make adjustments in the learning environment 3. Revising formative and summative assessments. 4. Focusing on methods of instruction, i.e. lecture, group work JB: Identify key facts: Identify key facts in an array of data. Recognize when pertinent facts are incorrect, missing, or require supplementation or verification. Distinguish information that is not pertinent to a decision or solution. Break down data: Break down data into component parts to understand the nature and relationship of the parts. Recognize underlying principles, patterns, or themes in an array of related information, and determine whether additional information would be useful or necessary. CM: The data obtained for test should serve as benchmark in an effort to make adjustments in the learning environment. It is important to find ways to engage and provoke learning among all learners. Every class is unique in regards to the backgrounds of learners and so there is no cookie cutter approach to learning. CB: Data gathered is valuable in revising in course assessments. Additionally, the analysis may also result in curriculum or instructional revisions. RB: It allows me to verity that the method of instruction is appropriate or the best method for the material.

8. What is the purpose of a curriculum validation? 1. A curriculum validation simply “proofs” the curriculum 2. Curriculum validation is a way to periodically ensure that the curriculum is relevant and effective 3. That it meets local/state requirements. 4. Ensures the agreed-upon (or directed) learning outcomes are being met. JB: A curriculum validation simply “proofs” the curriculum. When curriculum is designed we are only assuming it will work. Validation allows us to make corrections to all course curriculum as well as providing a report that is not opinion based. CM: Curriculum validation is a way to periodically ensure that the curriculum is relevant and effective. The curriculum should align with certain desired outcomes so that learners can apply their learning. The combination between theory, and practice as to how it relates to the individual courses and overall curriculum. Preparing adult learners to apply and utilize the knowledge obtained should always be the ultimate goal. CB: That the curriculum meets the requirements of the course description either in the Alamo Colleges course inventory or more importantly the course description and learning outcomes established by the Texas Higher Education Coordinating Board. RB: Our broad learning outcomes come from the state. It is important that our local learning outcomes support the state’s. A validation ensures ours and the state’s outcomes are being satisfied.

9. What controls do you put in place during implementation to ensure success of new training? 1. Develop a implementation plan so all instructors are aware of what to monitor and information to collect. 2. When it comes to implementation do temperature checks along the way to ensure that things are working as planned 3. Sound learning outcomes and assessment plan 4. Identify key points during the semester and focus and assess these key points. JB: Ensure all curriculum material is being used. Develop a implementation plan so all instructors are aware of what to monitor and information to collect. CM: When it comes to implementation I do temperature checks along the way to ensure that things are working as planned. In order for a new training to be a success it is important to analyze things along the way in an effort to make adjustments as necessary. CB: The key to successful training starts with developing sound learning outcomes and creating instructional methods and materials that support those outcomes. Secondly, a valid assessment plan is paramount to the success of any program. RB: We identify key points during the semester and focus on those. One of the key points is when the students complete their educational goal plan. We (the staff) will assist the instructors if requested.

10. What factors do you use when choosing participants for a needs assessment? 1. Get a complete perspective of people doing the job at every level, journeyman, technician, and leaders. 2. Randomly select participants for needs assessments in an effort to get non biased results from the assessment. 3. Do they have a vested interest? For students, are they a representative sample? 4. Random selection. Ensure the population is large enough to get a good sample. JB: We try to get a complete perspective of people doing the job at every level, journeyman, technician, and leaders. CM: I like to randomly select participants for needs assessments in an effort to get non biased results from the assessment. CB: Participants should be selected that have a vested interest in the program such as faculty, staff, administrators, curriculum developers, etc. Students that are selected should be a representative and valid sample of the overall student population. RB: Random selection of students within the overall population. Work with the IR department to identify numbers and process.

Conclusion Interview Questions Interviewed 4 Educators Compare & Contrast Purpose Development Implementation Analysis of Assessments and Evaluations in their Learning Environments