Download presentation
Presentation is loading. Please wait.
Published byOpal Daniel Modified over 9 years ago
1
1 Core Assessment: Challenges Facing New Programs TAIR Professional Development Workshop September 2004 Dr. Dena Pastor Center for Assessment and Research Studies
2
2 What is Assessment? Conceptualize Core Curriculum What is our intention? How will it be structured? Create learning goals and objectives for students Decide upon assessment methods Collect and analyze data Interpret data – what does this mean? Use what was learned from assessment process to make improvements to core curriculum
3
3 Core Curriculum Assessment in Virginia ► 1987: State Council of Higher Education in Virginia (SCHEV) mandated that public institutions: Define learning objectives Create an assessment scheme ► A report by SCHEV in 1999 notes: “Many institutions struggled in their initial efforts to respond.”
4
4 Why is Assessment of the Core Curriculum So Challenging? ► There are many questions that an institution must answer: Why assess? What to assess? When to assess? Who to assess? How to assess? 1. Stakes of assessment 2. How to collect the assessment data 3. Assessment Method 4. Reporting of Results 5. Who’s going to oversee core curriculum assessment?
5
5 Why assess? (1) It is a good idea to keep in mind that the state may want such information for the purposes of resource allocation…but, it is a better idea to keep in mind that assessment can bring about improvement in the curriculum, which in turn may lead to a higher quality education for the students. The state told me to. To improve the quality of my core curriculum program.
6
6 Assessment Tone ► Assessment is often met with resistance by faculty and/or administrators Make them want to do it, to overcome the feeling of “I have to do it”. Create an “assessment culture” at the university that has a positive attitude ► Remind them that they… have more control over the assessment process compared to teachers/administrators in K-12 can use the assessment process to provide themselves with the kind of information that they will find useful
7
7 Institutions’ Orientations to Student Assessment ► Proactive ► Proactive Institutions engaging in student assessment prior to state mandate ► Retain a sense of control over implementation and utilization of assessment ► Internal improvement remains primary goal of assessment, external accountability is secondary Based on 1997 information from National Center for Postsecondary Improvement
8
8 Institutions’ Orientations to Student Assessment ► Responsive Institutions engaging in student assessment as a result of state mandate ► Decisions regarding implementation and utilization of assessment shared by internal and external constituents ► Internal improvement and external accountability are the dual goals of assessment Based on 1997 information from National Center for Postsecondary Improvement
9
9 Institutions’ Orientations to Student Assessment ► Reactive Institutions engaging in minimal student assessment as a result of state mandate ► External accountability remains primary goal of assessment, internal improvement is secondary or non-existent ► Institutions’ orientation may be predictive of institutions’ support for assessment & utilization of assessment information Based on 1997 information from National Center for Postsecondary Improvement
10
10 State-Level Student Assessment Initiatives ► Research says that state-level initiatives that promote greater institutional support for & engagement in student assessment are those that: emphasize institutional improvement as their primary purpose, not external accountability set broad guidelines for student assessment, but leave the specifics up to the institution Based on 1997 information from National Center for Postsecondary Improvement
11
11 What to assess? ► What to assess should be directly tied to the learning objectives for the core curriculum ► Learning objectives should be an answer to the question: What should a student know and be able to do as a result of completing their core curriculum coursework? ► Measuring # of faculty with Ph.D.’s, faculty to student ratio, # students completing the course, etc. will not help you understand if students are meeting the learning objectives Need to measure skills, knowledge, behavior or values of the students
12
12 When to assess? ► Preferable to measure a single student on multiple occasions using the same assessment method More is better! ► Preferable to have a pre-measure of knowledge, skills, behavior to serve as baseline What do the students know coming in? What kind of /amount of knowledge do they gain?
13
13 Who to assess? ► Unit of analysis should be congruent with purpose of assessment Are you using assessment to hold student accountable for knowledge/skills? (Assess student). Are you using assessment to evaluate a curriculum? (Aggregate student data). Data are collected at the student level, but results are often only reported at the program level If it is helpful, results might be reported by course, semester, or subgroup (e.g., males vs. females) Not advisable to report results by faculty member
14
14 Who to assess? ► Collecting the data: Preferable to assess all students If not feasible, assess a random sample (the more the better) Could ask for volunteers (good luck!), but your sample will be biased
15
15 1. Should stakes be associated with a students’ performance on the assessment? 2. How, logistically, are we going to collect the assessment data? 3. What kind of method are we going to use for assessment? 4. How should the results of assessment be reported and who should they be reported to? 5. Who’s going to oversee core curriculum assessment? How to assess?
16
16 HIGH STAKES PROS ► Students will be more likely to take assessment seriously, to prepare and to put forth maximal effort. High Stakes for the Student: Must pass in order to graduate, register, fulfill course requirements, etc. CONS ► Security of the assessment materials is now of concern (may need more proctors, multiple test forms, secure databases). ► How to handle students who don’t pass? Should multiple attempts be allowed? ► Legal concerns: What about students/parents of students who want to contest a score? Are the instruments of sufficient quality to bear the weight of the consequences?
17
17 ► Requiring students to pass in order to graduate, register, or fulfill course requirements. ► Reporting student’s assessment scores on their transcript ► Reporting assessment scores to the students themselves via e- mail/e-campus ► Not reporting the assessment scores to the students at all Stakes & Examinee Motivation HIGH STAKES NO STAKES HIGH EXAMINEE MOTIVATION QUESTIONABLE EXAMINEE MOTIVATION
18
18 Collecting the Data ► Course-Embedded Assessment Examples: common final across sections, common portfolio of students’ work ► Computer-Based Test Examples: Test administered in computer lab or over the web; students have a window of time in which to take the exam ► Assessment Days Particular days set aside for assessment purposes
19
19 Method of Assessment ► Commercially-available test (standardized tests) Do a backwards translation to ensure that content of test matches up well with learning objectives Could be costly ► Faculty-created tests (locally-developed) Time-consuming & challenging for faculty May better ensure a match between test content and learning objectives Faculty may feel “ownership” over test and may be more committed to revisions and enthusiastic about results Can use test-development process as a professional development opportunity for faculty (perhaps even pay them or give them release time to participate) If high-stakes for students, then multiple forms of test may be necessary thus requiring more item-writing from faculty
20
20 Arguments for Locally-Developed Instruments ► Research supports that the use of locally- developed instruments over state-mandated standardized tests: Is more helpful for institutional improvement Increases institutional support for assessment and institutional ownership over assessment ► Problem: How to compare institutions to one another? But is that the goal? May be better to compare each institution to its own objectives, previous performance, or standards
21
21 Setting Standards ► May want to use standard setting procedures to establish “cut-offs” for proficiency on the test ► Could be that students are gaining knowledge/skills over time, but are they gaining enough? STANDARD SET BY FACULTY
22
22 Method of Assessment ► Performance Assessment Student portfolio of their work (e.g., collection of essays) that is rated by faculty according to a rubric An actual performance by the student (e.g. speech) that is rated by faculty according to a rubric ► Pros & Cons May be more authentic, but… With a large # of students, may be costly in terms of faculty time/resources ► Perhaps use a random sample ► Preferable to use multiple methods ► To supplement performance assessment or test: Surveys of faculty/students Focus groups May be best when wanting to measures attitudes, values and behavior; not gains in knowledge or skills
23
23 Student Surveys ► May ask student if they feel that their core curriculum courses helped them master learning goals and objectives They may feel that they did, but wouldn’t you rather know if they did?? ► Or may have student self-reflect on their learning and development May be a very useful exercise for the student; really involves student in assessment; but self-reflections are NOT hard, unbiased data ► Surveys are best used as supplementary material
24
24 In Defense of Multiple-Choice Tests ► Can create multiple-choice items that are cognitively complex Challenging, but do-able! Easy to score Can use item analysis to improve item Most practical with a large number of examinees Direct measure of student learning
25
25 Discriminate between association and causation, and identify the types of evidence used to establish causation For the following pairs of events, determine whether: a. one event caused the other b. the two events are related, but no causation can be determined c. the two events are unrelated 1. A man goes to use his telephone, only to find it dead, and; The lights flicker and go out. The lights flicker and go out.
26
26 Reporting Results ► Results should be reported and discussed by all stakeholders (e.g., faculty, students, administration, state higher education council) ► Keep it simple: descriptive statistics/tables/graphs over inferential statistics ► Keep in mind the reliability/validity of the assessment tool, it’s match to learning objectives, the timing of the assessment, characteristics of the sample (generalizability & size)
27
27 Link Results to Improvement Assessment Technique or Process Findings/Results Use Made of Findings Evidence of Effect of Improvement 1. Writing portfolios scored Many portfolios contained materials that faculty did not think were appropriate for the course; assessment data indicated that such portfolios were rated more harshly Instructors of the course met to discuss what were appropriate assignments for the course and what were not – ideas were shared and guidelines for assignments were developed Writing portfolio scores increased over three next years and fewer portfolios were found including inappropriate materials
28
28 Ways to Use Results 1. Continuous Improvement 2. Program Review Self-studies and peer reviewSelf-studies and peer review 3. Planning and Budgeting Resource allocation, Needs assessmentsResource allocation, Needs assessments 4. Teaching and Learning Changing teaching methods, faculty distribution, curriculum, etc.Changing teaching methods, faculty distribution, curriculum, etc. 5. Improving Assessment Revise process for the futureRevise process for the future
29
29 VA Assessment Reporting Guidelines Any scientist would require the same ► Presented in quantitative & non-anecdotal form ► Evidence of reliability ► Evidence of validity ► Description of data collection context (sampling procedures, etc.) ► Evidence of the use of assessment results
30
30 Who’s going to oversee core curriculum assessment? Responsibility of some or all of the following: ► Core curriculum committee Assessment sub-committees ► Separate committees for different areas consisting of faculty from across the different departments that may be teaching in that area ► Academic Affairs ► Student Affairs ► Institutional Research ► Separate Office of Assessment At the very least Director or Coordinator of Assessment
31
31 State Resources for Institutions’ Assessment Efforts ► Budget lines to fund assessment activities ► Permission to charge student fees for assessment activities ► Offering grants or technical assistance ► If they offer $$ for assessment, take it!
32
32 Institutions’ Assessment Support Strategies ► Have an explicit and visible student assessment plan ► Use a series of incremental planning steps ► Examine assessment practices at other institutions ► Pilot assessment strategies ► Encourage broad participation in the planning process, particularly faculty & administrators Based on 1997 information from National Center for Postsecondary Improvement
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.