The Development of a Comprehensive Assessment Plan: One Campus’ Experience Bruce White ISECON 2007.

Slides:



Advertisements
Similar presentations
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
Advertisements

Accreditation Process Overview Presented By: The Saint John Vianney Accreditation Team Chris Gordon Pam Pyzyk Courtney Albright Dan Demeter Gloria Goss.
Assessment Report Computer Science School of Science and Mathematics Kad Lakshmanan Chair Sandeep R. Mitra Assessment Coordinator.
The Assessment Imperative: A Work in Progress A Focus on Competencies Ricky W. Griffin, Interim Dean Mays Business School Texas A&M University.
Assessment Policy Overview Dwayne Holford Coordinator, Academic Affairs.
Great Lakes Council of Business Schools and Programs Regional Conference October 6 – 7, Dr. Reginald J. Gardner Campus College Chair – School of.
Assessing Students Preparedness to Compete and Succeed in a Global Economy Through Written Communications Robert A. Chin & Carolyn Dunn Donna Hollar Department.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
A Process for the Direct Assessment of Program Learning Outcomes Based on the Principles and Practices of Software Engineering Robert W. Lingard California.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
The Academic Assessment Process
1 American Society For Engineering Education Annual Conference St. Louis, MO June 18-21, 2000.
SCHOOL OF BUSINESS Masters in Business Administration CIP Code: digit Program Code: Program Quality Improvement Report
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Report to External Review Board W. Spencer Guthrie, Ph.D. Department of Civil and Environmental Engineering Brigham Young University December 1, 2006 W.
Industry Advisory Board Department of Computer Science.
Types of Evaluation.
Quality Assurance Review Team Oral Exit Report School Accreditation Bayard Public Schools November 8, 2011.
Assessment College of Engineering A Key for Accreditation February 11, 2009.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
CAA’s IBHE Program Review Presentation April 22, 2011.
Assessment Techniques for Curricular Improvement
King Fahd University of Petroleum and Minerals
Assessment at UW-Platteville What we are doing to assess student learning? What improvements we have made as a result?
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Department of Music School of Liberal Arts B.M.Ed In music CIP Code (681) 1 Program Quality Improvement Report
OCTOBER 2011 Considerations for Assessing Program Objectives Malcolm LeMay Director of Operations College of Business.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
Business, Technology, and Design (BTD). Unifying Vision/Identity BTD will provide a personalized, supportive, and collaborative academic environment.
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
University of Central Florida S.O.S.: Student Outcomes Solutions for Program Assessment Paula S. Krist, Ph.D. Director, OEAS December 5, 2005 CS-55.
IS Faculty Perceptions of ABET Accreditation Presented by Bruce White and Wendy Ceccucci.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Council for Exceptional Children/Division of Early Childhood Conference October 2010 Kim Carlson, Asst. Director/619 Coordinator Ohio Department of Education.
Direct and Indirect Measures INPUTS OUTCOMES. Assessment Basics: Overview Characteristics of learning outcomes Introduction to assessment tools Validity.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
PTEU Conceptual Framework Overview. Collaborative Development of Expertise in Teaching, Learning and Leadership Conceptual Framework Theme:
Quality Assurance Review Team Oral Exit Report District Accreditation Bibb County Schools February 5-8, 2012.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
IS Assessment Test One campus’s use. Why do assessment? First, assessment is what we faculty members can do in order to demonstrate to ourselves that.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Quality Assurance Review Team Oral Exit Report District Accreditation Murray County Schools February 26-29, 2012.
King Saud University, College of Science Workshop: Programme accreditation and quality assurance Riyadh, June 13-14, 2009 II.4 Exercise: Assessment of.
Intro to Outcomes. What is “Outcomes”? A. a statewide initiative aimed at improving learning and accountability in education B. a standing SFCC committee.
Quality Assurance Review Team Oral Exit Report School Accreditation AUTEC School 4-8 March 2012.
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
B.A. (English Language) UNIVERSITI PUTRA MALAYSIA
TUTORIAL: USING IS MAPPING FOR CURRICULUM IMPROVEMENT AND ACCREDITATION John H. Reynolds George Nezlek School of CIS, Grand Valley State University Allendale,
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
ABET Accreditation Criterion 4: Continuous Improvement Direct Assessment of Learning Outcomes Dr. Abdel-Rahman Al-Qawasmi Associate Professor EE Department.
SHS GUIDING PRINCIPLES Academic SystemsBehavioral Systems 1-5% 5-10% 80-90% Intensive, Individual Interventions Individual Students Assessment-based High.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
FLORIDA EDUCATORS ACCOMPLISHED PRACTICES Newly revised.
Industry Advisory Board May 30 th, 2014
Defining Our Work: Faculty Assignment Documents Elaine Bowen Health Specialist, F & H.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Industry Advisory Board June 8 th, 2012
Understanding a Skills-Based Approach
Curriculum Coordinator: Marela Fiacco Date : February 29, 2015
Presentation transcript:

The Development of a Comprehensive Assessment Plan: One Campus’ Experience Bruce White ISECON 2007

Feedback / accountability 1. “For society to work […] we must be accountable for what we do and what we say.” 2. “No person can succeed unless he or she is held accountable” 3. “Feedback is the breakfast of champions” 4. “You need a culture of assessment, not a climate” 1 -Betty Dowdell 2 – Grant Wiggins 3 – Ken Blanchard 4 – Gloria Rogers

Overview Are we teaching what we say we are? Are students learning? How can we be more effective in our instruction? SO … What do we want students to learn? Why do we want them to learn it? How can we help them to learn it? How do we know what they have learned?

Furthermore...  Stakeholders want to see if we are accomplishing our goals of education.  Possible stakeholders:  Students  Parents  Employers  Board of Regents / State Agencies  Accrediting groups (AACSB / ABET / etc.)  Faculty  Alumni

Our campus program  The Information Systems Management program at Quinnipiac University in Hamden Connecticut started our journey towards a comprehensive assessment program in  Prior ‘assessment’ was informal:  So ISM faculty – how do you think we are doing?  So Advisory Board – what advise do you have for us?  So IS education community – what should we teach (like IS2002 curriculum)  So Employers – what do our students need to know (or know better)  Etc.

Our desired outcomes:  Analysis and design of information systems which meet enterprise needs.  Use and experience with multiple design methodologies.  Experience in the use of multiple programming languages.  Development of hardware, software and networking skills.  Understanding of data management.  Understand the role of IS in Organizations.

Possible assessment methods: Direct Assessment Methods :  Simulations  Behavioral Observations  Performance Appraisals  Locally Developed Exams  External Examiner  Portfolios / E-portfolios  Oral exams  Standardized Exams (Source: Gloria Rogers- ABET Community Matters 8-06)

Indirect Assessment Methods  Exit and other interviews  Archival data  Focus groups  Written or electronic surveys / questionnaires  Senior exit surveys  Alumni surveys  Employer surveys  Other factors:  IS model curriculum  Advisory board  Alumni Surveys

Foundation of Our Assessment Program We got interested in the CCER IS Assessment test early It is a direct assessment test based on the IS2002 model curriculum It has been thoroughly tested and analyzed It has been shown to be valid and reliable Test scores reported on 37 different areas – and relevant to our learning outcomes The test questions are written at higher levels of Bloom’s taxonomy – with scenarios

More on our assessment process  We also use a senior exit survey (indirect measure)  An advisory board gives input  Informal controls:  Campus decisions (such as number of credits allowed, changes in general education)  Model Curriculum Changes  Employer input  Conferences / technologies

Specific Learning Skills Skill Set 3.0 Strategic Org. Systems Develop Avg 3.1 Organizational Systems Development Strategic Utilization of Information Technology IS Planning IT and Org. Systems Information Systems Analysis & Design Decision Making Systems Concepts, Use of IT, Customer Service Systems Theory and Quality Concepts

3.2 Project Management Avg Team Leading, Project Goal Setting Monitor and Direct Resources and Activities Coordinate Life Cycle Scheduling and Planning Apply concepts of continuous improvement Project Schedule and Tracking Continued

Overall Analysis Area Avg Hardware and Software Modern Programming Language Data Management Networking and Telecommunications Analysis and Design Role of IS in Organizations Number taking test Overall

Senior Exit Survey Learning Objective 2006 scores 2007 scores Systems Analysis (including project management Alternative Design Methodologies Programming Languages Hardware and Software Networking Data management 4.2 IS in Organizations Ethics in IS / IT Global aspects of IS / IT

Next Step – setting metrics  So … we have a solid direct measurement  And … we have a good indirect measure  Now … what  We are working on setting metrics (especially for our direct measurement – the CCER IS Assessment test)  From ABET:  “Every student does not have to achieve the desired outcomes, but targets must be defined.”

Setting expectations  The faculty have considered the goals of the program and feel our program emphasizes systems analysis, the role of IS in organizations and data management ISECON Accreditation16 AreaExpectation Systems Analysis80% will score 50 or higher Programming60% will score 45 or higher Data Mgmt70% will score 50 or higher Role of IS in Org.80% will score 50 or higher Networking70% will score 45 or higher Hardware / Software70% will score 45 or higher

Now what  Next year as students take the CCER IS Assessment test and as we get feedback from our senior survey, we will analyze the data to see if our outcomes have been reached.  If they have:  If they haven’t:  We analyze why not – was it poor instruction? Poor students? Poor textbook? Overly optimistic expectations?  We change in an effort to ‘constantly and continually improve’ our program!!