Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.

Slides:



Advertisements
Similar presentations
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Advertisements

Writing Assessment Plans for Secondary Education / Foundations of Educations Department 9 th Annual Assessment Presentation December 3, 2013 Junko Yamamoto.
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessment Report Biology School of Science and Mathematics Rey Sia, Chair Laurie B. Cook, Assessment Coordinator.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Bill Zannini Business Programs Coordinator October 27, 2008.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
EECS Faculty Retreat ECE Assessment Report August 22, 2014.
1 The Path to the Ph.D. in IS: Part 4, The Dissertation.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Computer Science Department Program Improvement Plan December 3, 2004.
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
1 Dissertation Process 4 process overview 4 specifics –dates, policies, etc.
Overview of the MS Program Jan Prins. The Computer Science MS Objective – prepare students for advanced technical careers in computing or a related field.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Supplemental Salaries. History School Board ask Personnel Policy Committee to look into supplemental pay Personnel Policy Committee formed a Supplemental.
An On-line Statistics Course in a Bioethics Curriculum Jane E. Oppenlander, Ph.D. Assistant Professor The Bioethics Program Union Graduate College-Mt.
An investigation of the impact of student support initiatives on the retention of computer science students Clem O’Donnell 1, James Murphy 2, Abdulhussain.
Evaluation of Student Learning, Department of Mathematics and Computer Science, Westmont College, Santa Barbara, CA Interpretation The.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
New Annual Faculty Assessment... after the Beta.
The Role of Automation in Undergraduate Computer Science Chris Wilcox Colorado State University 3/5/2015.
BY Karen Liu, Ph. D. Indiana State University August 18,
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
Overview of the Department’s ABET Criterion 3 Assessment Process.
1 April 2012 Quality assurance in education at NTNU.
CS507 Fundamentals of Research Fall About the Course - Topics Graduate School How to read a research paper Planning and conducting research Writing.
4/16/07 Assessment of the Core – Science Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
School of Information Sciences (SIS) - Assessment Plan - Ronald L. Larsen Undergraduate Information Sciences Program (BSIS)
Institutional Planning, Assessment & Research 2010 Institutional Planning, Assessment & Research Assessment Review Committee Report HCAS – Graduate Programs.
New Advanced Higher Subject Implementation Events Engineering Science Advanced Higher Course Assessment.
ADEPT 1 SAFE-T Evidence. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
THE EDUCATIONAL RESEARCH GRANT PROGRAM Sponsored by CERTI December 20, 2010.
New Advanced Higher Subject Implementation Events Computing Science Advanced Higher Course Assessment.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Year 9 Humanities Personal Project Term 2. Contents  The task and outcome The task and outcome  The purpose The purpose  Becoming an effective learner.
Undergraduate Curriculum Review Committee Progress Report.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Dr. Amina M R El-Nemer Lecturer Maternity and Obstetric Nursing Dep. IQAP Manager Program Specification.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Connecting Course Goals, Assignments, and Assessment Faculty Development for Student Success at Prince George’s Community College William Peirce
Charting Library Service Quality Sheri Downer Auburn University Libraries.
ECE791 Senior Design Experience Project Requirements and Timeline.
Program Evaluation Making sure instruction works..
Refresher Course on Assessment A Workshop for Department Chairs, Program Directors, and Others January 22, 2016.
Helpful hints for planning your Wednesday investigation.
Computer Science Department Middle States Assessment 4 different plans in place - one for each program Assessment of some learning outcomes in each program.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
General Education Assessment Report Assessment Cycle.
BUS 642 Entire Course (2 Sets) FOR MORE CLASSES VISIT This Tutorial Contains 2 Sets of Assignments for All Weeks, Check Details.
Capstone: Identifying the impact of advisor review on the quality of student scholarly writing Colleen Burnham MBA, Caroline Alper MD, Melissa A. Fischer.
4/16/07 Assessment of the Core – Quantitative Reasoning Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Course Project Guidelines
Joan Donohue University of South Carolina
Director of Policy Analysis and Research
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
BUS 642 Teaching Effectively-- snaptutorial.com
Assessment of Learning (AOL) in Undergraduate Business Analytics (BSAN) Courses BSAN I: Business Statistics BSAN II: Introduction to Management Science.
Teaching Critical Thinking Skills in Science with sInvestigator
Sarah Lucchesi Learning Services Librarian
Implementing the Routines
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
Chair: Nadine Jennings Date of Presentation: January 19, 2017
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Program Modification “Academic Year 2019” Assumption University
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different plans in place - one for each of those programs Assessment of some learning outcomes in each program are scheduled for each year Some assessments in each program were scheduled to be done based on classes from Spring 2006 Most learning outcomes are assessed on a 3 year rotation – but the more statistical ones are done yearly

Spring 2006 Assessments Scheduled & Performed 4 assessment for the undergraduate program Programming skills Mathematical and Analytical Reasoning Project Management and Large Scale Programming Skills Research, Writing and Presentation Skills 4 assessments for the graduate program Project Development Peer Reviewed Publication at 3 years Peer Reviewed Publication at graduation Presentation at a conference at graduation A small committee created by the department chair to perform each of these scheduled assessment

Process for Each Assessment April – December, 2006 Discussed the list of scheduled assessments for the current semester Created a committee for each needed assessment who were asked to have reports back in the beginning of the following fall semester Set a chair for each committee Contacted the committee with assessment description Informed committee about methods of assessment Followed up with each committee to give additional guidance and answer questions Reports filed and consolidated

Considerations when Selecting Committees Faculty members not directly associated with that semester of the course Somehow connected to the course in general Previously taught that course Taught a similar course on different level Teaches the course following it in the sequence Mix of faculty members from different backgrounds Teams maximizing these differences Maximize involvement of the faculty members of the department

U ndergraduate Program Assessment Programming Skills Chau-Wen Tseng and Nelson Padua-Perez They used projects from CMSC 131 (Computer Science I) This assessment will rotate through the intro-programming sequence in subsequent years Looked at two projects “Company Database” and “Shape Decorator” Looked at project descriptions, 6 student implementations and supporting course materials Determined students are able to proficiently use the Java constructs required for projects that are of moderate size ( lines of code) Suggestions for course improvement: The projects should deemphasize string input/output and its formatting details. Projects should be more open ended Suggestions for assessment improvement: A larger sampling of student projects and more specific criteria for what is needed would give more feedback for course content.

U ndergraduate Program Assessment Mathematical and Analytical Reasoning Bill Gasarch and Evan Golub They used final exam from CMSC 250 (Discrete Structures) Looked at one final exam question whose content is very important for the subsequent courses Reviewed 20 exam papers chose at random in such a way as to represent the proportionate number of students who received A’s, B’s and C’s for the final course grade Created their own grading criteria separate from what was used by the instructional staff to grade this question Determined that 15 were Excellent or Very Good on this one question, 1 was Moderate, and 4 were poor. 75% were at least Very Good. Suggestions for course improvement: None given Suggestions for assessment improvement: A larger sampling of questions (2 questions that are different in nature instead of one) and Inclusion of students who did not successfully complete the course

Undergraduate Program Assessment Project Management and Large Scale Programming Skills Pete Keleher and Udaya Shankar They used project from CMSC412 (Operating Systems) Looked at one stage of development of a multi-part project Reviewed the project description and 3 student implementations Used the criteria of clear and well documented code, well- designed functions, and evidence of good debugging practice Determined that two of the three implementations did well on all three criteria, the third was not well documented and showed less sophisticated debugging techniques Suggestions for course improvement: None given Suggestions for assessment improvement: A larger sampling of students possibly looking for more specific criteria since the student implementation is so large.

Undergraduate Program Assessment Research, Writing and Presentation Skills Bill Gasarch and Don Perlis They used papers submitted for the CMSC Honors Program They evaluated six papers submitted for Spring, 2006 graduation Used the criteria of originality, significance, and presentation They created a 0-3 scale for each of these criteria, graded independently and added the scores. Then derived a scale of at least one 5, one 4 with possibly one 3 in the areas to be excellent. Determined that all projects met the criteria of excellent on this scale. Suggestions for course improvement: None given Suggestions for assessment improvement: Possibly branching this assessment to determine the writing and research of non-honor’s students to determine the learning outcome of a larger population

G raduate Program Assessment Project Development James Reggia He used a required project assigned for CMSC 726 (Machine Learning) Reviewed the project description and the student implementations of all projects submitted that semester The project was to be implemented on an individual basis or in a team of size 2 There were a total of 13 submissions representing the 20 students in the class The project required a proposal, a hypothesis and an application that tested the hypothesis The criteria of originality, content, implementation effort, and report quality Determined that the expectations of project development on these criteria was exceeded and gained valuable research experience also Suggestions for course improvement: None given Suggestions for assessment improvement: None noted

Graduate Program Assessment Peer Reviewed Publication at 3 years Michael Hicks, Neil Spring and Jan Plane They used the database collected from the graduate review day held each April There were 29 3rd year students who were still active in the program in April of of those students had at least one reviewed publication since entering Maryland. This is a rate of 69% of those who are completing their third year have had at least one publication The original assessment proposed was to find out what percentage had submitted an article for review rather than to determine how many had been accepted, but we did not have a way to collect that data directly. Suggestions for assessment improvement: Modify the assessment criteria to something that is more easily measured such as the percentage who have published in a peer reviewed venue. The goal of 75% is probably too high for those who are just completing their third year if the goal is publication rather than submission.

Graduate Program Assessment Peer Reviewed Publication at Graduation Samir Khuller, Heather Murray and Jan Plane They used the data collected in a survey, during exit interviews, and on student web pages. There were a total of 34 Ph.D. Graduates in Summer 2005 – Spring of those Ph.D. Graduates had one or more peer reviewed publications This is a rate of 76% of those who are completing their Ph.D. program have had at least one publication Suggestions for assessment improvement: The method of data collection used this year was not the more accurate since none of the methods of discovery were required. The proposal is to insert a new question on the application for graduation specifically asking them to report refereed publications. This method should be more accurate since this form is required shortly before graduation.

Graduate Program Assessment Presentation at a Conference before Graduation Samir Khuller, Heather Murray and Jan Plane They used the data collected in a survey, during exit interviews, and on student web pages. There were a total of 34 Ph.D. Graduates in Summer 2005 – Spring of those Ph.D. Graduates had presented at one or more conferences This is a rate of 82% of those who are completing their Ph.D. program have had at least one conference presentation Suggestions for assessment improvement: The method of data collection used this year was not the more accurate since none of the methods of discovery were required. The proposal is to insert a new question on the application for graduation specifically asking them to report presentation at conferences. This method should be more accurate since this form is required shortly before graduation.

Lessons Learned about the Assessment Process Itself Many lessons learned that will modify how future assessments are conducted More guidance to faculty selected for the committees Qualitative rather than Quantitative – difficult to compare to goals Make sure there is a large enough sample size even if the number of criteria has to be reduced to make it practical Most have a significant report of what they did but were shorter about the details of their assessment More realistic evaluation methods Wording of the learning outcome Clearer specification of assessment measure More specific criteria