Technology-Mediated Assessment Jack McGourty, Columbia University John Merrill, Ohio State University Mary Besterfield-Sacre & Larry Shuman, University.

Slides:



Advertisements
Similar presentations
Using the Self Service BMC Helpdesk
Advertisements

New Products for © 2009 ANGEL Learning, Inc. Proprietary and Confidential, 2 Update Summary Enrich teaching and learning Meet accountability needs.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Agent-Based Architecture for Intelligence and Collaboration in Virtual Learning Environments Punyanuch Borwarnginn 5 August 2013.
Register Laulima Workshop for Instructors Solutions to help you engage your students through Laulima.
The Academic Computing Assessment Data Repository: A New (Free) Tool for Program Assessment Heather Stewart, Director, Institute for Technology Development,
Service to Leadership Freshmen Orientation Assessment Service to Leadership Professional Development Training August 14, 2012 Agriculture Research Complex.
BEST PRACTICES IN INFORMATION LITERACY ASSESSMENT Yvonne Mery, Vicki Mills, Jill Newby, University of Arizona Libraries February 11, 2009.
Holyoke Public Schools Professional Development By, Judy Taylor
Compliance on Demand. Introduction ComplianceKeeper is a web-based Licensing and Learning Management System (LLMS), that allows users to manage all Company,
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
E-content Generation and Delivery Management for Student Centric Learning CONTROL NUMBER : DE Dr DVLN Somayajulu Professor, PI and Institute.
Getting Started in Blackboard. You will need… A web browser, preferably Internet Explorer, version 4.0 or higher An account and the knowledge of.
UM.Tools Media Union 2002 Louis E. King University of Michigan Media Union 4/13/2002.
RECRUIT Overview November 29, 2005 Academic Personnel Systems 1 Academic Personnel Systems: RECRUIT Please silence cell-phones.
How to use Student Voice Training Session Gary Ratcliff, AVC - Student Life.
Northshore Community College Public MA Community College 4 campus locations –4200 FTE –2500+ non-credit –90+ progams of study Career,LA,Transfer Technical.
© 2004, The Trustees of Indiana University 1 OneStart Workflow Basics Brian McGough, Manager, Systems Integration, UITS Ryan Kirkendall, Lead Developer.
Texas A&M University College of Education eEducation Group.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
How was the course organized? Faculty present: Lectures, small groups, and labs. Students prepare: Problem-based learning issues, journal article reports,
Phillip R. Rosenkrantz, Ed.D., P.E. Industrial & Manufacturing Engineering Department California State University, Pomona.
Increasing Online Survey Response Rates April 7, 2015 Institute for Faculty Development.
Register Laulima Workshop for Instructors Solutions to help you engage your students through Laulima.
Employing e-Portfolios in Instructional and Co-Curricular Settings Jennifer Matthews, Senior Consultant Blackboard Inc April 13, 2005.
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Electronic EDI e-EDI. The EDI has been in use since 1999 using a paper-based system and computerized spreadsheets to collect and manage EDI data. Over.
Tutorial Introduction Fidelity NTSConnect is an innovative Web-based software solution designed for use by customers of Fidelity National Title Insurance.
WebCT Web Course Tools Online Teaching. How Much Online?  Traditional Teaching (in the classroom) with supporting material on the Web  Syllabus  Orientation.
Best Practices In Design Outcomes Of A Survey P. H. King, PhD, PE Joan Walker, PhD Vanderbilt University.
Blackboard Strategies: Using Blackboard Pedagogically.
1 ThinkLink Learning Online User Manual for Predictive Assessment Series Go to www2.thinklinklearning.com/pas4mlwk. Click Educator Login. Your username.
The Integration of Embedded Librarians at Tuskegee University Juanita M. Roberts Director Library Services Ford Motor Company Library/Learning Resources.
About Dynamic Sites (Front End / Back End Implementations) by Janssen & Associates Affordable Website Solutions for Individuals and Small Businesses.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
INSTRUCTOR & FACULTY ORIENTATION Blackboard 9.1. What is Online Learning? The term online learning is used interchangeably with e-learning or electronic.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
I Have to Know What??!! Online Teaching Competencies Adapted from “Online Teaching Facilitation Course: Core Competencies:
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
James Williams e: eTutor Project SUMMARY OF KEY FINDINGS for 2 Pilot studies of the.
The Auditor Role The auditor has the same view of the course as the student does, but no marks are recorded for auditors.
Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University.
Good Practice Conference “…collating, disseminating and encouraging the adoption of good practice…”
Institutional Considerations
Laulima Workshop for Instructors Solutions to help you engage your students through Laulima.
IPortal Bringing your company and your business partners together through customized WEB-based portal software. SanSueB Software Presents iPortal.
CIS101 Introduction to Computing Week 01. Agenda What is CIS101? Class Introductions Using your Pace Introduction to Blackboard and online learning.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
Student Orientation to Online Learning at UWG WebCT Vista, Horizon Wimba & Student Resources 2006.
Registration Solutions for your Event Management.
Welcome to SkillSoft Learner Orientation This project made possible with funding by the Institute of Museum and Library Services through the Maryland State.
AEFI S Assessment Improved Mustafa Sualp Founder, Servant Leader & CEO.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Innovation Software Corporation's Cultural Awareness Training Program Presentation by:
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Capturing the Cyclic Nature of Design with Multi-Generation Projects Department of Bioengineering University of California–San Diego Melissa Kurtis Micou,
Making Progress with Academic Progress Reporting A Customized Tool for Faculty and Advisors.
ANGEL Penn State’s Course Management System Created by PSY Office of C&IS.
Earth Educators’ Rendezvous Workshop Leader Webinar Introduction Workshop Design Best Practices Utilizing the Web Tools Evaluation Instruments David McConnell,
Blackboard Learn 9.1 Communicating with Students © 2010 Blackboard Inc. All rights reserved.
GNU EPrints 2 Overview Christopher Gutteridge 19 th October 2002 CERN. Geneva, Switzerland.
Advanced Higher Computing Science
ASSESSMENT OF STUDENT LEARNING
JET Education Services: Innovations in Teacher Support and Curriculum Development Presentation to the Care and Support for Teaching and Learning Regional.
Course Evaluation Ad-Hoc Committee Recommendations
EDUCAUSE MARC 2004 E-Portfolios: Two Approaches for Transforming Curriculum & Promoting Student Learning Glenn Johnson Instructional Designer Penn State.
Evaluation Measures, Ongoing Improvements and Enhancement
Presentation transcript:

Technology-Mediated Assessment Jack McGourty, Columbia University John Merrill, Ohio State University Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh Gateway Engineering Education Coalition

Technology-Mediated Assessment Introduction Your Expectations Applications Drexel and Columbias Course Evaluation Ohio States Activities Team Evaluator Your Experiences Enablers and Barriers (Break-out Groups) Conclusions

Introduction Reasons for Online-Assessment Common Applications Design and Development Things to Think About

Reasons for On-Line Assessment Customized development Targeted communication Ease of distribution/no boundaries Automatic data collection and analyses Real time response monitoring Timely feedback

Common Applications Attitude Surveys Multisource assessment and feedback Course evaluations Portfolios Technology-mediated interviews Tests

Design and Development Item/Question development Adaptive testing/expert systems Multimedia tutorials Dialogue boxes Reporting wizards

Things to Think About Confidentiality/Privacy Response rates Reliability/Validity Ease of use Administrators, end users System growth Can it easily be upgraded? Adding modules System flexibility Survey/test construction Data flexibility Item databases Reporting wizards Data storage Platforms Specific vs. combination Reporting Various levels Dissemination mechanisms Real time vs. delayed

Technology in Education Dr. John Merrill The Ohio State University Introduction To Engineering Program Technology Enabled Assessment The Wave of The Future

Objectives Explanation of web-based assessment tools Uses of assessment tools Virtual run-through of student actions Lessons learned Q&A

Web-Based Assessment Tools Course Sorcerer (through WebCT) Online Journal Entries Course Evaluations Team Evaluator Peer Evaluations

WebCT WebCT is a commercial web-based tool used for course management. IE Program uses/capabilities: Electronic grade book, chat rooms, bulletin boards, calendars Provides links to Course Material Course Sorcerer Team Evaluations (Team Evaluator)

Course Sorcerer A simple, web-based evaluation tool created by Scott Cantor at University Technology Services Technical Specifications: Written in Cold Fusion Run on Windows NT with a Netscape Enterprise Web Server Uses a MS SQL Server database with 15 tables Server Machine: PII-450 w/ 512M of RAM Accesses Sybase running on Solaris 2.6 as a warehouse for roster data. Used for Journal Entries & Course Evaluations

Team Evaluator (Peer Evaluation) Used by team members to provide confidential assessment System Requirements: Operating System: Windows 2000 with ActivePerl or UNIX with Perl or higher Perl Modules: CGI, DBI (plus SQL drivers), POSIX SQL Server: MySQL 3.23 or higher Web Server: IIS (Windows) or Apache 1.3 (UNIX) CPU: Pentium II 400 or better recommended Memory: 128 MB or higher recommended Disk Space: 100 MB for adequate database space

Journal Entries Students complete journal entries online every two weeks. Submissions are anonymous. All entries are read and summarized by a staff member and shared with the instructional team. Instructional team members share the summaries with their classes.

Course Evaluations Students in 181 & 182 complete online course evaluations at the end of each quarter. Questions designed to evaluate courses based on items a-k of Criterion 3, Program Outcomes & Assessment, in the ABET Engineering Criteria, 2000.

Short Term Uses Journal Entries & Course Evaluations Address immediate student concerns/questions about class, labs, or projects. Inquire about student problems with specific topics and labs. Discover general information from students in regards to interests, influences, and attitudes.

Example Addressing Immediate Student Concerns How are the figures supposed to be done? Strictly isometric or just drawn so you can see everything? What pieces need to be labeled? What are we doing in labs 6 & 7? I know it says in the syllabus that we are incorporating the sorting mechanism, but is that going to take two weeks?

Long-Term Uses Journal Entries & Course Evaluations Improve program content Improve course materials Modify teaching styles Evaluate course based on ABET criteria

Example Improving Course Content Positive: I... - Gained knowledge about circuits in general - Learned how to read schematics - Learned how to use breadboards - Further developed team working skills Negative: - The circuits did not work the first time. - Time ran short for both labs, but we did finish each circuit.

How It Works Start: WebCT site:

Completion Tracking

Lessons Learned Journal Entries & Course Evaluations Students are more likely to complete if given credit. Students are extremely responsive to the anonymity of the online survey. Students respond positively when asked for suggestions/solutions to problems in the class.

Web Enhanced Course Evaluation at Columbia University Jack McGourty Columbia University

Overview A little history How does course assessment fit into the big picture? Why use web technology? How is it being done? Does it work?

History Columbias Fu Foundation School of Engineering and Applied Science began using the web for course assessment about four years ago starting with a student administered web site for results Designed and developed state-of-the-art system using student teams Now building on current infrastructure to include on-line tutorials and increased flexibility for administration

Student Web Site Search by course or faculty Current and past results No comments

The Big Picture Why are we assessing courses and programs? Continuous improvement of the education process What are we doing right, and what can we do better? Integral part of our ABET EC2000 Compliance Develop a process Collect and evaluate data Close the loop Document/Archive results Course evaluation one of several outcome assessment measures such as senior exit surveys, enrolled student surveys, and alumni surveys

How WCES Fits in

Using Technology Pro Students have the time to consider their responses Timely feedback Responses are easily analyzed, archived and distributed Less paper Lower cost/efficient administration Con You lose the captive audience You cant guarantee a diversity of opinions Motivated/Non- motivated Like course/Dislike course Not necessarily less effort

Course Assessment Details 10 Core Items Course Quality Instructor Quality Relevant ABET EC2000 Items Pre-selected by faculty member Customized questions for specific course objectives

Selecting EC2000 Questions

Monitoring Faculty Usage One of our culture change metrics is the percentage of faculty who are capitalizing on the system and adding custom and EC2000 questions. Currently around 15%.

Course Evaluation Results Web page access Current terms assessment Limited time window Limited access Secure site Previous terms results Open access to numerical results; not comments Results Individual faculty Aggregate Data – Department Chairs

Reporting

Promoting Responses Student-driven results website Multiple targeted s to students and faculty from Dean Announcements in classes Posters all over the school Random prize drawing

Closing the Loop

Does it Work? Student response rates have steadily increased over past two years from 72% to 85% More detail in student written comments in course assessments Data is available that we have never had before Faculty use of ABET EC2000 and Customized question features increasing but still limited (15%)

Cross Institutional Assessment with a Customized Web-Based Survey System Mary Besterfield-Sacre & Larry Shuman University of Pittsburgh This work is sponsored by two grants by the Engineering Information Foundation, EiF 98-01, Perception versus Performance: The Effects of Gender and Ethnicity Across Engineering Programs, and the National Science Foundation, Action Agenda - EEC , Engineering Education: Assessment Methodologies and Curricula Innovations

Why a Web-Based Survey System for Assessment? Need for a mechanism to routinely Elicit student self-assessments and evaluations Facilitate both tracking and benchmarking Most engineering schools lack sufficient resources to conduct requisite program assessments Expertise Time Funds Triangulation of multiple measures Multiple measures

Pitt On-line Student Survey System (Pitt-OS 3 ) Allows multiple engineering schools to conduct routine program evaluations using EC 2000 related web-based survey instruments. Assess and track students at appropriate points in their academic careers via questionnaires Survey students throughout their undergraduate career Freshman Pre and Post Sophomore Junior Senior Alumni Freshman orientation expanded to include Math Placement Examinations Mathematics Inventory Self-Assessment

Knowledge-Based Competence Application Area Synthesize multiple areas Can Take on Complexity Accept Ambiguity Welcome Environment Confidence Develop Comfort Preparation Opportunity and Application Work Experience EC Outcomes Attitudes and Valuing Student-Focused Model

System-Focused Model

Pitt OS 3 Conduct routine program evaluation via surveys through the web Data collection Report generation (under development) Web versus paper surveys Pros Administration ease Minimize obtrusiveness Data is cleaner Cons Lower response than paper-pencil surveys User/Technical issues

Pitt OS 3 System Components

Pitt OS 3 Local Administrator Individual at the school where the surveys are being conducted Responsible for the administering the surveys through a web- interface Controls the appearance of the survey Selects school colors Uploads school emblem/logo Selects survey survey beginning and ending dates Composes initial and reminder letter(s) to students Cut-and-pastes user login names and address Manages surveys in progress Extends surveys beyond original dates

Pitt OS 3 Local Administrator

Pitt OS 3 Student Java Applet running on a web browser One question per screen minimizes scroll bar confusion Once student submits questionnaire, results are compressed and sent to the OS 3 server Results stored and students password is invalidated Confirmation screen thanks the student for taking the survey Can accommodate users who do not have accounts

Pitt OS 3 Sample Student

Pitt OS 3 Student Welcome

Pitt OS 3 Student Instructions

Pitt OS 3 Questionnaire

Pitt OS 3 How it Works Every day OS 3 summarizes all active surveys for each school Summary reports on the number of students who have and have not taken the survey Specific students can also be viewed from the local administrators account Upon completion of the survey dates addresses are stripped from the system Only login names remain with results Only time the OS 3 system has student addresses is when the local admin is receiving daily updates about their active surveys

Pitt OS 3 Sample Daily Report

Pitt OS 3 Evaluation of the System Piloted on five schools Multiple surveys concurrently at each school Multiple schools at one time Response rates vary ( % on average) Example University of Pittsburgh - April 2001 One initial with two reminder s over 2.5 weeks Responses Freshman70% Sophomores48% Junior44% Varied by department Some usernames had +

Pitt OS 3 System Trace of One School Freshman Post Survey Survey available for two weeks with one reminder message 57% overall response rate Increased server traffic 2 to 24 hours after each Design concerns 63% of students had to log in more than one time Multiple logins due to case sensitive passwords 14% never finished - browser problems or didnt want to finish 10% gave up - just didnt complete login

Pitt OS 3 Issues to Consider Consent for Human Subjects Discuss with institutions Internal Review Board Surveys often exempt Java Applets not supported by very old browsers HTML as alternate Firewalls established by other organizations