Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Vantage Points for Employment-Related Feedback Share.

Slides:



Advertisements
Similar presentations
House Committee on Workforce and Technical Skills February 20, 2001.
Advertisements

Technical Communication and the RosE-Portfolio Documenting and Reflecting on the Development of Your Communication Skills.
An overview.  Help unemployed and underemployed find and maintain sustainable income  Teach participants the skills to succeed in customer service careers.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Gateway Engineering Education Coalition Engineering Accreditation and ABET EC2000 Part II OSU Outcomes Assessment for ABET EC200.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
University of Alabama Electrical & Computer Engineering Electrical and Computer Engineering Capstone Design Experience at The University of Alabama: Challenges,
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
Essentials to a Successful Internship Search. Why Have an Internship? Integrate classroom theory with practical experience Opportunity to assess career.
Computer Science Department Program Improvement Plan December 3, 2004.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
The Academic Assessment Process
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Mohammad Alshayeb 19 May Agenda Update on Computer Science Program Assessment/Accreditation Work Update on Software Engineering Program Assessment/Accreditation.
Program Improvement Committee Report Larry Caretto College Faculty Meeting December 3, 2004.
ABET Accreditation Board for Engineering and Technology
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
Connecting Work and Academics: How Students and Employers Benefit.
CHEN Program Assessment Advisory Board Meeting June 3 rd, 2012.
King Fahd University of Petroleum and Minerals
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Institutional Assessment Day: Program-Level Alumni Survey Data August 19, 2014 Pat Hulsebosch Associate Provost – Office of Academic Quality Rosanne Bangura.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Comparing Generic Student Learning Outcomes among Fresh Graduates in the Workplace Comparing Generic Student Learning Outcomes among Fresh Graduates in.
Results from the 2013 Undergraduate Alumni Survey Karen Gil, Dean and Distinguished Professor, College of Arts & Sciences Lynn Williford, Assistant Provost,
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Service-Learning and Grant Writing Workshop Tennessee Technological University February 23, 2010 Presented by: Shelley Brown Department of Sociology and.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Exploring How Community-Engaged Experiential Education Programs Foster Student Learning and Career Readiness: A Study of Student Development in Service-Learning,
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Interview skills: How to present yourself with confidence Career Development Centre University of Ulster.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
Venue: M038 Date: Monday March 28,2011 Time: 10:00 AM JIC ABET WORKSHOP No.2 Guidelines on: IMapping of PEOs to Mission Statement IIMapping of SOs to PEOs.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
Department of Computing and Technology School of Science and Technology Bachelor of Science Technology CIP Code Program Quality Improvement Report.
Dept. of Computing and Technology (CaT) School of Science and Technology A.A.S. in Electronic Engineering Technology (EET) CIP Code: Program Code:
Improving the Institutional Effectiveness Process 1.
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
Faculty of Computing, Engineering & Technology COIS40894 COIS40894 PROFESSIONAL AND ACADEMIC SKILLS FOR APPLIED IT I (Introduction)
Employer Surveys General & Employee Supervisor May -June 2003.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
Supporting ABET Assessment and Continuous Improvement for Engineering Programs William E. Kelly Professor of Civil Engineering The Catholic University.
Graduating Senior Survey Summary of Results through
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
Susan A. Ambrose Senior Vice Provost, Undergraduate Education & Experiential Learning Professor of Education & History NEASC Annual Meeting & Conference.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
MEDICAL EQUIPMENT TECHNOLOGY DEPARTMENT FIRST SEMESTER 2014/2015 Medical Equipment Department November 2015.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
Student Employment Where Learning Happens. Today’s Agenda Overview of Learning Outcomes UWM Employment Experience – What our data says – Student Employment.
Funded by a grant from the National Science Foundation. Any opinions, findings, conclusions or recommendations expressed are those of the authors and do.
DANA D’ANGELO, CLINICAL PROFESSOR GENERAL BUSINESS STUDIES LEBOW COLLEGE OF BUSINESS, DREXEL UNIVERSITY ANDY MACALEER, ADJUNCT INSTRUCTOR DORNSIFE OFFICE.
UTS Careers Presents: Enhancing Student Employability.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
Computer Engineering Program Outcomes Assessment Dept. of Computer Engineering King Fahd University of Petroleum & Minerals, Saudi Arabia Dept. of Computer.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Action Plans Your teaching – individual
Technical Communication and the RosE-Portfolio
Presented by: Skyline College SLOAC Committee Fall 2007
The Heart of Student Success
Presentation transcript:

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Vantage Points for Employment-Related Feedback Share the Future IV Conference March 18, 2003 Joseph Hoey and Jack Marr Georgia Tech SUCCEED

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Workshop Coverage  Process for integrating information from multiple sources  Role of employer feedback in overall assessment process  Longitudinal strategy to measure value added  Methodological considerations in using assessment data  Interpreting assessment data  Making it work at Southern University

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Preliminary Questions  Going into this workshop, what are your concerns about employer feedback?  How would you use feedback data? What process would you personally find most useful?

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Relating College and the World of Work  How can those skills relevant to the world of work be assessed against those skills our students are gaining through their programs of study?  What do our students do that relates to employment?  Internships?  Major-related activities?

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Relating College and the World of Work  How can we connect academic evaluation with employment-related evaluation?  What are the similarities?  How can these be tied together to seek some sort of consistency?  Can indirect information and information derived from students or alumni play a role?  Can general education assessment be connected to the world of work?

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Georgia Tech Employer Feedback Model

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Multiple Sources of Career- Related Performance Evaluation  Co-op Employers  Recruiters  Employers of Alumni/Alumnae

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Co-op Employers  All students are evaluated every term  Can potentially evaluate “value added”  Shows “what’s happening now”

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Recruiters  First interface with post-graduation employment  Substantial Filter  Limited Information and Contact  Biased Sample---Students and Recruiters

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Alumni(ae)  Another filter  In the workplace or graduate school  “Real world” perspective  Very large sample needed to yield department-level data  Undeliverable addresses a problem  Sampling, non-response bias

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Employers of Alumni/Alumnae  Another filter  More opportunity to evaluate  Clearer perspective on what’s important  Greater investment  Other biases, e.g., Sampling is based on employee’s permission.

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Methodological Issues  Survey Methods  Sampling  Response Rate  Sources of Bias  Consistency

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education “Triangulation” Co-op Recruiter Employer Alumni(ae)

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education “Triangulation” (at best) Co-op Recruiter Employer Alumni(ae)

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Case Study: Comparing and Seeking Continuity in Results  Subject: Computer Information Systems at Very Humid University (VHU)  Just completed first round of assessment studies  Now looking at the data to figure out what they have, what it means, and how they might want to rethink their assessment process to improve its usefulness  You are there as consultants

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Case Study: Comparing and Seeking Continuity in Results  How would you use the data you have to assess the student outcomes stated?  To what extent can you use the data you have to assess the outcomes listed? Do you see any problems?  What changes to assessment methods do you recommend for Computer Info Systems at VHU?  Do you have any other recommendations?

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Employer Feedback: Development of Process  Reworked survey of recruiters to include items relevant to Criteria 2000  Reworked evaluation instrument completed by employers (supervisors) of co-op students to include items relevant to Criteria 2000  Created Alumni and Employer instruments

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Process Logistics  Project funded by SUCCEED  Negotiations with process owners  Recruiter and Co-op data collected and entered by office of origin; analyzed by Office of Assessment  Alumni and Employer surveys collected by Office of Assessment

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Findings: Co-op  All ratings moderately high or better, with some variability in spring 2001  Lifelong learning and technical skills rated highest  Written and oral communication skills rated lowest

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Findings: Co-op  Disaggregated co-op employer evaluations by department and by student class level within department  Breakdown allows clear demonstration of student knowledge and skill gain through the undergraduate experience.

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Findings: Recruiters  Importance: highest ratings on teamwork, problem solving, ability to apply knowledge, communication.  Preparation: highest ratings on using necessary techniques and skills for practice, problem solving.  Largest “performance gaps” over time: teamwork and communication skills

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Results: Bachelor’s Alumni(ae) Survey  Alumni were asked to rate a set of skills, abilities, and attributes generally expected of a Georgia Tech graduate, first rating the importance of each item relative to their personal employment experience since graduation, and then rating each item relative to how well their education had prepared them.

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Alumni Results: Importance vs. Preparation  There were 6 specific skill areas for which there was a greater than 0.50 difference between mean ratings for importance and mean ratings for preparation: The ability to…  communicate orally,  communicate in writing,  function on teams,  use computing technology in communications,  engage in lifelong learning / self-critique, and  exercise leadership skills.

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Hands-On Activity  Divide into small groups.  Discuss: How could you structure and unify employer feedback at Southern U.? Each group should put together ideas on:  what information is needed  what methods would be appropriate to use  who would have ownership/need to be involved  who would collect, enter, analyze, report data  how best to communicate and use results

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Hands-On Activity  Presentation and Discussion (10 minutes): Small groups present summarized ideas; larger group discusses.

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education What Have Previous Participants Thought?*  What information is needed  What methods would be appropriate to use  Who would have ownership/need to be involved  Who would collect, enter, analyze, report data  How best to communicate and use results * ASEE 2001, Albuquerque

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education What Information Is Needed  Criterion 3, A–K knowledge, skills, and abilities—Importance and preparation  How do our students compare with others?  How do our students compare by levels? — Degree level — Field level Are we preparing students appropriately? — What is important — Where are we succeeding — Where are we falling short What is our minimum acceptable performance? — Measure the “low end” of graduates

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education What Methods Would be Appropriate to Use  Surveys  Interviews — Telephone — Personal  Advisory committees

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Who Would Have Ownership/Need to be Involved  Students  Faculty  Recruiters/employers  Employees (grad/co-op)  Parents  Institution

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education Who Would Collect, Enter, Analyze, Report Data  Involve team (students, faculty, alumni)  Analyze and collect (survey specialists)  Report (faculty, editors, specialists)  Distribute based on type — Co-op/Career Planning and Placement — Dean’s office — Computer Services — Institutional Research — Individual departments

Curriculum 21 SUCCEED Southeastern University and College Coalition for Engineering Education How Best to Communicate and Use Results  Share survey data results and summary information with — Students — Faculty — Industry Advisory Boards — Employers — Alumni  Format data results and summary information with — Statistical analyses — Bar graphs — Trends — Tracking—year-by-year and by class  Disaggregate to department level