Data Management for Large STEP Projects Michigan State University & Lansing Community College NSF STEP PI Meeting March 15, 2012 Workshop Session I-08.

Slides:



Advertisements
Similar presentations
Florida State University PhD Completion Project Phases I & II Nancy Marcus Judith Devine.
Advertisements

Survey Responses Challenges and Opportunities Matt Richey St. Olaf College.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Wednesday, 7 Jan 09 MAA Session Quantitative Literacy across the Curriculum –Kelly E Matthews, PhD student & Student Experience Manager, Faculty of Science,
Best Practices in Assessment, Workshop 2 December 1, 2011.
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
1 Faculty Leadership Development Programs at Virginia Tech Peggy Layne, P.E., Director, AdvanceVT.
Examining the Gender Gap in Introductory Physics Lauren Kost Steven Pollock, Noah Finkelstein Department of Physics, University of Colorado at Boulder.
The Role of the Office Institutional Research and Program Assessment at Baruch College Presented by: John Choonoo, Director Jimmy Jung, Assistant Director.
Terrell L. Strayhorn, Ph.D. Associate Professor Joey Kitchen, Marjorie Dorime-Williams, & Todd Suddeth School of Educational Policy & Leadership Higher.
Cluster Analysis on Perceived Effects of Scholarships on STEM Majors’ Commitment to Becoming Teachers versus Teaching in High Needs Schools Pey-Yan Liou.
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Project Funded by: E-Outcomes Assessment Project: Technology Linking Assessment & Learning Luvai Motiwalla, Steven Tello & Kathryn Carter College of Management.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
An investigation of the impact of student support initiatives on the retention of computer science students Clem O’Donnell 1, James Murphy 2, Abdulhussain.
WIP – Using Information Technology to Author, Administer, and Evaluate Performance-Based Assessments Mark Urban-Lurain Guy Albertelli Gerd Kortemeyer Division.
We Need Your Help What we need you to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback. Help.
TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1.
BCSSE 2013 Institutional Report Concordia University Chicago BCSSE 2013 Institutional Report Concordia University Chicago Elizabeth Owolabi, Ph.D. Director.
Managing Data Collection: Where we were, where we are, and where we want to be! Gwen Brockman California State University, Dominguez Hills.
1 CENTER for the ADVANCEMENT of ENGINEERING EDUCATION Lorraine N. Fleming, Ph.D. Co- Principal Investigator Howard University Kimarie Engerman, Ph.D. Senior.
QVCC: Outcomes Assessment Using eLumen Achievement Fall 2005.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
DATA COLLECTION, STORAGE AND REPORTING FOR COLLEGE ACCESS PROGRAMS The University of North Carolina General Administration February 22, 2012.
Ph. D. Completion and Attrition: Analysis of Baseline Data NSF AGEP Evaluation Capacity Meeting September 19, 2008 Robert Sowell Council of Graduate Schools.
LEARNING COMMUNITIES & COHORT BUILDING 2014 NSF STEP MEETING Strategies for building community among students, and the impact of those strategies on STEM.
FOSTERING OPPORTUNITIES FOR TOMORROW’S ENGINEERS (FORTE) John R. Reisel, Marissa Jablonski, Ethan Munson, George Hanson, Hossein Hosseini, Edward Beimborn.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
Assessing Student Understanding of Physical Hydrology (#0691) Adam J. Castillo a,c ; Jill Marshall a ; Meinhard B. Cardenas b a Department of Curriculum.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: That students do not.
We Need Your Help What we need ASM members to do for us: If we build it, will you use it? Be willing to test-drive the user interface and provide feedback.
The Persistence of the Gender Gap in Introductory Physics Lauren Kost Steven Pollock, Noah Finkelstein Department of Physics, University of Colorado at.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
LEARNING COMMUNITIES & COHORT BUILDING Strategies for building community among students, and the impact of those strategies on STEM retention. Discussion.
Assessment: Understanding the P.E.A.R. Presented by: Renee Baxter Mary Casey-Incardone Maureen Musker Sue Rohde
GATEWAY INITIATIVE Hillsborough Community College Fall 2007 Preliminary Results A Formative Evaluation.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Effects of a Postsecondary Faculty Professional Development Program Designed to Better Address the Needs of Students with Disabilities Accessing Higher.
Draft of the Conceptual Framework for Evaluation & Assessment of the National Science Foundation (NSF) Alliance for Graduate Education & the Professoriate.
Columbus State University C ollege of Education and Health Professions PSC Program Review February 14-17, 2010.
Strategies for Promoting Faculty Engagement with Early STEM Students Michigan State University College of Engineering Center for Engineering Education.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Assessing the Impact of the Interactive Learning Space Initiative on Faculty Morale and Student Achievement Ball State University 2015 Spring Assessment.
Ken Gonzalez, University of San Diego and Mary A. Millikin, Tulsa Community College 89th Annual AACC Convention April 5, 2009 Focus Groups: Putting the.
CCSSE 2012 Findings for Southern Crescent Technical College.
Sketched-Truss Recognition Tutoring System: Improved Student Learning through Active Learning and Immediate Student Feedback, NSF DUE , 3/10-2/13.
Planning, Assessment & Research Analysis and Use of Student Data Part I Administrative Academy Los Angeles Unified School District.
College Student Satisfaction & Assessment By: Laura Heidel Western Kentucky University CNS 610.
Collecting Longitudinal Evaluation Data in a College Setting: Strategies for Managing Mountains of Data Jennifer Ann Morrow, Ph.D. 1 Erin Mehalic Burr,
Engineering programs must demonstrate that their graduates have the following: Accreditation Board for Engineering and Technology (ABET) ETP 2005.
Robert P. King Department of Applied Economics April 14, 2017
Jones Hall.
A nationwide US student survey
Discussion and Conclusion
Title of Poster Site Visit 2017 Introduction Results
New Position Proposal: EPIC Coordinator
The Heart of Student Success
Florida State University PhD Completion Project Phases I & II
This material is based upon work supported by the National Science Foundation under Grant #XXXXXX. Any opinions, findings, and conclusions or recommendations.
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Data Management for Large STEP Projects Michigan State University & Lansing Community College NSF STEP PI Meeting March 15, 2012 Workshop Session I-08 Mark Urban-Lurain, MSU Daina Briedis, MSU Renée DeGraaf, LCC Ruth Heckman, LCC Colleen McDonough, MSU Jon Sticklen, MSU Claudia Vergara, MSU Tom Wolff, MSU

Acknowledgements This material is based upon work supported by the National Science Foundation under award (DUE). Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation (NSF).

Workshop Structure Goal of the workshop is for participants to help identify some common challenges and successful solutions that can be shared across the STEP community Working in groups Identify characteristics of your project Reporting results for synthesis Summary provided to all participants after workshop

Who are we? EEES: Engaging Early Engineering Students Targeted disciplines All departments in College of Engineering Targeted numbers of students All incoming engineering students (~1000 first year students/year) How long project in place 4 years Types of interventions Peer Assisted Learning Connector Faculty Dx-Driven Early Intervention Cross Course Linkages

Who are you? At your tables… Your name and institution Quickly tell everyone about your project Targeted discipline(s) Targeted numbers of students What year of your STEP project are you in? Types of intervention(s) Why are you attending the workshop? Each table will report one reason for attending the workshop

Why are You Attending This Workshop?

Part 1: Data Collection Why are we collecting data? Research/evaluation challenges? What types of analyses?

Part 1 EEES: Why are we collecting data? STEP Project Evaluation Which students / faculty participate in which interventions? What are the barriers/inducements to participation? How can each intervention be improved? Research What are the changes in retention as a result of EEES? What are the impacts of each of the components of EEES on retention? Which groups of students are impacted by which components of EEES? What are the interactions among the EEES components for different groups of students?

Part 1 EEES: Analyzing Data Research/evaluation challenges Interactions Different types of at-risk students Different interventions Final outcomes College-wide Not feasible to compare contemporaneous students who are and are not participating in the interventions Elective participation in parts of the program (or not) Interactions may vary by student Traditional statistical techniques difficult to use Types of analyses we are performing Structural Equation Modeling Software we are using AMOS

Part 1: Data Collection Why are you collecting data? STEP Project Evaluation What are your evaluation questions/metrics? Research What are your research questions? Analyzing Data What are your research/evaluation challenges? What types of analyses are you performing? What software are you using? Identify most common themes to report

Part 1: Report Out

Part 2: Types of Data What are the sources? What data is from students? What data is from faculty What data is about the course/instruction/intervention overall? Other data? Type of data Quantitative Qualitative Form of the data: Electronic Paper Who enters / transcribes the data? Participants directly (e.g., online) Researchers

Part 2: EEES Data Student data Demographic data (class standing, gender, ethnicity, ACT scores) From Registrar’s office Assignment, exam and course grades in the core pre-engineering courses From faculty Grades in other courses From Registrar’s office Math diagnostic exam scores From Math department Number of PAL sessions attended Student sign-in sheets Number of meetings with Connector Faculty Surveys of students and faculty Student perceptions of the various components of the EEES program Surveys and interviews Student engagement and desire to persist in engineering Surveys and interviews Outcome data (From Registrar’s office and College) Students who graduate with an engineering degree Students who are academically qualified for engineering but elect to leave for other majors Students who apply but are not admitted due to insufficient grades

Part 2: Metadata What descriptors are you collecting for your data? Any “standards” (e.g., ABET, etc.)

Part 2: Identifiable Data Why do you want identifiers? – Allows merging/tracking for given individuals across data sets Pre-test / post-test Across courses Analysis across different types of data Which of your data has identifiers? Anonymizing data – Protects identity – Needs to be consistent Same identifier gets same code across data

Part 2: Your Data Identify the data you are collecting What are the sources? What data is from students? What data is from faculty What data is about the course/instruction/intervention overall? Other data? Type of data Quantitative Qualitative Form of the data: Electronic Paper Who enters / transcribes the data? Participants directly (e.g., online) Researchers What Metadata Identifiers What are they Anonymizing/protecting

Part 2: Report Out

Part 3: Managing EEES Data Strategies for managing data One researcher coordinates data collection Master list of data to be collected, source, timetable, responsible person What software are you using? Quantitative data in spreadsheets Paper scanned to PDF Audio recordings of interviews as MP3 Merging data sets in SPSS Modeler Where are you storing the data (local computers, cloud, etc.) Networked file storage on MSU campus How to you control access? Restricted access to data manager Parceled out to individual research assistants for cleanup Metadata Data dictionary describing each set of data File naming conventions Directory structures for incoming “raw” data and cleaned up data ready for analysis Anonymization of identified data SHA-1 Hashing of student identifiers

Part 3: Managing Your Data What are your strategies for managing data? What software are you using? Where are you storing the data (local computers, cloud, etc.) How to you control access? Metadata Anonymization of identified data

Part 3: Report Out

Challenge: Managing Diverse Datasets Need to support any type of assessments Independent of any LMS Data granularity to item level Variety of meta-data standards Merging/tracking data across courses/programs within the institution Supporting research across institutions

Learning Outcomes Assessment Data (LOAD) Store Proposal Developed prototype data store NSF Store student assessment data at the item level Individual student responses on each assessment item Merge data from multiple sources Across courses Institutional data Educational Metadata Standards (EdML) Built on existing standards (Dublin Core, IMS Question Test Interoperability specification) Exam parsing Extract individual exam items from PDF and store

LOAD Data Upload

LOAD Data Query

Thanks for Your Participation Questions? Mark Urban-Lurain Claudia Vergara