UNC TLT Conference March 2006 How Technology Impacts Student Learning: An Assessment Workshop Joni Spurlin, Sharon Pitt, Allen Dupont, Donna Petherbridge.

Slides:



Advertisements
Similar presentations
What Did We Learn About Our Future? Getting Ready for Strategic Planning Spring 2012.
Advertisements

PD Plan Agenda August 26, 2008 PBTE Indicators Track
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Maximizing Your NSSE & CCSSE Results
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
An Assessment Primer Fall 2007 Click here to begin.
Apples to Oranges to Elephants: Comparing the Incomparable.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
Securing Faculty and Staff Buy-in for AQIP The Higher Learning Commission 2004 Annual Meeting March 30, 2004 Chicago, Illinois.
College Strategic Plan by Strategic Planning and Quality Assurance Committee.
Data Dashboards and Key Performance Indicators Presented by: Melissa Wright, M.A. Assistant Director, Baseline September 21, #labgabLike.
Course Portfolio: Making Pedagogy Visible Center for Excellence in Teaching and Learning (CETL) Queensborough Community College, CUNY Spring 2008.
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Professional Growth= Teacher Growth
Minia Univresity Council for Education Affairs Meeting - July Higher Education Enhancement Project Fund (HEEPF) Improving Transportation Engineering.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment.
Next Generation Science Standards Update Cheryl Kleckner Education Specialist.
The Integration of Embedded Librarians at Tuskegee University Juanita M. Roberts Director Library Services Ford Motor Company Library/Learning Resources.
Margaret J. Cox King’s College London
Learning & Teaching in FCA: Pathways and Loops!
Dr. Donna Harp Ziegenfuss Dr. Cynthia Furse Dr. Stacy Bamberg Teach-Flip.utah.edu MOOCs in STEM: Exploring New Educational.
The Third Year Review A Mini-Accreditation Florida Catholic Conference National Standards and Benchmarks.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
PROFESSIONAL DEVELOPMENT PLAN WORKSHOP. What is the Professional Development Plan? The Professional Development Plan is a directed planning and evaluation.
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Thomas College Name Major Expected date of graduation address
Assessment 101: Unpacking the Complexities Dr. Linda J.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Improving the Teaching of Academic Genres in High-Enrollment Courses across Disciplines: A Three-Year Reiterative Study Chris Thaiss University of California,
The Student Services Assessment Institute (SSAI): Creating a Culture of Assessment through Professional Development Kim Black, Ph.D. Stephanie Torrez,
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
MDC Quality Enhancement Plan Mathematics Discipline Meeting Update March 5, 2009.
Dr. Lesley Farmer California State University Long Beach
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
UNC TLT Annual Conference, March 2004 questions | comments 
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
Regional Educational Laboratory at EDC relnei.org Logic Models to Support Program Design, Implementation and Evaluation Sheila Rodriguez Education Development.
Educator Effectiveness Academy Day 2, Session 1. Find Someone Who…. The purpose of this activity is to review concepts presented during day 1.
By Monica Y. Peters, Ph.D. Coordinator of Institutional Effectiveness/QEP Office of Quality Enhancement.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright Information Copyright Karen St.Clair & Stan North.
Student Learning when Teaching with Technology: Aligning Objectives, Methods, and Assessments Copyright.
Project 3 Supporting Technology. Project Proposal.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
ELI Annual Meeting, January 2006 Session Documents: Assessing the Impact of Technology on Student Learning:
ASSESSING THE MOBILE COMPUTING PILOT PROGRAM: A COLLEGE-WIDE INITIATIVE Copyright Joni Spurlin and Kathy Mayberry, North Carolina State University, 2006.
Strategies for blended learning in an undergraduate curriculum Benjamin Kehrwald, Massey University College of Education.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
The University of West Florida Reaffirmation of Accreditation Project Southern Association of Colleges and Schools Commission on Colleges.
Cal Poly Pomona University Strategic Plan 2011 ‐ 2015 Partial Assessment of Progress Presented to the University Strategic Planning Committee (USPC) 12/4/2014.
TELL Survey 2015 Trigg County Public Schools Board Report December 10, 2015.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Tell Survey May 12, To encourage large response rates, the Kentucky Education Association, Kentucky Association of School Administrators, Kentucky.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
ASSESSING THE MOBILE COMPUTING PILOT PROGRAM: A COLLEGE-WIDE INITIATIVE Copyright Joni Spurlin and Kathy Mayberry, North Carolina State University, 2006.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Outcomes Assessment Committee
High Impact Practices: HU-HIPs plan
ORGANIZATIONAL STRUCTURE
Institutional Effectiveness USF System Office of Decision Support
The Heart of Student Success
Presentation transcript:

UNC TLT Conference March 2006 How Technology Impacts Student Learning: An Assessment Workshop Joni Spurlin, Sharon Pitt, Allen Dupont, Donna Petherbridge NC State University

UNC TLT Conference March 2006 Session Outcomes By the end of the session, participants will be able to develop or improve their assessment efforts by: –Applying ideas from workshop to: Define their assessment question Identify potential assessment methods for their institution –Learning from what NC State has learned –Utilizing resources from this workshop

UNC TLT Conference March 2006 Session Overview LITRE as a Quality Enhancement Plan Defining the Assessment Question –LITRE Plan Goals & our revisions –Framework –Exercise 1 Defining Assessment Methods –Assessment Methods at NC State –Exercise 2 Wrap-up

UNC TLT Conference March 2006 LITRE Planning LITRE as an outgrowth of accreditation review –Plan for transformative, institutional improvement –Crucial to enhancing educational quality –Directly related to student learning –Based on comprehensive analysis of institutional effectiveness

UNC TLT Conference March 2006 LITRE Planning Initial Benchmarking –Defining the “learning with technology” environment –Focus Groups –Critical Infrastructure Needs –2003 Faculty Survey

UNC TLT Conference March 2006 LITRE Faculty Survey (2003) Why: Inform recommendations of LITRE and provide baseline for future LITRE efforts Who and What: Faculty were surveyed about their experiences with computer-based instructional and learning aids. 1,790 faculty were invited to participate in the survey. 983 did—a response rate of 55%. Indicator: Respondents were asked what would make it easier to use the technologies that they did use in their courses: “If they were available and supported in the classrooms in which I typically teach” was chosen most often, 37% of the time. See for survey report and instrument.

UNC TLT Conference March 2006 The Essence of the LITRE Plan Scholarly inquiry focused on enhancing the technology-rich learning environment Investigative process through which new approaches to student learning, using technology, are proposed, vetted, empirically evaluated, and if the evaluation results indicate, deployed and routinely assessed Evidence would be collected and analyzed to inform future projects

UNC TLT Conference March 2006 Critical Infrastructure Needs Classroom Improvement Faculty Computing File Space Quota Software Licensing Learning Management Systems Digital Asset Management Student E-Portfolios Technology Support for Students Faculty Innovation Grants Information Exchange Accessibility and Universal Design Wireless Data Connectivity and Mobile Computing/Communication Systems Advanced Remote Access Services

UNC TLT Conference March 2006 First Wave Initiatives Classroom and Laboratory Improvements –University-wide Classroom Improvement Plan –Student Group Collaboration (FlySpace) –ClassTech projects –SCALE-UP Classroom Faculty Innovation Grants –LITRE grants

UNC TLT Conference March 2006 Assessment Cycle Step 1 – Develop/refine your assessment question Step 2 – Define assessment methods that will gather evidence Step 3 – Implement assessment methods, gather evidence, analyze data Step 4 – Interpret the results in terms of your assessment question Step 5 – Make decisions based on results and interpretations

UNC TLT Conference March 2006 Step 1 – Develop the Question “Assessment is the systematic collection, review and use of information about educational programs undertaken for the purpose of improving student learning and development” ( Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education. Catherine A. Palomba & Trudy W. Banta 1999) Evaluation – “…a broader concept than assessment as it deals with all aspects of a program including resources, staffing, organization, operations, and efficiency” SAUM website What are you wanting to know? –Student learning –Effective use of resources Why do you want to know that? What are the driving forces? Level of analyses? –Improve your course, program, or institution? –Give information to Provost for educational improvements? –To better invest resources? –Accreditation?

UNC TLT Conference March 2006 LITRE Assessment Overarching goals Focused on assessment related to student learning Assessment methods should be conducted by PI of the initiatives and grants Collaboration - Support through LITRE Assessment Committee

UNC TLT Conference March 2006 Collaboration – LITRE Assessment Committee Faculty Assessment Professionals Computer/Information Technology Professionals Instructional Designers Available to help with assessment design of Individual Faculty Proposals ( )

UNC TLT Conference March 2006 LITRE Goals Improve student learning Systematically investigate effectiveness of technology-based innovations in learning and teaching Use results to scale our successes, shape future investigations and inform campus decision making

UNC TLT Conference March 2006 Student Learning Assessed: Four Dimensions Defined Problem Solving Empirical Inquiry Research from Sources Performance in the discipline See LITRE Plan, Appendix A for definitions:

UNC TLT Conference March 2006 Lessons Learned—Assessment Obtain baseline data – helped define our Goals and Initiatives Develop Assessment Committee that actively engages community in discussions Ask and Ask again: What do we want to learn from assessment? What will it tell us?

UNC TLT Conference March 2006 Lessons Learned: Our Next Steps Modifications: –Revisit overarching questions: What pedagogical issues are we trying to solve? How can technology help address challenges? Framework: –Developing a framework to have a common understanding of the issues involved. (handout) –Resource: FAQ “technology and assessment” website:

UNC TLT Conference March 2006 Exercise 1 – 15 minutes Given that “student learning” is the focus of your question, develop a set of possible questions that could be addressed at the “institutional” level. Examples: –Learner characteristics: How do we best meet multiple learning styles to affect student’s ability to be engaged with the material via technology? –Functional use of technology: How does applying “real world” simulations/interactions improve learning? Where is face-to- face really better? Where is virtual better? How are they different? –Environment: How does giving students different levels of feedback on their work via technology, affect how well they think critically? –Learning space: How do we improve learning in large- enrollment classes?

UNC TLT Conference March 2006 Group Discussion What were your questions? What issues did this activity raise?

UNC TLT Conference March 2006 Assessment Cycle Step 1 – Develop/refine your assessment question Step 2 – Define assessment methods that will gather evidence Step 3 – Implement assessment methods, gather evidence, analyze data Step 4 – Interpret the results in terms of your assessment question Step 5 – Make decisions based on results and interpretations

UNC TLT Conference March 2006 Step 2 – Define Methods Consider your assessment question Direct measure of student learning –i.e., Use of student work in portfolios, tests, projects Indirect measures –i.e., Surveys of students, faculty, alumni, employers (see handout) Methods - Plans should include who do what, when, how, who information reported to, etc. What are others doing? –Resource: Annotated bibliography:

UNC TLT Conference March 2006 Levels of Assessment LevelTypeQuestion 0Technology Use/AttitudesHow do you map and then assess technology use? What methods can be used and challenges are faced when assessing technology use? 1Course LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the course level? 2Program LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the program level? 3Institutional LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the institutional level?

UNC TLT Conference March 2006 Levels of Assessment Part of assessment (level 0) is understanding: –What are we doing with the technology? –What are people’s attitudes toward the technology?

UNC TLT Conference March 2006 Technology Use/Attitudes Training –Who are you training, and on what? –FY 2004/ unique participants via regularly scheduled workshops, including 87 faculty, 120 staff, and 77 graduate students, with 1271 enrollments. Over 400 participants were served in custom trainings, and 41 attended Summer Institute. See for details. –July 1, 2005 – Feb. 28, unique participants so far, 162 faculty, 103 staff, 97 grad students –See workshop schedule at:

UNC TLT Conference March 2006 Technology Use/Attitudes Support –What support are you providing? –In FY 2005, s, phone, and in-person help calls and consultations were documented, an overall increase of 32% (from an overall total of 2150 documented calls in FY 2004). A total of 2349 Remedy calls and 487 Instructional House Calls were reported during FY patterns

UNC TLT Conference March 2006 Technology Use/Attitudes Usage How do you look at usage? Look at: How many LMS sections? Tool usage? Usage statistics? (depts/colleges?) –

UNC TLT Conference March 2006 Technology Use/Attitudes What do faculty, staff & students think about the technology? – essment/ essment/ –Faculty focus groups, student surveys, faculty and staff surveys.

UNC TLT Conference March 2006 Levels of Assessment LevelTypeQuestion 0Technology Use/AttitudesHow do you map and then assess technology use? What methods can be used and challenges are faced when assessing technology use? 1Course LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the course level? 2Program LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the program level? 3Institutional LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the institutional level?

UNC TLT Conference March 2006 Assessment of Individual Faculty Grants – Mostly Course Level (1) Each Faculty Grant MUST include assessment activities –“How will I know if I accomplished my goals?” –“How does the technology and pedagogy affect student learning?” PI does the assessment work, but the LITRE Assessment Committee members are available to consult with PIs

UNC TLT Conference March 2006 Course Level Assessment (1): ENG 33(x) Technical Writing Objective: Improved writing on cover letters and resumes Pedagogy: Used web modules for students to complete before class – devoted more class time to assisting students with writing Assessment method: Had graduate students review samples from students in the enhanced section and in the “regular” section Results: Students in enhanced section performed better on the writing tasks

UNC TLT Conference March 2006 Levels of Assessment LevelTypeQuestion 0Technology Use/AttitudesHow do you map and then assess technology use? What methods can be used and challenges are faced when assessing technology use? 1Course LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the course level? 2Program LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the program level? 3Institutional LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the institutional level?

UNC TLT Conference March 2006 Program Level Assessment College of Engineering “Mobile Computing Pilot Program” (Laptop Initiative) Student learning objective: improved problem solving by students in “laptop sections” Other objectives: assess faculty workload, technical challenges, student and faculty satisfaction with “laptop sections” Courses Involved: Intro to Engineering, Calculus sequence, Computer Models in Bio Engineering, Chem E Design I, others

UNC TLT Conference March 2006 Program Level Assessment (cont) Pedagogy: “Laptop sections” were set up so that students brought their laptops to class and used them on exercises or activities; regular sections were lectures with separate labs (at a different time) Methods of Assessment: Direct methods used rubrics to score student work on appropriate facets of their work (complexity of programming solutions, grasp of graphical concepts, etc); Indirect methods (surveys)

UNC TLT Conference March 2006 Program Level Assessment (cont) Results: Students in the “laptop sections” significantly outperformed students in regular sections on many aspects of course work (including visualization of content, programming, and graphics) Faculty members indicated a need for more training and more mentoring by other faculty members who had previously taught a “laptop section” The detailed report can be found at …

UNC TLT Conference March 2006 Levels of Assessment LevelTypeQuestion 0Technology Use/AttitudesHow do you map and then assess technology use? What methods can be used and challenges are faced when assessing technology use? 1Course LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the course level? 2Program LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the program level? 3Institutional LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the institutional level?

UNC TLT Conference March 2006 Assessment Methods for LITRE Goals Faculty Survey –Was used in developing LITRE –Will be conducted periodically to look for improvements and other issues –Will include faculty perceptions of how student learning improved in 4 dimensions Student Surveys –Student perceptions on the 4 dimensions of student learning Alumni Survey Meta-synthesis of LITRE Projects (LITRE-Like Projects) & Grants –Lessons learned –Improvement of student learning

UNC TLT Conference March 2006 LITRE Results Related to Student Learning Senior student survey: The highest increase was seen in use of computerized exams: –USE: 23% in 03/04 increased to 30% in 04/05 –LEARNED BETTER: 18% in 03/04 increase to 28% in 04/05 From LITRE projects: Only a few results related to student learning from efforts because: –Time needed for Infrastructure –Time for faculty to incorporate technology into coursework –Time to develop DIRECT assessment methods of student learning (not rely on just indirect methods such as surveys) Majority of faculty felt that the pace, variety and depth of their course has been increased and students were more involved in learning. Need to Modify Question: the interaction of technology as a tool, faculty’s pedagogy and student use of the technology on student learning.

UNC TLT Conference March 2006 Lessons Learned—Assessment Methods Overestimate time and resources Get faculty involved in assessment Provide support and training for assessment Helps improve understanding of assessment Just Do IT!

UNC TLT Conference March 2006 Exercise 2: Assessment Method – 15 minutes Take one of your questions, and develop an assessment method at each level. LevelTypeQuestion 0Technology Use/Attitudes How do you map and then assess technology use? What methods can be used and challenges are faced when assessing technology use? 1Course LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the course level? 2Program LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the program level? 3Institutional LevelWhat methods can be used and challenges are faced when assessing technology use/outcomes/impact at the institutional level?

UNC TLT Conference March 2006 Group Discussion What were your methods? What were some of the challenges?

UNC TLT Conference March 2006 Wrap-Up Vision: What is your overall question? Vision: Pedagogically driven – Technology is only one component of complex framework Collaboration: Assessment is Collaborative Collaboration: Harvesting and Meta-synthesis of multi- levels for institutional decision-making Leadership: 1-2 champions of the process Diverse: Assessment is as diverse as our institutions but should be systematic, systemic, and sustainable Evolutionary: Assessment is an iterative process

UNC TLT Conference March 2006 Resources Quality Enhancement Plan for Learning in a Technology-Rich Environment at NC State: LITRE Goals and Assessment Plan: LITRE Faculty Survey Report: Classroom NC State:

UNC TLT Conference March 2006 Resources Update on Infrastructure Critical Needs Documentation: Annual Reports and Strategic LMS Implementation Committee/DELTA website: ees/delta/slic/slic_committees/Assessment/ ees/delta/slic/slic_committees/Assessment/ Resources on assessment of technology related to student learning: Internet Resources on Higher Education Outcomes Assessment:

UNC TLT Conference March 2006 Contact Information Joni E. Spurlin, Ph.D. University Director of Assessment University Planning and Analysis Sharon P. Pitt Associate Vice Provost Learning Technologies Allen Dupont, Ph.D. Director of Assessment, Division of Undergraduate Academic Programs Donna Petherbridge Director, Instructional Services Learning Technologies