Evaluating Educational Technology Initiatives: How Do You Know It’s Working?

Slides:



Advertisements
Similar presentations
BBA3042 Business Research Project I Class Session Four.
Advertisements

Focus on Instructional Support
How Do We Know It’s Working? Creating Evaluations for Technology Projects and Evaluations (part I)
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Planning the Nonprofit Startup Business Planning.
HIA Project Management Overview What is a project? zA temporary endeavor undertaken to accomplish specific goals or objectives zAn activity with.
 Reader and Task The Forgotten Component of Text Complexity Sydnee Dickson, Utah State Office of Education Jimi Cannon, Scholastic Classroom and Community.
SMART Goals.
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Quality of life 8th seminar Jaroslav Biolek. Objective x subjective -What is relation between objective and subjective indicators of quality of life (according.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Copyright © Allyn & Bacon 2007 Power Point Presentations for Each Chapter of SuperVision and Instructional Leadership: A Developmental Approach Carl D.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Program Evaluation Tools and Strategies for Instructional Technology.
Evaluation Workshop 10/17/2014 Presenters: Jim Whittaker – KCP/Evaluation Dr. Larry M. Gant – SSW Evaluation Christiane Edwards – SSW Evaluation 1.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Leaders Facilitate the Planning Process
 1. Methods of evaluation are thorough, feasible, and appropriate  2. Use of objective measures to produce quantitative and qualitative data  3. Methods.
Aligning Academic Review and Performance Evaluation (AARPE)
Rigorous Curriculum Design
Creating Successful Proposals For Curriculum and Technology Projects.
Simulation as a tool for Computer Assisted Formative Assessment - First Aid as a case study Andrew Young and Steven Cafferty Computer Science University.
Success in Implementing Data Teams B.R. Jones, PhD Professional Development Associate Fresno, CA Bringing Coherence and Focus to Teaching and Learning.
Leadership WSSDA Group Project November 2013 Tom Albright, Janet Quinn & Paul Wagemann 1.
MassCUE Evaluators Day 1 – January 11, Greetings! zLogistics and Locations yWiFi SSID and Password yBathrooms, etc. zTeams yWho are you? yTeam members?
Evaluating Your Technology Initiative How do you know it’s working? The MassCUE Evaluators Program Let us know you’re here!
MassCUE Evaluators Data Collection Creating Questions Mapped to Indicators.
SCHOOL IMPROVEMENT SPECIALIST TRAINING SEPTEMBER 14, 2011.
Accreditation follow-up report. The team recommends that the college further refine its program review, planning, and resource allocation processes so.
What is HQPD?. Ohio Standards for PD HQPD is a purposeful, structured and continuous process that occurs over time. HQPD is a purposeful, structured and.
THE PROCESS OF PLANNING The “Back Story” for teachers…..
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
CA COUNTY PEER QUALITY CASE REVIEW (Insert Review Week Dates)
Washington State Teacher and Principal Evaluation Project Introduction to Teacher Evaluation in Washington 1 June 2015.
MassCUE Evaluators – Maximizing Your Technology Investment Documenting the Impact of Instructional Technology Initiatives.
SLG Goals: Reflecting on the First Attempt Oregon Collaboration Grant Statewide Grantee Meeting November 21, 2013.
Planning into Practice Michigan Technology Planning Workshops The Process of Technology Planning.
Expanding Communications on the New T 3 TLC Program.
MAIN STREET FOUR POINT APPROACH THREE TIGHTLY-INTEGRATED COMPONENTS: 1.COMMUNITY VISIONING AND MARKET UNDERSTANDING 2.TRANSFORMATION STRATEGIES – IMPLEMENTED.
Evaluating 1:1 Initiatives and Other Technology Initiatives The MassCUE Evaluators Program.
M= Math in STEM College and Career Ready- Conference Summer, 2015.
Lead Teach Learn PLC Fundamental III: Climate and Engagement.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
KEEPING THE FOCUS ON STUDENT ACHIEVEMENT Stephanie Benedict Academic Development Institute & Center on Innovations in Learning.
Chapter 6: High-Leverage Practice 1: Formative Evaluation.
School Development Goal Development “Building a Learning Community”
Considering the Roles of Research and Evaluation in DR K-12 Projects December 3, 2010.
Amy Fitchett.  “...education that is organized in such a way that it cuts across subject-matter lines, bringing together various aspects of the curriculum.
MassCUE4You Individual Technology Evaluation for Districts Let us know you’re here!
Graphic Organizers 1. The human continuum Scale -questions about Graphic Organizers! 2. What are Graphic Organizers? 3. Different Types of Graphic Organizers.
Office of School Improvement Differentiated Webinar Series Formative Assessment – Assessment for Learning December 13, 2011 Dr. Dorothea Shannon Dr. Greg.
Data for Decision-Making Assessing the impact of technology implementation and professional development.
1 Classroom Assessment Compiled by Linda Blocker.
All health care professionals must understand and use the EBP approach to practice Incorporates expertise of clinician and patient’s values and preferences.
February 19, 2013 EXPLICIT INSTRUCTION.  After achieving a working knowledge and components of explicit instruction, teachers will self-assess their.
Dick & Carey Instructional Design Model Sabri Elamin ED 6615.
Module II Creating Capacity for Learning and Equity in Schools: The Mode of Instructional Leadership Dr. Mary A. Hooper Creating Capacity for Learning.
Applying an Optimal Learning Model to Your Teaching Session 2
Leaders Facilitate the Planning Process
Leveling of Expectations and Introduction to the Course
Transforming Grading Robert Marzano
SNA Strategic Plan #SNAleadership.
All Sure Start Educators
209: Visitation: The Heart of Permanency Planning
Implementation Guide for Linking Adults to Opportunity
Learning Intentions We are learning to (W.A.L.T.):
Second Semester Overview
SEC & State/District Initiatives
EDD/581 Week 1.
Accreditation follow-up report
Presentation transcript:

Evaluating Educational Technology Initiatives: How Do You Know It’s Working?

Introduction zLearn about – and get the tools for – conducting a successful evaluation zSet appropriate expectations for instructional technology evaluation zReflect on the value that such a process can bring to your district’s efforts.

zSun Associates ywww.sun-associates.com yWho we are yWhat we do yWhat’s on your USB drive ywww.sun-associates.com/saevalws zYou?

Framing the task for Program Evaluation zDeconstructing “Working” yDefined by the goals of your project yDefined by intent zEvaluation = A structured process for examining the relationship between intent/actions and outcomes

zVery focused projects zWhole district technology plans zThe basic process is the same zToday, we’re focusing on a whole district technology plan – aka, technology audit.

The Sun Associates Process zAn “open-source” approach to technology program evaluation yYou can modify components of the process as necessary/if desired zA process for examining outcomes as measured by intent yThe process allows you to reflect on your intent and the anticipated outcomes related to that intent

Sun Associates’ Process

zMixed Methods yQualitative and quantitative zStakeholder-based yFocused on unique needs/situations zOpen Source yAllows for incorporation of existing measures/metrics zGenerative of reflection yExcellent for both on-going-improvement, formative work, and planning work. zProduces “answers” for those who need those

Sun Associates’ Process

Working an Example Together zTake a look at the tools… yIndicator/question matrix yUniverse of data zFinding references in the data yQuantitative and Qualitative zWhat do we think the data says? yThat’s a finding!

Now You Do It z4 teams yTeams will model the data review and analysis process for a single indicator yToday, 2 teams per indicator yIn “real life” your committee would need to do this for all FOUR indicators zReview your team’s indicator, questions, and data zDevelop basic findings z40 minutes!

Pair-Share zFind the other team that worked on your indicator z15 minutes to pair-share what you found yWhat are the findings you developed for your indicator?

Expanding the Process zModifying the Indicators yConnecting the indicators to your specific project outcomes zUsing different/additional questions ySun Associates’ Question Bank zWeaving-in existing tools and data

Wrap-up zFor additional assistance… ySun Associates’ evaluation support website yWhat’s coming there next zJeff Sun – y ext. 204