Presentation is loading. Please wait.

Presentation is loading. Please wait.

How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink

Similar presentations


Presentation on theme: "How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink"— Presentation transcript:

1 How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink cv36@cornell.edu

2 Seminar Goals Seminar participants will be able to…. Select project goals that can be evaluated. Identify relevant indicators to evaluate project goals. Identify data collection methods Assess considerations for developing evaluation activities Create an evaluation plan for an academic technology project that is tied to their project assumptions and strategies.

3 Please indicate the type of projects you’re interested in evaluating.

4 What are the key challenges in fully evaluating your projects?

5 A Framework Project Goals Focus of Evaluation Evaluation Design Overview Indicators - measures of ”success” Data collection -Methods -Population -Procedures Timeline Data Analysis Reporting Findings

6 Introduction How this process was developed… “To evaluate the effectiveness of the technology enhancement and it’s impact on student learning” …..With constraints of staff, time and budget limitations.

7 Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity

8 Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity Interview Survey

9 Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity Interview Survey Interviews Surveys Tech Review Usability

10 Staff Effort Complexity of Evaluation Low High Small Project with technology intervention LMS Pilot Project- Service Decisions Quasi-experimental Research on Tech Intervention Importance-Complexity Interview Survey Interviews Surveys Tech Review Usability Interviews Surveys Observations Control group

11 To inform the evaluation Based on Prior research? What led to the development of the project? Literature review- What will inform the evaluation? Assumptions about the strategies & technologies selected?

12 Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure…

13 Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure… How can you SELECT goals that can be evaluated within the scope of the project?

14 Selecting Goals…. Since not all projects may be evaluated within the timeframe of the project… OR may be difficult to measure… How can you SELECT goals that can be evaluated within the scope of the project?  How would you PRIORITIZE the goals that are the most critical to evaluate?

15 Sample Goals: Review examples ofgoals and idenitfy goals that could be evaluated within the project constraints. EXAMPLE #1 Instructional Goals: 1.) Encourage active participation and critical thinking skills by using video clips of main stream movies to initiate class discussions. 2.) Encourage student involvement and active learning by creating a mechanism for students to record interviews in the field. (part of a class assignment) 3.) Create a repository of student-collected audio interviews for ongoing use in the curriculum. Audio clips will be used to illustrate the diversity of public education experiences. Other Project Goals:. 4.) Develop work flow and documentation for student recording of audio interviews and video clip processing. 5.) Choose, create, and provide an archiving mechanism for cataloguing clips.

16 Sample Goals: Review examples ofgoals and idenitfy goals that can be evaluated within the project constraints. EXAMPLE #2 Instructional Goals: A. ) Students will be able to practice application of fluid therapy, under various conditions, employing a unique computer-based simulation B.) Students will be able to interpret symptoms presented in a sick dog, select an appropriate treatment, administer fluids, monitor patient’s reaction and modify treatment plan accordingly. C.) Case simulation will enable students to experience clinical variability in a manner similar to hands-on practice. Other Project Goals: D.) Simplify the creation of a set of teaching models, or prototypes, that are the basis of the cases. E.) Create a method for generating unique computer- based cases that build from the prototypes. F.) Provide a method for saving case data for comparison.

17

18 Developing the evaluation plan

19 Evaluation Process Identify Project Goals Survey Observation Interviews Focus Groups Data analysis Case Study

20 Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Survey Observation Interviews Focus Groups Data analysis Case Study

21 Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Survey Observation Interviews Focus Groups Data analysis Case Study

22 Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Methods: how will data be collected? Survey Observation Interviews Focus Groups Data analysis Case Study

23 Evaluation Process Identify Project Goals Focus of the Evaluation? What goals will be evaluated? Indicators: type of data to collect? Methods: how will data be collected? Survey Observation Interviews Focus Groups Data analysis Case Study Population

24 Process: Select Goals …Focus  What goals will be the Focus of the evaluation? Project Goals 1.Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. 2.Use of PowerPoint presentations with interactive lecture. 3.Implement use of a Tablet PC for annotating presentations for visual and interactive lectures Evaluation Process

25 Process: Select Goals …Focus  What goals will be the Focus of the evaluation? What is feasible to evaluate in project’s timeframe? Project Goals 1.Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. 2.Use of PowerPoint presentations with interactive lecture. 3.Implement use of a Tablet PC for annotating presentations for visual and interactive lectures Focus of the Evaluation Use Personal Response System (polling) with questions to encourage critical thinking & active participation. (student engagement) Evaluation Process

26 Process: Select Goals …Focus  What goals will be the Focus of the evaluation? What is feasible to evaluate in project’s timeframe?  Identify what INDICATORS can be used to collect data, both indirect and direct measures. Project Goals 1.Use Personal Response System (polling) with questions to encourage critical thinking & student engagement. 2.Use of PowerPoint presentations with interactive lecture. 3.Implement use of a Tablet PC for annotating presentations for visual and interactive lectures Focus of the Evaluation Use Personal Response System (polling) with questions to encourage critical thinking & active participation. (student engagement) Evaluation Process

27 Evaluation Focus: Example 1 Formative: Since the project is assisting with the development of online modules, a formative evaluation of the modules will be conducted to look at the interface design, navigation, usability, organization and presentation of content, and the usefulness for student learning. The key focus will be on the “functionality” of the module.  Interface / Navigation / Design (not important in phase I)  Technology performance: test across browsers, OS, distance, etc.  Organization/presentation of content  Use of images, illustrations, images  Learning objectives** Summative: (part of the overall program evaluation) Since the project is assisting with the development of web-based modules, the summative evaluation will examine the effect on student perception of learning from the implementation of instructional technology in the course. Measures of success may include the student perception of greater ease in learning difficult concepts, and positive feedback about new modules. Evaluation Process

28 Select Methods Develop data collection METHODS for the indicators, such as surveys, interviews, observations, etc. What method(s) would you select evaluate the focus area- “functionality” of the online module? 1. Surveys 2. Interviews 3. Observation 4. Log analysis 5. All of the above 6. Other – post in chat Evaluation Process

29 Select Methods Develop data collection METHODS for the indicators, such as surveys, interviews, observations, etc What method(s) would you select to evaluate how well the online module met the “ learning objectives ”? 1. Surveys 2. Interviews 3. Observation 4. Log analysis 5. All of the above 6. Other – post in chat Evaluation Process

30

31 Other considerations  Identify the population from where the data will be collected. Can you contact this group?  Identity other data sources, such as logs, document, etc.  Are there considerations for human subjects research & informed consent on your campus? For example, using grades as data, what permissions are necessary? Evaluation Process

32 Timelines: How much time do you have? Need? Dec-Jan Project Transition & Closeout Fall Project Development Aug – Dec Fall Semester Project Implementation Implement Project Evaluation Project Evaluation Planning Evaluation: Data analysis & reports Evaluation Process

33 Evaluation Timeline Develop an initial timeline & staffing effort. Develop evaluation plan.September 1 Create an observation protocolSeptember 15 Observe EDU 271 during two class sessionsSept - October Complete the student survey. (instrument)October 1, 2005 Create an interview protocolOctober 15 Administer student survey.Mid- November Conduct student interviewsEnd of November Conduct data analysisDecember - January Complete evaluation reportFebruary 1 Evaluation Process

34 Evaluation Timeline Develop an initial timeline & staffing effort. Develop evaluation plan.September 1 Create an observation protocolSeptember 15 Observe EDU 271 during two class sessionsSept - October Complete the student survey. (instrument)October 1, 2005 Create an interview protocolOctober 15 Administer student survey.Mid- November Conduct student interviewsEnd of November Conduct data analysisDecember - January Complete evaluation reportFebruary 1 Where reality meets ideal evaluation methods…. Evaluation Process

35

36 Implementing the Plan & Reporting Results

37 Implementing Methods Surveys:  Identify or develop questions.  Do survey questions map to indicators?  Survey distribution & associated permissions. Interviews:  Develop interview questions & protocols.  Schedule and conduct interviews. Resources about quantitative and qualitative methods can guide development and implementation of methods and data analysis. Evaluation Process

38 Analysis & Reporting DATA ANAYSIS What type of analysis will be completed? Quantitative: survey analysis Qualitative: interview analysis based on interview protocols. Example: Overall, I am satisfied with the use of instructional technology in this course. Mean = 1.2 1 Strongly Agree 82% 2 Agree 18% 3 Neutral 0 4 Disagree 0 5 Strongly Disagree0 Evaluation Process “I interviewed Prof. X about her experience with the email simulation…..”

39 Analysis & Reporting Evaluation Process

40

41 Constraints

42 Staffing How can this planning all be completed within limited staff hours, while maintaining the INTEGRITY of the evaluation process? Evaluation Considerations & Tools

43 Do you think it is feasible to re-train staff for evaluation?

44 Staffing How can this planning all be completed within limited staff hours, while maintaining the INTEGRITY of the evaluation process?  How can staff be trained in this process without having a deep evaluation background?  What staff skills might be adapted?  Other campus resources? Evaluation Considerations & Tools

45 What type of existing skills could be adapted for evaluation?

46 Supporting Tools  Have an overall evaluation plan template that an be adapted to other projects.  Informed consent templates  Use common interview/observation protocols.  Develop question banks for survey questions.  Use of Video in Course Presentations and lectures  Use of Online instructional tutorials  Use of Presentations in Lecture Evaluation Considerations & Tools

47 question banks Evaluation Considerations & Tools

48 question banks Evaluation Considerations & Tools

49 Summary

50 Consider…  How can this METHODOLOGY be applied to your projects and institution?  How VIABLE is this as an evaluation methodology for your projects? When does a more ROBUST process need to be put in place?

51 A Framework Project Goals Focus of Evaluation Evaluation Design Overview Indicators - measures of ”success” Data collection -Methods -Population -Procedures Timeline Data Analysis Reporting Findings

52 A Framework 1. Project Goals 2. Focus of Evaluation 3. Evaluation Design  Overview  Indicators - measures of ”success”  Data collection - Methods - Population - Procedures 4. Timeline: what, when, how 5. Data Analysis 6. Reporting Findings

53 Questions? cv36@cornell.edu


Download ppt "How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink"

Similar presentations


Ads by Google