Presentation is loading. Please wait.

Presentation is loading. Please wait.

Data Collection Techniques For Technology Evaluation and Planning (TA109)

Similar presentations


Presentation on theme: "Data Collection Techniques For Technology Evaluation and Planning (TA109)"— Presentation transcript:

1 Data Collection Techniques For Technology Evaluation and Planning (TA109)

2 2 zJeff Sun yjsun@sun-associates.com zOctavio Munist yomunist@sun-associates.com zwww.sun-associates.comwww.sun-associates.com zwww.sun-associates.com/necc2006www.sun-associates.com/necc2006

3 3 Objectives zTo understand the role of data collection in an overall evaluation process zTo review various data collection tools and strategies zOthers?

4 4 Why Evaluate? zTo fulfill program requirements yNCLB and hence Title IID carry evaluation requirements yMost other state and federal proposals require an evaluation component xAnd not simply a statement that “we will evaluate” xBut actual info on who will evaluate, the evaluation questions, and methodologies zProject sustainability zGeneration of new and improved project ideas

5 5 By Definition, Evaluation… zIs both formative and summative zHelps clarify project goals, processes, products zShould be tied to indicators of success written for your project’s goals zIs not a “test” or simply a checklist of completed activities zQualitatively, are you achieving your goals? zWhat adjustments can be made to your project to realize greater success?

6 6 A Three-Phase Evaluation Process zEvaluation Questions yTied to original project goals yIndicator rubrics yAllow for authentic, qualitative, and holistic evaluation zData Collection yTied to indicators in the rubrics zScoring and Reporting yRole of the evaluation committee pg 5 in workbook

7 7 Who Evaluates? zCommittee of stakeholders zOutside facilitator? zData collection specialists? zTask checklist (page 6)

8 8 Data Collection vs. Evaluation zEvaluation is more than data collection zEvaluation is about… yCreating questions yCreating indicators yCollecting data yAnalyzing and using data zData collection occurs within the context of a broader evaluation effort

9 9 An Iterative Process zEvaluation breaks your vision down into increasingly observable and measurable pieces.

10 10 Goals Lead to Questions zWhat do you want to see happen? yThese are your goals yRephrase goals into questions zAchieving these goals requires a process that can be measured through a formative evaluation

11 11 …And Then to Indicators zWhat is it that you want to measure? yWhat are the conditions of success and to what degree are those conditions being met? yBy what criteria should performance be judged? yWhere should we look and what should we look for to judge performance success? yWhat does the range in the quality of performance look like? yHow should different levels of quality be described and distinguished from each other?

12 12 zIndicators should reflect your project’s unique goals and aspirations yRooted in proposed work yIndicators must reflect your own environment...what constitutes success for you might not for someone else yIndicators need to be highly descriptive and can include both qualitative and quantitative measures zYou collect data on your indicators

13 13 Evidence? zClassroom observation, interviews, and work- product review yWhat are teachers doing on a day-to-day basis to address student needs? zFocus groups and surveys yMeasuring teacher satisfaction zTriangulation with data from administrators and staffTriangulation yDo other groups confirm that teachers are being served?

14 14 Data Collection zReview Existing Data yCurrent technology plan yCurriculum yDistrict/school improvement plans yOthers?

15 15 Tools and Techniques zSurveys zInterviews zObservations zArtifact Analysis

16 16 Surveys zOnline vs. Paper yIs there sufficient connectivity? yOften works best if people complete the instruments all at the same time ySame goes for paper surveys zOnline surveys provide immediate data zSpreadsheets which can be exported to a variety of different programs for analysis

17 17 Surveys zOnline yVIVEDVIVED yProfilerProfiler yLoTiLoTi yZoomerangZoomerang ySurveyMonkey.comSurveyMonkey.com

18 18 Make Your Own! zwww.sun-associates.com/surveyws/surveys.htmlwww.sun-associates.com/surveyws/surveys.html zwww.sun-associates.com/necc2006/datacoll/necc06samp.htmlwww.sun-associates.com/necc2006/datacoll/necc06samp.html zBased on a CGI script on your web server zOutputs to a text file, readable by Excel zWorks with yes/no, choose from a list, and free text input (no branching)

19 19 Survey Considerations zThere actually is a science involved in survey creation zIssues of validity both in questions and instruments zSample size yAttaining a sufficiently large sample to test validity yPiloting zThese are all reasons to use surveys that have been validated already

20 20 Survey Tips zKeep them short (under 10 minutes) zAvoid huge long checklists zAllow for text comments zSupport anonymity yBut allow for categorical identifications -- school, job function, grade, etc.

21 21 zCoordinate and support survey administration yAvoid the “mailbox stuffer” yWork with building leaders yProvide clear deadlines

22 22 Three Big Points zSurveys alone mean nothing yTRIANGULATE! z100% response rate is virtually impossible yOn the other hand, nearly 100% is very possible if you follow our tips! zShare the data yNo one wants to fill in forms for no purpose

23 23 Interviews zServe to back up and triangulate survey data zLess anonymous than surveys yMixed blessing... zAllows for immediate follow-up of interesting findings

24 24 Interviewing Tips zAs homogenous as feasible yBy grade, job function, etc. zBe attentive to power structures yDon’t mix principals with teachers; tech coordinators with teachers; central office staff with principals; etc.

25 25 zUse outside interviewers yPeople will explain things to us (because they have to!) yWe avoid the power structure issues yWe’ve done this before zStructure and focus the interviews yUse a well-thought-out and designed protocol yOnly diverge after you’ve covered the basic question

26 26 Three Big Points zCreate protocols after you’ve seen survey data zHomogeneity and power zUse outsiders to conduct your interviews

27 27 Observations zThe third leg of your data triangle Observations ySurveys - Interviews - Observations zFamiliar yet different yYou’ve done this before...but not quite zProgressively less “objective” than surveys and interviews

28 28 Observation Tips zInsure that teachers understand the point and focus of the observations yYou’re evaluating a project, not individuals!! zSample yYou can’t “see” everything ySo think about your sample zYou can learn as much from an empty classroom as an active one yLook at the physical arrangement of the room yStudent materials yHow is this room being used?

29 29 zOutside observers are necessary unless you simply want to confirm what you already know zAvoid turning observations into a “technology showcase” yShowcases have their place -- mostly for accumulating and reviewing “artifacts” yBut the point of observations is to take a snapshot of the typical school and teacher

30 30 Three Big Points zObserve the place as well as the people zObservations are not intended to record the ideal...rather, the typical zUse outside observers

31 31 Artifact Analysis zReviewing “stuff” yLesson plans yTeacher materials yStudent work zCreate an artifact rubric yNot the same as your project evaluation indicator rubric

32 32 Data Dissemination zCompile the report zDetermine how to share the report ySchool committee presentation yPress releases yCommunity meetings

33 33 10 Tips for Data Collection zGround your data collection within a larger evaluation framework yKnow why you’re asking what you’re asking zNot all data is quantitative zCast a wide net yIt’s all about stakeholders zDig deep yTry to collect the data that can’t easily be observed or counted

34 34 zUse confirming sources yTriangulate! Surveys alone do nothing. zHave multiple writers yStakeholders and different perspectives zThink before you collect yChoose questions carefully and with regard to what you really expect to find

35 35 zSet (reasonable) expectations for participation yTime and effort zForget about mailbox surveys yUsually waste more time than their value zReport back yDon’t be a data collection black hole!

36 36 More Information zjsun@sun-associates.com y978-251-1600 ext. 204 zomunist@sun-associates.com zwww.sun-associates.com/evaluationwww.sun-associates.com/evaluation zwww.sun-associates.com/necc2006www.sun-associates.com/necc2006 zwww.edtechevaluation.comwww.edtechevaluation.com

37


Download ppt "Data Collection Techniques For Technology Evaluation and Planning (TA109)"

Similar presentations


Ads by Google