Gwynn Mettetal.  Discuss different ways to assess outcomes  Help you decide which methods would be best for your project.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Data Collection Strategies An Overview. How Do We Assess ?
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessment is about Quality AMICAL CONFERENCE – MAY 2008 Ann Ferren, Provost, American University in Bulgaria.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Evaluating Online Tutorials for Data Structures and Algorithms Courses June 24, Simin Hall, PhD Mechanical Engineering Prof. Clifford A. Shaffer,
Collecting Quantitative Data
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
Assessment of ICT in Schools An investigation. 19/10/2004Dr J L Chatterton2 Assessment of ICT What sort of areas have you seen assessed? What sort of.
Observing Behavior A nonexperimental approach. QUANTITATIVE AND QUALITATIVE APPROACHES Quantitative Focuses on specific behaviors that can be easily quantified.
Chapter 10 Collecting Quantitative Data. SURVEY QUESTIONNAIRES Establishing Procedures to Collect Survey Data Recording Survey Data Establishing the Reliability.
Steps in the Research Process I have a research question, what do I do next?
Using Measurable Outcomes to Evaluate Tutor Programs Jan Norton, Presenter.
Department of Humanities College of Sciences and Liberal Arts Writing Program Assessment at New Jersey Institute of Technology Carol Siri Johnson Associate.
The Impact of Project Based Learning on High School Biology SOL Scores Rhiannon Brownell April 1, 2008 ECI 637 ECI 637 Instructor: Martha Maurno, M.S.
Innovative Educators Webinar December 1, 2010 Jan Norton, Presenter.
AP Statistics Introduction. Benefits of Statistics  Used in all different subject areas, especially: medicine, science, business, psychology, actuarial.
The Research Process Interpretivist Positivist
Research Process Step One – Conceptualization of Objectives Step Two – Measurement of Objectives Step Three – Determine Sampling Technique Step Four –
Does the use of math journals improve students retention and recall of math facts and formulas?
3.2.1 The role of Market Research and Methods Used:
Classroom Assessment A Practical Guide for Educators by Craig A
Research Design & the Research Proposal Qualitative, Quantitative, and Mixed Methods Approaches Dr. Mary Alberici PY550 Research Methods and Statistics.
OBSERVATIONAL METHODS © 2012 The McGraw-Hill Companies, Inc.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Research Problem In one sentence, describe the problem that is the focus of your classroom research project about student learning: Students are not adequately.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
TAH Project Evaluation Data Collection Sun Associates.
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Assessing the Quality of Research
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
555 Review. Chapter 2: Introduction The sequence of 8 types of research Differences between qualitative and quantitative research – Components of qualitative.
NAME Evaluation Report Name of author(s) Name of institution Year.
Teacher Authority WEEK 1 – Welcome webinar preparation + Free word associations Learn about course syllabus and learning environment Course basics Provide.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Classroom Research Workshop at Darunsikkhalai, 2 November 2012 Richard Watson Todd King Mongkut’s University of Technology Thonburi
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Hafa Adai Student Learning and Assessment Welcome Dr. Julie M. Ulloa-Heath.
Slide 1-1 Copyright © 2004 Pearson Education, Inc. Stats Starts Here Statistics gets a bad rap, and Statistics courses are not necessarily chosen as fun.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Evaluating Educational Technology Brian McNurlen & Chris Migotsky University of Illinois at Urbana-Champaign.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Happy Wednesday! You will have a little time to put your posters together.
OBSERVATIONAL METHODS © 2009 The McGraw-Hill Companies, Inc.
1 REPORT CARDS & GRADING SYSTEMS : ASSESSMENT OF LEARNING CHAPTER 11.
CONFERENCE EVALUATION DATA ANALYSIS. DATA ANALYSIS  A credible amount of data has to be collected to allow for a substantial analysis  Information collected.
Using Portfolios: Options for Young Learners By Brad Tipka.
Practical Statistics Regression. There are six statistics that will answer 90% of all questions! 1. Descriptive 2. Chi-square 3. Z-tests 4. Comparison.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Innovate. Engage. Empower THE ONECLAY WRITES SCORING EXPERIENCE WELCOME! FIND A SEAT TALK TO OTHERS AT YOUR TABLE AND DISCUSS SUCCESSES SO FAR THIS YEAR.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Research Problem Part of learning is learning from mistakes, but there is no “safe” mechanism for attempt->failure->reattempt built into most courses where.
The research process Psych 231: Research Methods in Psychology.
The Sociological Research Process Date: Date: Monday, 07 March 2016 Lesson Outcomes: Describe the difference between qualitative & quantitative dataDescribe.
4/12/2007Assessing and Evaluating Teaching Methods 1 Source: Effective teaching in.
How do you know your product “works”? And what does it mean for a product to “work”?
General Education Assessment Report Assessment Cycle.
PSYCH 610 Entire Course (UOP) For more course tutorials visit  PSYCH 610 Week 1 Individual Assignment Research Studies Questionnaire.
RESEARCH METHODS Sociology 1301: Introduction to Sociology Week Three.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
M ARKET R ESEARCH Topic 3.1. W HAT IS MARKET RESEARCH ? The process of gaining information about customers, products, competitors etc through the collection.
AICE Sociology - Chapter 3
PSYCH 610 Competitive Success/snaptutorial.com
PSYCH 610 Education for Service/snaptutorial.com.
Quantitative v Qualitative ___________ v ___________
Qualitative and Quantitative Data
The Effect of Teaching on Student Learning in the Onsite and MOOC Version of the Nonprofit Governance Course June 1, 2016 Research Presentation 2016.
Quantitative vs Qualitative Research
Student Evaluations of Teaching (SETs)
Presentation transcript:

Gwynn Mettetal

 Discuss different ways to assess outcomes  Help you decide which methods would be best for your project

 Who are you?  What sort of Vision 2020 project are you planning? (the two sentence version)

 Assessment—evidence that your project is making a difference  You MUST assess the effectiveness of your Vision 2020 grant to get continued funding!  Lots of strategies possible  Depends on your goals  Depends on your situation

 Quantitative (numbers)  Grades, attendance, ratings on a scale, retention rate  Qualitative (words)  Interviews, essays, open ended survey questions  Both are fine, just different

 Existing data (easiest, already there) ◦ Student records ◦ Archival data ◦ Student work in course  Conventional sources (easy, but must generate) ◦ Behavioral data—journals, library usage ◦ Perceptual data—surveys, focus groups, interviews  Inventive sources (difficult) ◦ Products or performances

 Must treat students respectfully  Must protect privacy  Must “do no harm”  Collecting new data (not coursework) from your own students? ◦ Have someone else collect and hold until grades are in ◦ Can’t force them to participate ◦ Can’t take up too much instruction time  Institutional Review Board (IRB) ◦ If planning to publish

 Add power--compare groups! ◦ Before and after ◦ Different course units ◦ This semester and last ◦ Two sections with different methods ◦ Your class to that of another instructor  Be realistic--start small

 Validity—does your evidence (data) mean what you think it means?  Example  test scores = deep learning?  What if just rote memory?  What if students cheated?

 Reliability—would you get the same evidence if you collected it again? Or was this just a fluke?  Example:  Test scores = deep learning?  What if you gave again next week and scores were very different?

 In general, hard to have both.  Real life is messy (valid, not as reliable)  Experiments are controlled (reliable, not as valid)  Solution is...

 Get several different types of data  Different sources: ◦ Instructors, students, advisors, records  Different methods: ◦ Surveys, observations, student work samples  Different times: ◦ Start and end of semester, two different classes, two different semesters

Course evaluations final project rubric Comparison to last semester’s class

What data could YOU collect?

 Qualitative analyses: look for themes in words and behaviors Theme 1: Students understood more abstract concepts after group discussion. (Follow with quotes from student exams, other evidence.)

 Quantitative analyses: simple graphs, tables  Simple statistics: means, correlations, t-tests

Focus on practical significance, more than statistical significance

What would convince YOU?

 If evidence was good, keep your old strategy  If evidence was weak, tinker to improve your strategy  Plan to assess again, after working with a new group of students  You will need to show how you used your data to get continued Vision 2020 funding!