1 Identifying Instruction-Related Research Issues Deborah Lines Andersen School of Information Science and Policy University at Albany June 26, 2004.

Slides:



Advertisements
Similar presentations
Chapter 11 Direct Data Collection: Surveys and Interviews Zina OLeary.
Advertisements

Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
THE LIBRARY BOARD Role Responsibilities Rights James C. Seidl Woodlands Library Cooperative
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Conducting the Community Analysis. What is a Community Analysis?  Includes market research and broader analysis of community assets and challenges 
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Oregon State Library Transformation Project Launch
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Did you sign in and take a handout packet? Please turn off your cell phones! The Foundation Center’s Proposal Writing Basics.
©2007 by the McGraw-Hill Companies, Inc. All rights reserved. 2/e PPTPPT.
Introduction to Writing Proposals Courtesy of
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Assessment of Adult Learning in a Discipline-Based Dual Language Immersion Model® Fidel R. Távara, M.Ed. Assessment coordinator Florida Campuses.
Responsible Conduct of Research, Scholarship, and Creative Activities Peer Review Responsible Conduct of Research, Scholarship, and Creative Activities.
Research Ethics Levels of Measurement. Ethical Issues Include: Anonymity – researcher does not know who participated or is not able to match the response.
Larabee/Stager Grants to Educators GRANT WRITING TIPS.
Our Group Plan for a Mixed Methods Study by John W. Creswell, Ph.D. and Vicki L. Plano Clark, M.S. University of Nebraska-Lincoln
Faculty Senate Report, James Guffey, President Diane Johnson, President Pro-tempore.
Evaluation. Practical Evaluation Michael Quinn Patton.
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Formulating the research design
Standards and Guidelines for Quality Assurance in the European
Collection Development
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
1 Writing a research proposal Jan Illing, Jan Illing, Gill Morrow and Charlotte Kergon Gill Morrow and Charlotte Kergon.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Program Evaluation Using qualitative & qualitative methods.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
ODINCINDIO Marine Information Management Training Course February 2006 Evaluating the need for an Information Centre Murari P Tapaswi National Institute.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Continuous Improvement Collecting, Analyzing, and Sharing Data.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
1 8. Marketing Research & Information Systems. 2 The Marketing Information System Part of management information system Involves people, equipment & procedures.
Too expensive Too complicated Too time consuming.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
What you need to know about the Saint Leo IRB review process.
Outcome Assessment Reporting for Undergraduate Programs Stefani Dawn and Bill Bogley Office of Academic Programs, Assessment & Accreditation Faculty Senate,
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
Using Needs Assessment to Build A Strong Case for Funding Anna Li SERVE, Inc.
Canadian English LING 202, Fall 2007 Dr. Tony Pi Research Ethics.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
Thoughts on the Role of Surveys and Qualitative Methods in Evaluating Health IT National Resource Center for HIT 2005 AHRQ Annual Conference for Patient.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
BUILDING A PRIOR LEARNING ASSESSMENT PROGRAM Office for Prior Learning Assessment Joyce Lapping, Director and Panel Presenter at NEASC 126 th Annual Meeting.
Get Your "Party" Started: Establishing a Successful Third-party Evaluation Martha Thurlow, Ph.D. & Vitaliy Shyyan, Ph.D.—National Center on Educational.
Human Subjects Review: Policies and Procedures. Why A Human Subjects Review? It is the policy of this University that all researchers undertaking studies.
Technology Transfer Innovation Program (T 2 IP). What Is the Program Mission Why Is It Important Who Is It Designed to Assist How Will It Encourage TT.
Quality Assurance Project Makerere University Report on Activities of the No Cost Extension Dr Ddembe Williams Project Manager Dr Lillian Tibatemwa-Ekirikubinza.
AB 86: Adult Education Consortia Planning Using Your Planning $$$ Wisely Webinar Series
1 TERMS OF REFERENCE Assoc.Prof.Dr. Osman YILMAZ April 20, 2004 University Curriculum Committee.
Proposals Prof. Z. Lewis.
August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR)
Data Management Plans PAUL H. BERN, PH.D. APRIL 3, 2014.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
 Marketing Information System: A set of procedures and methods that regularly generates, stores, analyzes, and distributes information for use in making.
Assessment of Your Program Why is it Important? What are the Key Elements?
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
Evaluation Structure. 2| Evaluation – A Multi-layered Approach All AHEC awardees are conducting pre and post tests as well as focus groups with an external.
Making Grant Writing Successful Dara O’Neil Georgia Institute of Technology 26 October 2000.
Course title: RESEARCH PLANNING AND REPORT WRITING Lecture 1 Introduction to postgraduate research.
Applied Research Consultants: Applied Research Consultants: Assessment Services and External Evaluation.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
MCC- EDUCATION/ SCHOOL COMMITTEE
Program Evaluation Essentials-- Part 2
Institutional Effectiveness Presented By Claudette H. Williams
TLQAA STANDARDS & TOOLS
Presentation transcript:

1 Identifying Instruction-Related Research Issues Deborah Lines Andersen School of Information Science and Policy University at Albany June 26, 2004

2 A Research Story The research problem Methodology decisions Respondent choices The “so what?” Data collection Data analysis Results in action

3 Transformation Turning an idea into a workable research process Using standard research methods Relying upon others for feedback and guidance (and creating “buy in”) Piloting before conducting research Assessing the costs and benefits up front

4 An Idea We want to find out if we are offering the right courses in the library. Should we be offering different courses? Which ones? How often and when? Should we use budget dollars to develop new curricula?

5 Asking the Right Questions Of ourselves Of our staff Of our patrons Of our administrators Of our collections and services Of our educational materials Of other units in our organization

6 Asking the Right Questions How will this research be used? –Formative evaluation? –Summative evaluation? –Change internal to the organization? –Dissemination external to the organization? –Allocation of existing resources? –Justification for new resources?

7 What Do We Want to Assess?

8 What Do You Want to Assess? Create a one-sentence statement of the problem that you are addressing Create a second sentence describing why this is an important problem Write a third sentence explaining how doing this research will help your organization, your unit, and/or your users

9 An Idea (Again) We want to find out if we are offering the right courses in the library. Should we be offering different courses? Which ones? How often and when? Should we use budget dollars to develop new curricula?

10 The Transformation The problem: The university library needs a systematic review of it course offerings (low attendance, poor post-class evaluations). The importance: With finite staff and funding the library wants to offer the best courses with it’s existing resources. The benefit(s): Appropriate courses will attract more students and faculty, making more patrons better users of the library’s services.

11 Who Has the Information You Need for Assessment?

12 Who Has the Information You Need for Assessment? Find the right respondent pool Worry about generalizability and statistical significance (qualitative vs quantitative) Look at (some or all?) –Learners/patrons –Teachers/librarians –Organizations/administrators

13 How Will You Conduct This Research? (Methodology)

14 How Will You Conduct This Research? (Methodology) Assess the best way (time, labor, and cost) to collect the necessary data Consider the “usual” methods: –surveys –interviews –focus groups –secondary data analysis

15 How Will You Conduct This Research? (Methodology) Consider partnerships –Other librarians (internal and external) –Seminars to discuss research old and new –Graduate students as staff –PhD students as researchers –Faculty and other staff (IT, research agencies) –Schools of library and information science

16 A Methodology Statement In order to assess the appropriateness of library course holdings, a paper survey will be developed, piloted, administered to 400 undergraduates in their required English classes, and analyzed between September 2004 and April 2005 to determine the effectiveness of present classes and the need for new and/or revised course offerings.

17 What Questions Should You Ask?

18 What Questions Should You Ask? (Think “Parsimony”) Use existing data collection documents Pilot test new materials Think about analysis issues (charts, stats) Check for bias, “doubled-up” questions Think about length of individual data collection instrument (response rates) Check logical order of questions

19 Does This Research Meet Organizational Standards?

20 Does This Research Meet Organizational Standards? Research review boards and procedures Human subjects review Research ethics Research involving minors Confidentiality and anonymity Access to and disposal of data Classes on human subjects and ethics

21 How Long Will It Take to Do the Research?

22 How Long Will It Take to Do the Research? Creating committees Creating methodological protocols Securing organizational approval Securing funding (internal or external) Contacting respondents Waiting for respondents Coding, analyzing, and reporting findings

23 A Technology Note on Data Analysis Quantitative Analysis –By hand –Excel –SPSS … Qualitative Analysis –By hand –Atlas.ti; Nudist; …

24 How Much Will It Cost? Who Will Pay for This? (Grants???)

25 How Much Will It Cost? Who Will Pay for This? (Grants???) Budget for research Assess direct (and indirect) costs Use existing staff and funds Seek grants inside or outside your organization Use existing data Pay attention to expensive data

26 Research Project Action Plans: Practical Next Steps 1 Identify your group’s facilitator Assign a time keeper (30 minutes plus 15 minute break) Do a round-robin reading of the group’s problem statements Ask questions about each other’s projects Rewrite/edit problem statements; then write importance and benefit statements

27 Research Project Action Plans: Practical Next Steps 2 Identify appropriate subjects and stakeholders Identify appropriate methodology or methodologies Consider the best time frame for the research List possible publication venues Create your action plan

28 Get Started!