CHAPTER 16 ASSESSMENT OF THE PROGRAM. Educational Assessment »Assessment and Evaluation is an integral part of any educational program. »This is true.

Slides:



Advertisements
Similar presentations
Conducting the Community Analysis. What is a Community Analysis?  Includes market research and broader analysis of community assets and challenges 
Advertisements

Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 1.1 Chapter Five Data Collection and Sampling.
SEM A – Marketing Information Management
2.06 Understand data-collection methods to evaluate their appropriateness for the research problem/issue.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Evaluating Risk Communication Katherine A. McComas, Ph.D. University of Maryland.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Chapter 41 Training for Organizations Research Skills.
Formative and Summative Evaluations
Sociological Research Chapter Two. Copyright © 2004 by Nelson, a division of Thomson Canada Outline  Why is Sociological Research Necessary?  The Sociological.
Title I Needs Assessment and Program Evaluation
Chapter 13 Survey Designs
Conducting ONLINE SURVEYS Valerie M. Sue, Ph.D.. ntroduction 1.
Using Market Research. Seeing the Problem Clearly Marketing Research: procedure designed to identify solutions to a specific marketing problem through.
The Market Research Process
Survey Designs EDUC 640- Dr. William M. Bauer
1 Types and Sources of Data UAPP 702 Research Methods for Urban & Public Policy Based on notes by Steven W. Peuquet, Ph.D.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Chapter Five: Nonexperimental Methods II: Ex Post Facto Studies, Surveys and Questionnaires, Sampling and Basic Research Strategies.
Chapter 10 Questionnaire Design Chapter Objectives explain why it is important for managers or business researchers to know how to design good questionnaires.
CHAPTER eleven Questionnaire Design Copyright © 2002
Business and Management Research
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
The Market Research Process
Dr. Albrecht Research Team EXAMPLE of EVALUATIO N RESEARCH SERVICE LEARNING
Questionnaires and Interviews
Chapter 3: Marketing Intelligence Copyright © 2010 Pearson Education Canada1.
BSBIMN501A QUEENSLAND INTERNATIONAL BUSINESS ACADEMY.
1 School Counseling PowerPoint produced by Melinda Haley, M.S., New Mexico State University. “This multimedia product and its contents are protected under.
Medical Audit.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
THE MARKET RESEARCH PROCESS Chapter Steps of the Market Research Process 1. Define the Problem 2. _____________________ 3. Analyzing Data 4. Recommending.
Classroom Assessments Checklists, Rating Scales, and Rubrics
 Collecting Quantitative  Data  By: Zainab Aidroos.
Evaluating a Research Report
Alternative Assessment
Evaluating HRD Programs
Market Research The key to the customers wallet …..
Quantitative and Qualitative Approaches
© 2012 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
 Descriptive Methods ◦ Observation ◦ Survey Research  Experimental Methods ◦ Independent Groups Designs ◦ Repeated Measures Designs ◦ Complex Designs.
Sociologists Doing Research Chapter 2. Research Methods Ch. 2.1.
Outline 1. Definition 2. When and why to use surveys
Chapter 5: NEEDS ASSESSMENT “Acting without thinking is like shooting without aiming.” B. C. Forbes.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Copyright © 2014 by The University of Kansas Conducting Surveys.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 1.1 Chapter Five Data Collection and Sampling.
Chapter Five Data Collection and Sampling Sir Naseer Shahzada.
Quality Assurance Programme of the Canadian Census of Population Expert Group Meeting on Population and Housing Censuses Geneva July 7-9, 2010.
Sociologists Doing Research Chapter 2. Research Methods Ch. 2.1.
CHAPTER 3 UNDERSTANDING THE COMMUNITY. A Major Step In Developing A School Public Relations Program Is Collecting Information That Will Enable School.
Unit 5—HS 305 Research Methods in Health Science
Lecture 22. ` Basic Program Evaluation Contd… Module 8 – How to Collect Data Module 7 – Data Collection Plan Module 8 – How to Collect Data Module 9.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Questionnaire Design CHAPTER Nine.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
1 Data Collection and Sampling Chapter Methods of Collecting Data The reliability and accuracy of the data affect the validity of the results.
Chapter 14: Affective Assessment
Evaluating Media Service Program Chapter 13 Presented by: Dal Marie Hawkins.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Questionnaire Design CHAPTER eleven.
Quantitative Data Collection Techniques In contrast to qualitative data quantitative data are numerical. Counted, calculated tallied and rated. Also ratings.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
1 Data Collection and Sampling ST Methods of Collecting Data The reliability and accuracy of the data affect the validity of the results of a statistical.
A Presentation on TRAINING NEEDS ANALYSIS. Shradha(02) Vidya(34) Rothin(58) Pallav(48) Preeti Minz(11) Preeti Kumari(S2) Rohan Charly(24)
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Chapter 10 (3.8) Marketing Research.  What is Marketing Research? Marketing research is the systematic design, collection, analysis, and reporting of.
Research Methods for Business Students
Part Two.
Advertising Research.
Data Collection and Sampling
Presentation transcript:

CHAPTER 16 ASSESSMENT OF THE PROGRAM

Educational Assessment »Assessment and Evaluation is an integral part of any educational program. »This is true of the School Public Relations Program. »However, the public relations program, like most of what is done in the area of the social sciences, is difficult to evaluate -- largely because of the many variables that effect any social based program.

Myths About Measurement "Dissemination is Communication "Effort can be equated with results "Samples are representative "Increased knowledge means more favorable attitudes

The Four Purposes of Evaluation are: 1to improve, add, or drop existing public relations activities 1to determine if the pubic relations program is achieving its intended results 1to determine if the results were worth the time and money spent; and 1to bring greater visibility to the accomplishments of the public relations program

Closed Systems of Evaluation Closed Systems of Evaluation look at a single element of the program and usually include: Pretest -- An evaluation of what the audience knows before the activity and; Posttest -- An evaluation of what the audience knows after the activity.

An Open System of Evaluation unintended audiences how well the school system is administered the effectiveness of the organization of the school system union activities administration perceptions, etc. An open system includes all the elements of the program -- examples are:

Jacobson’s Seven Steps in Systemic Evaluation 1. Select the rationale 2. Specify the objectives 3. Develop measures 4. Administer the measures and collect the data 5. Analyze the data 6. Report the results 7. Apply the results to decisions

Swinehart lists types of evaluations and then gives dimensions for each. His types are: Appraisal and description by persons involved in the program Count of activities Outside expert apprasial of activities Volunteered reaction of audiences Solicited reaction from a sample audience Reactions of actual or potential audiences through small-scale studies Controlled field experiments or similar studies to assess actual impact of programs.

Many Types of Evaluation Instruments Have Been Developed Most evaluation procedures involve rating scales and checklists These have been developed for use at the building level and the school district level

Bortner Has Developed a 168 Point Checklist Using Seven Categories Program organization School staff Students Parents Community One-way comunications (printed and nonprinted) School plant (A portion of a checklist appears on page 306, figure 16.2)

Some Conventional Methods of Program Evaluation Include: Observations -- careful and unbiased review of program effects Records -- Recorded evidence such as complements, criticisms, and complaints Telephone Surveys -- A randomly selected survey can be used to check one aspect of the program, or general feeling within the community

The Panel -- This method get feedback, but with the same limitations as using the panel for input in the community survey Questionnaires -- These are widely used as they are easy to prepare, distribute, and tally -- the same limitations exist here as in doing community surveys Checklists -- These are a series of multiple choice questions with 3 to 10 possible answers -- getting a return of checklists is similar to the problems of a questionnaire

Rating Scales -- Similar to a checklists, however, they provide a numeric quantification for the evaluation -- problems in getting returns are the same as the previous methods Opinion Polls -- Using direct interview with stratified sampling is one of the best methods -- to register the effect of the program, these must be done periodically -- they can be too expensive for some systems

The Communications Audit Short and long term goals Priority of those goals Themes of issues to be emphasized Priority list of publics Community pulse issues Communication methods that are working New Communications methods wanted A measuring stick for future evaluation Every School Public Relations Program should undergo a audit every three to five years -- areas shown for possible improvement could include :

Major Topics Included in a System Audit Include: Communications Philosophy Community Demographic Objectives and Goals Organization and Staffing for Public Relations Existing PR programs Attitudes toward present PR Needs and Expectations