Presentation is loading. Please wait.

Presentation is loading. Please wait.

Science in General. n Definition: a systematically organized body of knowledge n Assumptions n 1. there is order in nature n 2. every event has an explanation.

Similar presentations


Presentation on theme: "Science in General. n Definition: a systematically organized body of knowledge n Assumptions n 1. there is order in nature n 2. every event has an explanation."— Presentation transcript:

1 Science in General

2 n Definition: a systematically organized body of knowledge n Assumptions n 1. there is order in nature n 2. every event has an explanation n 3. we will never know everything n Definitions of important terms n Variable: any trait or characteristic which can take on a range of values

3 Science (continued) n Hypothesis: Question or statement about the relationship between two or more variables i.e., is there a relationship between number of police on the streets and the crime rate? n Independent variable (IV): a variable thought to have an effect n Dependent variable (DV): affected variable

4 Science (continued) n Theory: an explanation that systematically organizes observations and hypotheses n Basic vs. Applied Research n Basic--why questions; Applied--solve problems n Cross-sectional vs. Longitudinal Research n Experimental vs. Ex Post Facto Research

5 Some notes about research n Much research does not “pan out” n Some research gets results “accidentally” n Research which seems trivial sometimes turns out to be important (Golden Fleece award is sometimes undeserved) n We cannot assume that “commonsense” is correct n Study of patterns, not individuals

6 Errors in observation n Inaccurate observation (measure & record) n Overgeneralization (sufficient number of subjects, replication of studies) n Selective observation (sufficient number) n Illogical reasoning, such as gambler’s fallacy and ex post facto reasoning (logic) n Ego involvement, Premature closure, Reductionism

7 Research methods n Experiments (manipulation and control) n Surveys (written and interviews) n Field or observational research n Record or archival research (content analysis, secondary analysis) n Case study n Evaluation research

8 Theories n Importance of theories--they drive research n Criteria for a good theory 1. consistent with known facts 2. internally consistent, not contradictory 3. parsimonious 4. subject to empirical investigation 5. able to predict

9 Theory building n Deductive reasoning: start with an explanation, derive hypotheses and test them. ex: family instability as a result of social upheaval n Inductive reasoning: gather information and then develop a theory. ex: Durkheim and crime and suicide n Relationship between research and theory

10 Examples of research studies n Hirschi and social control theory n Policewomen on patrol n Kansas City Patrol Experiment n Group therapy in California prisons

11 Relationships vs. Causation n To be a cause, one variable must be necessary and sufficient to affect another variable. n Something may be necessary but not sufficient (intelligence and good grades) n Could be sufficient but not necessary (isolation in early life and mental retardation)

12 Three criteria to establish a cause n 1. Cause must precede effect n 2. Two variables must be empirically correlated (as one changes, the other changes, in a systematic fashion) 3. Relationship must not be explained away by a third variable. Storks and babies, polio and pavements Large family size and delinquency

13 Relationship: child abuse & Delinquency n Two methods of study n Retrospective n Private residential treatment center, 66% abused n Runaway shelter, Ohio, 75% abused n Juvenile delinquents, 40% abused, neglected or abandoned

14 Relationship (continued) n Prospective n 5000 children referred for abuse followed: after 5 years, 14% adjudicated, after 10 years, 32% n A N.Y. study found that 50% of families reported had at least one child taken to court as delinquent

15 Conclusions n Not cause and effect n Need for a base rate of comparison--how many children are abused, and how many go to juvenile court n There would appear to be a relationship n Abused children at greater risk, higher p n Other explanatory variables for the relationship

16 Purposes of research n Exploration--satisfy curiosity, test feasibility of a study, develop methods n Description (Census, polls) n Explanation n Units of analysis: units observed and described to create summary descriptions of all units and to explain differences among them

17 Units of analysis n Individuals n Groups (i.e., families, gangs) n Organizations (police departments) n Social artifacts (traffic accidents, court cases, prison riots)

18 Steps in designing research n Choosing a research problem n Reviewing the literature: abstracts and journals, books, collected readings, computer searches (NCJRS), CD ROMS, and the internet n Conceptualization of variables, hypotheses, questions

19 Steps in research (continued) n Selecting how to measure variables (operationalization) n Selecting subjects for the study: population and sample n Method: making observations and measurements n Data processing and analysis

20 Steps in research continued n Interpreting the results and their applications

21 Research articles n Who does research? n Process for getting into print n Abstract n Introduction--problem, literature review n Method--description of subjects, instruments, procedure, data analysis used n Results--descriptive and inferential statistics

22 Research article (continued) n Tables and graphs in results section n Discussion: interpretation of results, cautionary notes, directions for future research--the next questions

23 Research proposal n Abstract n Introduction (introduction of the topic, literature review, statement of what this study would do) n Method n Subjects: how many? What are their characteristics? How will they be selected? n Instruments: what questions will be asked?

24 Research proposal (continued) n Procedure--how will the study be carried out? n Schedule: List each step, and estimate how much time each would take (sometimes steps can be done simultaneously), and indicate the total length of the project n Budget: List all items, the cost of each, and the total costs.

25 Research proposal (continued) n Indicate supplies, travel, personnel costs, etc. and justify n Bibliography (based on Introduction--every reference in the introduction should have an entry in the bibliography, and vice versa n Appendices

26 Layout of research proposal AbstractxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxIntroduction (a few paragraphs)

27 Layout (continued) MethodSubjectsxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxInstrumentsxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

28 ProcedurexxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxSchedulexxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx (a table is also useful)

29 Layout (continued) Budgetxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx (again, a table may be used) Item Cost

30 Layout (continued) Bibliography author, year of publication, title, journal, volume #, pages. Author, year of publication, title, city of publisher, publisher.

31 Measurement n Concepts, hypothetical constructs: theoretical ideas based on observation but which cannot be observed directly n aggressiveness n intelligence n prejudice

32 Measurement n Difficulty of measurement of hypothetical constructs n LaPiere study n Interchangeability of indicators: if several different indicators follow the same pattern, they are measuring the same concept n Definition of concept: dictionary

33 Operational definition n definition that describes how a concept will be measured (intelligence will be measured by the scores on the Stanford-Binet and the WAIS-R) n Considerations for operational definitions: reliability, validity, norms, precision n Reliability: consistency of measurement. Different from accuracy

34 Assessing reliability n Test-retest: Scores should not change much over a short period of time n Split-half: divide test into two parts, scores should be the same on one part as on the other for the same individual n reliability affected by (1) reliability of observers and by (2) poor questions

35 Validity n Does the test measure what you want it to measure? n Four types of validity: face, criterion or predictive, content, construct n Face validity: does it appear to measure what you want it to? Do the questions appear relevant?

36 Validity (continued) n Criterion or predictive: does the measurement predict something we would like to predict? n Examples: ACT and success in college (GPA), Screening tests and future job performance n Determined by applying measure, and then determining how well it would have predicted

37 Validity (continued) n Content validity: degree to which a measure covers the range of meanings in the concept n Example: achievement test, senior test, attitude test, personality trait n Construct: based on way a measure relates to other variables within a system of theoretical relationships (Hirschi)

38 Other considerations n A measure could be reliable but not valid. It cannot be valid unless it is reasonably reliable n Norms: measures which provide a basis for comparison n Precision: fineness of distinction in measuring. How precise? In the social sciences, we are not very precise


Download ppt "Science in General. n Definition: a systematically organized body of knowledge n Assumptions n 1. there is order in nature n 2. every event has an explanation."

Similar presentations


Ads by Google