Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006 Linda McDaniel University of Kentucky Experimental Research.

Similar presentations


Presentation on theme: "1 Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006 Linda McDaniel University of Kentucky Experimental Research."— Presentation transcript:

1 1 Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006 Linda McDaniel University of Kentucky Experimental Research in Assurance & Auditing

2 2 Acknowledgements Many colleagues, but particularly Jane Kennedy Bill Kinney Laureen Maines Mark Peecher

3 3 Experimental Research Session: Objectives Review strengths and weaknesses of experimental research Discuss how to capitalize on these strengths (i.e., summarize elements of good experimental design) Illustrate with an example arising from SOX reguations

4 4 Experimental Research is … controlled, empirical guided by theory Systematic, controlled, empirical, critical investigation of phenomena guided by theory and hypotheses about the presumed relations among such phenomena (Kerlinger) characterized by, active manipulation of variables of interest to generate new data the random assignment of participants to specified conditions controlled, presumed relations theory guided by

5 5 Comparative Advantages Ability to test causal relations, not just associations Manipulate variables of interest Internal validity Control/randomize effects of other variables Disentangle variables confounded in natural setting

6 6 Timeliness: no need to wait on the real world to create data See McDaniel and Hand ( CAR, 1996) Ex ante research is possible Conditions that do not exist in natural settings can be created in the lab Gaynor, McDaniel, and Neal ( TAR, forthcoming) Hirst and Hopkins ( JAR, 1998) Comparative Advantages

7 7 Examination of sub-judgments (determinants of decisions) and processes Kadous, Kennedy, and Peecher ( TAR, 2003) Hoffman, Joe, and Moser ( AOS, 2003) Maines and McDaniel ( TAR, 2000) Thus, experiments can answer how, when, and why important features of the accounting process and environment influence behavior as well as decisions Thus, experiments can answer how, when, and why important features of the accounting process and environment influence behavior as well as decisions

8 8 Relative Disadvantages External validity Task is abstraction from real world Variables manipulated at discrete levels Participants may not be representative Small sample size Limited access to participants for real-world, complex auditing/accounting issues Reduced ability to replicate No second chances (without significant costs)

9 9 Designing an Experiment After the researcher identifies an interesting, relevant question that calls for an experiment … (i.e., post Kinney 3 paragraphs) … he/she must develop an effective (good) research design, i.e., to draw causal inferences Use theory to guide predictions It is the theory that decides what can be observed. It is the theory that decides what can be observed. Albert Einstein Minimize threats to construct, internal, and statistical validity

10 10 Vs and Zs Prior-influence & contemporaneous factors (alternative explanations) 4. Statistical Validity Validity Operational Definition X Operational Definition Y Operational Libby et al. (2002) Predictive Validity Framework Concept XConcept Y Conceptual Independent (X) Dependent (Y) 1. Theory 2. Construct Validity 3.Validity 5. Internal ValidityValidity

11 11 Designing an Experiment What variables will you manipulate? Number & levels of independent variables Interactions? Libby et al. (2002) Predictive Validity Framework

12 12 Designing an Experiment What variables will you control? (internal validity ) Account for Vs and Zs (see Kinney ( TAR, 1986)) by Random assignment of participants Hold variables constant by design/ selection (within-participant design; match on Vs) Measure covariates / statistically remove effects (e.g., covariate analysis, regression ) Measure covariates / statistically remove effects (e.g., covariate analysis, regression )

13 13 Other Necessary Design Choices Professional participants? Incentives? Within- versus between- participants design? See Libby, Bloomfield, and Nelson ( AOS, 2002) for a discussion of each

14 14 Professional Participants? Theory should dictate this choice Libby and Kinney ( TAR, 2000) Maines and McDaniel ( TAR, 2000) See also Libby and Luft ( AOS, 1993) Professionals are a limited resource Libby, Bloomfield & Nelson ( AOS, 2002) Professionals exhibit stronger selection bias relative to non-professional groups Peecher & Solomon ( IJA, 2001)

15 15 Incentives? Why? …no skin in the game… When? Camerer & Hogart ( JRU, 1999)

16 16 Within- versus Between- Participants? Enhanced statistical power as participants serve as their own control See Schepanski, Tubbs, and Grimlund ( JAL, 1992) Increased salience of treatment effects Vulnerability to carry-over effects Requires proper statistical analysis

17 17 Turning Observations into a Researchable Question Do the new SOX regulations related to NAS result in improved audit quality? Why is this interesting or important? How can we examine?

18 18 Real World Problem and Regulatory Actions After corporate abuses, SEC seeks to ban all auditor-provided NAS Concerns about auditor independence, audit quality, and investor confidence Conceding certain NAS improve audit quality, SEC limits NAS auditors can provide to clients but requires ACs to pre-approve services after considering auditor independence and audit quality Registrants to disclose AC pre-approvals and fees paid to auditor (by category)

19 19 An Example The Effects of Joint Provision and Disclosures of Non-audit Services (NAS) on Audit Committee (AC) Decisions and Investor Preferences

20 20 Theory Pre-approval process makes ACs directly accountable to 3 rd parties for auditor independence and audit quality Disclosures (of pre-approvals and audit fees) makes ACs publicly accountable to investors for perceived independence anecdotal reports suggest ACs are avoiding allowable NAS

21 21 Predictions / Research Hypotheses ACs will be more likely to recommend joint provision when the NAS improves audit quality (AQ) less likely to recommend joint provision when public disclosures are required The disclosure effect holds even when ACs believe joint provision improves AQ

22 22 NAS/AQ relation & required public disclosures Predictive Validity Framework Conceptual Independent (X) Dependent (Y) Account- ability ACs pre-approval decisions Type of NAS Type of NAS and type of company Joint provision recommendation Joint provision recommendation Vs and Zs Experience with NAS approval; beliefs about effects of NAS on auditor independence and synergies with audit; audit experience, etc.audit experience, etc. Operational

23 23 Operational Independent Variables Type of Service: Effect of NAS on Audit Quality Type of Service: Effect of NAS on Audit Quality Risk Management Services Joint provision improves audit quality Human Resource Services Joint provision has no effect on audit qualityaudit quality

24 24 Measured Independent Variable Measured Independent Variable: Belief about NAS and audit quality relation relation See Libby et al. ( AOS, 2002) for when this approach is justified and implications for interpretation

25 25 Operational Independent Variables Type of Company: Disclosure Requirement Publicly-traded company Company is required to make public disclosures Privately-held company Company is not required to make public disclosurespublic disclosures

26 26 Operational Dependent Variable and Controls Measured Dependent Variable: Joint provision recommendation Reasons for and against firm selectionselection Manipulation Checks / Control Variables: Audit quality manipulation check NAS quality by provider Effect of joint provision on auditor objectivity Demographic information

27 27 Participants & Other Choices Participants were Corporate Directors attending a KPMG Audit Committee Institute Roundtable No monetary incentives Between-participants design

28 28 Some Lessons Learned 1. Work on projects that really interest you and for which you have a comparative advantage! 2. Good experimental papers require a lot of up-front time and effort. This pays off! 3. Always: Write Kinney 3 paragraphs / have others review! Prepare Libby boxes Pilot test (as many times as necessary) Share with colleagues often throughout the process


Download ppt "1 Auditing Section Doctoral Consortium 2006 Auditing Section Midyear Conference January 2006 Linda McDaniel University of Kentucky Experimental Research."

Similar presentations


Ads by Google