CMP3265 – Professional Issues and Research Methods Research Proposals: n Aims and objectives, Method, Evaluation n Yesterday – aims and objectives: clear,

Slides:



Advertisements
Similar presentations
Assessment I. Outline Select a paper of interest Not a thought piece or a review Empirical research Get paper approved Produce 2,000 word report.
Advertisements

A small taste of inferential statistics
Designing an impact evaluation: Randomization, statistical power, and some more fun…
BSc Honours Project Introduction CSY4010
What makes a good project?.  A testing ground for concepts presented in the taught programme  An opportunity to demonstrate the ability to apply knowledge.
Project Proposal.
Extended Project Research Skills 1 st Feb Aims of this session  Developing a clear focus of what you are trying to achieve in your Extended Project.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
introduction to MSc projects
CMP3265 – Professional Issues and Research Methods May 22 nd – May 24 th CW2/17 Module Leader: Lee McCluskey Other Tutors: Rob Lloyd-Owen, Julie Wilkinson,
UMass Lowell Computer Science Advanced Algorithms Computational Geometry Prof. Karen Daniels Spring, 2004 Project.
CS300 Planning and Executing a Project Terry Hinton Helen Treharne.
Research Methods and Proposal Writing
Statement of the Problem Goal Establishes Setting of the Problem hypothesis Additional information to comprehend fully the meaning of the problem scopedefinitionsassumptions.
SE is not like other projects. l The project is intangible. l There is no standardized solution process. l New projects may have little or no relationship.
UEL Guidelines for External Examiners Philip Brimson Quality Manager (Validation & Review)
Factors Impacting Performance (FIP) Evidence we need from Pupils…..
Formulating the research design
Research Methods and Proposal Writing
An Introduction to Research Methodology
Critical Analysis. Key Ideas When evaluating claims based on statistical studies, you must assess the methods used for collecting and analysing the data.
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Preparing a Successful SHRM Foundation Grant Application Lynn McFarland, Ph.D. August 23, 2012.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Sule Ozmen LITERATURE REVIEW Sule Ozmen Sule Ozmen Seminar in Thesis.
Research Design. Research is based on Scientific Method Propose a hypothesis that is testable Objective observations are collected Results are analyzed.
Introducing Unit Specifications and Unit Assessment Support Packs Art & Design National 3, 4 & 5.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessment of MSc and PhD students E J Wood School of Biochemistry & Microbiology University of Leeds Leeds, LS2 9JT, UK
S7: Audit Planning. Session Objectives To explain the need for planning To explain the need for planning To outline the essential elements of planning.
UEL Guidelines for External Examiners Philip Brimson Quality Manager (Validation & Review)
This chapter is extracted from Sommerville’s slides. Text book chapter
Audit Planning. Session Objectives To explain the need for planning To outline the essential elements of planning process To finalise the audit approach.
1 Analysing the contributions of fellowships to industrial development November 2010 Johannes Dobinger, UNIDO Evaluation Group.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Science Fair How To Get Started… (
1 Ideas of Problem-based Learning As a learner-centred process, problem- based learning meets the learners' interests and as such gives room for developing.
What makes a good project?.  A testing ground for concepts presented in the taught programme  An opportunity to demonstrate the ability to apply knowledge.
Research Methods in Computational Informatics IST 501 Fall 2014 Dongwon Lee, Ph.D.
SINTEF Telecom and Informatics EuroSPI’99 Workshop on Data Analysis Popular Pitfalls of Data Analysis Tore Dybå, M.Sc. Research Scientist, SINTEF.
CS529 Multimedia Networking Experiments in Computer Science.
1 ©2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
New Advanced Higher Subject Implementation Events Statistics Unit Assessment at Advanced Higher.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
FYITS – Students Mktg Briefing Nov 2010 BSc (Hons) Engineering Management Nature of Course The course seeks to equip students with management knowledge.
Question paper 1997.
Presenting and Analysing your Data CSCI 6620 Spring 2014 Thesis Projects: Chapter 10 CSCI 6620 Spring 2014 Thesis Projects: Chapter 10.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Research Word has a broad spectrum of meanings –“Research this topic on ….” –“Years of research has produced a new ….”
URBDP 591 A Lecture 16: Research Validity and Replication Objectives Guidelines for Writing Final Paper Statistical Conclusion Validity Montecarlo Simulation/Randomization.
IAVI/DIT BSc (Hons) in Property Studies How to complete the Dissertation Proposal Form.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Aviation Ground Handling Trailblazer Assessment Activity consultation Aviation ground operative Aviation ground specialist Aviation operations manager.
Anatomy of Project and Dissertation of Thesis Or What is a Dissertation / Thesis? Bruce Miller Nurtuvista’ 12 MTPG&RIHS 05 January 2012.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Research design By Dr.Ali Almesrawi asst. professor Ph.D.
Technical Business Consultancy Project
Research Problems, Purposes, & Hypotheses
Research Into Practice
Module Content Research Principles and Approaches Types of Research
Critical Reading of Clinical Study Results
WP2. Excellent university for the researchers
Foundation Degree IT Project
Research Methods and Proposal Writing
Research Methods Introduction Jarod Locke.
Presentation transcript:

CMP3265 – Professional Issues and Research Methods Research Proposals: n Aims and objectives, Method, Evaluation n Yesterday – aims and objectives: clear, timely, significant, original, feasible n This morning – Research Methods, Research Evaluation

Research Method - characteristics The method should encompass:  What areas of research are involved and how this research builds upon them  How the research is to be conducted  A plan for the research project that will achieve the objectives  How the project is to be managed Questions of a chosen method:  Will the method deliver the aims/objectives?  Is the method appropriate for the type of research?  Can the results produced by the method be reproduced?

Back to Yesterday’s Example “Research Hypothesis: Object-oriented database technology leads to better quality software than RDB technology” METHOD - How the research is to be conducted 1. Assemble development team T 2. Identify an application A 3. Apply T to A using OODB 4. Apply T to A using RDB 5. Use results to ‘prove’ hypothesis Comment on the method.

better method? “Research Hypothesis: Object-oriented database technology leads to better quality software than RDB technology” METHOD: 1. Identify a set of software developers T 2. Randomly separate T into development teams T1 and T2 3. Identify an application A 4. Apply T1 to A using OODB technology 5. Measure product quality by recording metrics eg development time, bug rate, LOC, code complexity 6. Apply T2 to A using RDB technology 7. Measure product quality by recording metrics eg development time, bug rate, LOC, code complexity Comment on the method.

better method? BUT – Using one development team split randomly would still introduce bias (depending on which was eg the most experienced team) Using one application A – not good enough Choice of _actual_ technology will influence result

Example 2 “ Hypothesis: Algorithm A is faster at solving problems from population X then Algorithm B” METHOD - How the research is to be conducted: Implement algorithm A Implement algorithm B Generate a random sample from X Apply A to each member of X – record CPU time Apply B to X to each member of X – record CPU time Use results to ‘prove’ hypothesis Comment on the method.

Research Methods – Sampling Research methods often involve SAMPLING – Choose a representative / random sample S from population X; Experiment and obtain results on sample S; Make claims about S; GENERALISE claims to the whole population X.

Sampling - Pitfalls Consider the following samples. Is it safe to generalise? 1. What is being Tested: software method Population: Programmers Sample: set of IT students 2. What is being Tested: educational software method Population: IT students Sample: random set of Huddersfield IT students 3. Re-look at Example 2 above Samples must be representative of the larger population

Research Methods – cause and effect Research methods often involve trying to prove causality where a feature causes a particular effect  Does the use of one particular method improve a process compared to some other?  Does the use of an enhanced algorithm improve its quality (eg speed)?  Does a course of training / learning improve the effectiveness of developers?

The effect of ‘Extraneous Variables’ Research often fall down because of extraneous variables in cause and effect – e.g. Is it the new method / algorithm that causes the improvement or some other factor? e.g. it may be skill of the team itself rather than the method that produces an observed improvement. In computing research experiments tend to be VERY complex – it is important to remove any extraneous variables that may produce side-effects

Research Evaluation Typically there are several general ways to evaluate research results (plus combinations of these) 1. Literature Comparison – show superior to existing / past works by comparing with written accounts of other work 2. Empirical – run experiments and take measurements 3. Rational – prove or demonstrate properties [eg prove an algorithm’s computational complexity]

Research Evaluation - Measurements Comment on these fictional claims: “.. research shows that Pentium processors can function in environments at twice the normal ambient temperature” ? “.. research shows that XP is twice as reliable as W95” ? “..research shows that the XP OS has twice as many bugs as Linux“ ? Scientists must be very careful with statistics and measurements

Example of “Good” research – best research paper handout Using FORM rather than CONTENT, analyse and evaluate the computing research paper given out. Consider:  What type of research is it – ground breaking, new or improved method, new algorithm, new theory etc  Are its aims clear, significant, timely, original?  Does it outline what areas of research are involved and how this research builds upon them?  The method that was used to conduct the research  How the research was evaluated  Literature comparison?  Empirical?  Rational?

Example of “Good” research – best research paper handout  What type of research is it – its ground breaking in that it combines two areas into one for a new framework (for something…!)  The objective seems fairly clear, and it is written as if it is significant, timely, original (need to be a subject expert to decide on that one)  What kind of method + evaluation does it use? 1. Outlines the theory about ‘problems’ and ‘solutions’ that share a common formulation 2. Outlines a ‘discovered ‘algorithm (LDFS) that is general enough to solve the scope of problems 3. Evaluates the new algorithm  rationally (by proving propositions),  emprically by its application to a subset of the formulation (MDPs) with comparison to previous state of the art algorithms  comparisons are also made through the paper

UK Research Assessment Exercise This is huge evaluation of research to determine the ‘value’ of research groups in UK Universities. Each subject has to present evidence: - For each submitted academic: 4 pieces of work (publications) words stating their ‘impact’ - ‘peer esteem’ - measured by factors such as conference/journal organisation and management - Number of PhD awards / year - Amount of Research Income /year

Portfolio Exercise 1. Take any piece of published research (this can include staff research disseminated in last term’s seminar series) and analyse and evaluate it with respect to:  What type of research is it – ground breaking, new or improved method, new algorithm, new theory etc  Are its aims clear, significant, timely, original?  What kind of method does it use?  How is it evaluated?  Literature comparison?  Empirical?  Rational? Comment on its overall quality (and include a copy of the publication or a link to it in your portfolio)