The YES Program Technical Writing Guidelines & Tips Michael Georgiopoulos Michael Georgiopoulos.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

Critical Reading Strategies: Overview of Research Process
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Yuri R. Tsoy, Vladimir G. Spitsyn, Department of Computer Engineering
Marković Miljan 3139/2011
Project Proposal.
UNIVERSITY OF JYVÄSKYLÄ Building NeuroSearch – Intelligent Evolutionary Search Algorithm For Peer-to-Peer Environment Master’s Thesis by Joni Töyrylä
Writing a Research Paper
EC: Lecture 17: Classifier Systems Ata Kaban University of Birmingham.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
1 Abstract This paper presents a novel modification to the classical Competitive Learning (CL) by adding a dynamic branching mechanism to neural networks.
1 Abstract This study presents an analysis of two modified fuzzy ARTMAP neural networks. The modifications are first introduced mathematically. Then, the.
Data classification based on tolerant rough set reporter: yanan yean.
Writing Good Software Engineering Research Papers A Paper by Mary Shaw In Proceedings of the 25th International Conference on Software Engineering (ICSE),
Basic Scientific Writing in English Lecture 3 Professor Ralph Kirby Faculty of Life Sciences Extension 7323 Room B322.
Module 5 Writing the Results and Discussion (Chapter 3 and 4)
Session 6: Writing from Sources Audience: 6-12 ELA & Content Area Teachers.
On the Application of Artificial Intelligence Techniques to the Quality Improvement of Industrial Processes P. Georgilakis N. Hatziargyriou Schneider ElectricNational.
Software Engineer Report What should contains the report?!
Session 6: Writing from Sources Audience: K-5 Teachers.
Advanced Research Methodology
Marcus Gallagher and Mark Ledwich School of Information Technology and Electrical Engineering University of Queensland, Australia Sumaira Saeed Evolving.
Structure of a Research Paper
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Literature Review and Parts of Proposal
IMSS005 Computer Science Seminar
Cost-Sensitive Bayesian Network algorithm Introduction: Machine learning algorithms are becoming an increasingly important area for research and application.
Abstract This poster presents results of three studies dealing with application of ARTMAP neural networks for classification of remotely sensed multispectral.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
ABSTRACT Function: An abstract is a summary of the entire work that helps readers to decide whether they want to read the rest of the paper. (HINT…write.
Hierarchical Distributed Genetic Algorithm for Image Segmentation Hanchuan Peng, Fuhui Long*, Zheru Chi, and Wanshi Siu {fhlong, phc,
Organizing Your Information
NEURAL NETWORKS FOR DATA MINING
Report Format and Scientific Writing. What is Scientific Writing? Clear, simple, well ordered No embellishments, not an English paper Written for appropriate.
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY, P.P , MARCH An ANFIS-based Dispatching Rule For Complex Fuzzy Job Shop Scheduling.
© A. Kwasinski, 2014 ECE 2795 Microgrid Concepts and Distributed Generation Technologies Spring 2015 Week #7.
Ming-Feng Yeh1 CHAPTER 16 AdaptiveResonanceTheory.
Literature Review. Outline of the lesson Learning objective Definition Components of literature review Elements of LR Citation in the text Learning Activity.
 An article review is written for an audience who is knowledgeable in the subject matter instead of a general audience  When writing an article review,
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Research Methodology For Information Technology Students [cf. Martin Olivier 1997] Created by Piet Boonzaier Chapter 2 Designing your project.
Principals of Research Writing. What is Research Writing? Process of communicating your research  Before the fact  Research proposal  After the fact.
DESIGNING AN ARTICLE Effective Writing 3. Objectives Raising awareness of the format, requirements and features of scientific articles Sharing information.
Supervised Machine Learning: Classification Techniques Chaleece Sandberg Chris Bradley Kyle Walsh.
Feature Selction for SVMs J. Weston et al., NIPS 2000 오장민 (2000/01/04) Second reference : Mark A. Holl, Correlation-based Feature Selection for Machine.
Sporadic model building for efficiency enhancement of the hierarchical BOA Genetic Programming and Evolvable Machines (2008) 9: Martin Pelikan, Kumara.
INFOMGP Student names and numbers Papers’ references Title.
An evolutionary approach for improving the quality of automatic summaries Constantin Orasan Research Group in Computational Linguistics School of Humanities,
June REU 2003 How to Conduct Research Some Rules of Thumb.
Helpful hints for planning your Wednesday investigation.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Yu Cheng Chen Author: Michael.
Big data classification using neural network
CHAPTER 1 Introduction BIC 3337 EXPERT SYSTEM.
Outline What is Literature Review? Purpose of Literature Review
A Modified Naïve Possibilistic Classifier for Numerical Data
Chapter 21 Formal Reports
Department of Electrical Engineering
Evolutionary Ensembles with Negative Correlation Learning
Poster Title ___ Title is at top of the poster, short, descriptive of the project and easily readable at a distance of about 4-5 feet (words about
Learning and Memorization
Presentation transcript:

The YES Program Technical Writing Guidelines & Tips Michael Georgiopoulos Michael Georgiopoulos

12/03/09YES Program Paper Outline  Abstract  Introduction  Literature Review  Main Part of The Paper  Algorithm Description  Algorithm Justification/Analysis  Experimental Design  Databases Used  Experimental Results and Observations  Summary/Conclusions  References  Other Parts (Acknowledgments to NSF, others)

12/03/09YES Program Abstract  In the abstract you write, in a few sentences, what you have done, why have you done it, how well you have done it, and you explain very briefly how it compares to the state-of-the-art  Example of an Abstract (GFAM paper) evolution of Fuzzy ARTMAP neural network classifiers, using genetic algorithmsobjective of improving generalization performance alleviating the ART category proliferation problem exhibits good generalization and is of small sizeconsuming reasonable computational effort. In a number of classification problems. GFAM produces the optimal classifiercompare the performance of GFAM with other competitive ARTMAP classifiers This paper focuses on the evolution of Fuzzy ARTMAP neural network classifiers, using genetic algorithms, with the objective of improving generalization performance (classification accuracy of the ART network on unseen test data) and alleviating the ART category proliferation problem (the problem of creating more than necessary ART network categories to solve a classification problem). We refer to the resulting architecture as GFAM. We demonstrate through extensive experimentation that GFAM exhibits good generalization and is of small size (creates few ART categories), while consuming reasonable computational effort. In a number of classification problems. In some cases, GFAM produces the optimal classifier. Furthermore, we compare the performance of GFAM with other competitive ARTMAP classifiers that have appeared in the literature and addressed the category proliferation problem in ART. We illustrate that GFAM produces improved results over these architectures, as well as other competitive classifiers.

12/03/09YES Program Introduction/Motivation  In the introduction you discuss in more detail, what you have done, why have you done it, and how well have you done it.  In the Introduction you place your work in the context of previous work that has been conducted in the field.  In the introduction you should show sufficient knowledge of the previous literature that has tackled a similar problem as the one that you are tackling, and you should also explain why your approach should be preferred.  The introduction should end with a paragraph that explains the organization of the paper by section number, which is followed by one sentence of what each of these sections contain.

12/03/09YES Program Example of Introduction pieces 1st Paragraph (Why ART?)  The Adaptive Resonance Theory (ART) was developed by Grossberg (1976). One of the most celebrated ART architectures is Fuzzy ARTMAP (FAM) (Carpenter et al, 1992), which has been successfully used in the literature for solving a variety of classification problems. Some of the advantages that FAM possesses is that it can solve arbitrarily complex classification problems, it converges quickly to a solution (within a few presentations of the list of input/output patterns belonging to the training set), it has the ability to recognize novelty in the input patterns presented to it, it can operate in an on-line fashion (new input/output patterns can be learned by the system without re-training with the old input/output patterns), and it produces answers that can be explained with relative ease.

12/03/09YES Program Example of Introduction pieces 2 nd Paragraph (What is the Problem?)  One of the limitations of FAM that has been repeatedly reported in the literature is the category proliferation problem. This refers to the creation of a relatively large number of categories to represent the training data. Categories are the hidden nodes (or units) in a FAM neural network. Each node is mapped to a specific class. The creation of a large number of categories means poor compression of the training data. Quite often the category proliferation problem, observed in FAM, is connected with the issue of overtraining. Over-training happens when FAM is trying to learn the training data perfectly at the expense of degraded generalization performance (i.e., classification accuracy on unseen data) and also at the expense of creating many categories to represent the training data (leading to the category proliferation problem). Also, it has been related to several limitations of FAM, such as the representative inefficiency of the hyperbox categories or the excessive triggering of the match tracking mechanism due to existence of noise.

12/03/09YES Program Example of Introduction pieces 3rd Paragraph (What have we done?)  In this paper, we propose the use of genetic algorithms (Goldberg, 1989) to solve the category proliferation problem, while improving the generalization performance in FAM. We refer to the resulting architecture as GFAM. In our work here, we use GAs to evolve simultaneously the weights, as well as the topology of the FAM neural networks. We start with a population of trained FAMs, whose number of nodes in the hidden layer and the values of the interconnection weights converging to these nodes are fully determined (at the beginning of the evolution) by the ART's training rules. To this initial population of FAM networks, GA operators are applied to modify these trained FAM architectures (i.e., number of nodes in the hidden layer, and values of the interconnection weights) in a way that encourages better generalization and smaller size architectures.

12/03/09YES Program Example of Introduction pieces 4 th Paragraph (How well have we done?)  Our results show that GFAM performed well on a number of classification problems, and on a few of them it performed optimally. Furthermore, GFAM networks were found to be superior to a number of other ART networks (ssFAM, ssEAM, ssGAM, safe micro-ARTMAP) that have been introduced into the literature to address the category proliferation problem in ART. More specifically, GFAM gave a better generalization performance and a smaller than, or equal, size network (in almost all problems tested), compared to these other ART networks, requiring reduced computational effort to achieve these advantages. More specifically, in some instances the difference in classification performance of GFAM with these other ART networks quite significant (as high as 10%). Also, in some instances the ratio of the number of categories created by these other ART networks, compared to the categories created by GFAM was large (as high as 5).

12/03/09YES Program Example of Introduction pieces 5 th Paragraph (Organization of the Paper)  The organization of this paper is as follows: In section 2 we present a literature review relevant to some of the issues addressed in this paper. In section 3 we emphasize some of the basics of the Fuzzy ARTMAP architecture. In section 4 we describe all the necessary elements pertinent to the evolution of the Fuzzy ARTMAP architecture. In Section 5, we describe the experiments and the datasets used to assess the performance of GFAM, we assess the performance of GFAM, and we offer performance comparisons between the GFAM and other ART architectures that were proposed as solutions for the category proliferation problem in FAM. Also, in Section 5 we compare GFAM with other non-ART based classifiers (although the comparison is not comprehensive). In Section 6, we summarize our contribution, findings, and we provide directions for future research.

12/03/09YES Program Literature Review Section  It is a good idea to have a separate section, immediately after the introduction, that discusses the literature review in more detail. The literature review needs to be  Thorough,  Relevant, and  It has to be related with the problem that you are addressing (for example in the outlier detection strategies by Gramajo and Fox we should refer to the A-priori algorithm paper, where the idea of frequent itemsets was introduced)

12/03/09YES Program Main Paper (Algorithm Description)  In this part of the paper you describe the algorithm that you have invented (such as an efficient FIM algorithm for outlier detection problems) and you provide  Pseudo-Code  Step-by-Step Description (with explanations), and an  Example (explains how your algorithm operates)

12/03/09YES Program Main Paper (Algorithm Explanations)  In this part of the paper you explain how and why you came up with this algorithm and you emphasize some of the good properties of the  In this part of the paper you explain how and why you came up with this algorithm and you emphasize some of the good properties of the algorithm…Words, words, words…

12/03/09YES Program Main Paper (Experimental Design)  In this part of the paper you describe to compare the merits of your algorithm with the merits of other techniques that have shown up in the literature and address similar problems.  In this part of the paper you describe how you are going to conduct your experiments to compare the merits of your algorithm with the merits of other techniques that have shown up in the literature and address similar problems.  If your algorithm is a classifier, your are:  If your algorithm is a classifier, your typical measures of merit are:  Generalization Performance  Size of your Classifier  Time Complexity to Design your Classifier

12/03/09YES Program Main Paper (Experimental Design) An Example  We conducted a number of experiments to assess the performance of the genetically engineered Fuzzy ARTMAP (GFAM) architecture. There were two objectives for this experimentation.  The first objective is to, the probability of deleting a category,, and the probability of mutating a category,. The default values were identified by conducting experiments with 19 databases. This effort is described in detail in section 5.2.  The first objective is to find good (default) values for the ranges of two of the GA parameters, the probability of deleting a category,, and the probability of mutating a category,. The default values were identified by conducting experiments with 19 databases. This effort is described in detail in section 5.2.  The second objective is to (for the default parameter values) to that of that have been proposed in the literature with the intent of addressing the category proliferation problem in FAM, such as ssFAM, ssEAM, ssGAM, and micro-ARTMAP.  The second objective is to compare the GFAM performance (for the default parameter values) to that of popular ART architectures that have been proposed in the literature with the intent of addressing the category proliferation problem in FAM, such as ssFAM, ssEAM, ssGAM, and micro-ARTMAP.

12/03/09YES Program Main Paper (Databases)  In this part you that you use to compare your algorithm with other techniques. In choosing the databases for your experiments you should consider databases that  In this part you describe the databases that you use to compare your algorithm with other techniques. In choosing the databases for your experiments you should consider databases that  Have different number of data-points  Have different dimensionality for the input patterns  Have different number of Labels for the output patterns  Correspond to classification problems of varying degrees of difficulty

12/03/09YES Program Main Paper (Databases) An Example  We experimented with both artificial and real databases. Table 1 shows the specifics of these databases.  Gaussian Databases (Database Index 1-12): These are artificial databases, where we created 2-dimensional data sets, Gaussianly distributed, belonging to 2-class, 4-class, and 6-class problems. In each one of these databases, we varied the amount of overlap of data belonging to different classes. In particular, we considered 5%, 15%, 25%, and 40% overlap. Note that 5% overlap means the optimal Bayesian Classifier would have 5% misclassification rate on the Gaussianly distributed data. There are a total of 3×4=12 Gaussian databases. We name the databases as “G#c-##” where the first number is the number of classes and the second number is the class overlap. For example, G2c-05 means the Gaussian database is a 2- class and 5% overlap database.

12/03/09YES Program Main Paper (Experimental Results)  This part is. Your objective here is to present your results with (as needed), and (as needed), and make appropriate observations of how well your algorithm performs  This part is one of the most important parts of the paper. Your objective here is to present your results with tables (as needed), and figures (as needed), and make appropriate observations of how well your algorithm performs  It is important in this part to compare your algorithm with other similar techniques that have appeared in the literature and have that your algorithm works better (in some aspects) than other techniques  It is important in this part to compare your algorithm with other similar techniques that have appeared in the literature and have enough and convincing evidence that your algorithm works better (in some aspects) than other techniques  It is always advantageous of all the other algorithms that you compare your new algorithm with (quite often this is not feasible)  It is always advantageous that your comparisons rely on your own implementations of all the other algorithms that you compare your new algorithm with (quite often this is not feasible)

12/03/09YES Program Main Paper (Experimental Results) An Example  See GFAM paper

12/03/09YES Program Main Paper (Summary/Conclusions)  In the Summary/Conclusions you emphasize and you provide the from your work, as well as and contributions of your work  In the Summary/Conclusions you emphasize what you have done and you provide the major observations from your work, as well as and contributions of your work  Occasionally you provide directions of future research that you, or others might pursue

12/03/09YES Program Main Paper (Summary/Conclusions) An Example  In this paper, we have introduced yet another method of solving the category proliferation problem in ART. This method relies on evolving a population of trained Fuzzy ARTMAP (FAM) neural networks. We refer to the resulting architecture as GFAM.  We have experimented with a number of databases that helped us identify good default parameter settings for the evolution of FAM. We defined a fitness function that gave emphasis to the creation of a small size FAM networks which exhibited good generalization. In the evolution of FAM trained networks, we used a unique (and needed) operator; the delete category operator. This operator allowed us to evolve into FAM networks of smaller size. The network identified at the end of the evolutionary process (i.e., last generation) was the FAM network that attained the highest fitness value. Our method for creating GFAM resulted in a FAM network that performed well on a number of classification problems, and on a few of them it performed optimally.

12/03/09YES Program Main Paper (Summary/Conclusions) An Example (Continued)  GFAM was found to be superior to a number of other ART networks (ssFAM, ssEAM, ssGAM, safe micro-ARTMAP) that have been introduced into the literature to address the category proliferation problem in ART. More specifically, GFAM gave a better generalization performance (in almost all problems tested) and a smaller than or equal size network (in almost all problems), compared to these other ART networks, requiring reduced computational effort to achieve these advantages.  In particular, in some instances the difference in classification performance of GFAM and these other ART networks was quite significant (as high as 10%). Furthermore, in some instances the ratio of the number of categories created by these other ART networks, compared to the categories created by GFAM, was large (as high as 5). Finally, some comparisons were also drawn between GFAM and dFAM, FasART, and dFasART, and other classifiers that led us to the conclusion that GFAM achieves good classification accuracy by creating an ART network whose size compares very favorably with the size of the other classifiers.  Obviously, the introduced method to evolve trained FAMs can be extended to other ART architectures, such as EAM, and GAM, amongst others, without any significant changes in the approach followed.

12/03/09YES Program References  This part is obvious, in the sense that you will be providing a that are related to your work, and all the references that influenced your work  This part is obvious, in the sense that you will be providing a list of all the references that are related to your work, and all the references that influenced your work  The reference list has to be, and each paper needs to be according to the standards of the publication to which your contribution is submitted  The reference list has to be complete, and each paper needs to be appropriately cited according to the standards of the publication to which your contribution is submitted

12/03/09YES Program Appendices  You may choose to have one or more appendices, if your paper requires it. In an appendix you put that is important but you decided not to put it in the main part of the paper, so as to of the paper.  You may choose to have one or more appendices, if your paper requires it. In an appendix you put information that is important but you decided not to put it in the main part of the paper, so as to avoid disturbing the flow of the paper.  For example an appendix could be  A  A glossary of terms  A proof of a theorem that is lengthy  Pieces of code that you want the reader to know of

12/03/09YES Program Acknowledgments  Here you acknowledge the sponsors of your work (e.g., YES Program)  You also acknowledge other people that helped you produce this paper and they are not included in the author list

12/03/09YES Program Questions?