HyperNetworks Engın denız usta 1819630.

Slides:



Advertisements
Similar presentations
A Word at a Time Computing Word Relatedness using Temporal Semantic Analysis Kira Radinsky, Eugene Agichteiny, Evgeniy Gabrilovichz, Shaul Markovitch.
Advertisements

INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS, ICT '09. TAREK OUNI WALID AYEDI MOHAMED ABID NATIONAL ENGINEERING SCHOOL OF SFAX New Low Complexity.
By Luigi Cardamone, Daniele Loiacono and Pier Luca Lanzi.
Neuro-Evolution of Augmenting Topologies Ben Trewhella.
Modeling Pixel Process with Scale Invariant Local Patterns for Background Subtraction in Complex Scenes (CVPR’10) Shengcai Liao, Guoying Zhao, Vili Kellokumpu,
Evolvable by Design Panos Oikonomou James Franck Institute Institute of Biophysical Dynamics The University of Chicago Philippe Cluzel How topology affects.
1. Elements of the Genetic Algorithm  Genome: A finite dynamical system model as a set of d polynomials over  2 (finite field of 2 elements)  Fitness.
Evolving Neural Network Agents in the NERO Video Game Author : Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen Presented by Yi Cheng Lin.
Speaker Adaptation for Vowel Classification
Data classification based on tolerant rough set reporter: yanan yean.
1. 2 General problem Retrieval of time-series similar to a given pattern.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
CISC673 – Optimizing Compilers1/34 Presented by: Sameer Kulkarni Dept of Computer & Information Sciences University of Delaware Phase Ordering.
Differential Evolution Hossein Talebi Hassan Nikoo 1.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Evolutionary Robotics NEAT / HyperNEAT Stanley, K.O., Miikkulainen (2001) Evolving Neural Networks through Augmenting Topologies. Competing Conventions:
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Hurieh Khalajzadeh Mohammad Mansouri Mohammad Teshnehlab
Improved Gene Expression Programming to Solve the Inverse Problem for Ordinary Differential Equations Kangshun Li Professor, Ph.D Professor, Ph.D College.
Fuzzy BSB-neuro-model. «Brain-State-in-a-Box Model» (BSB-model) Dynamic of BSB-model: (1) Activation function: (2) 2.
ELeaRNT: Evolutionary Learning of Rich Neural Network Topologies Authors: Slobodan Miletic 3078/2010 Nikola Jovanovic 3077/2010
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Grid-enabled parameter initialization for.
Pac-Man AI using GA. Why Machine Learning in Video Games? Better player experience Agents can adapt to player Increased variety of agent behaviors Ever-changing.
Skeleton Based Action Recognition with Convolutional Neural Network
Distributed Pattern Recognition System, Web-based by Nadeem Ahmed.
Ghent University Pattern recognition with CNNs as reservoirs David Verstraeten 1 – Samuel Xavier de Souza 2 – Benjamin Schrauwen 1 Johan Suykens 2 – Dirk.
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Xintao Wu University of Arkansas Introduction to Deep Learning 1.
Evolutionary Computation Evolving Neural Network Topologies.
Chapter 12 Case Studies Part B. Control System Design.
Evolving CPPNs to Grow Three- Dimensional Physical Structures Joshua E. Auerbach Josh C. Bongard GECCO 2010 – Portland, Oregon Evolving CPPNs to Grow Three-Dimensional.
Cancer Metastases Classification in Histological Whole Slide Images
Environment Generation with GANs
Recognition of biological cells – development
Deep Learning Amin Sobhani.
Data Mining, Neural Network and Genetic Programming
DeepCount Mark Lenson.
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
Neural Machine Translation by Jointly Learning to Align and Translate
Matt Gormley Lecture 16 October 24, 2016
Dr. Kenneth Stanley September 25, 2006
Huffman Coding, Arithmetic Coding, and JBIG2
Intelligent Information System Lab
A brief introduction to neural network
Grid Long Short-Term Memory
Mixture Density Networks
Deep CNN of JPEG 2000 電信所R 林俊廷.
A First Look at Music Composition using LSTM Recurrent Neural Networks
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Outline Background Motivation Proposed Model Experimental Results
Dr. Kenneth Stanley February 6, 2006
Synthesis of Motion from Simple Animations
Recurrent Encoder-Decoder Networks for Time-Varying Dense Predictions
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
Neural networks (3) Regularization Autoencoder
Advances in Deep Audio and Audio-Visual Processing
Evolutionary Computation for Creativity and Intelligence
Intensity Transformation
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Realtime Recognition of Orchestral Instruments
Realtime Recognition of Orchestral Instruments
Case Injected Genetic Algorithms
Attention for translation
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Automatic Handwriting Generation
Human-centered Machine Learning
Learning and Memorization
Report 7 Brandon Silva.
An introduction to neural network and machine learning
Presentation transcript:

HyperNetworks Engın denız usta 1819630

Outline Introduction Background General Approach of HyperNetworks Evolving Neural Networks: NEAT and HyperNEAT Pattern Producing: CPPN and DPPN Compression Weight Search General Approach of HyperNetworks Static vs Dynamic Implementation Experiments Conclusion

Introduction ICLR 2017 Review

NEAT: NeuroEvolution of Augmenting Topologies Source: Evolving Neural Networks through Augmenting Topologies (Stanley, 2016)

NEAT: NeuroEvolution of Augmenting Topologies Evolving new types of neural networks based on genetic algorithms. Selection with Fitness Mutation Crossover Source: Efficient Evolution of Neural Network Topologies (Stanley, 2002)

Compositional Pattern Producing Networks Inputs: Coordinates (x, y) Distance to origin (d) Bias Output: Intensity of pixel (grayscale) Source: Compositional Pattern Producing Networks: A Novel Abstraction of Development (Stanley, 2007)

Compositional Pattern Producing Networks Source: Compositional Pattern Producing Networks: A Novel Abstraction of Development (Stanley, 2007)

HyperNEAT: Hypercube-based NEAT CPPN is evolved by NEAT Substrate == Weight Matrix Source: An Empirical Analysis of HyperNEAT (van der Berg, 2013)

Differentiable Pattern Producing Networks Evolves the Network Learns the weights Source: Convolution By Evolution (Fernando et al, 2016)

Compressed Weight Search Genes: Fourier series coefficients Weight Matrix : Inverse Fourier-type Transform of Genes Discrete Cosine Transform: Just like JPEG Compression Source: Evolving Neural Networks in Compressed Weight Space (Koutnik et al, 2016)

Neural Networks

Static HyperNetworks

Dynamic HyperNetworks

Implementation - Static Reducing number of parameters by factorization Accuracy vs Parameters tradeoff Source: Hypernetworks (Mike Brodie, 2016)

Implementation - Static

Implementation - Dynamic

Implementation - Dynamic

Implementation - Dynamic Memory Problems! Solution? * Linear Scaling of the Matrix

Experiments - Static

Experiments - Dynamic Bits per Character: Similar to average cross entropy Lower == Better Source: Generating Sequences With Recurrent Neural Networks (Graves, 2013)

Conclusion For CNN’s, with a tradeoff of accuracy, parameter count can be reduced drastically. For RNN’s, “generating the generative model” is very powerful, and beats previous records in character-level prediction datasets.