NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Perceptron Learning Rule
Marković Miljan 3139/2011
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Neural Networks Basic concepts ArchitectureOperation.
Neural Networks for Optimization Bill Wolfe California State University Channel Islands.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
September 14, 2010Neural Networks Lecture 3: Models of Neurons and Neural Networks 1 Visual Illusions demonstrate how we perceive an “interpreted version”
How does the mind process all the information it receives?
The McCulloch-Pitts Neuron. Characteristics The activation of a McCulloch Pitts neuron is binary. Neurons are connected by directed weighted paths. A.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Networks Applications Versatile learners that can be applied to nearly any learning task: classification numeric prediction unsupervised pattern.
Virtual Neurons 2.0 Gabe Al-Ghalith Janet M Dubinsky 7/31/12.
Mind, Brain & Behavior Wednesday January 15, 2003.
Artificial Neural Networks
2101INT – Principles of Intelligent Systems Lecture 10.
Artificial Neural Network Unsupervised Learning
Clustering of protein networks: Graph theory and terminology Scale-free architecture Modularity Robustness Reading: Barabasi and Oltvai 2004, Milo et al.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
PSY105 Neural Networks 2/5 2. “A universe of numbers”
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Choosing Weight and Threshold Values for Single Perceptrons n CS/PY 231 Lab Presentation # 2 n January 24, 2005 n Mount Union College.
Unsupervised learning
Multiple attractors and transient synchrony in a model for an insect's antennal lobe Joint work with B. Smith, W. Just and S. Ahn.
Ming-Feng Yeh1 CHAPTER 16 AdaptiveResonanceTheory.
Lecture 19 Review: First order circuit step response Steady-state response and DC gain Step response examples Related educational modules: –Section
Input Single-state DCM Intrinsic (within- region) coupling Extrinsic (between- region) coupling Multi-state DCM with excitatory and inhibitory connections.
Artificial Intelligence & Neural Network
1 Computational Vision CSCI 363, Fall 2012 Lecture 24 Computing Motion.
Oscillatory Models of Hippocampal Activity and Memory Roman Borisyuk University of Plymouth, UK In collaboration with.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
From brain activities to mathematical models The TempUnit model, a study case for GPU computing in scientific computation.
James L. McClelland Stanford University
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Computational Properties of Perceptron Networks n CS/PY 399 Lab Presentation # 3 n January 25, 2001 n Mount Union College.
An Introduction To The Backpropagation Algorithm.
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
Objective – To use tables to represent functions.
ECE 471/571 - Lecture 15 Hopfield Network 03/29/17.
Ranga Rodrigo February 8, 2014
Dr. Unnikrishnan P.C. Professor, EEE
Simple learning in connectionist networks
ECE 471/571 - Lecture 19 Hopfield Network.
? Dynamical properties of simulated MEG/EEG using a neural mass model
Covariation Learning and Auto-Associative Memory
Neuro-RAM Unit in Spiking Neural Networks with Applications
An Introduction To The Backpropagation Algorithm
Volume 40, Issue 6, Pages (December 2003)
Objective- To use an equation to graph the
FLIPPED CLASSROOM ACTIVITY CONSTRUCTOR – USING EXISTING CONTENT
RELATIONS & FUNCTIONS CHAPTER 4.
Neuronal Mechanisms and Transformations Encoding Time-Varying Signals
Active, dynamic, interactive, system

Sanguthevar Rajasekaran University of Connecticut
Learning Combinational Logic
Recurrent neuronal circuits in the neocortex
Presentation transcript:

NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented graph (no bi-directional connections), and 2. every node (neuron) of G has at least one out-going edge. Process: Use the graph to create a neural network with threshold-linear dynamics (next slide). Next, choose an initial condition and compute the solution to the network equations. The solution is a set of firing rates, one per neuron, as a function of time. Finally, associate a piano key to each neuron, and use the neuron’s firing rate to modulate the amplitude of the key’s frequency. Superimpose the amplitude-modulated frequencies for all neurons to obtain a single acoustic signal. Output: the resulting acoustic signal is the network’s song !

The neural network Threshold-linear network dynamics: parameter constraints: threshold nonlinearity network of excitatory and inhibitory cells graph G of excitatory interactions Graph-based connectivity matrix:

song 1: penta The sequence of notes and the rhythm are emergent properties of the network dynamics. listen to the song!

song 2: skipping The only difference between this network and the previous one is the graph. listen to the song!

song 3: whistle Can you hear how this one takes longer to settle into the repeating pattern? listen to the song!

song 4: arhythmia Does the song for this network ever perfectly repeat? listen to the song!