Speech Generation Using a Neural Network

Slides:



Advertisements
Similar presentations
Face Recognition: A Convolutional Neural Network Approach
Advertisements

Sorting Really Big Files Sorting Part 3. Using K Temporary Files Given  N records in file F  M records will fit into internal memory  Use K temp files,
IPv6 Near-Unique Site Local Addresses draft-francis-ipngwg-unique-site-local-00.txt.
Entropy and Dynamism Criteria for Voice Quality Classification Applications Authors: Peter D. Kukharchik, Igor E. Kheidorov, Hanna M. Lukashevich, Denis.
Visualization of hidden node activity in a feed forward neural network Adam Arvay.
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Simple Neural Nets For Pattern Classification
Branch Prediction with Neural- Networks: Hidden Layers and Recurrent Connections Andrew Smith CSE Dept. June 10, 2004.
Introduction to Recurrent neural networks (RNN), Long short-term memory (LSTM) Wenjie Pei In this coffee talk, I would like to present you some basic.
MACHINE LEARNING AND ARTIFICIAL NEURAL NETWORKS FOR FACE VERIFICATION
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Convolutional Neural Networks for Image Processing with Applications in Mobile Robotics By, Sruthi Moola.
Support Vector Machines Piyush Kumar. Perceptrons revisited Class 1 : (+1) Class 2 : (-1) Is this unique?
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Artificial Neural Networks An Overview and Analysis.
Appendix B: An Example of Back-propagation algorithm
Artificial Intelligence Techniques Multilayer Perceptrons.
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
Using Neural Network Language Models for LVCSR Holger Schwenk and Jean-Luc Gauvain Presented by Erin Fitzgerald CLSP Reading Group December 10, 2004.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Evolutionary Computation Evolving Neural Network Topologies.
Speech Recognition through Neural Networks By Mohammad Usman Afzal Mohammad Waseem.
CSC321: Neural Networks Lecture 9: Speeding up the Learning
CNN-RNN: A Unified Framework for Multi-label Image Classification
Analysis of Sparse Convolutional Neural Networks
Ananya Das Christman CS311 Fall 2016
ECE 5424: Introduction to Machine Learning
ANN-based program for Tablet PC character recognition
Data Mining, Neural Network and Genetic Programming
MATLAB Implementation of the Optimal Brain Surgeon (OBS) Algorithm
A Support Vector Machine Approach to Sonar Classification
Other Kinds of Arrays Chapter 11
Other Kinds of Arrays Chapter 11
Neural Networks Dr. Peter Phillips.
Classification with Perceptrons Reading:
Lecture 5 Smaller Network: CNN
Policy Compression for MDPs
Convolutional Neural Networks
CSC 578 Neural Networks and Deep Learning
شبکه عصبی تنظیم: بهروز نصرالهی-فریده امدادی استاد محترم: سرکار خانم کریمی دانشگاه آزاد اسلامی واحد شهرری.
Training a Neural Network
Lecture 7: Index Construction
A Kaggle Project By Ryan Bambrough
XOR problem Input 2 Input 1
network of simple neuron-like computing elements
Detecting Myocardial Infarctions (Heart Attack) using Neural Network
College Football Playoff Composition Prediction using Machine Learning
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Long Short Term Memory within Recurrent Neural Networks
On Convolutional Neural Network
Artificial Intelligence 12. Two Layer ANNs
ECE 352 Digital System Fundamentals
Face Recognition: A Convolutional Neural Network Approach
EE 193: Parallel Computing
Today’s Lecture Project notes Recurrent nets Unsupervised learning
Spaced Practice Spaced Practice.
Automatic Handwriting Generation
Word representations David Kauchak CS158 – Fall 2016.
PYTHON Deep Learning Prof. Muhammad Saeed.
CSC 578 Neural Networks and Deep Learning
THE ASSISTIVE SYSTEM SHIFALI KUMAR BISHWO GURUNG JAMES CHOU
Outline Announcement Neural networks Perceptrons - continued
Today’s Lecture Project notes Recurrent nets Unsupervised learning
Presentation transcript:

Speech Generation Using a Neural Network NetTalk Project Speech Generation Using a Neural Network Michael J Euhardy

The Speech Generation Idea Input: a specific letter whose sound is to be generated Input: three letters on each side of it for a total of seven letters input Output: the sound that should be generated based on the input letter and the surrounding letters

The Strategy 26 possible letters 7 input position Map each letter in each position to a unique input 7*26 = 182 total inputs

The Strategy 57 possible sounds generated Map to 57 output labels

The Resulting ANN A fully connected single layer perceptron with 182 inputs and 57 outputs

The Findings The trained neural network performs very well, and the larger the training set and the longer spent training on it, the better it performs The training can be an extremely long process if a high rate of classification is desired and the training set is large

Problems Time Space

Time You can’t rush training the network. Even using a dual PIII-733 with 512MB, it still took a really long time to train any data of a significant size. And just converting all of the characters in the data file to the matrices necessary to use as inputs and labels took hours.

Space 20000 words of data with maybe 7 letters on average. That’s a matrix 140000x239 Double precision in Matlab, that’s a lot of memory

Workarounds Smaller data set, only 1000 words Lower standards of training, only train to 80% classification

Next Time C++ Start Earlier, it’s a long process Matlab is way too slow and way too memory intensive Start Earlier, it’s a long process Multi-Layer Perceptron

Conclusion I give up! I don’t know how Microsoft’s Narrator does it, but I bet it doesn’t do it this way.