Download presentation
1
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
ANN Tutorial 語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
2
ㄊㄞ ㄨㄢ ㄉㄚ ㄒㄩ ㄝ Acoustic Model Speech Recognition Language Model 台 灣 大 學
3
How do we build this machine?
ㄅ ㄆ ㄇ ㄈ ㄉ ㄊ ㄋ ㄌ ㄍ ㄎ ㄏ ㄐ ㄑ ㄒ ㄓ ㄔ ㄕ ㄖ ㄗ ㄘ ㄙ ㄧ ㄨ ㄩ ㄚ ㄛ ㄜ ㄝ ㄞ ㄟ ㄠ ㄡ ㄢ ㄣ ㄤ ㄥ ㄦ Classifier How do we build this machine? Our Task
4
“Field of study that gives computers the ability to learn without being explicitly programmed”
Instead of writing a program by hand for each specific task, we collect lots of examples that specify the correct output for a given input. A machine learning algorithm then takes these examples and produces a program that does the job. The program produced by the learning algorithm may look very different from a typical hand-written program. It may contain millions of numbers. If we do it right, the program works for new cases as well as the ones we trained it on. If the data changes the program can change too by training on the new data. Massive amounts of computation are now cheaper than paying someone to write a task-specific program. Instead of writing a program by hand for each specific task, we collect lots of examples that specify the correct output for a given input. A machine learning algorithm then takes these examples and produces a program that does the job. The program produced by the learning algorithm may look very different from a typical hand-written program. It may contain millions of numbers. If we do it right, the program works for new cases as well as the ones we trained it on. If the data changes the program can change too by training on the new data. Massive amounts of computation are now cheaper than paying someone to write a task-specific program. Machine Learning Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera
5
The Classification Task
Features Classifier Classes Male Hair Length Make-up . Classifier Female Others 舉例而言 若要針對這房間裡的人 做性別的分類 The Classification Task
6
Male Female Make-Up 2D Feature Space Hair Length Voice pitch
7
We need some type of non-linear function!
Multi-D Feature Space
8
Neurons Each neuron receives inputs from other neurons
The effect of each input line on the neuron is controlled by a synaptic weight The weights can be positive or negative. The synaptic weights adapt so that the whole network learns to perform useful computations Recognizing objects, understanding language, making plans, controlling the body. Neurons Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera
9
Artificial Neural Network
Feed Forward Net activation function bias i input th x1 w1 x2 w2 output index over input connections weight on x3 w3 y th i input w4 x4 b 1 Artificial Neural Network A lot of simple non-linearity complex non-linearity
10
Sigmoid Function A.K.A Logistic Function
Activation Function
11
How Do Neural Nets Learn?
Intuition: 1. Start with random weights 2. Compare the outputs of the net to the targets 3. Try to adjust the weights to match the outputs with the targets yj tj Target 1 0.2 4 0.9 1 -3 w How Do Neural Nets Learn?
12
Gradient Descent x1 w1 x2 w2 x3 w3 x4 w4 Error w2 w1 Updated weights
Learning rate Reference: Machine Learning course by Andrew Ng, Coursera
13
Back Propagation yj j zj i yi
Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera
14
Overfitting Which model do you trust?
The complicated model fits the data better. But it is not economical. A model is convincing when it fits a lot of data surprisingly well. Not just the training data. output = y input = x Which output value should you predict for this test input? Overfitting Reference: Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera
15
Divide dataset into training set and validation set
Use training set for training, validation set as indicator to overfitting Use another independent testing test to performance Training and Testing
16
ㄊㄞ ㄨㄢ ㄉㄚ ㄒㄩ ㄝ Acoustic Model Speech Recognition Language Model 台 灣 大 學
17
Our Task ㄅ ㄆ ㄇ ㄈ ㄉ ㄊ ㄋ ㄌ ㄍ ㄎ ㄏ ㄐ ㄑ ㄒ ㄓ ㄔ ㄕ ㄖ ㄗ ㄘ ㄙ ㄧ ㄨ ㄩ ㄚ ㄛ ㄜ ㄝ
ㄞ ㄟ ㄠ ㄡ ㄢ ㄣ ㄤ ㄥ ㄦ Our Task
18
We can ‘splice’ the right and left 4 MFCCs to form a 13 x 9 = 117 Dim Vector
Example --- Inputs MFCC:13 Dim Vector
19
Example --- Output Right context dependent initial final phones
‘a’, ‘ai’, ‘an’, ‘ang’, ‘au’, ‘b_a’, ‘b_e’, ‘b_ee’, ‘b_i’ … Target ‘a’ ‘ai’ 1 117 Dim MFCC ‘an’ Example --- Output
20
Matlab NN toolkit---Net
net = newff(Feature,Label,10); % given Feature and Label, this command generates an NN with 10 hidden units net = newff(Feature,Label,[20 10],{'logsig' 'tansig'}); % this command generates an NN with two layers, one with 20 units and sigmoid as activation function, one with 10 units and tanh as activation function Matlab NN toolkit---Net
21
Matlab NN Toolkit ---Train
[net,tr] = train(net,Feature,Label); % this command will train the network, dividing your dataset into training, validation, and testing set Matlab NN Toolkit ---Train
22
Matlab NN Toolkit ---Test
out = sim(net,testInputs); % this command runs the command you have trained on a test data set to obtain outputs Matlab NN Toolkit ---Test
23
Matlab NN Toolkit ---NNTool
% this command calls a GUI for NN visualization Matlab NN Toolkit ---NNTool
24
Demo Program Location: speech.ee.ntu.edu.tw/~poetek/nnkit
Copy the folder to your computer, and run the file ‘run.m’ for a quick demo Demo Program
25
DEMO
26
Demo Program --- Results
Example (1): Classify the inputs into 4 classes: {‘a’, ‘e’, ‘i’, ‘u’} ans = correct: 1139 out of 1205 accuracy = 0.9452 classification = Demo Program --- Results
27
Demo Program --- Results(2)
Example (2): Classify the inputs into 6 classes: {‘a’, ‘e’, ‘i’, ‘u’, ‘b_a’, ‘p_a’, ‘m_a’, ‘f_a’} correct: 1293 out of 1446 accuracy = 0.8942 classification = Demo Program --- Results(2)
28
Demo Program --- Results(3)
Example (3): Classify the inputs into all 147 classes correct: 92 out of 506 accuracy = 0.1818 Demo Program --- Results(3)
29
Please run example 1~3 with the demo program, record your performance, run time and your observations Please play around with different settings to achieve better accuracy Some parameters you can change: Net structure Training algorithm Input Features net.trainParam (Please type ‘help newff’ for more info) Assignments
30
Neural Networks for Machine Learning course by Geoffrey Hinton, Coursera
Machine Learning course by Andrew Ng, Coursera Matlab NN toolkit documentation References
31
Thank You! Feel free to ask questions!
My Thank You!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.