Sigmoid and logistic regression
One-hot encoding One-hot: Encode n states using n flip-flops Assign a single “1” for each state Example: 0001, 0010, 0100, 1000 Propagate a single “1” from one flip-flop to the next All other flip-flop outputs are “0”
Multilayer Neural Network for Classification
softmax
One hot encoding and softmax function
Error representation 방식 Classification error Mean squared error (MSE) Average Cross entropy error (ACE error)
Example case
Classification error Classification error = 1/3
Mean squared error Mean squared error = Mean squared error =
Cross entropy The cross entropy for two distributions p and q over the same discrete probability space is defined as follows: H(p,q) = - x p(x) log(q(x))
Average Cross Entropy (ACE) error
MSE vs. ACE 속도는 ACE가 좋음 학습 정도는 경우에 따라 다름 Classification에는 ACE, regression 에는 MSE 사용하는 경우가 많음
Rectified Linear Unit