Hierarchical Deep Convolutional Neural Network Justin Essert
Motivation Convolutional Neural Networks Widespread standard for image recognition Weeks to train Struggle on sparse data
Dataset – Cifar100 50,000 32x32 training images Low resolution Fewer samples per category than many other datasets State-of-the-art: 30-40% error rates 100 categories (fine) – 20 super-categories (coarse) Superclasses Classes Aquatic Mammals Beaver, Dolphin, Otter, Seal, Whale Flowers Orchids, Poppies, Roses, Sunflowers, Tulips Household Furniture Bed, Chair, Couch, Table, Wardrobe
Convolutional Neural Networks – VGG16 Simonyan, Karen & Zisserman, Andrew. Very Deep Convolutional Networks For Large-Scale Image Recognition. arXiv:1409.1556, 2015 Pre-trained on ImageNet (transfer learning) Photo Credit: Heuritech Le Blog
Hierarchical Deep CNN Initial network is trained on entire dataset Yan, Zhicheng et al. HD-CNN: Hierarchical Deep Convolutional Neural Network for Large Scale Visual Recognition. arXiv:1410.0736, 2014 Initial network is trained on entire dataset Fine -> coarse prediction 20 ‘Fine’ networks (for each superclass) Final prediction = weighted sum of fine networks
Hierarchical Deep CNN (Example) Dataset: Superclasses Classes Aquatic Mammals Beaver, Otter Household Furniture Bed, Wardrobe So lets look at the output of our trained network when we pass in a picture of a wardrobe!
Hierarchical Deep CNN (Example) Coarse Classifier: Labels: Probability: Aquatic Mammals 0.10 Household Furniture 0.90 1st Fine Classifier: Labels: Probability: Beaver 0.50 Otter 2nd Fine Classifier: Labels: Probability: Bed 0.00 Wardrobe 1.00
Hierarchical Deep CNN (Example) Coarse Classifier: Final Classification: Labels: Probability: Aquatic Mammals 0.10 Household Furniture 0.90 Labels: Probability: Beaver 0.10*0.50 = 0.05 Otter Bed 0.90*0.00 = 0.00 Wardrobe 0.90*1.00 = 0.90 1st Fine Classifier: Labels: Probability: Beaver 0.50 Otter 2nd Fine Classifier: Labels: Probability: Bed 0.00 Wardrobe 1.00
Questions?