Automated Recipe Completion using Multi-Label Neural Networks

Slides:



Advertisements
Similar presentations
Greedy Layer-Wise Training of Deep Networks
Advertisements

Rule extraction in neural networks. A survey. Krzysztof Mossakowski Faculty of Mathematics and Information Science Warsaw University of Technology.
Data Mining Classification: Alternative Techniques
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
Artificial Neural Networks
Machine Learning Neural Networks
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
Lecture 14 – Neural Networks
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Machine Learning Neural Networks.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Appendix B: An Example of Back-propagation algorithm
Community Architectures for Network Information Systems
Bayesian networks Classification, segmentation, time series prediction and more. Website: Twitter:
EMIS 8381 – Spring Netflix and Your Next Movie Night Nonlinear Programming Ron Andrews EMIS 8381.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Data Mining: Classification & Predication Hosam Al-Samarraie, PhD. Centre for Instructional Technology & Multimedia Universiti Sains Malaysia.
Neural Networks for Protein Structure Prediction Brown, JMB 1999 CS 466 Saurabh Sinha.
M. E. Malliaris Loyola University Chicago, S. G. Malliaris Yale University,
Healthcare Process Modelling by Rule Based Networks Han Liu First Year PhD Student Alex Gegov, Jim Briggs, Mohammed Bader PhD Supervisors.
Foundations of Machine Learning and Data Rainer Marrone, Ralf Möller.
Prediction of the Foreign Exchange Market Using Classifying Neural Network Doug Moll Chad Zeman.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
1 Advanced Computer Programming. 2 Overview Coding C#.Net Framework C# - C++ -Java.
Reservoir Uncertainty Assessment Using Machine Learning Techniques Authors: Jincong He Department of Energy Resources Engineering AbstractIntroduction.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Chong Ho Yu.  Data mining (DM) is a cluster of techniques, including decision trees, artificial neural networks, and clustering, which has been employed.
Application of Data Mining Techniques on Survey Data using R and Weka
Machine Learning Chapter 18, 21 Some material adopted from notes by Chuck Dyer.
Supervised Machine Learning: Classification Techniques Chaleece Sandberg Chris Bradley Kyle Walsh.
Statistical techniques for video analysis and searching chapter Anton Korotygin.
Machine Learning: A Brief Introduction Fu Chang Institute of Information Science Academia Sinica ext. 1819
GPGPU Performance and Power Estimation Using Machine Learning Gene Wu – UT Austin Joseph Greathouse – AMD Research Alexander Lyashevsky – AMD Research.
LOAD FORECASTING. - ELECTRICAL LOAD FORECASTING IS THE ESTIMATION FOR FUTURE LOAD BY AN INDUSTRY OR UTILITY COMPANY - IT HAS MANY APPLICATIONS INCLUDING.
Understanding Business Intelligence with Neural Networks James McCaffrey Microsoft Research Labs Wednesday, May 4, :15 – 4:00 PM Room Breakers CD.
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Who am I? Work in Probabilistic Machine Learning Like to teach 
Machine Learning overview Chapter 18, 21
Machine Learning overview Chapter 18, 21
Machine Learning & Deep Learning
Prepared by: Mahmoud Rafeek Al-Farra
Business Analytics Applications in Budget Modelling
Soft Computing Applied to Finite Element Tasks
Estimating Link Signatures with Machine Learning Algorithms
Azure Machine Learning Noam Brezis Madeira Data Solutions
AV Autonomous Vehicles.
Unsupervised Learning and Autoencoders
Machine Learning Week 1.
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
Neural Networks Advantages Criticism
Research Interests.
Artificial Neural Networks for the NFL Draft
MEgo2Vec: Embedding Matched Ego Networks for User Alignment Across Social Networks Jing Zhang+, Bo Chen+, Xianming Wang+, Fengmei Jin+, Hong Chen+, Cuiping.
Neural Networks Geoff Hulten.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Machine Learning Study
Hans Behrens, , 25% Yash Garg, , 25% Prad Kadambi, , 25%
MTBI Personality Predictor using ML
Artificial Intelligence 10. Neural Networks
Deep Learning for the Soft Cutoff Problem
David Kauchak CS51A Spring 2019
Attention for translation
Machine Learning overview Chapter 18, 21
Sanguthevar Rajasekaran University of Connecticut
Recommender Systems and Fast Algorithms
Patterson: Chap 1 A Review of Machine Learning
Presentation transcript:

Automated Recipe Completion using Multi-Label Neural Networks Alexander Politowicz 12/12/2018

Executive Summary Using partial list of ingredients, predict missing ingredients Completed! Model attempts prediction of missing ingredients Unfortunately, accuracy not great; room for improvement - Sesame Oil - Sichuan peppercorns Cherry tomatoes Paprika

Approaches and Data Analysis Initial problem: variable input-output Difficult for most machine learning (ML) models Simplification: use ML for cuisine prediction, then get and extract best matching ingredients Data: Inputs: TF-IDF vectorized partial list of ingredients Outputs: cuisine -> missing ingredients

Methodology Neural Network: Post-Processing: Input: TF-IDF vectorized list of ingredients Hidden: 300 nodes Hidden: 100 nodes Output: Number of missing ingredients Post-Processing: Given predicted cuisine, retrieve most common ingredients Recommend by density of main ingredients (e.g. chicken legs, carrots) and spices (e.g. salt, vegetable oil)

Results Neural network performance: Algorithm performance: 71.6% 62.4%

Discussion Neural network likely overfits Post-processing technique not powerful enough to catch unique cases Future improvements: Look into better neural network structure and dropout to combat overfitting Investigate better methods of encoding ingredients (or similar data) for prediction