ACRES 2018 Research Project

Slides:



Advertisements
Similar presentations
Joseph Kings Lithium Atom Please press the space bar and hold It down to view the atom.
Advertisements

Basis Sets for Molecular Orbital Calculations
INTRODUCTION TO Machine Learning 2nd Edition
Machine learning continued Image source:
CS Statistical Machine learning Lecture 13 Yuan (Alan) Qi Purdue CS Oct
CHE Inorganic, Physical & Solid State Chemistry Advanced Quantum Chemistry: lecture 4 Rob Jackson LJ1.16,
Lecture 14 – Neural Networks
Pattern Recognition and Machine Learning
The Kernel Trick Kenneth D. Harris 3/6/15.
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
Counting Atoms.
沈致远. Test error(generalization error): the expected prediction error over an independent test sample Training error: the average loss over the training.
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
A hybrid SOFM-SVR with a filter-based feature selection for stock market forecasting Huang, C. L. & Tsai, C. Y. Expert Systems with Applications 2008.
Introduction to Machine Learning Supervised Learning 姓名 : 李政軒.
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
CISC Machine Learning for Solving Systems Problems Microarchitecture Design Space Exploration Lecture 4 John Cavazos Dept of Computer & Information.
Fitting normal distribution: ML 1Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
©2011, Jordan, Schmidt & Kable Lecture 13 Lecture 13 Self-consistent field theory This is how we do it.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Bohr Model Deficiencies
Big data classification using neural network
Deep Feedforward Networks
Deep Learning Amin Sobhani.
Quantum Simulation Neural Networks
به نام خدا هوالعليم.
Randomness in Neural Networks
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
Photorealistic Image Colourization with Generative Adversarial Nets
S-D analog to digital conversion
Alan Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani
Machine learning, pattern recognition and statistical data modelling
Load Balancing: List Scheduling
Classification with Perceptrons Reading:
FROM BOHR TO SCHROEDINGER Bohr Model Deficiencies
Intelligent Information System Lab
Overview of Supervised Learning
Learning to Rank Shubhra kanti karmaker (Santu)
CAMCOS Report Day December 9th, 2015 San Jose State University
Machine Learning Week 1.
Face detection using Random projections
Simulation of Marketing Mix – Placement of Business
Move and label the protons, neutrons, and electrons to build an atom of each of the following elements.
Project 1 Binary Classification
Equalization in a wideband TDMA system
Neuro-Computing Lecture 4 Radial Basis Function Network
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Neil Gealy Outline What I Learned this Week Research Interests
Identification of Wiener models using support vector regression
What is an isotope? Review Video.
Visualizing and Understanding Convolutional Networks
Identifying the Optimum Perovskite Catalyst for Water Splitting using Julia Yusu Liu 10/10/2018.
Model generalization Brief summary of methods
Parametric Methods Berlin Chen, 2005 References:
Deep Learning for Plant Stress Phenotyping: Trends and Future Perspectives  Asheesh Kumar Singh, Baskar Ganapathysubramanian, Soumik Sarkar, Arti Singh 
An Introduction to Automated Record Linkage
Backpropagation David Kauchak CS159 – Fall 2019.
Machine Learning – a Probabilistic Perspective
Load Balancing: List Scheduling
Process – Structure Linkage for Cellulose NanoCyrstals (CNC)
MAS 622J Course Project Classification of Affective States - GP Semi-Supervised Learning, SVM and kNN Hyungil Ahn
Image recognition.
Bond Polarity and Electronegativity
Debasis Bhattacharya, JD, DBA University of Hawaii Maui College
Multi-UAV to UAV Tracking
Report 7 Brandon Silva.
Example of training and deployment of deep convolutional neural networks. Example of training and deployment of deep convolutional neural networks. During.
Machine Learning for Cyber
Introduction to Machine Learning
Presentation transcript:

ACRES 2018 Research Project Muawiz Chaudhary, Jialin Liu, Paul Sinz, Yue Qi, Matthew Hirn

What is electron density? Electron density is a measure of the probability of an electron being present at a specific location. Knowing the electron density gives us information on the bonding of atoms. Current methods for calculating electron density are computationally expensive. We wish to calculate electron density using machine learning (ML) methods.

Background Brockherde and others obtained electron density by learning a mapping from potential as gaussians to electron density. Brockherde used a kernel ridge regression model. A basis is specified for representing the density.

Improvement We wish to improve upon the results of Brockherde. Our input will be a 3 channel image, where each channel contains the electron density modeled as a gaussian for a corresponding atom. Labels will the electron densities that our model wants to learn. Predictions are the electron densities the model will output. We wish to improve upon the results of Brockherde. We use a Unet architecture for predicting the electron density of a Lithium-Oxygen- Lithium system. Improvement

Generating the data set Using Vasp, we varied the position of each Lithium atom and calculated the electron density of 625 different configurations of a Lithium-Oxygen-Lithium system. This is our labels. Given the position of individual atoms and their core charges, we modeled the electron density as gaussians. These are our inputs.

Train We split our data set into 4 splits, where each split contains a train and test set. Feed our inputs into a mixer, which is then fed into the actual Unet. Compare the prediction against a corresponding label using the L2 Norm loss function. Weights are adjusted accordingly.

Results Our models generate predictions that are very close to our labels. Output of our filters is interesting, as they indicate that our model learns the underlying chemistry of the system. We also noticed that no dropout is better. This is consistent with previous findings, which indicate that

Future We wish to continue the present research. Current focus is on hyperparameter tuning. Want to use tools from optimal transport. Future goals are to see if our models can generalize their predictions to larger systems after having trained on smaller systems of atoms.