Physics-guided machine learning for milling stability:

Slides:



Advertisements
Similar presentations
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Advertisements

Data Mining Classification: Alternative Techniques
Ensemble Methods An ensemble method constructs a set of base classifiers from the training data Ensemble or Classifier Combination Predict class label.
Data Mining Classification: Alternative Techniques
Christoph F. Eick Questions and Topics Review Nov. 22, Assume you have to do feature selection for a classification task. What are the characteristics.
Support Vector Machines
SVM—Support Vector Machines
Search Engines Information Retrieval in Practice All slides ©Addison Wesley, 2008.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Application of Stacked Generalization to a Protein Localization Prediction Task Melissa K. Carroll, M.S. and Sung-Hyuk Cha, Ph.D. Pace University, School.
Classification and Decision Boundaries
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Ensemble Learning: An Introduction
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Boosting Main idea: train classifiers (e.g. decision trees) in a sequence. a new classifier should focus on those cases which were incorrectly classified.
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Data mining and machine learning A brief introduction.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
CS 8751 ML & KDDSupport Vector Machines1 Support Vector Machines (SVMs) Learning mechanism based on linear programming Chooses a separating plane based.
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Benk Erika Kelemen Zsolt
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
BOOSTING David Kauchak CS451 – Fall Admin Final project.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
Today Ensemble Methods. Recap of the course. Classifier Fusion
Greedy is not Enough: An Efficient Batch Mode Active Learning Algorithm Chen, Yi-wen( 陳憶文 ) Graduate Institute of Computer Science & Information Engineering.
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
Learning with AdaBoost
START OF DAY 5 Reading: Chap. 8. Support Vector Machine.
CS 478 – Tools for Machine Learning and Data Mining SVM.
A NOVEL METHOD FOR COLOR FACE RECOGNITION USING KNN CLASSIFIER
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
CZ5225: Modeling and Simulation in Biology Lecture 7, Microarray Class Classification by Machine learning Methods Prof. Chen Yu Zong Tel:
Classification Ensemble Methods 1
… Algo 1 Algo 2 Algo 3 Algo N Meta-Learning Algo.
Boosting ---one of combining models Xin Li Machine Learning Course.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Linear Models & Clustering Presented by Kwak, Nam-ju 1.
Ensemble Classifiers.
Machine Learning: Ensemble Methods
CS 9633 Machine Learning Support Vector Machines
Support Vector Machine
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
COMP61011 : Machine Learning Ensemble Models
Basic machine learning background with Python scikit-learn
Manufacturing and uncertainty
An Introduction to Support Vector Machines
Machine Learning Week 1.
Combining Base Learners
Nearest-Neighbor Classifiers
Introduction to Data Mining, 2nd Edition
Machine Learning Ensemble Learning: Voting, Boosting(Adaboost)
COSC 4335: Other Classification Techniques
Department of Electrical Engineering
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Ensemble learning.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Support Vector Machines
Nearest Neighbors CSC 576: Data Mining.
CIS 519 Recitation 11/15/18.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Srinivas Neginhal Anantharaman Kalyanaraman CprE 585: Survey Project
Advisor: Dr.vahidipour Zahra salimian Shaghayegh jalali Dec 2017
Presentation transcript:

Physics-guided machine learning for milling stability: A numerical study of model training and updating Jiang Jiang and Tony L. Schmitz Mechanical Engineering and Engineering Science, University of North Carolina at Charlotte, Charlotte, NC, USA Introduction Physics-based model descriptions receptance coupling substructure analysis (RCSA) used to predict the tool point receptance mechanistic force model used to relate the cutting force to the commanded chip area mean force frequency domain analysis used to predict stability limit using the first two models as inputs Physics-based model uncertainty deterministic models include uncertainty for example, the actual extension length of the endmill from the holder results in uncertainty in the tool point receptance which, in turn, leads to uncertainty in the stability limit if a test is performed to determine the actual stability behavior of a spindle speed-axial depth combination, there is no straightforward mapping between this result and the model input parameters New approach use the (uncertain) physics-based stability model to train a machine learning (ML) algorithm ML model is defined in the desired test domain of spindle speed-axial depth updating can then proceed directly by collecting stability results and modifying the dataset used to re-train the ML stability model. leverage Industry 4.0, where data collected during and after machining is used to achieve process improvement each part becomes an experiment and the uncertainty in operating parameters can be reduced over time Numerical case study Physics-based models with intentionally inserted error Updating Strategy 2 – grid ‘lazy update’ that selects a grid of equally spaced points over the full domain number of points varied from 50 to 410 if the cut is unstable, then all the points with the same spindle speed but higher depth of cut are updated as unstable if the cut is stable, then all the points with the same spindle speed but lower depth of cut are updated as stable Define force model using best information available (includes errors). Modeled force data 700 N/mm2 specific cutting force 68 deg force angle True force data 800 N/mm2 specific cutting force 60 deg force angle Define tool-holder-spindle-machine using best information available (includes errors). Modeled tool 12 mm diameter 50 mm extension length from holder 4 teeth True tool 53 mm extension length from holder Tool point frequency response function for incorrect extension length Chatter Stable Chatter Stable GOAL By updating the training dataset, transform the stability limit from left (with errors) to right (true model) Updating Strategy 3 – climbing ‘efficient update’ that uses fewer tests, 101 to 183 check stability for all points at 5 mm depth for stable points, test at 7 mm depth (same spindle speeds) for stable points, test at 10 mm depth (same spindle speeds) requires less points and yields a better result captures stability limit over full spindle range Generate original dataset from incorrect stability model, generate dataset with the shape of 20 (depth of cut/m) * 101 (spindle speed/rpm) provides base for updating and online learning, features were normalized dataset was divided into training set and test set with 70:30 ratio Stage 1: Replicate the stability boundary use three machine learning algorithms to replicate the stability boundary determine accuracy from the test set (compared to the stability limit with input error) Stage 2: Update the stability boundary with test cuts Updating Strategy 1 – frequency analysis ‘smart update’ leverages physical knowledge to generate updating data points select largest stable depth-spindle speed as the first cut, use MATLAB simulation to determine the process signals determine if cut is stable or unstable, as well as the chatter frequency if it is unstable use equation to find the spindle speed of the next cut repeat Physics-based model, y = f(x) Input, x, related to output Simulated output, ys Physics-guided machine learning model, Y = f(x, ys, ym) Output, Y Measured data, ym, for input, x AdaBoost KNN SVM 97.03% 96.20% 91.09% Conclusion study provides feasibility of physics-guided machine learning for milling stability machine learning models to predict cutting stability for test dataset (with errors) updated stability limit with error-free data results were compared to the observed/true stability limit accuracy of two updating strategies with three ML algorithms evaluated climbing method for updating had highest accuracy AdaBoost provided accuracy of 92.24% (model is correct for 92.24% of the test dataset compare to the true stability limit) future research will consider other machine learning algorithms, such as neural network combination of both synthetic and measured data will be used in future study Machine learning defined by Tom Mitchell as machine learns with respect to a particular task T, performance metric P, and type of experience E, if the system reliably improves its performance P at task T, following experience E closely related to computer science, statistics, psychology and neuroscience data mining, knowledge discovery, algorithm development, and problem solving K-nearest neighbors (KNN) makes decisions by referring to the K data points closest to selected data point does not make assumptions over the distribution of the dataset does not produce a generalized rule over the dataset Support vector machine (SVM) creates a hyperplane to separate the data points into two classes chosen by maximizing distance between the hyperplane and the closest data points maps the data points to a higher dimension, if required, so that a hyperplane can still divide them AdaBoost ensemble algorithm that groups weak classifiers into a combined classifier with each round of training, the examples that were misclassified by the previous round get more weight when calculating the accuracy of the current classifier forces the algorithm to focus on mistakes base classifiers have weighted votes to generate the final results Example Test cut 1 with  = 15600 rpm, b = 20 mm use MATLAB simulation to determine process signals cut is unstable choose spindle speed for Test cut 2  = 3046.7(60)/((j + 1)4) = 15233 rpm for j = 2, keep b = 20 mm No. Spindle Speed (rpm) Depth of Cut (mm) Status 1 15600 20 2 15233 3 15 4 5 15000 6 Test AdaBoost KNN SVM Original dataset 81.19% 80.03% 84.82% Grid   n=50 81.85% 80.20% 85.48% n=110 82.01% 80.86% 86.14% n=210 82.84% 82.18% 87.13% n=410 Climbing n=101 86.96% 87.79% n=154 90.92% 89.60% 89.27% n=183 92.24% 91.25%