Comparison of PLS regression and Artificial Neural Network for the processing of the Electronic Tongue data from fermentation growth media monitoring Alisa.

Slides:



Advertisements
Similar presentations
On the application of GP for software engineering predictive modeling: A systematic review Expert systems with Applications, Vol. 38 no. 9, 2011 Wasif.
Advertisements

Lecture 5: CNN: Regularization
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Analysis of port wines using the electronic tongue Alisa Rudnitskaya 1, Ivonne Delgadillo 2, Andrey Legin 1, Silvia Rocha 2, Anne-Marie Da Costa 2, Tomás.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Machine Learning Neural Networks
Lecture 14 – Neural Networks
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
Decision Support Systems
Branch Prediction with Neural- Networks: Hidden Layers and Recurrent Connections Andrew Smith CSE Dept. June 10, 2004.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Net Analyte Signal Based Multivariate Calibration Methods By: Bahram Hemmateenejad Medicinal & Natural Products Chemistry Research Center, Shiraz University.
Some Applications of Neural Networks in Plasma Physics and Fusion Materials Modelling R. Kemp 1, G. Cottrell 2, and H. K. D. H Bhadeshia 1 University of.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Elaine Martin Centre for Process Analytics and Control Technology University of Newcastle, England The Conjunction of Process and.
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Multilayer feed-forward artificial neural networks for Class-modeling F. Marini, A. Magrì, R. Bucci Dept. of Chemistry - University of Rome “La Sapienza”
1 Neural plug-in motor coil thermal modeling Mo-Yuen Chow; Tipsuwan Y Industrial Electronics Society, IECON 26th Annual Conference of the IEEE, Volume:
Radial Basis Function Networks
PPT 206 Instrumentation, Measurement and Control SEM 2 (2012/2013) Dr. Hayder Kh. Q. Ali 1.
Biointelligence Laboratory, Seoul National University
Classification Part 3: Artificial Neural Networks
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Chapter 9 Neural Network.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Gerousis Toward Nano-Networks and Architectures C. Gerousis and D. Ball Department of Physics, Computer Science and Engineering Christopher Newport University.
Using Neural Networks to Predict Claim Duration in the Presence of Right Censoring and Covariates David Speights Senior Research Statistician HNC Insurance.
NEURAL NETWORKS FOR DATA MINING
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
1 Introduction to Neural Networks And Their Applications.
Digital Media Lab 1 Data Mining Applied To Fault Detection Shinho Jeong Jaewon Shim Hyunsoo Lee {cinooco, poohut,
Neural Networks II By Jinhwa Kim. 2 Neural Computing is a problem solving methodology that attempts to mimic how human brain function Artificial Neural.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
Reservoir Uncertainty Assessment Using Machine Learning Techniques Authors: Jincong He Department of Energy Resources Engineering AbstractIntroduction.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Lecture 5 Neural Control
1 Statistics & R, TiP, 2011/12 Neural Networks  Technique for discrimination & regression problems  More mathematical theoretical foundation  Works.
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Big data classification using neural network
Artificial Neural Networks
Learning with Perceptrons and Neural Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Introduction to Neural Networks And Their Applications
Prof. Carolina Ruiz Department of Computer Science
Machine Learning Today: Reading: Maria Florina Balcan
CSC 578 Neural Networks and Deep Learning
Artificial Intelligence Methods
Neural Networks Geoff Hulten.
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Artificial Intelligence 10. Neural Networks
Facultad de Ingeniería, Centro de Cálculo
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Comparison of PLS regression and Artificial Neural Network for the processing of the Electronic Tongue data from fermentation growth media monitoring Alisa Rudnitskaya 1, Andrey Legin 1, Dmitri Kirsanov 1, Boris Seleznev 1, Kim Esbensen 2, John Mortensen 3, Lars Houmøller 2, Yuri Vlasov 1 1 Laboratory of Chemical Sensors, Chemistry Department, St. Petersburg University, Russia; 2, Aalborg University Esbjerg, Denmark; 3 Department of Life Science and Chemistry, Roskilde University Centre, Denmark.

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 2 Industrial use of filamentous fungi batch fermentation Fungi: Aspergillus, Penicillium etc Citric acid Food stuffs Enzymes Pharmaceuticals Food additives

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 3 Purpose of the study Development of rapid analytical methodology to follow-up batch fermentation processes and for quantitative analysis of broths –Evaluation of Electronic Tongue (ET) for following-up of the batch fermentation processes and quantitative analysis of broths on the example of Aspergillus Niger batch culture medium –Application and comparison of different chemometric techniques for multivariate calibration of ET

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 4 Experimental set-up Samples Background: 0.5 gL -1 KCl, 1.5 gL -1 KH 2 PO 4, 0.5 gL -1 MgSO 4, 1 mlL -1 of Vishniac trace element solution, pH 6 SampleTime, hCitratePyruvateOxalateGlucoseGlycerol,MannitolErythritolNH 4 Cl Solutions simulating growth media of real fermentation processes involving Aspergillus niger 2. Same solutions with 10mM of sodium azide added.

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 5 Measurements ET comprising 10 potentiometric chemical sensors with polymeric membranes Direct and fast (few minutes) measurements No sample preparation Experimental set-up Data processing Data splitting into calibration, monitoring and test sets (D-optimal design) Multivariate calibration PLS-regression Feed-forward neural network Software used:Unscrambler v. 7.8 by CAMO AS, Norway; NeuroSolutions by NeuroDimensions Inc, USA

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 6 Determination of ammonium, oxalate, citrate content and time elapsed from the media inoculation in the model growth media using ET Average relative error of prediction, % AmmoniumOxalateCitrateTime without sodium azide 106 7with sodium azide Calibration of ET by PLS regression for each component separately Results for the test set

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 7 Non-linearity of the sensors’ responses Calibration of ET w.r.t. ammonium concentration using PLS-regression

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 8 Response of the NH 4 -sensitive electrode to NH 4 + on the growth medium Detection limits to NH 4 + : Discrete electrode pNH 4 Sensor array pNH 4 Nikolski equation:

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 9 Non-linear calibration methods Non-linear regression Artificial neural networks Advantages -Flexibility -Noise tolerance Drawbacks -Prone to overfitting

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 10 Feed-forward neural network Learning Local error function: e j = -  E/  I j for output layer: e o = f’(I o ) (y-ŷ) for hidden layers: e s j = f’(I s j )  (e s+1 s w s+1 kj ) Weight update:  w s ij = -  (  E/  w s ij ) =  e s j x s-1 i x1 x3 x2 I, f(I) w s ij Input layer Hidden layer Output layer ŷ Weight - w s ij Input function: I s j =  x s-1 i *w s ij Transfer function: f(I) Forward pass E =ly-ŷl Error back-propagation Hyperbolic tangent :

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 11 Neural network validation Evolution of training and monitoring errors during ANN training. Calibration of ET w.r.t. oxalate concentration

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 12 Data splitting into calibration, monitoring and test sets using D-optimal design Basic idea of D-optimal design: finding a design matrix that maximizes the determinant D of the initial data matrix, i.e. finding a set of samples that are maximally independent of each other. Ideal distribution: if calibration set contains n samples, monitoring and test sets should contain between n/2 and n samples each. In this case: calibration set – 22 samples, monitoring set – 11 samples, test set – 21 samples.

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 13 Optimization of the neural network architecture Aim: minimization of prediction error AND number of network parameters (weights), i.e. hidden and input neurons. Optimized ANN for calibration w.r.t. content of : Ammonium: 5  2  1 Oxalate: 5  3  1 Citrate: 7  2  1

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 14 Determination of ammonium, oxalate and citrate content and time elapsed from the media inoculation in the model growth media using ET Average relative error of prediction, % Calibration methodAmmoniumOxalateCitrateTimeSamples ANN 6682without sodium azide 7772with sodium azide all data PLS without sodium azide 106 7with sodium azide

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 15 PCA score plot of ET measurements in growth media with and without sodium azide added

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 16 Non-linearity of the sensors’ responses Calibration of ET w.r.t. to ammonium concentration using ANN

WCS-4, February 15—18, 2005, Moscow (Chernogolovka), Russia A. Rudnitskaya et al St. Petersburg University 17 Conclusions An ET system comprising a sensor array based on ten PVC- plasticized cross-sensitive potentiometric chemical sensors was successfully applied to simultaneous determination of ammonium, oxalate and citrate content in simulated fermentation media closely resembling real-world samples typical of a process involving Aspergillus niger. Feed-forward neural network was found to be superior to PLS regression for the ET data fitting due to better consideration of non- linearity of the sensor potentials/concentration relationship particularly at low concentration levels. The average prediction errors for key metabolites’ concentrations in the given ranges was about 6-8% when using a feed-forward artificial neural network for ET calibration. Content of three key components of the growth media can be measured by ET in the presence of 10 mM sodium azide, which is commonly used to suppress microbial activity after sampling. ET was demonstrated to be promising for monitoring fermentation processes.