Download presentation
Presentation is loading. Please wait.
Published byCharlene Carr Modified over 9 years ago
1
Jump to first page
2
The Generalised Mapping Regressor (GMR) neural network for inverse discontinuous problems Student : Chuan LU Promotor : Prof. Sabine Van Huffel Daily Supervisor : Dr. Giansalvo Cirrincione
3
Jump to first page Mapping Approximation Problem n Feedforward neural networks are : universal approximators of nonlinear continuous functions (many-to-one, one-to-one) they don’t yield multiple solutions they don’t yield infinite solutions they don’t approximate mapping discontinuities
4
Jump to first page Inverse and Discontinuous Problems n Mapping : multi-valued, complex structure. conditional average of the target data n Poor representation of the mapping by least squares approach (sum-of-squares error function) for feedforward neural networks. n Mapping with discontinuities.
5
Jump to first pagegatingnetwork Network 1Network 2Network 3 input output mixture-of-experts It partitions the solution between several networks. It uses a separate network to determine the parameters of each kernel, with a further network to determine the coefficients. winner-take-all Jacobs and Jordan Bishop (ME extension) kernel blending
6
Jump to first page Example #1 ME MLP
7
Jump to first page Example #2 ME MLP
8
Jump to first page Example #3 ME MLP
9
Jump to first page Example #4 ME MLP
10
Jump to first page Generalised Mapping Regressor ( GMR ) (G. Cirrincione and M. Cirrincione, 1998) n approximate every kind of function or relation. input : collection of components of x and y output : estimation of the remaining components n output all solutions, mapping branches, equilevel hypersurfaces. Characteristics :
11
Jump to first page n coarse-to-fine learning incremental competitive based on mapping recovery (curse of dimensionality) n topological neuron linking distance direction n linking tracking branches contours n open architecture function approximation pattern recognition Z (augmented) space unsupervised learning GMR Basic Ideas clustersmapping branches
12
Jump to first page GMR four phases object merged Object Merging LearningRecall- ing branch 1 branch 2 INPUT Linking links object 1 pool of neurons object 2 object 3TrainingSet
13
Jump to first page EXIN Segmentation Neural Network (EXIN SNN) n clustering (G. Cirrincione, 1998) w 4 = x 4 vigilance threshold xx Input/weight space
14
Z (augmented) space coarse quantization EXIN SNN high z ( say 1 ) branch (object) neuron GMR Learning
15
Z (augmented) space production phase Voronoi sets domain setting GMR Learning
16
Z (augmented) space secondary EXIN SNNs z = 2 < 1 TS#1 TS#2 TS#3 TS#4 TS#5 Other levels are possible fine quantization GMR Learning
17
GMR Coarse to fine Learning ( Example) object neuron fine VQ neurons object neuron Voronoi set
18
Jump to first page GMR Linking n Voronoi set: setup of the neuron radius (domain variable) neuron i riri asymmetric radius Task 1 :
19
Jump to first page Weight Space GMR Linking n For one TS presentation: zizi d1d1 w1w1 w5w5 w3w3 w4w4 d1d1 w2w2 d 5 d3d3 d4d4 d2d2 branch and bound search technique k-nn Linking candidates è distance test è direction test è create a link or strengthen a link Task 2 : Linking direction
20
Jump to first page Branch and Bound Accelerated Linking n neuron tree constructed during learning phase (multilevel EXIN SNN learning) n methods in linking candidate step (k-nearest-neighbors computation): -BnB : < d 1, ( ( : linking factor predefined) k-BnB : k predefined.
21
Jump to first page GMR Linking branch-and-bound in linking experimental results: 83 %
22
Jump to first page branch and bound (cont.) Apply branch and bound in learning phase ( labelling ) : n Tree construction k-means EXIN SNN n Experimental results (in the 3-D example) 50% of labeling flops are saved
23
GMR Linking Example link
24
GMR Merging Example
25
GMR Recalling Example level 1 neuron level 2 neuron branch 1 branch 2 è level one neurons : input within their domain è level two neurons : only connected ones è level zero neurons : isolated (noise)
26
Experiments spiral of Archimedes = a (a = 1)
27
Experiments Sparse regions further normalizing + higher mapping resolution
28
Experiments noisy data
29
Experiments
30
contours : links among level one neurons GMR mapping of 8 spheres in a 3-D scene.
31
Jump to first page Conclusions GMR is able to : u solve inverse discontinuous problems u approximate every kind of mapping u yield all the solutions and the corresponding branches GMR can be accelerated by applying tree search techniques GMR needs : p interpolation techniques p kernels or projection techniques for high dimensional data p adaptive parameters
32
Jump to first page Thank you ! (shi-a shi-a)
33
l 1 = 0 b 1 = 0 l 1 = 0 b 1 = 0 l 6 = 0 b 6 = 0 l 6 = 0 b 6 = 0 l 5 = 0 b 5 = 0 l 5 = 0 b 5 = 0 l 2 = 0 b 2 = 0 l 2 = 0 b 2 = 0 l 3 = 0 b 3 = 0 l 3 = 0 b 3 = 0 l 4 = 0 b 4 = 0 l 4 = 0 b 4 = 0 l 7 = 0 b 7 = 0 l 7 = 0 b 7 = 0 l 8 = 0 b 8 = 0 l 8 = 0 b 8 = 0 l 3 = 2 b 3 = 1 l 3 = 2 b 3 = 1 GMR Recall input w1w1 w2w2 w3w3 w7w7 w8w8 w4w4 w5w5 w6w6 r1r1 l 1 = 1 b 1 = 1 l 1 = 1 b 1 = 1 è linking tracking è restricted distance è level one test connected neuron : level zero level two branch the winner branch
34
GMR Recall input w1w1 w2w2 w3w3 w7w7 w8w8 l 1 = 0 b 1 = 0 l 1 = 0 b 1 = 0 l 6 = 0 b 6 = 0 l 6 = 0 b 6 = 0 l 5 = 0 b 5 = 0 l 5 = 0 b 5 = 0 l 2 = 0 b 2 = 0 l 2 = 0 b 2 = 0 l 3 = 0 b 3 = 0 l 3 = 0 b 3 = 0 l 4 = 0 b 4 = 0 l 4 = 0 b 4 = 0 l 7 = 0 b 7 = 0 l 7 = 0 b 7 = 0 l 8 = 0 b 8 = 0 l 8 = 0 b 8 = 0 w4w4 w5w5 w6w6 r2r2 l 1 = 1 b 1 = 1 l 1 = 1 b 1 = 1 l 3 = 2 b 3 = 1 l 3 = 2 b 3 = 1 l 2 = 1 b 2 = 2 l 2 = 1 b 2 = 2 l 2 = 1 b 2 = 1 l 2 = 1 b 2 = 1 è level one test è linking tracking branch cross
35
GMR Recall l 6 = 0 b 6 = 0 l 6 = 0 b 6 = 0 l 6 = 2 b 6 = 4 l 6 = 2 b 6 = 4 l 6 = 1 b 6 = 6 l 6 = 1 b 6 = 6 input w1w1 w2w2 w3w3 l 1 = 0 b 1 = 0 l 1 = 0 b 1 = 0 l 5 = 0 b 5 = 0 l 5 = 0 b 5 = 0 l 2 = 0 b 2 = 0 l 2 = 0 b 2 = 0 l 3 = 0 b 3 = 0 l 3 = 0 b 3 = 0 l 4 = 0 b 4 = 0 l 4 = 0 b 4 = 0 l 7 = 0 b 7 = 0 l 7 = 0 b 7 = 0 l 8 = 0 b 8 = 0 l 8 = 0 b 8 = 0 w4w4 w5w5 w6w6 l 1 = 1 b 1 = 1 l 1 = 1 b 1 = 1 l 3 = 2 b 3 = 1 l 3 = 2 b 3 = 1 l 2 = 1 b 2 = 2 l 2 = 1 b 2 = 2 l 2 = 1 b 2 = 1 l 2 = 1 b 2 = 1 l 4 = 1 b 4 = 4 l 4 = 1 b 4 = 4 l 5 = 2 b 5 = 4 l 5 = 2 b 5 = 4 l 4 = 1 b 4 = 5 l 4 = 1 b 4 = 5 l 4 = 1 b 4 = 4 l 4 = 1 b 4 = 4 … until completion of the candidates è level one neurons : input within their domain è level two neurons : only connected ones è level zero neurons : isolated (noise) w7w7 w8w8 l 6 = 1 b 6 = 4 l 6 = 1 b 6 = 4 clipping Tow Branches Tow Branches Two Branches Two Branches
36
GMR Recall input w1w1 w2w2 w3w3 w7w7 w8w8 l 7 = 0 b 7 = 0 l 7 = 0 b 7 = 0 l 8 = 0 b 8 = 0 l 8 = 0 b 8 = 0 w4w4 w5w5 w6w6 è Output = weight complements of the level one neurons è Output interpolation l 1 = 1 b 1 = 1 l 1 = 1 b 1 = 1 l 3 = 2 b 3 = 1 l 3 = 2 b 3 = 1 l 2 = 1 b 2 = 1 l 2 = 1 b 2 = 1 l 4 = 1 b 4 = 4 l 4 = 1 b 4 = 4 l 4 = 1 b 4 = 4 l 4 = 1 b 4 = 4 l 6 = 1 b 6 = 4 l 6 = 1 b 6 = 4
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.