Download presentation
Presentation is loading. Please wait.
Published byMarsha Robertson Modified over 9 years ago
1
The intelligence of the Social Network, first achievements on FIS A peculiar scientific problem B. Apolloni and team B. Apolloni, S. Bassis, GL Galliani, L. Ferrari, M. Gioia Pisa 10/12/2014
2
Lead philosophy I don’t know how optimally operate my appliances, hence I ask the network The network learns to optimally satisfy the user request on the basis of the informative triplet The social feature of the network stands in the members’ contribution in terms of profiles (of them and their appliances) and log of the above triplets
3
Recipe production is an adaptation process Start from a base recipe Improve it on the basis of ◦ the feedback, and ◦ operational rules
4
A peculiar cognitive problem Recipes are sequences of parameter/value pairs. Tasks and evaluations are sets of both crisp and fuzzy variables. Fuzzy quantifiers do not refer to a specific metric space
5
The overall procedure in three steps Mining Fuzzy System Inference Reinforcement learning TASK Candidate RECIPES Fine-tuned RECIPE Improved RECIPE
6
The scientific challenge Consider Horn clauses such as ◦ If less crusty and soggy then increase rising time ◦ If very crusty and crisp then decrease rising time Involving: ◦ Crisp variables: rising time ◦ Fuzzy variables: crustiness, humidity With fuzzy quantifiers: less, normal, very
7
Fuzzy Inference System The generic fuzzy rule system The Sugeno variant
8
Hence we must infer x as well A 11 A 12 A 22 Input x = (x 1,x 2 ) w 11 w 12 w 21 w 22 product T norm w1w1 w2w2 f1f1 f2f2 x1x1 x2x2 x f = w1 f1(x,α)+w2 f2(x,α)f = w1 f1(x,α)+w2 f2(x,α) Weight w ij Normalized weight w ij Sugeno functions f i Output f Fuzzy set A, B crustiness humidity rising time With the furt her com plica tion
9
Rather we must infer from the evaluation g induced by f A1A1 A2A2 B1B1 B2B2 w 11 w 12 w 21 w 22 w1w1 w2w2 f1f1 f2f2 x1x1 x2x2 x f = w1 f1(x,α)+w2 f2(x,α)f = w1 f1(x,α)+w2 f2(x,α) Output f: rising time setting crustiness humidity rising time Evaluation g: evaluation proposed on crustiness and humidity It tastes somewhat custy for my teeth -5 -4 -3 -2 -1 0 1 2 3 4 5 Too soft Too crusty crustiness humidity
10
A mixture of identification and control Let’s recall distal learning by Rumelart and Jordan Learn to compute the u once you have learnt the PLANT model for whatever u NNc NNi PLANT - u + o odod
11
Computational issues Output f: rising time → positive continuous Evaluation g: judgement → likert scale Parameters θ : ◦ of a membership function Vertex of triangular mf Mean and std of asymmetric Gaussian mf … ◦ of the Sugeno function usually linear in the function ◦ the input position within the membership function as well to identify input coordinates Error E: e.g. g 2 -5 -4 -3 -2 -1 0 1 2 3 4 5 Too high Too low Optimal High Crustiness a hc b hc c hc f 1 (x,α) = α 0 + α 1 x 1 + α 2 log(x 2 ) High Crustiness a hc b hc c hc x1x1 α 0 + α 1 x 1 + … x1x1 Active variables
12
Simply a richer derivative chain Legend Output f: rising time Evaluation g: judgement Parameters θ Error E: e.g. g 2 //(y-f) 2 Identification phase canonical learning update rule Control phase analitic way numeric way canonical learning update rule
13
On-line learning 1.Prepare a new bread 2.Taste it 3.Evaluate the bread 4.On the basis of the evaluation, adjust the bread machine parameters through the neurofuzzy system Neurofuzzy system [f(t)]g(t)g(t) f(t+1) g(t+1)
14
Two kinds of retropropagated signals 1. Task-related signals a.User judgment 0n-line b.Target appliance parameter off-line mode 2. Empirical evidence-related signals Low Crustiness Medium Crustiness High Crustiness Low Crustiness Medium Crustiness High Crustiness The doble life of g i s: fuzzy sets as for antecedents, Likert metrics as for consequent
15
Early numerical results: a case study Membership function and coordiantes inference Input trajectory (green arrow) and true input (red ponts) The coordinates tracks True mf (green triangles) vs. inferred mf (green triangles) Error course Error computed on arbitrary target with non-linear Sugeno functions
16
Early experiments a close case Main features: ◦ 10 parameters to beautify the face ◦ 4 evaluation criteria (from -5 to +5) ◦ No analytical nor monotone relations between parameters and evaluation http://37.187.78.130/facedeform/
17
The identification phase Relating the four judgements to two parameters
18
Training and generalization problems No training from judgements if no on-line learning No on-line learning if the training algorithm is not efficient.
19
Say bye bye Bruno
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.