Download presentation
Presentation is loading. Please wait.
1
Dissuasive Methods Against Cheaters in Distributed Systems Kévin Huguenin Ph.D. defense, December 10 th 2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AAA A A
2
Distributed systems and Models cable fiber losses wireless losses computer Van Neuman 2 upon receive(x) y= x + y send y upon receive(x) y= x + y send y crashes, bugs
3
Fault models and Approaches bad very bad Crashes, losses Byzantine faults Dishonest users hardware peoplerational conspros 3
4
4 Approaches: example Masking more roads Preventing speed governing Dissuading speed traps & fines
5
Detection Punishment Dissuasive Approach: How To 5
6
Human nature ◦Collaborative dissemination Social nature ◦Computation in Social Networks Outline 6
7
Collaborative dissemination 7
8
Epidemics Collaborative Dissemination Principle 8
9
Collaborative Dissemination Attacks and Dissuasion 9 Propose 12, 22, 31 (free SMS) 4x Request 12, 22 (free SMS) 4x Send 12, 22 (MMS) period Receive 12, 22, 31 4 guests 3x Propose lessLess guestsSend lessBias selection
10
Verifications Decision Collaborative Dissemination Challenges and Solution A B C I contacted C Yes he didDid he? Did B send what I asked? 0 punished score log CABDEFGH ZYZ Ok No ZYZ Propose lessLess guestsSend lessBias selection 10
11
Social networks 11
12
E.g., polling ◦“Should partners be invited?” Computation in Social Networks 12 Yes! But it sounds cheesy… No! What if my partner had to learn? No way! I have to prevent this from happening But what if people find out?
13
Computation in Social Networks A new model of entities ◦Reputation ◦Privacy Computation ◦Set of entities ◦Input values ◦Compute ? 13
14
The S 3 problem Definition Scalable and Secure distributed computations in Social networks 14
15
The S 3 problem Definition: Candidate S 3 candidate quadruple where is an arbitrary set, is a metric space and is a symmetric function 15
16
The S 3 problem Definition: Scalability -Scalability message, spatial and computational complexities are 16
17
The S 3 problem Definition: Accuracy -Accuracy where 17
18
The S 3 problem Definition: Privacy Probabilistic anonymity For any trace generated from a non-trivial configuration For any coalition of faulty nodes For any non-faulty node Exists another trace (generated from ) s.t. 18
19
The S 3 problem Definition: Privacy Privacy: probabilistic anonymity ◦Discard trivial input configurations ◦(strong): trivial = inputs can be inferred from output alone ◦(weak): trivial = all inputs are equal 19
20
The S 3 problem Definition: faults Model of faulty-nodes: ◦Deviate from the protocol BUT ◦Never behave in such a way that their misbehavior is detected with certainty 20
21
Solving (√,√,weak)-S 3 Architecture groups of size (ring) 21
22
+1 +1 +2 +1 +4 22 Solving (√,√,weak)-S 3 Demo: Polling
23
Solving (√,√,weak)-S 3 Theorem: ◦The protocol S 3 computes aggregation functions for 23
24
Solving (√,√,weak)-S 3 Proof: Scalability Messages Memory 24
25
+1 25 Solving (√,√,weak)-S 3 Proof: Accuracy Attack: Voting
26
+1 +5 +1 -1 +1 26 Solving (√,√,weak)-S 3 Proof: Accuracy Attack: Counting
27
27 Attack: Token corruption Solving (√,√,weak)-S 3 Proof: Accuracy
28
Impact of one faulty-node: ◦Voting: ◦Counting: ◦Aggregation along the ring: none Relative error is 28 Solving (√,√,weak)-S 3 Proof: Accuracy
29
Exists (w.h.p.) two equivalent traces where inputs are swapped Group with no faulty node p p q q 29 Solving (√,√,weak)-S 3 Proof: Privacy Output unchanged
30
Generalization Can compute the multi-set of inputs Can compute any regular function with a fixed input set 30
31
Conclusion & perspectives User-centric models practical solutions Boundaries Massively multi-player online games 31
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.