Comment Spam Identification Eric Cheng & Eric Steinlauf
What is comment spam?
Total spam:1,226,026,178 Total ham:62,723,306 95% are spam! Source: Retrieved 4/22/2007http://akismet.com/stats/
Countermeasures
Blacklisting 5yx.org 9kx.com aakl.com aaql.com aazl.com abcwaynet.com abgv.com abjg.com ablazeglass.com abseilextreme.net actionbenevole.com acvt.com adbx.com adhouseaz.com advantechmicro.com aeur.com aeza.com agentcom.com ailh.org akbu.com alaskafibre.com alkm.com alqx.com alumcasting-eng-inc.co! americanasb.com amwayau.com amwaynz.com amwaysa.com amysudderjoy.com anfb.com anlusa.net aobr.com aoeb.com apoctech.com apqf.com areagent.com artstonehalloweencostumes.com globalplasticscrap.com gowest-veritas.com greenlightgo.org hadjimitsis.com healthcarefx.com herctrade.com hobbyhighway.com hominginc.com hongkongdivas.com hpspyacademy.com hzlr.com idlemindsonline.com internetmarketingserve.com jesh.org jfcp.com jfss.com jittersjapan.com jkjf.com jkmrw.com jknr.com jksp.com jkys.com jtjk.com justfareed.com justyourbag.com kimsanghee.org kiosksusa.com knivesnstuff.com knoxvillevideo.com ksj! kwscuolashop.com lancashiremcs.com lnjk.com localmediaaccess.com lrgww.com marketing-in-china.com rockymountainair.org rstechresources.com samsung-integer.com sandiegonhs.org screwpile.org scvend.org sell-in-china.com sensationalwraps.com sevierdesign.com starbikeshop.com struthersinc.com swarangeet.com thecorporategroup.net thehawleyco.com thehumancrystal.com thinkaids.org thisandthatgiftshop.net thomsungroup.com ti0.org timeby.net tradewindswf.com tradingb2c.com turkeycogroup.net vassagospalace.com vyoung.net web-toggery.com webedgewars.com webshoponsalead.com webtoggery.com willman-paris.com worldwidegoans.com
Captchas "Completely Automated Public Turing test to tell Computers and Humans Apart"
Other ad-hoc/weak methods Authentication / registration Comment throttling Disallowing links in comments Moderation
Our Approach – Naïve Bayes Statistical Adaptive Automatic Scalable and extensible Works well for spam
Naïve Bayes
P(A|B) ∙ P(B)= P(B|A) ∙ P(A)= P(AB)
P(A|B) ∙ P(B)= P(B|A) ∙ P(A)
P(A|B) = P(B|A) ∙ P(A) / P(B)
P(spam|comment) = P(comment|spam) ∙ P(spam) / P(comment)
P(spam|comment) = P(comment|spam) ∙ P(spam) / P(comment)
P(spam|comment) = P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam) / P(comment) (naïve assumption) Probability of w 1 occurring given a spam comment
P(w 1 |spam) = 1 – (1 – x/y) n Probability of w 1 occurring given a spam comment where x is the number of times w 1 appears in all spam messages, y is the total number of words in all spam messages, and n is the length of the given comment Texas casino Online Texas hold’emTexas gambling site P(Texas|spam) = 1 – (1 – 2/5) 3 = CorpusIncoming Comment
P(spam|comment) = P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam) / P(comment) Probability of w 1 occurring given a spam comment
P(spam|comment) = P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam) / P(comment) Probability of w 1 occurring given a spam comment Probability of something being spam
P(spam|comment) = P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam) / P(comment) Probability of w 1 occurring given a spam comment Probability of something being spam ??????
P(spam|comment) = P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam) / P(comment) Probability of w 1 occurring given a spam comment Probability of something being spam ?????? P(ham|comment) = P(w 1 |ham) ∙ P(w 2 |ham) ∙ … P(w n |ham) ∙ P(ham) / P(comment)
P(spam|comment) P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam) Probability of w 1 occurring given a spam comment Probability of something being spam P(ham|comment) P(w 1 |ham) ∙ P(w 2 |ham) ∙ … P(w n |ham) ∙ P(ham)
P(spam|comment) P(w 1 |spam) ∙ P(w 2 |spam) ∙ … P(w n |spam) ∙ P(spam)) P(ham|comment) P(w 1 |ham) ∙ P(w 2 |ham) ∙ … P(w n |ham) ∙ P(ham)) log( ) )
log(P(spam|comment)) log(P(w 1 |spam)) + log(P(w 2 |spam)) + … log(P(w n |spam)) + log(P(spam)) log(P(ham|comment)) log(P(w 1 |ham)) + log(P(w 2 |ham)) + … log(P(w n |ham)) + log(P(ham))
P(spam|comment) = 1 – P(ham|comment) Fact: Abuse of notation: P(s) = P(spam|comment) P(h) = P(ham|comment)
P(s) = 1 – P(h) m = log(P(s)) – log(P(h)) = log(P(s)/P(h)) e m = e log(P(s)/P(h)) = P(s)/P(h) e m ∙ P(h) = P(s)
P(s) = 1 – P(h) m = log(P(s)) – log(P(h)) e m ∙ P(h) = P(s) e m ∙ P(h) = 1 – P(h) (e m + 1) ∙ P(h) = 1 P(h) = 1/(e m +1) P(s) = 1 – P(h)
m = log(P(s)) – log(P(h)) P(h) = 1/(e m +1) P(s) = 1 – P(h)
m = log(P(spam|comment)) – log(P(ham|comment)) P(ham|comment) = 1/(e m +1) P(spam|comment) = 1 – P(ham|comment)
log(P(ham|comment)) log(P(spam|comment)) In practice, just compare
Implementation
Corpus A collection of 50 blog pages with 1024 comments Manually tagged as spam/non-spam 67% are spam Provided by the Informatics Institute at University of Amsterdam Blocking Blog Spam with Language Model Disagreement, G. Mishne, D. Carmel, and R. Lempel. In: AIRWeb '05 - First International Workshop on Adversarial Information Retrieval on the Web, at the 14th International World Wide Web Conference (WWW2005), 2005.
Most popular spam words casino betting texas biz holdem poker pills pokerabc teen online bowl gambling sonneries blackjack pharmacy
“Clean” words edu projects week etc went inbox bit someone bike already selling making squad left important pimps
Implementation Corpus parsing and processing Naïve Bayes algorithm Randomly select 70% for training, 30% for testing Stand-alone web service Written entirely in Python
It’s showtime!
Configurations Separator used to tokenize comment Inclusion of words from header Classify based only on most significant words Double count non-spam comments Include article body as non-spam example Boosting
Minimum Error Configuration Separator: [^a-z<>]+ Header: Both Significant words: All Double count: No Include body: No Boosting: No
Varying Configuration Parameters
Boosting Naïve Bayes is applied repeatedly to the data. Produces Weighted Majority Model bayesModels = empty list weights = vector(1) for i in 1 to M: model = naiveBayes(examples, weights) error = computeError(model, examples) weights = adjustWeights(examples, weights, error) bayesModels[i] = [model, error] if error==0: break
Boosting
Future work (or what we did not do)
Data Processing Follow links in comment and include words in target web page More sophisticated tokenization and URL handling (handling $100,000...) Word stemming
Features Ability to incorporate incoming comments into corpus Ability to mark comment as spam/non- spam Assign more weight on page content Adjust probability table based on page content, providing content-sensitive filtering
Comments? No spam, please.