BannedContentDiscussion

Last major edit (later minor edits)

Summary: Some more documentation of the tools at our proposal

Changed:

< We're using a Captcha to protect this wiki: [[Oddmuse:QuestionAsker Extension]]. In addition to that, we support a BannedContent page that prevents pages from being saved if they contain text matching one of the regular expressions.
< ## Old
< The rest of
this page contains outdated information.
< Related pages:
< * UnbannedContentProposals to remove junk from the list
< * BannedContentMeasurements for some (too few!) numbers
< Discussion:
< * BannedContentExpiration for automatic expiration
< ## Anti-Spam Network
< If you use our format
, you can join the network.
< Basically we all copy our BannedContent from
< [http://www.emacswiki.org/cgi-bin/wiki/BannedContent EmacsWiki BannedContent].
< ## Format
< See MeatBall:SharedAntiSpam for
the format. Basically # introduces a comment,
< everything else is a
regular expression that matches URLs.
< ## Implementation
< The script that copies regular expressions from A to B, but leaves the
< comments at B intact and adds timestamps for the new regular expressions
< imported:
< * http://www.emacswiki.org/scripts/merge-banned-lists
< It uses wikiput, which is available here (for Oddmuse wikis):
< * http://www.oddmuse.org/cgi-bin/oddmuse/wikiput
< References:
< * MeatBall:PeerToPeerBanList
< ## Joining
< If you want to use this list, just go ahead. It might be easier to parse if
you get the plain text:
< * http://www.emacswiki.org/cgi-bin/community/raw/BannedContent
< ## Cases
< [[Banned Content Reduces Pagerank Case]] -- a web hosting provider wants to get
a client of the list...
< ## Discussion
< [new:MattisManzel:2004-07-09 10:56 UTC]
< Yes. I appreciate thinking forward in this direction. But I guess you know how supertouchy this is though.
< See SpamIsInformation.
< [new:RichardParker:2004-12-04 13:22 UTC]
< I am greatly concerned by the inclusiveness of some of the regular expressions used to ban domains. For example
, if '''sex.com''' were banned like most of the banned domains, it probably would appear as '''sex\.com''' on the list. However such an expression would match content like '''sussex.com''', '''sex.comprehensive-doctors-advice.org''' or '''sussex.company-maps.co.uk'''.
< I highly recommend that when banning domains the regular expressions should look something like '''"\b([\w\-.] \.)?example\.com\b"''' instead of '''example\.com'''.
< -- RichardParker
< [new:BayleShanks:2004-12-05 21:38 UTC]
< The script "merge-list" no longer works from my PC. Error msg is "http://www.emacswiki.org/cgi-bin/wiki/raw/BannedContent: 500 Internal Server Error". "wget http://www.emacswiki.org/cgi-bin/wiki/raw/BannedContent" also fails. But my Firefox browser can load the same page fine. Any ideas? Could user-agent 'Mozilla/5.0
' possibly be blocked??
< [new:AlexSchroeder:2004-12-05 21:59 UTC]
< My wikis are located on a server
that has been having Apache 2 trouble. From time to time it will go down. The machine is on the net, you can send packets to port 80, but the web server never responds, resulting in a time-out. Maybe that explains it? Note that I fixed the wikiput script to only put the page if there is actually any content. My current cronjob used to not test and erase the target page if it got an error from the source page. :/
< [new:Kunda
:2005-01-05 10:27 UTC]
< Greets, I contribute to wiki.s23.org. I was just wondering what is the overhead like loading all the items in
the BannedContent page into a wiki engine? Also, I'm confused as to why there are no numeric IPs on the list? If you reply, please add a quick follow up link directed back here on [http://wiki.s23.org/wiki.pl?InterNal#BigBan S23-InterNal]
< Cheers --Kunda
< [new:AlexSchroeder:2005-01-09 12:57 UTC]
< I have
no numbers on the loading of all items. Two things: I suspect that loading the file eats about as much memory as the file has bytes, and it is only loaded when a page is edited. That doesn't happen too often.
< As for IPs, note that this is banned /content/. There is also a BannedHosts
page. The BannedContent applies to the /content/ of pages, not the host/IP of the author.
< [new]
< A BAS LA CENSURE!... DE QUEL DROIT VOUS-CROYEZ VOUS AUTORISES A ETABLIR UNE LISTE NOIRE DE SITES?... SUR QUELS CRITERES?... ET POURQUOI PAS UN AUTODAFE PENDANT QUE VOUS Y ETES?... C'EST DU NAZISME!...
< [new:AlexSchroederPoke:2005-02-03 15:13 UTC]
< Idiot. :)
< [new:BayleShanks:2005-02-03 01:44 UTC]
< Some people have blacklists larger than the limit for wiki
page sizes.
< I suggest modifying OddMuse to merge the
lists on all pages matching this regexp:
< <pre>
< BannedContent\d*
< </pre>

< that is, if BannedContent isn't big enough for you, you could continue the list on page BannedContent2, or BannedContent1234, etc.
< [new:AlexSchroeder:2005-02-03 15:14 UTC]
< Hm, yes...
< [new:CharlesNepote:2005-02-12 21:36 UTC]
< Thinking about another format.
< Many WikiEngines do not use Perl and won't be able to implement such a format. This format is also very poor without extentions possibility. For example, first of all, on my wiki talking about sex, secs.com isn't a SPAM... Why not use an agnostic format with more properties and extensibility ? I started to make some propositions for a RDF Vocabulary : (it's in french but the vocabulary is in english) :
< * http://www.wikini.net/wakka.php?wiki=VocabulaireRDFAntiSpamDiscussions
< [new:]
< Hm. I don't think people would like to put this much effort into classifying spam. As for your s<nowiki>e</nowiki>x.com example, just add it to your local exception list... (I don't speak french, so I can't comment on the actual contents of your page) -- JorgenSchäfer:2005-02-12 21:43 UTC
< ----
< The guys are chongqed.org
are also maintaining a blacklist: http://blacklist.chongqed.org/ This is based on the [http://chongqed.org/submit.html spam submissions] they get, which are checked manually to ensure only real spammers are listed. There's a few wikis which take that as their primary blacklist to synch with. Also the dokuwiki software includes it by default. It could be they grabbed regular expressions from here to start the list, certainly it might make sense to organise some more inter-operating in this way. I think automated updating of blacklists can be extremely effective if a spammer attacks one wiki, then finds themselves immediately shut out of many wikis, but it will help if propagating the regexps is something which happens automatically.
< I like CharlesNepote's idea
(although I also don't understand french). At the moment you've started forming a little trusted network, but if we can keep track of who (which wiki) added a regexp, this might allow a kind of semi-trusted network to form, in which regexps propagate from the wiki which suffered the first attack, to all other wikis automatically. I also think it would be good to keep track of ''when'' a regexp was added, and possibly when it last caught spam. This is useful to identify ones which we dont really need any more. This kind of information does not necessarily require a great deal of human classification effort. On the downside it will require considerable implementation effort, which makes it unlikely to be adopted across many different wiki softwares. See also [http://wiki.chongqed.org//ContentBanning discussion here]. -- [http://wiki.chongqed.org//Halz halz@chongqed]
< [new:AlexSchroeder:2005-08-26 06
:15 UTC]
< I've seen spam containing only links to sites with domain names provided by dynamic dns services or free hosting sites
. This is problematic because the URL blacklist works due to economic reasons: Registering new domains costs (a little bit of) money. Either our blacklists will start blocking these, or we'll have to find another method of banning content.
< [new:SamRose:2007-09-14 20:53 UTC]
< I'ev been interested in wikis.onestepback.org/Ruse/
page/show/FrequentlyAskedQuestions and their idea ofa "TarPit". They claim it is working well on an open wiki. (Their wiki engine is a Ruby clone of UseMod)

to

> We're using a Captcha to protect this wiki: [[Oddmuse:QuestionAsker Extension]].
> The BannedContent page prevents pages from being saved if they contain URLs matching one of the regular expressions. You can do this manually, or you can write the regular expression right after you rolled back a page, if you are an admin. There, you'll see the URLs that got rolled back and based on them you can probably write a good regular expression.
> The BannedRegexps page is harsher: the regular expressions it lists are entirely forbidden, no matter where they occur on the page (not just URLs).
> The BannedHosts
page lists regular expressions matching IP numbers that are banned from editing. This is mostly just useful if people keep using their IP numbers. Sometimes we ban entire ranges. You can do this manually (using whois on the IP number), or you can use the proposals of the [[Oddmuse:Ban Contributors Extension]] extension.
> Note that all the banning actions only prevent page editing. Page reading is not affected.


We’re trying to fight WikiSpam, here.

We’re using a Captcha to protect this wiki: Oddmuse:QuestionAsker Extension asks editors for a password and records the fact that you answered correctly in a cookie. Since we’re fighting semi-automated spambots, the password doesn’t have to be hard to find, just not immediately obvious. On 2021-07-22 the name of the cookie was changed so you must all answer the question again, and the password itself was also changed is on the page that’s mentioned by not linked by the prompt (but linked from this page). If you’re having difficulties, send me an email (which also implies a sequence of links to follow, starting at Alex Schroeder).

The BannedContent page prevents pages from being saved if they contain URLs matching one of the regular expressions. You can do this manually, or you can write the regular expression right after you rolled back a page, if you are an admin. There, you’ll see the URLs that got rolled back and based on them you can probably write a good regular expression.

The BannedRegexps page is harsher: the regular expressions it lists are entirely forbidden, no matter where they occur on the page (not just URLs).

The BannedHosts page lists regular expressions matching IP numbers that are banned from editing. This is mostly just useful if people keep using their IP numbers. Sometimes we ban entire ranges. You can do this manually (using whois on the IP number), or you can use the proposals of the Oddmuse:Ban Contributors Extension extension.

Note that all the banning actions only prevent page editing. Page reading is not affected.

(CommunityWikiFooter)

Languages: