HighTrafficWiki

What design features would be needed for a really high-traffic wiki, with a level of traffic and frequency of posting comparable to SlashDot?


Handling a high frequency of posting

On most wikis, if people tried to post to a single page as fast as people post to a single story on SlashDot, almost everyone would get EditConflicts? and they would never be able to complete their post.

MergingAutomatically? could solve this problem.

The “My Watchlist” can help people keep up to date on pages they find interesting, without having to keep up with the firehose of a complete RecentChanges.

Wikipedia is already the #11 most frequently visited site, far more popular than Slashdot at #296 (according to Alexa.com ). What are good technical features of Wikipedia? What features of Wikipedia would people do differently, if they had to do it over again?


Handling large amounts of posting may be a technical advantage, but it doesn’t work out socially. People can’t handle all that many comments. When topics get busy, many times a new comment has already been posted before they can make a comment, which they should take into account. Unfortunately, they usually don’t, thus you end up with redundancy even within a single topic.


it doesn’t work out socially.

(sarcasm)

Clearly, high-volume web sites such as SlashDot, Wikipedia, Google, MySpace, YouTube, C2, etc. can’t possibly work. Mr. Taco and the sysops of those other sites should realize this and stop accepting submissions from mere laypeople.

It should be obvious to the most casual observer that only authorized, accredited experts should be allowed to post anything to public parts of the Internet. (/sarcasm)

My goal for DistributingWiki / DistributedEditing is to make a wiki that still works great even when hit by the sorts of little glitches that knock ordinary websites offline. Although not one of my primary goals, I suspect that a wiki distributed over many servers will be able to handle more traffic than any one server could handle alone.

Here’s some info I’ve collected on improving wiki peformance:

Web Proxy

Squid cache

Wikipedia has been dealing with problem via Wikipedia:Squid_cache, which distributes the serving end of things, anyway.

Grid Server

Media Temple gridserver

GridServer supposedly can handle high traffic much better than traditional VPS. Exampelof OddMuse wiki runnign on Gridserver: http://s4560.gridserver.com/cgi-bin/wiki2/FrontPage

Improvements to traditional single servers

Lighthttpd ("lighty") and fastcgi

(for Perl and Python based wiki engines)

Some wiki admins have reported better results by using "lighty" instead of ApacheWebServer?. "lighty" includes FastCGI.

The advantage to using "lighty" & FastCGI is:

Sheep Art is reported to run on "lighty" & FastCGI. A comparison of Apache benchmarking with OBM Wiki which currently runs on traditional CGI and Apache2 webserver, shows that Sheep Art serves multiple requests over time almost twice as fast as OBM Wiki, probably due in large part to "lighty" & FastCGI, since tests were run on both very old/low memory and more modern machines running "lighty" & FastCGI, vs CGI and Apache2.

There is also mod_fastcgi for Apache.

Distributing Wiki

I’ve listed some ideas on 2007-02-08.

Ultra Monkey?

At this time, I don’t know of any successful implementations of distributing wiki across servers. {http://www.ultramonkey.org/ UltraMonkey?] may hold some possibilities, but something still needs to synchronize the application across servers.

GridOs Ideas

Emerging technology is addressing distribution by creating a GridOs?.

Examples:

This technology seems to be momentarily out of reach for us mere mortals, however. I have not found any OpenSource software projects that have come anywhere near this (yet).

See Also

CategoryWikiTechnology

Discussion

BayleShanks, I’m intrigued by the question, and I’m excited to see the discussion you all are having.

I am wondering, though; “What motivates this question?”

Only a tiny handful of wiki (likely not including WardsWiki, but almost certainly including the WikiPedia MediaWiki,) have a need to answer this question. Are motivated to find an answer to this question.

So, … I was wondering, … BayleShanks

Do you have any particularly cool attendant news, perhaps, or ideas, that inspire this question? HmmmMMMM?!? ;) :D :) Do you?!?

can provide full wiki experience using almost entirely only static files.

Writing a whole engine with these ideas in mind could provide a really extremely effective solution, including static recent changes and history files. Editing is, obviously, mucg greater problem – locks, conflict resolving, keeping track of the most recent version, processing power needed for saving the pages – not even mentioning anti-spam filters – it all complicates things somewhat.

Good points, Radomir. There is already OddMuse:Static_Hybrid_Module for instance.

I have experienced more edit conflicts in Community than in Wikipedia :)

Also, with the other issue about “It doesn’t work socially”. I don’t agree that it doesn’t work socially. I just think it works differently socially than it does in smaller groups (EcosystemOfNetworks).

“What features of wikipedia would we do differently if we did it over again?” is a question that I want to come back and answer after thinking about it some more. Not just technical from a serving standpoint, but also from a wiki engine design perspective, WikiPatterns perspective, etc

Define external redirect: MultiPlexing ApacheWebServer GridOs UltraMonkey EditConflicts MergingAutomatically

EditNearLinks: SlashDot WikiPatterns MediaWiki WardsWiki YouTube OpenSource

Languages: