DistributedEditing

Soon this page will be merged into DistributingWiki. This page is entirely about distributed wiki editing, not any other kind of distributed editing, right?

When you have a bunch of wikis sharing pages, then you can edit a page X on any of the wikis, and your change will propagate through the network until all instances of page X have been changed. This uses the term “network” to mean all the wikis sharing page X. This network might be big or small.

This requires MergingAutomatically?, as you probably don’t want to use a two-phase commit mechanism to make sure that your change actually spread to all wikis in the network. Using CVS or a similar version control system as your back-end is a good start. CommunityRepository proposes using the “darcs” distributed version control system.

Note that in an InterWiki framework, the individual wikis need not necessarily share all their pages.

Why It Would Be Cool

What do we need a distributed wiki for?

Good question.

A few weeks ago, a server hosting some of my own (static) web pages went offline. Fortunately, I still have – on my local hard-drive – the master copy of all the files. But it’s a bit of a hassle periodically emailing the sysadmin, periodically checking to see if it’s still online, and then uploading the files when it comes back.

A few months ago, the server hosting one of my favorite wiki (the VisualLanguage wiki) went offline. It came back online a week or so later, but some of the pages were damaged, and a few of the pages (and illustrations) were lost.

About a year ago, a hard drive failed in a computer owned by a friend of mine. Even though most of the really critical information was backed up, it was a big hassle plugging in a fresh hard drive and restoring things to the point where he could recover his “bookmarks” listing his favorite web pages.

A few years ago, I got an entirely new desktop box. It was a big hassle moving all “my” files to the new box.

A few years ago, a hard drive failed in a computer owned by a different friend. All the information was lost.

Somehow I have it in my head that if this information were in a “distributed wiki”, things would have been a lot better in all these situations.

– (copied from DavidCary in discussion below)

Implementations

The approach I’ve taken (this time, I’ve done a couple of implementations) is to use a CVS server to sync the content on different servers – allowing editing of the same content on different machines.

The analogy I’d use is if a standalone wiki sort of often acts in scenarios where people might use an email list, my approach to a distributed wiki could be used where people might normally use newsgroups

In the past I’ve also set up read only mirroring of content from one wiki into a local server using rsync and then automatically changing edit links back to the original wiki server. That way you get locality of content whilst keeping the community editing in one location. Again it’s worth noting that this way copying of all between servers to simplify the system.

The cultural change was interesting – adding documents to the wiki became an accepted manner of exchanging content since the content would be accessible quickly to all. – MichaelSparks?

Clever. Pages from the local wiki is merely cached (read-only) on the local wiki. Rather than make the edit links point directly back to the original wiki server, perhaps it would be better to make them what appears to be a local edit URI, so (if clicked) the local wiki (1) flushes that page from its cache, before (2) redirecting the user to the remote edit URI.

It is clear that this is merely a local cache and the distant wiki always holds the “master version” of a page. That avoids any forking. Does that avoid all CopyrightTraps mentioned on DistributingWiki ?

DavidCary

Discussion

See WikiFeatures:FailSafeWiki.

I’m thinking about such a system again. I think what I basically need is a way of synchronizing a wiki with a directory of plain text files. Once this is done, we only need two things:

  1. A cron job that updates the directory of plain text files from a version control repository (and the beautiful thing is that this will work with any backend)
  2. A way of synchronizing a wiki with the directory of flat files (Oddmuse would allow an easy way of moving the text around using wget/curl and the raw interface)

I have a script that uploads a directory of text files to a wiki (but not the reverse; it doesn’t download changes from the wiki or do any merging). But mvs seems more advanced:

http://search.cpan.org/~markj/WWW-Mediawiki-Client-0.19/bin/mvs

But why duplicate the functionality of the version control software in three places? Just use some sort of decentralized version control software (like Arch, although maybe there’s others) as the wiki backend.

Well, in my case I want to plug the thing into Oddmuse, and I want this to be something you can decide to do without the founder of the Oddmuse wiki knowing about it. Truly decentralized in the sense that anybody can “hook-up” unless they are being banned for some reason. And I don’t want to switch Oddmuse itself to such a system because it adds installation requirements, and one of the Oddmuse design principles is that installation should be totally trivial.

Good points, I agree.

Designing a system so that the original site doesn’t have to support is turns out to be ugly – you basically have to either download every page in raw format, or extend the wiki engine so that you at least request the HEAD for every page giving you some info about the last-changed timestamp for that page only, etc. That would save some bandwidth, assuming not many pages have changed, but it would still require at least one request per page. The alternative is to design two commands – one to import the entire wiki, and one to update according to info provided on RecentChanges. This seems tricky as well.

So now I’m thinking about writing the entire using a simple protocol: Get the list of all pages with timestamp from upstream, and figure out what to do. If we store the timestamp of the last sync operation, we can submit changes including the timestamp of our common ancestor. That would be great, because we get MergingAutomatically?. The drawback would be that you need a local Oddmuse wiki, too.

Alternatively, I could abandon the position that the original server doesn’t have to know about other people branching. If the server has to know whether other people want to branch or not, we can go all the way: I could write an extension that replaces the filesystem layer with a version control system. Back to what you suggested. Hm…

I think two commands, one to import the wiki, one to update based on RecentChanges, is the way to go. Because this depends only on primitives that practically every wiki supports. Therefore, the system could be adapted to use WikiGateway primitives for the actual communication with the upstream wiki, and then it will work not just for OddMuse but for whatever wiki engines are supported by WikiGateway.

I don’t quite understand the latter two paragraphs. Do you mean that you would add a new action to OddMuse, “give me a list of all pages, and for each page tell me its timestamp”?

And why would a local OddMuse be needed?

Also, if I understand correctly, then why is this less tricky than having an “import wiki” command and an “update based on RecentChanges” command? In both cases, you are identifying a list of pages which were changed since the last sync, merging the changes, and uploading the results. The only difference is whether you get the timestamps for RecentChanges or from the new “lsit pages and timestamps” command.

If you are worried about parsing OddMuse RecentChanges or ModWiki feed, Wiki::Gateway already does this:

$ perl -MWiki::Gateway
use Data::Dumper;
print Dumper(Wiki::Gateway::getRecentChanges('http://www.emacswiki.org/cw', 'oddmuse1', 'april 9, 2005'));
^D
$VAR1 = {
          'lastModified' => '2005-04-09T01:05:01+00:00',
          'importance' => 'major',
          'comment' => 'reply; i like "import wiki" and "update from RC"; but i\'m confused about your other idea',
          'version' => '21',
          'name' => 'DistributedEditing'
        };
$VAR2 = {
          'lastModified' => '2005-04-09T00:47:47+00:00',
          'importance' => 'major',
          'comment' => 'InterWikiWorkshopProposal',
          'version' => '1',
          'name' => '2005-04-08'
        };

(btw, this is the current CVS version of Wiki::Gateway, not released yet; but it’s pretty much ready to go, I could release it soon if needed)

Or maybe you meant the new action to be, “give me a list of all pages which have been modified since this time”; but this is what RecentChanges does, right?

The benefit of a separate action that gives me all the data per page is that I am no longer forced to remember the state of our interaction. Should I refresh? Or update from RC? If I’ve waited too long, I’ll have to refresh anyway… Too complicated.

You mean so that the client doesn’t have to remember the last time they synchronized?

Hm, given that I still need to remember the timestamp of the last change for every single page, I guess we could figure out what the last sync point was by just taking the lastest of these timestamps… Heh, I think you’re right.

Yeah – and if you’re going to do that, you may as well just save the last-sync date. Because unless you want to upload every page every time, the client has to save some sort of state (besides the textual contents of the pages, I mean). So why not just save everything you need?

So it’s not that complicated;

  • On “checkout”, download all pages. Create a special file in which to save the current date.
  • User edits various pages over some time period
  • On “sync”:
    • look in your special file to see when the last-sync date was
    • Go through every page, and make a list of those pages which have been locally modified since the sync date (you can just use the last modified file attribute if each page is a text file)
    • Now ask the upstream wiki server for RecentChanges since that date
    • Take the intersection of these two lists; for each page in the intersection, merge conflicts, or flag as an edit conflict (and maybe copy the user’s version to a special folder, to make room for the updated upstream version)
    • Now, upload every locally changed file to the upstream wiki (except for the unresolvable edit conflicts)
    • Next, download every page that the wiki told you about in RecentChanges (to save time, you could skip d/ling those that you just uploaded)
    • Now you are re-sync’d (i.e. the text files in your local copy folder exactly match the wiki). Update the last-sync date in your special file.

In pseudo-code:

on checkout{

foreach page in allPages:
  download page

save current date to ".last-sync"

}

on sync {
#
# 1) create locallyModifiedList
#
lastSync = loadFromFile(".last-sync")
foreach page in localdir:
  if page.lastModified > lastSync:
    push page onto locallyModifiedList

#
# 2) get recentChangesList
#
recentChangesList = getRecentChangesSince(lastSync)


#
# 3) do merges
#
toBeMerged = intersection(locallyModifiedList, recentChangesList)

foreach page in toBeMerged:
   success = tryToMergePage(loadFromFile(page), getFromWiki(page))
   if not success:
     cp "page" "conflicts/page"
     push page onto editConflictsList


#
# 4) do uploads
#
toBeUploadedList = locallyModifiedList - editConflictsList

foreach page in toBeUploadedList:
  putPageToWiki(page)


#
# 5) do downloads
#
toBeDownloadedList = recentChangesList - toBeUploadedList

foreach page in toBeDownloadedList:
  download page

#
# 6) now we're in sync; save sync date
#
save current date to ".last-sync"
}

Just use some sort of decentralized version control software (like Arch, although maybe there’s others) as the wiki backend.

You make that sound so simple.

I would like to build a fault-tolerant wiki.

Currently I’m considering building it on top of "SVK - Distributed Version Control". Or would “Arch” be better?

I sometimes wonder about the need for distributed wikis, or distributed editing of a wiki: What do we need it for? You’re saying you’d like to build a “fault-tolerant” wiki. But: What kind of faults are you protecting yourself against? What kind of problems do you currently have that such a wiki would help you solve?

I guess that’s why I stopped working on distributed anything.

I can see the need for two things:

  1. Offline editing of pages and synchronizing when you’re back online. In this scenario, the offline user doesn’t need to have a complete wiki including a local web server. It would be nice, but basically a list of flat files is good enough. This is what my solution using SVN does.
  2. Distributing load amongst various servers. The only wiki that has earned the right to require this is WikiPedia, I think, and they have chosen a different route: They are hosted on server farms, with caching proxies, etc. All of it old-school technology for scaling websites and databases, nothing MediaWiki specific. That’s why I don’t think there really is a need to innovate in this area.

That is, unless there’s some kind of PeerToPeer vision I have been missing.

What do we need a distributed wiki for?

Good question.

A few weeks ago, a server hosting some of my own (static) web pages went offline. Fortunately, I still have – on my local hard-drive – the master copy of all the files. But it’s a bit of a hassle periodically emailing the sysadmin, periodically checking to see if it’s still online, and then uploading the files when it comes back.

A few months ago, the server hosting one of my favorite wiki (the VisualLanguage wiki) went offline. It came back online a week or so later, but some of the pages were damaged, and a few of the pages (and illustrations) were lost.

About a year ago, a hard drive failed in a computer owned by a friend of mine. Even though most of the really critical information was backed up, it was a big hassle plugging in a fresh hard drive and restoring things to the point where he could recover his “bookmarks” listing his favorite web pages.

A few years ago, I got an entirely new desktop box. It was a big hassle moving all “my” files to the new box.

A few years ago, a hard drive failed in a computer owned by a different friend. All the information was lost.

Somehow I have it in my head that if this information were in a “distributed wiki”, things would have been a lot better in all these situations.

Nice.

Alex?

;)

Translation Issues

I’m currently doing some translations in french, and i was just thinking about those problems of synchronization but at a human level for the “end processing”. For example when i’m done with a given page, it would be nice to put some sort of flag on it that would point to the page of reference, and record it’s current revision number. Then if someone comes and read the translated page later on, she would be warned in case there is a more recent revision of that page of reference. It would be up to the reader to check what the differences really are, and if there’s a need to edit the dependent page. I thought about “transcluding” at the end of the translated page the revisions history page filtered by date, and writing by hand the revision number at the time of translation. But i guess that it would not be a very “nice” solution, i’m not sure that my user rights allow me to do it, and i don’t know anyways how and if it’s possible to filter the history page.

Another example : I want to keep some information up-to-date about WikiNodes for translation purpose. So, I know that the page of reference is http://wikinodes.wiki.taoriver.net/moin.cgi/WikiNode. But, say, I’ve seen here another place where the subject is discussed. So I need to watch both. Then again, it would be nice to be alerted within my page of the modifications. I wonder if this could be a possible functionnality ? – SebastienSauteur

Such a system exists for Oddmuse wikis – but for local pages only. See Oddmuse:Translation Extension. Since it is an extension, I would have to install it. Right now only the Oddmuse manual uses it, eg. Oddmuse:網站地圖 links back to the Oddmuse:SiteMap.

Thanks, that’s exactly what i was thinking about :-) i’ll try it.


See also DistributedWiki.

CategoryInterWiki?

Define external redirect: MichaelSparks MergingAutomatically CategoryInterWiki

EditNearLinks: ModWiki MediaWiki WikiNodes

Languages: