PublicRefineryProcess

Historically, this page was about how documents on ThePublicWeb could conceivably go through a “Public Refinery Process.”

LionKimbro had some particular idea about how you would have ScratchWiki, where people sketch out ideas, and then a CommunalWiki, where the ideas are collected and organized and safe-guarded by a Community. And then, at the ultimate stage, you would have a ManagedWiki, or a ModeratedWiki?, where it’s mostly static. These are for things that are well trusted, have TrustedLinkLanguage, and so on.

Now, this is a rather nice idea about how things could go, but: Things did not turn out quite like that. For a number of reasons, things just don’t work out that way.

But let us turn to the deeper questions.

And these are deep questions for the PublicWeb?, for the GlobalBrain, in the sense of ThePowerOfQuestions. Surely, these questions are connected with the ZeitGeist.

Some other questions:

Processes that Refine Documents

What are some processes that refine documents?

MobileContent - This idea refers to how CopyLeft documents can go from place to place to place over the Internet, being adapted and refined at each step.

ScratchToCommunalToManaged - This idea refers to intentionally making ScratchWiki, CommunalWiki, and ManagedWiki side by side, chaining documents from one end (scratch) to the other side (managed).

Wiki & the Public Refinery Process

It seems natural that wiki would be the starting point for most documents in a “public refinery process.”

That’s because WikiIsDocumentBased, and because it is SocialSoftware.

When we reach TheEndOfWiki?, where mass-editing is build into most mediums (TheMedium), it makes sense that documents will come out of everywhere. For now, though, it’s mostly just wiki.

Underlying Assumptions

Refinery means that you start with something rough first, and then turn it into something beautiful. You start with coal, and you come out with diamonds. You start with rough rocks, and out comes polished stone.

Polished stone doesn’t come out of thin air, and diamonds aren’t made without pre-existing coal.

People can and will improve documents.

However, we do not assume that there is a simple “ladder of quality,” that you can only ascend or descend. There are “side” paths, fitting documents for particular purposes, fitting documents for certain recipients. What one person finds clear, another finds completely unreadable.

Licensing

Document license affects the refinements that are possible.

One philosophy of licensing is to use a WikiLegal:SlopeOfCopyLeftRestrictions?. When documents are ugly and un-refined, you can just leave it in the PublicDomain. As they become increasingly higher quality, you may want to perform a WikiLegal:RestrictiveUpgrade?.

Some major points:

It seems to me that wiki with fewer DegreesOfEditorialControl should have less stringent licenses, closer to the PublicDomain. That is, a WikiLegal:SlopeOfCopyLeftRestrictions?.

As content is refined, apply stricter and stricter CopyLeft (WikiLegal:RestrictiveUpgrade?.)

The basic reasoning is that nice work is more likely to be used without contributing back to the community. By performing restrictive upgrade, you ensure that the work is used on community terms. The downside is that there is the WikiLegal:CopyLeftInteroperabilityProblem?- you can’t use the text with other related texts that just have a slightly different license.

See Also

CategoryReworking

Discussion

Evidence

Refinery – Each individual wiki page tends to start out as a default stub, then a rough draft, then (hopefully) becomes more polished.

I’ve seen at least 3 ways of doing this; have you seen any others ?

I suppose it is theoretically possible to

  • have 2 separate wikis. When an article gets “good enough” on the scratch wiki, move it to the serious wiki (quality wiki ?).

But I haven’t seen that happen.

DavidCary

Beyond plain old reworking of wiki pages, WikiFeaturesWiki? may be evidence of a PublicRefineryProcess.

Then again, it could be interpreted as evidence to the contrary, as well..!

Experiments

We could look for Communal wiki (and start with our own), and attach to them a Scratch wiki. This plan is described in ParallelWiki.

Or, conversly, we could look for Scratch wiki, and attach Communal wiki to them.

You can always use my test wikis as a scratch-pad, but I don’t think that is what you intended. The scratch-pad wiki still has a topic and it is still linked to a MotherWiki, right?

Isn’t that the entire talk-pages idea in a different guise? The reason we don’t want those is because it makes reworking harder. But then again this wiki as accumulated so many ThreadMode-features, it feels as if reworking is just as hard. ReworkingProblems are hard – period.

So why not use the MotherWiki as your scratch-pad?

Insufficient technology :)ChrisPurcell

laugh
Exactly. ;) Insufficient technology.

No- a ScratchWiki doesn’t really need a MotherWiki. I mean, C2 is a ScratchWiki, as far as I can tell.

No, it’s not really talk-pages, because you don’t have to have a “real” page that you’re talking about.

It’s just… A place where the community norm is “anything goes.” People can build things like indexes and roadmaps and stuff into it, but it’s really quite a bit of a jungle. Just think: “C2.”

It’s not “all thread mode-” it can produce documents as well.

I am starting to believe that the proper way to make wiki is to start with a ScratchWiki, and then move up to several CommmunalWiki?, and then those spawn HardWiki. I want to write about this idea- ScratchWikiFirst- but I’ve got other things to read, catch up on here. Also, have some other ideas I need to get out.

(See, this would be perfect for the CommunityWiki’s ScratchWiki. That way, I wouldn’t have to worry about cluttering RecentChanges, or niceness, or integration, or whatever. We could just write, and we could see what we liked and didn’t like, and then move what we liked into here: CommunityWiki. As it is, I feel a little guilty about writing so much here.)

The term "Parallel Wiki"

I feel that the term “Parallel” is not a good one, to describe a Scratch wiki that is connected to a Communal wiki for purposes of refinery.

It seems to me that many things could be “Parallel” wiki: CommunityWiki and FermentWiki could be viewed as “Parallel.” (Wiki related by the same subject.) The page Meatball:ParallelWiki describes MetaWikiPedia as a parallel wiki to WikiPedia. (One wiki is SpaceForMetaDiscussion.) It could also be talking about different languages.

Then I suggest cutting room as the analogy to draw. That would make it a CuttingRoomWiki?

Update: 2008-02-15

Well hot damn, it seems like it is happening, sort of.

VeroPedia is basically what I was considering as a ManagedWiki. WikiPedia would then be something of a CommunalWiki. :)

re: ..."Curation"

It seems to me that one major difference between “Content” and “Curation” stems from the fact that a curator adds value in the form of “attribution” to a known source, by collecting material in a manner similar to that of an anthologist. This is especially valuable given the vast amount of information on the web that has “questionable” value. In effect, the value of the original content is enhanced by placing it in a more “defined” context that can be thought of as effectively adding “MetaData” to the content.

This parallels a real process in which one values information from known sources more than from unknown ones, simply because of a known “bias” (not necessarily a negative factor) that each source develops and displays over time.

Yes, in a time where information is plentiful, making the right choices, providing valuable recommendations is of growing importance. One negative aspect is how Steve Jobs called the Apple app store a “curated platform” [1] – and thereby justifies limiting access to some features and a marketplace infrastructure. Not quite the same thing.

In the physical world, we can distinguish “curated places” from “non-curated places” through cultural conventions. Galleries and museums are curated; grafitti and ads are usually not.

Also note how curation is a positive selection process: picking things from a pool because of some perceived benefit. There are other forces such as pornography laws or self-imposed principles that function as a negative selection process: removing things from a pool (such as inappropriate ads) because of a perceived harm.

As far as I’m concerned, Apple is performing a negative selection process but tries to label it as a positive selection process using marketing speak.

I appreciate the points you’ve added, Alex.

In fact, I can visualize an implementation of your “negative & positive selection” process based by filtering content on the basis of Tags,

Additionally, it seems to me that “curation” depends somewhat on the judgment or opinion of the curator. If so, then it may well be necessary to capture this View using the inference engines that are the foundation of ExpertSystems?, as opposed to the more common procedural programming techniques.

Define external redirect: SlopeOfCopyLeftRestrictions CopyLeftInteroperabilityProblem CuttingRoomWiki ModeratedWiki RestrictiveUpgrade TheEndOfWiki ExpertSystems CommmunalWiki PublicWeb WikiFeaturesWiki

EditNearLinks: WikiWikiWeb ReworkingProblems FermentWiki NuPedia ParallelWiki LarrySanger MetaWikiPedia DocumentMode

Languages: