[linux-audio-user] Wiki (Re: which graphics card?)

Juhana Sadeharju kouhia at nic.funet.fi
Tue Jun 27 16:51:41 EDT 2006

>From: Stephen Hassard <steve at hassard.net>
>Definitely take a look at mediawiki: http://www.mediawiki.org/
[ ... ]
>From: Lee Revell <rlrevell at joe-job.com>
>I like the idea of making a master Linux audio page at Wikipedia, and
>just put a brief description and links to all the community resources.
>I think Google weights wikipedia pages pretty well so it would be a nice
>place to point newbies.

Whatever wiki is used, please follow this guide:

1. Keep the current wiki pages separated from the edit pages, version
pages, etc. For example, MediaWiki at "www.uesp.net/wiki/" has its
control pages at "www.uesp.net/w/". Similarly MediaWiki of Blender at
"mediawiki.blender.org/index.php/Main_Page" has its control pages at
"mediawiki.blender.org/" as files "index.php?title=Main_Page&action=edit".

What it means? That wget can be used to download only the current wiki
pages for offline reading by using "-np" (no parents) option.
Yes, people need offline copies.

Many wiki pages have both the current wiki pages and the control pages
at the same directory. Then wget may download multiple GBs(!!) of
control pages instead of 200 MB of wiki pages. As an example of
wiki where everything is in the same directory:
 http://iua-share.upf.es/wikis/aes/  (Linux audio presentation at AES)
While the simple doc has a handful of pages, wget downloaded 18850 files.

2. If the pages are added to larger wiki system such as Wikipedia,
put all pages to a subdirectory so that only it can be downloaded
with wget.

3. Check a possibility to generate offline copy of the pages.
Wikipedia provides downloadable package files containing the wikipedia.
So does "wiki.beyondunreal.com", conveniently, because they also
forbid the use of a downloader like wget.

As you may have guessed I have downloaded a plenty of wikis for offline
reading. Great advantage is that I have a "meta" access to the webpage
with "find", "grep" and other commands. The mentioned directory
"www.uesp.net/wiki/" has 4825 pages -- info may easily be lost to such
a complexity. Many regular webpages benefit from downloading as well:
e.g., "find | grep zip" lists all zip files. No time is wasted in
finding them.

Forums are another trap for offline readers. While mailman archives
are easy to download, because gzipped archives are automatically created,
the forums have no such feature. Some forums offer a Low Fidelity version
but I did find it better to write a special download software.

  for developers of open source graphics software

More information about the Linux-audio-user mailing list