<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
</head>
<body bgcolor="#ffffff" text="#000000">
<br>
<blockquote cite="mid:2MA.1VZUg.2d8ut0h%7BEqp.1ICM8Z@seznam.cz"
type="cite">Hello Marco, ... I am envolved in translation to the
Czech language and I am sure to use the same principles for
collectin one movable manual (unfortunately I am on 10% lenght of
translation way.. cause time time time).</blockquote>
<br>
provided that translated pages are there, the same PDF could easily
be done with Czech lang version of the wiki manual...<br>
<br>
<blockquote cite="mid:2MA.1VZUg.2d8ut0h%7BEqp.1ICM8Z@seznam.cz"
type="cite">
<div>My first idea was to build local wiki in my notebook. It is
technically possible. It is possible to get wikimedia server,
apache, mysql, Naiad Scheme, some scripts.. but it is a little
bit complicated way and as somebody say below "mirrorring of
wikiserver is not answer".</div>
</blockquote>
It could help, you could mirror the wiki: it would be a first big
download, and then every day a few changes only... if you need an
offline wiki, I would do this. (mmm, I could do the same for the PDF
manual conversion... it would be faster - and lighter for the
blender wiki server - than once a month download every single
page...)<br>
<br>
<blockquote cite="mid:2MA.1VZUg.2d8ut0h%7BEqp.1ICM8Z@seznam.cz"
type="cite">
<div>I asked for the possibility to implement some PDF generator
scripts on wikimedia server without answer. Than I tried to
learn it locally and I have so say it is not simple method. It
is very user friendly (you can click on pages needed to the PDF,
sort them, and so on). But the engine (wikimedia module based on
additional server side scripts) take an wiki server energy...
Than I decided to go through <span style="line-height: 1.3;">method
of </span><span style="line-height: 1.3;">simple file and
conversion.</span></div>
</blockquote>
<br>
I learned for other reasons the mediawiki ways to create books,
collection of wiki pages, and then convert those to pdf or odt, or
else.. It works, it requires a mwserve server
(<a class="moz-txt-link-freetext" href="http://mwlib.readthedocs.org/en/latest/index.html">http://mwlib.readthedocs.org/en/latest/index.html</a>), but I find it
really heavy, for blender manual page and images number... It's
quite good for relatively small docs.<br>
<br>
<blockquote cite="mid:2MA.1VZUg.2d8ut0h%7BEqp.1ICM8Z@seznam.cz"
type="cite">
<div>So please could you explain how did you get clear pages from
wikli maual? Did you process with some wget, or html-to-pdf
commands on your local machine? I think that best way is use
very simple method to get texts, pictures, links into one local
directory. No menu, no special effects, dynamic content and so
on. <br>
</div>
</blockquote>
yep, sortof: see my previous post here...<br>
<br>
Marco
<BR>
<BLOCKQUOTE> -------------------
[Ai sensi e per gli effetti della Legge sulla tutela della privacy
(L. 196/2003), questa mail è destinata unicamente alle persone sopra
indicate e le informazioni in essa contenute sono da considerarsi
strettamente riservate. E' proibito leggere, copiare, usare o
diffondere il contenuto della presente mail senza autorizzazione.
Se avete ricevuto questo messaggio per errore, siete pregati di
rispedire la stessa al mittente. Grazie]
</BLOCKQUOTE>
</body>
</html>