[Members] wiki.xmpp.org data recovery
Guus der Kinderen
guus.der.kinderen at gmail.com
Thu Jun 22 08:06:05 UTC 2017
Oh, that's actually handy. I'm not much of a bash scripter, but by
combining xidel (to select the part of the HTML that is the article
content) and pandoc (for conversion to the Mediawiki format), I'm getting
something that is pretty close. Example:
$ xidel --html Edwin_Mons_Application_2011.html --css "#mw-content-text" |
pandoc --from html --to mediawiki
Can someone improve on that?
On 21 June 2017 at 22:01, Goffi <goffi at goffi.org> wrote:
> Le mercredi 21 juin 2017, 19:10:41 CEST Arc Riley a écrit :
> > After a great deal of frustration, our wiki is back online as of this
> > morning. Make sure to buy Kev (and the rest of the team) a beer for their
> > many hours of work on this.
> > Unfortunately, the data of the previous wiki was not being backed up.
> > of the wiki was backed up by Google and archive.org, but in HTML format:
> > http://ayena.de/files/wiki.xmpp.org/web/ (Google cache, collected by
> > https://web.archive.org/ (wayback machine)
> > As a group we can make light work of this if everyone re-adds the page(s)
> > that are important to them. Please coordinate on xsf at muc.xmpp.org to
> > edit collisions.
> > Since all account information has been lost, please send your name,
> > address, and preferred nickname to Alex or Kev, who will be happy to
> > an account for you.
> Hi, thank you for this work.
> It's a bit dirty, but Pandoc can convert XHTML to mediawiki format, it
> save a log of work I think.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Members