Export All pages Xwiki for Offline access (disaster)

Hi,

I use xWiki for Knowledge Base.

It’s very important to access it for us.

But it’s possible that the server or database are not online (for any reasons) ; but we need to access Knowledge Base.

I search a solution to export all XWiki pages in a secure space (can be access if we have a big Disaster on a Datacenter).

How to export all XWiki content to access it offline in degraded mode ?

Thanks for your experience and feedback,

Matt

There are lots of solutions:

  • Export to HTML
  • Use the Replication Extension to replicate an instance located in one data center to another
  • Use a distributed database to do the same (replicate data in different parts of the world)

I’m sure there are other options but these are the first ones that come to my mind.

Thanks

Note that the Replication app is not really designed to replicate an entire instance but more a set of spaces.

Ok thx for the info. What if you replicate all the top level spaces of your instance? Won’t it provide a “backup” solution for your instance if the replica is in a different data center?

Thx

It only replicates wiki pages (and a bit likes, but not ratings in general), so I would really not advertise it as a backup solution.

I would not recommend replicating the XWiki space, since it contains pretty dangerous stuff to replicate and among other things that can’t be replicated or it would break the instance (each instance needs its own URL setup and more importantly Replication setup).

ok thx, I see. So a bit similar to the limitations from https://www.xwiki.org/xwiki/bin/view/Documentation/AdminGuide/Backup#HUsingtheXWikiExportfeature

As a degraded mode, it should be fine though.

I see some limitations mentioned at https://extensions.xwiki.org/xwiki/bin/view/Extension/Replication%20Application/#HLimitations but it doesn’t seem complete, is it?

I indicated a more generic limitation, there is no point listing everything it does not replicate by default (there is not much chance for it to remain up to date).

ok I must have missed it, I only saw Notifications and Subwiki on the doc page.

You did not, I just added that :slight_smile:

Hi all,

Thanks for your replies.

→ I don’t want to install an XWiki replication - the VM is still protected by a GroupProtection manage by VirtualSolution.

→ I alraedy have a MySQL Replica Instance from the MySQL Instance hosted xwiki schema.

I search a solution where i can extract all the article and send all of them on a computer.

Yes it’s a solution.
Is it possible to write a script that exports all xwiki page ?

Thanks for your help,
Matthieu

First, you should know about Loading...

Re the command line, you can do it, see https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide/Features/Exports#HAdvanced-1

Hi,

With this URL i’m able to export all page as HTML.

http://XXXXXXXXX:8080/bin/export/Space/Page?format=html&pages=%25.%25

But is it possible to export only one page ?
Example : This reference page [xwiki:Bases de données.ORACLE.Erreurs connues.Fix broken oracle job.WebHome]
What is the URL to export this page as html ?

Thanks for your help,

Matt

The pages parameter is a reference, see https://www.xwiki.org/xwiki/bin/view/Documentation/UserGuide/Features/Exports#HAdvanced-1 which contains the syntax and examples.

To find the reference of a page, please see https://www.xwiki.org/xwiki/bin/view/FAQ/How%20can%20I%20get%20the%20reference%20of%20a%20page

Hi,
I’am able to export only on page to html :
http://wikisrv:8081/bin/export/Space/Page?format=html&pages=xwiki:Bases%20de%20données.ORACLE.Erreurs%20connues.Fix%20broken%20%20oracle%20job.WebHome

Do you have some examples to export the page from a curl or wget with authentification ?

THanks for your help,
Matt