XWiki for ELN (3)

We need to estimate how much information we can store in one XWiki instance (we could spin up new VMs and split our experiments over several XWiki instances, but prefer to avoid this). We are currently thinking about a semi-automatic mechanism for adding fotos of experimental setups - so there might be many pages (for individual experiments) with attached images (however, larger files, i.e. primary data from microscopy, will only be included with “thumbnail” images and meta information). We want to base our own hashing, timestamping and some experiment-specific processing for quality assurance/validation on exporting XAR files to a remote system - what kind of limitations should we expect, in particular for the size of individual pages (with embedded attachments)? The number of concurrent users at our institute will be well below 50 most of the time; we are worried about robustly handling larger individual pages with many attachments - processing will always be asynchronous on a separate machine, so performance is less critical than reliability. Are there known problems/limitations with larger installations and the REST interface accessing page content? Thanks!

No known problems.

XWiki will scale very well with a large number of pages (no limitations).

There are limitations to page content size. For example for MySQL/MariaDB, it’s 200K of text content, see https://github.com/xwiki/xwiki-platform/blob/405d210dfad79f723617ba727fbcc5f0f6d2fbca/xwiki-platform-core/xwiki-platform-oldcore/src/main/resources/xwiki.hbm.xml#L60

The more content there’s is to render, the longer it’ll take of course. See also Performance (XWiki.org)

Pages are cached, so if you want good perf you need to have enough memory to put them in the cache.

No limitations on attachment numbers or size (the hard drive is the limitation for the size).

50 concurrent users is pretty large but with the right setup, that should work fine.

One thing to note: try to keep the number of XObjects not too large per page. If you have 10K xobjects for a page then XWiki will need to load them all before being able to render the page and it’ll take time. Better to have more pages and less xobjects (for ex beter to have 10K pages with 1 xobjects than 1 page with 10K xobjects).

Hope it helps

1 Like

This has helped indeed - many thanks! We can throw some infrastructure resources at hosting one or more XWikis and I am no longer worried about fundamental performance problems. However, acceptance as a solution for electronic lab notebooks (ELN) will also depend on what kind of extra features (hashing, time-stamping RFC 3161, validation, reporting) we can implement. This will be done asynchronously and we already have working code for many pieces of the puzzle - our main challenge is the communication between the remote machine running our ELN specific code and the XWiki containing new or modified pages with documentation of experiments - we hope to do this with administrator privileges harnessing the REST interface - or is there a better way? - As mentioned before, we can score with our scientists if we find a really easy way to include fotos in pages - is moving images within XWiki (from one page to another) easier than uploading images from a central file system? Thanks!