Hi,
I am testing how long importing big spaces from Confluence to XWiki require, (from 3GB up to 12GB or so, at the moment, 6 to 9Gigabytes).
The Server is Debian 11 with XWiki 16.6.0, with xwiki-jetty and xwiki-postgresql. Java is openjdk-17-jre-headless (17.0.11+9-1~deb11u1). This is a VM with over 50GB space available and 32GB RAM.
I imported a pair of spaces and with the first space, before starting with Nested Pages Migration I stumbled for the first time upon this message:
Current conversion (Page URL)
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Failed to execute the \[velocity\] macro. Cause: \[Java heap space\]. Click on this message for details.
java.lang.OutOfMemoryError: Java heap space
then I noticed in the /var/lib/xwiki/data a file having for name java_pid3489.hprof, size 1.4GB.
Unsure about what to do with it, I just moved it to the /home/user (keeping ownership and rights/permissions) and reloaded the page, the message was gone.
However while next space is about to finish importation with the Nested Pages Migration stage, I get the same message again.
I tried to ignore it, but the first step of Nested Pages Migration would not finish loading the space.
I would very much like an advice. How can I fix it?
That file is a dump of the memory, that’s generally the best way to debug why there is an OutOfMemoryError (but it contains the whole memory of your instance, so it might include sensitive information). The stack trace associated with the very first OutOfMemoryError you can find in your log can sometimes give a clue of what is the cause (but honesty, it could as well be a consequence).
AFAIK the Confluence import itself is not supposed to take much RAM, but maybe something else is impacted by the number of pages you are adding to this instance.
My understanding is that the Nested Pages Migration is not needed at all anymore.
Once you get an OutOfMemoryError pretty much anything can be broken for good, and you generally cannot really trust an instance anymore, the safest is generally to restart.
Not much to do with the XWiki version. The support was made optional in the Confluence extension version 9.35.0 and made the default behavior in 9.46.0 from what I see in the changelog.
This is wonderful news, thank you! This might indeed save time.
(About the memory matter, I have been instructed to use the /etc/environment file, as per the examples listed in the file /usr/lib/xwiki-jetty/start_xwiki.sh).
I thought you were using the docker image, actually. Did you mean you used the Debian package xwiki-xjetty-pgsql (there is no package named xwiki-jetty and xwiki-postgresql) ?