XWiki OutOfMemoryError During Solr Indexing

Hi everyone,

We’re experiencing an issue with XWiki periodically stopping due to OutOfMemoryError. Here are some log entries showing the problem:

{"log":"2025-01-09 14:51:16,502 [XWiki Solr index thread] WARN  o.a.p.h.c.Chunk - Command offset 34 past end of data at 26 \n","stream":"stdout","time":"2025-01-09T12:51:16.502332144Z"}
{"log":"2025-01-09 14:51:16,502 [XWiki Solr index thread] WARN  o.a.p.h.c.Chunk - Command offset 88 past end of data at 69 \n","stream":"stdout","time":"2025-01-09T12:51:16.502334977Z"}
{"log":"2025-01-09 14:51:16,502 [XWiki Solr index thread] WARN  o.a.p.h.c.Chunk - Command offset 88 past end of data at 69 \n","stream":"stdout","time":"2025-01-09T12:51:16.502338101Z"}
{"log":"2025-01-09 14:51:49,099 [https-openssl-nio-8443-exec-402 - https://xwiki.example.com/bin/loginsubmit/XWiki/XWikiLogin] WARN  nticationFailureLoggerListener - Authentication failure with login [user1] \n","stream":"stdout","time":"2025-01-09T12:51:49.100233819Z"}
{"log":"2025-01-09 14:51:54,594 [https-openssl-nio-8443-exec-434 - https://xwiki.example.com/bin/loginsubmit/XWiki/XWikiLogin] WARN  nticationFailureLoggerListener - Authentication failure with login [user2] \n","stream":"stdout","time":"2025-01-09T12:51:54.595530928Z"}
{"log":"Exception in thread \"https-openssl-nio-8443-exec-469\" java.lang.OutOfMemoryError: Java heap space\n","stream":"stderr","time":"2025-01-09T13:32:47.029003114Z"}
{"log":"Exception in thread \"https-openssl-nio-8443-exec-436 - https://xwiki.example.com/webjars/wiki%3Axwiki/xwiki-platform-ckeditor-webjar/15.10.7/skins/moono-lisa/images/arrow.png\" java.lang.OutOfMemoryError: Java heap space\n","stream":"stderr","time":"2025-01-09T13:38:52.122874536Z"}

Before the OutOfMemoryError occurs, there are thousands of warnings about Command offset past end of data in the Solr indexing thread. These warnings seem related to data corruption or improper formatting.

To address this, we’ve increased the Java heap size and checked data integrity, but the issue persists. Currently, our Java heap size is set to -Xms3G and -Xmx6G.

Has anyone experienced a similar issue or have any ideas on what might be causing this problem and how to resolve it? Additionally, could these OutOfMemoryError issues be caused by the indexer trying to index large attached files or files protected with AIP (Azure Information Protection)?

Any help or suggestions would be greatly appreciated.

It would be interesting to enable Java automatic memory dump. It’s generally the easiest (even if technical) way to find out what is taking more memory than it should.

Maybe you should try to wipe the search core data as it might be what is corrupted, more than the memory. It’s generally located in <xwiki data>/cache/solr/search_9 (stop XWiki and remove the entire search_9 folder).

It could, yes. We have for example the BUG XWIKI-19668.

Thank you for your answer. I enabled the Java automatic memory dump, but unfortunately, no dump files were created. Additionally, I couldn’t find the /cache/solr/search_9 directory on my Docker container.

Given these circumstances, is increasing the XWiki heap size the only solution to address the XWIKI-19668 bug? Also, would using a remote Solr server improve XWiki performance?

Any further suggestions or insights would be greatly appreciated.

Thank you!

It’s supposed to be in the XWiki “permanent directory”.

It can help for the OOM, but if the index is corrupted it’s not going to do any good.

Yes.