Logging of the Filter Streams Converter application : is there a copy within the server?

Hello,

I am now calculating how long export/import of spaces from Confluence take and redoing a bunch of them in a test install with the XWiki 16.6.0 version (inside a XWiki server powered by a Debian install with xwiki-xjetty-*).

While doing the conversion of spaces exported from Confluence, in the job logs I have seen quite some bunches of line warnings starting with:

Could not find page (page xyz) in space (abc). . Links to this page may be broken. This may happen when importing a space that links to another space which is not present in this Confluence export, or the page is missing

I have tried to find the copy of the logs provided in the web backend, inside the /var/log directory, and also in the /var/lib/xwiki directory, and I failed doing so.
Can it be found somewhere in the system?

Thank you for your help. :slight_smile:

The log of all the serialized jobs (in some cases this serialization if disabled, but it’s not the case here) is located in the permanent directory under jobs/status. The Filter Streams Converter application specifically store them under jobs/status/filter/converter.

Thank you. It is a pity, though, that there are not simple text *.log files under /var/log/xwiki.

I noticed a link in error in the first line of the page you pointed me to:
WEB-INF/classes/logback.xml

It’s now fixed, thanks.

Hello Thomas,
after you pointed me to the xml logs I have retrieved one of them. I noticed two things which unfortunately make them useless for the purpose of checking error/warning messages:

  • In order to collect them, they need to be fetched individually after each space has been converted, or they vanish after the next space conversion
  • worse, getting information on which pages, links and spaces are in error is not possible because in these logs the names are just replaced with this tag : {[]}

So I struggled but found a way to copy the full content of the logs while the conversion is done, or just after, by selecting the first line, then press the left Shift button on the keyboard while pressing the end line arrow. (It is not very handy but it worked even in large spaces, though it took some time).

Do you think there is any chance that the XWiki dev team could get this very graphic log to be automatically copied to /var/log/xwiki in the future in a manageable way?

It is possible, but you need to check the parameters values a few lines after the message pattern.

Oh I see, thank you. However it is still very not handy to need to retrieve them systematically after each import/conversion, and not being able to visit them later once all conversions are done.

The location of the status and log files depends on the job identifier, and indeed the Filter Streams Converter generate the identifier based only on the type of input/output filters which are used (so it keeps overwriting them when you do the same kind of migrations, but it’s also easier to show the log of the latest conversion when you go back to the page).

I understand, but it still not makes it very handy to handle. Do you think the logs produced in the web page could be piped to txt logs sent to /var/log/xwiki (in a very traditional way), one per each conversion with its own date-time in the log file name?

While it feels obvious when you are searching for a log, it’s also a risk of flooding the filesystem with an infinite number of files which are never cleaned, so I’m not sure that’s the safest default.

I can think of several possibilities which are not too hard technically:

  • add an option in the conversion form to include the datetime in the generated identifier
  • add an option in the conversion form to manually indicate an identifier (easier to know which is which, but you need to think about providing this identifier)
  • add an option in the conversion form to force the job log to also end up in the general log (which is also more human-readable, but it’s much harder to find a specific conversion log when there are 10 of them)

While it feels obvious when you are searching for a log, it’s also a risk of flooding the filesystem with an infinite number of files which are never cleaned

This is the job of the files /etc/logrotate.conf and additional files inside /etc/logrotate.d to take care of that.
So ultimately a system administrator who knows his basics won’t have any trouble handling it. (man logrotate and du -csh /var/log are our friends).
Also the system won’t be ever become too full to be unable to connect to it using ssh, there is a basic security which default keeps a 5% storage free in system partitions to avoid this kind of hickups.

I’m afraid you overestimate a bit the average level of people who install stuff on servers :slight_smile:

Anyway, I doubt something like this have a good chance of ending up in the org roadmap on short term given the list of stuff already planned (but it could be something you might be able to push for via XWiki SAS). But don’t hesitate to create an issue on Loading... with everything you have in mind on how it should look like, problems and solutions, etc. And of course if you feel like working on this, I would be happy to help with pointers.

There are many kinds of logs under /var/log so any of them could begin to be too large without the admin knowing. Whenever one who has not yet been tought does not know why their partition is full they will perform a search and next time they will be more clever. Learning by doing, right?

Can it be as an idea to be added to a wish list?

That’s basically what an issue of type IMPROVEMENT, NEW FEATURE or IDEA is.

Hello,
I can create a ticket with title Filter Streams Converter only logs in UI and type Improvement. I just don’t know which component to choose, tried Confluence XML wasn’t found, Filter Streams Converter either, and Filter provides several choices, I am unsure which one to pick if any.
Can you help me with that?

It does not really have much to do with Confluence XML if you want to improve how the filter conversion works in general. So it would be the “Filter” component, and it probably require some modification in the “Job” component.

Here it is Loading...

1 Like