Should we limit the PDF export size by default?

I think this is a more general question if we want to protect against OOM errors. We don’t enforce such limits in other places like the display macro that could easily be used to, e.g., include 100 other documents in a single document. The PDF export makes this easier to trigger, though. I think in general it might be a good idea to protect against OOM attacks.

Unfortunately, the current limit is hardly a predictor for the actual memory usage of the XDOM (e.g., embedding images as data URIs could easily increase the output size without needing much memory or causing problems in Chrome). Would it be possible to instead compute a rough estimation of the XDOM’s memory consumption based on the number of nodes and maybe number of parameters in addition to the size of the HTML limit and only stop once one of the sizes reaches a much higher limit like 100MB? If this sounds too complicated, I’m +1 for option 2, possibly with an even higher limit like 5MB.

Another proposal: don’t stop the PDF export when the size limit is exceeded after processing the last document. That way, a single document can always be exported regardless of its size (and the XDOM is in memory at this point, anyways, so it doesn’t really help to abort the export at that point).