Technology used vs performance impact

From the performance point of view what are the known resources hungry patterns on xwiki farm?
Is it javascript extensions, velocity codes in pages used as backend services, is it parsing them or setting them as used for the whole wiki, is it maybe global translation?
I mean when i m developing simple app or customizing farm xwiki, there are many ways how to approach it and would like to choose the least resources hungry option, can you point me to the right direction regarding technology vs performance impact?

Velocity execution is slow so if you can (and if you care about performance) then you should “move” the Velocity code to Java (script service, REST service). You loose some flexibility (being able to modify / customize at runtime) but you win on performance. A REST request is way faster than a request for a service implemented in a wiki page using Velocity.

Regarding JavaScript, you should of course load the code on demand when you need it, but it’s less of a problem because the browser caches most of the requests for JavaScript code.

Global translations are not a problem AFAIK. Database queries could slow down your server if you’re missing the right indexes.

Hope this helps,

1 Like

See also

One current perf issue is having a lot of objects in wiki pages. You can create as many pages as you wish but try to limit the number of xobjects in them.

1 Like

thanks, that was very helpful!