Do we want to move to a Single Page Application model?

Hello all,

In the context of Cristal, are using Single Page Application (SPA)[1] pattern.

This means that we are handling the application routes 100% client-side (using vue-router).
We defined a URL scheme from scratch because the routes are decoupled from the backend, from which we get the displayed data.
Another benefit is that navigating from one page to another does not require a full-page reload. When navigating from one page to another, only components that are specific to the new page are refreshed (e.g., the document tree is not changed, while the page content and the breadcrumb are updated).

In the post, I’d like to discuss the pros and cons of moving to SPA for XWiki

Pros:

  • feeling of snappiness thanks to the absence of full-page reload when navigating
  • lower network use, again because of the decreased need for full-page loads

Cons:

  • difficult to answer architectural points:
    • rendering while preserving good search engine bots support is not easy, see Technical support for the rendering of modern front-end technologies
    • routes management (detailed post soon). Currently, XWiki’s routes are handled 100% server side, which is not aligned with the usually 100% client-side routes of SPA. Therefore, requiring a more complex route architecture to make it work

From a user experience perspective, I think going toward SPAs is interesting. But I’m afraid of the engineering cost that comes with this approach.
I’m again interested in your opinion on the matter.
Thanks!


  1. An SPA (Single-page application) is a web app implementation that loads only a single web document, and then updates the body content of that single document via JavaScript APIs such as Fetch when different content is to be shown. SPA (Single-page application) - MDN Web Docs Glossary: Definitions of Web-related terms | MDN ↩︎

I don’t think we should move to an SPA model outside of special use cases like a mobile application for XWiki or a desktop client (both could be interesting imho), or embedding a limited set of features into another application. To me, the main problem is backwards compatibility with existing code (including extensions), and that I don’t believe the performance claims for non-trivial use cases. I also fear the added complexity and engineering effort required to go from a prototype SPA to a reliably working SPA.

To make this more concrete, some examples:

  • Consider UIX that use server-side logic to display in certain cases - how would you make them compatible without re-loading the full UI on every click? Reloading the content isn’t enough, you need to reload the full UI on every click for backwards-compatibility.
  • There are style sheets that should only be used on certain pages, to make them compatible you would compare the current list of style sheets always to the list of style sheets of the new page to decide which ones to keep and which ones to unload. The complexity of all this makes this very error-prone and also not necessarily faster.
  • Adding a second server between the browser and the backend as suggested in the other thread also can’t make this faster without putting a lot of engineering effort into it (and even then, it sounds very difficult to profile and optimize).
  • There are difficult topics like how to handle browser history, how much to store in memory vs. reloading from server to make navigation fast without causing a huge memory usage - all things that the browser solves for us with a traditional web application.
  • There is engineering required to, e.g., properly handle network errors, session time-outs, … - we currently already see this with things like saving but with an SPA, you have this on every single click. You’ll need to implement loading indicators, ways for users to cancel loads, retry hanging requests, … - all things that the browser provides for free (and with a UI that everybody knows) when you use regular page loads. How do you handle session re-validation that requires a redirect to an SSO service? Does this completely reload the application (invalidating all the benefits of the SPA)?
  • Modern front-end technologies have a very short lifetime, we tried using them in Live Data and now the technology Live Data is based on isn’t supported anymore. I fear the same will happen when we transform XWiki into an SPA, but then it concerns not just a single component that we might be able to replace without too much effort but the whole UI. I fear that this means that we’ll have breaking changes and a lot of engineering effort every few years.

To me, we should instead gradually introduce web components in XWiki in places where it makes sense, thereby paying attention that the interfaces between them are based on standard web components such that we can easily replace the technology behind any of these web components, and perform that replacement gradually without breaking everything.

1 Like

I share Michael’s concerns regarding moving to an SPA model, but at the same time we have to admit that XWiki is not a pure MPA because it has a lot of actions that update the UI without reloading the web page:

  • document extra tabs
  • in-place edit mode
  • live data pagination, filters and sort

just to name a few. All these are changing the UI state client-side and this state needs to be reflected in the URL otherwise the state is lost when we reload or navigate away and back (and we can’t bookmark / share the state). ATM each of the mentioned features is using its own custom and limited way to save the state in the URL, sometimes conflicting with each other:

  • document extra tabs and in-place edit mode are using the URL fragment identifier
  • live data is modifying the URL query string (in the browser history)

So even if we don’t move to SPA, I think we still need a client-side component that allows XWiki extensions / modules to:

  • save their UI state
  • react to URL changes that involves their UI state

It’s not a router, but it’s related.

Thanks,
Marius

1 Like