Multiple failures in Firefox

Due to pending manifest v3 changes coming to Chromium based browsers, we are attempting to switch to Firefox. Unfortunately there are some bugs I’ve been unable to resolve when using XWiki in Firefox. XWiki works without error in both Chrome and Edge, but in Firefox I was getting the dreaded Failed to lock page error. Our XWiki installation is running in a Docker container behind an Nginx reverse proxy. We have a dozen services running in Docker containers behind this reverse proxy with no issue, but I made sure all our settings were in line with this documentation, and tried all the suggestions mentioned in this thread, all to no avail. Though not optimal, I eventually set edit.document.inPlaceEditing.enabled = false to work around this particular issue.

The next issue is that I am unable to create new pages in Firefox, getting the error Error while transforming the page title to a correct page name: the exact page title will be used.

This error appears immediately when anything is typed into the “Title” box, and it disables the “Create” button, making it impossible to create new pages in Firefox.

I believe both of the issues I’ve mentioned have the same underlying problem, I just don’t know why. When opening the DevTools in both Chrome and Firefox then comparing the “Network” tab entries when the same interactions are made in the XWiki, Firefox eventually results in a network request which returns a status of 302, followed by another network request which returns a status of 401. The exact same action in Chrome results in a network request which returns a status of 200. It doesn’t matter what the interaction is, the general pattern is always the same.

Extending the example for which I provided a picture, as soon as a single letter is typed in the “Title” box both Chrome and Firefox hit the same URL, however Chrome uses an XHR request whereas Firefox uses an HTML request. The URL:

https://xwiki.example.com/bin/get/Main/?xpage=entitynamevalidation_json&outputSyntax=plain&name=a&form_token=Zyx2KrFQRaYkyXxtWwhfnx

In Chrome the XHR GET request returns with a 200 status, the exact same request is repeated with the same result (unclear why, but not an issue), and no further requests are made.

In Firefox the HTML GET request returns with a 302 status, then a request to the XWiki login screen is made which returns with a 401 status, then both of those requests are repeated with the same result. The URL for the request that returns a 401 status:

https://xwiki.example.com/bin/login/XWiki/XWikiLogin;jsessionid=368FED311E78CAB481019B052CDB9C20?srid=bxaBWSzC&xredirect=%2Fbin%2Fget%2FMain%2F%3Fxpage%3Dentitynamevalidation_json%26outputSyntax%3Dplain%26name%3Da%26form_token%3DYyx2KrFQRaYkyXxtWwhfnw%26srid%3DbxaBWSzC

The login page doesn’t render, and I’m not redirected, but I can see in the XWiki login page is returned in the response to the request. Even after this happens, I’m still logged in to XWiki and can navigate to other pages and other such things.

This isn’t an issue with one Firefox install, it is all of them, including the mobile browser. I’ve disabled all Firefox security settings during testing just to see what could be causing this, but so far I’ve been unable to find the culprit. Any guidance would be appreciated.

Hello, as you are speaking of Edge I am assuming you are using a Windows OS. Which version of Firefox do you have, and have you installed it from the mozilla.org website?
Also which version of XWiki is your server running?

XWiki 16.1.0

Windows 11 64 bit
Firefox 131, 132, and 135

Android 13
Firefox 131, 132, and 135

Have you tried to restart Firefox 135 fresh? There is a menu for that purpose in the help section in Firefox. You can also access it directly from the address bar using about:support then on the top right, Repair Firefox (be aware it makes it fully new, or almost, if you don’t want that, care to backup your profile first).

Maybe other persons in the forum will have other ideas to suggest.

I haven’t tried the exact process you mentioned, but we are experiencing this problem on systems new and existing. Brand new installs of Windows 11, fresh installs of Firefox (all versions we have tested) all experience this problem. Starting Firefox in private mode, safe mode, or disabling any of its security features doesn’t seem to help.

Further testing has showed that XWiki is also working fine in Safari, but still fails in Firefox on Mac.

Can you install a test version on a machine or in a virtual machine, using the latest stable or the latest LTS edition? I think the LTS would be 16.10.3. https://www.xwiki.org/xwiki/bin/view/Blog/Releases

(Also you perhaps should you re-check the reverse-proxy configuration).

Not an apples to apples comparison, but I downloaded the “Demo with Standard Flavor Pre-installed” for both XWiki 16.1.0 and 17.0.0 and ran them both on a Windows 11 machine, browsed to them with Firefox on that same machine without any issues. Both demos use Jetty, not Tomcat, and FWIW, I believe the interactions between Tomcat and Firefox are what is causing the issue. When I performed the same test described in my OP, the requests in Firefox were XHR, not HTML.

Pretty sure it is correct, but maybe I missed something. Here are the important bits:

Tomcat server.xml (“default” config except for parts shown)

<Server port="8005" shutdown="SHUTDOWN">

  ...

  <Service name="Catalina">
    <Connector port="8080" protocol="HTTP/1.1"
       connectionTimeout="20000"
       redirectPort="8443"
       maxParameterCount="1000"
       secure="true"
       schema="https"
       />
       
    <Engine name="Catalina" defaultHost="localhost">
       <Valve className="org.apache.catalina.valves.RemoteIpValve"
         internalProxies="127\.0\.[0-1]\.1"
         remoteIpHeader="x-forwarded-for"
         requestAttributesEnabled="true"
         protocolHeader="x-forwarded-proto"
         protocolHeaderHttpsValue="https">
       </Valve>
       
       ...
       
     </Engine>
  </Service>
</Server>

XWiki xwiki.properties (“default” config except for parts shown)

edit.document.inPlaceEditing.enabled = false

XWiki xwiki.cfg (“default” config except for parts shown)

xwiki.home=https://xwiki.example.com/
xwiki.url.protocol=https

Nginx default.conf

resolver 127.0.0.11 ipv6=off;

map $request_uri $x_robots_tag {
    ~^.+$ "noindex, nofollow";
    default "";
}

server {
    listen 80;

    # Match requests with and with www, and with and without a subdomain
    # If a request has a subdomain, it will be stored in the $subdomain variable
    server_name ~^(www\.)?(?<subdomain>.+?)?\.?example\.com$;
    server_tokens off;

    # For services that write their own URLs and default them to being HTTP, request that they be HTTPS
    add_header Content-Security-Policy upgrade-insecure-requests;

    # Prevent search engine indexing
    add_header X-Robots-Tag $x_robots_tag;

    # Redirect all HTTP traffic to HTTPS
    location / {
        if ($subdomain) {
            return 301 "https://$subdomain.example.com$request_uri";
        }
        return 301 "https://example.com$request_uri";
    }
}

server {
    listen 443 ssl;
    http2 on;

    server_name ~^(www\.)?(?<subdomain>.+?)?\.?example\.com$;
    server_tokens off;

    ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;

    # Disable TLS 1.0, TLS 1.1, and TLS 1.2 because they are old and insecure
    ssl_protocols TLSv1.3;

    # For services that write their own URLs and default them to being HTTP, request that they be HTTPS
    add_header Content-Security-Policy upgrade-insecure-requests;

    # Prevent search engine indexing
    add_header X-Robots-Tag $x_robots_tag;

    location / {
        set $proxy_pass "";
        
        ################################################################################################################ 
        # Nearly a dozen other services following the same pattern as
        # the XWiki config below are listed where this comment is
        ################################################################################################################
        
        set $xwiki_container "xwiki";
        set $xwiki_subdomain "xwiki";
        set $xwiki_port "8080";
        if ($subdomain = $xwiki_subdomain) {
            set $proxy_pass "http://$xwiki_container:$xwiki_port$request_uri";
        }

        if ($proxy_pass = "") {
            return 403;
        }

        # Proxy the request to the corresponding Docker container based on subdomain
        proxy_pass $proxy_pass;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header Content-Security-Policy upgrade-insecure-requests;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
    }
}

I don’t have enough knowledge to give you insights on your configuration files, but other people on this forum can probably find them useful to bring their help. And now that you have a test version which works, you might have a solution in sight after a few more tests.

We are using Cloudflare for our DNS, and I’ve discovered that the problems I’m experiencing with Firefox go away when I disable the Cloudflare proxy to my server. Why this is causing a problem with Firefox is still not clear. I’m not willing to go without the Cloudflare proxy, so still hunting for a solution.

Success! I’ve discovered that the problem is to do with how CF caches requests when the CF proxy is enabled. All of my XWiki, Tomcat, and Nginx config was correct. I don’t know what caused the CF cache to become corrupt, or why that corruption only effected Firefox, but I used the “Purge Everything” option and XWiki immediately started working as expected in Firefox. I even set edit.document.inPlaceEditing.enabled back to true and in place editing is working again.

I suspect that whatever corrupted the CF cache in the first place will happen again, and if it does I will either create a CF “Cache Rule” which disables caching for my XWiki subdomain, or add a cache-control header to the request in the Nginx config, something like add_header Cache-Control "private, no-store, no-cache, must-revalidate";

2 Likes