Should we use these security settings?

Hi developers,

Today, for the German OpenDesk research project, I was asked some questions about HTTP headers and cookies related to Security, as that project had some requirements.

  1. They mandate that the Content-Security-Policy HTTP header be set to a value “as restrictive as possible”. Right now we don’t seem to be setting this header. Should we set it and what value should we use?

  2. Another HTTP header that they’re asking us to set is X-Content-Type-Options (with value nosniff). Same question: Should we set it?

  3. They’re also asking that the following settings for cookies be set: httponly, samesite and secure. I started researching cookies we set and what settings we have for them at Security (XWiki.org) but I didn’t mention httponly & samesite. However from what I see we set httponly only in 1 place. For secure, sometimes we set it (only when HTTPS is used btw) and sometimes we don’t. I didn’t see any place where we use samesite though. So the question is the same: should we set these 3 parameters, with what value and for all our cookies? The audit recommended a strict value for samesite.

Some answers from @MichaelHamann:

Re 1):

What do you mean by “do we need to”? This is a security feature that allows to limit, e.g., executed scripts. I suggested using this in Dynamic Script Restrictions (Proposal.DynamicScriptRestrictions) - XWiki to restrict XSS so I believe it would greatly improve the security of XWiki if we used this to restrict JavaScript execution. But it could be quite breaking depending on how much inline scripts we have.
It’s also a nice way to prevent the execution of scripts and other dangerous content in SVG while allowing users to upload SVG.
I think at the moment, we don’t allow regular users to upload SVG that can be displayed in pages. If we allowed that, the one way I know about to make this secure is to use CSP headers.
Okay, actually I know about two ways: a) parse and clean the SVG and b) disallow dynamic content using a CSP.
I mean our users expect that they can put onclick on any element in XWiki syntax (see
this forum post
) and any serious CSP would forbid this kind of inline script/require adding a hash of this script to the CSP header.
See also the discussion in CSP: script-src - HTTP | MDN
Disallowing inline scripts is the primary security win that you get from a content security policy as it would prevent most XSS attacks but XWiki’s users expect these kind of inline scripts.

Also, I absolutely wouldn’t be surprised if we had extensions that load JavaScript from arbitrary domains, so we also couldn’t just disallow external scripts.

My opinion on CSP: we should implement Dynamic Script Restrictions (Proposal.DynamicScriptRestrictions) - XWiki and as part of that add a restrictive CSP that forbids inline scripts unless they are secured by a nonce but make this feature optional. So people who care about the security of their XWiki installation can enable this feature and people who don’t rely on XWiki being secure but instead want to have the scripting possibilities they always had can turn it off.
Note that my proposal wouldn’t disable inline scripts, it would just be a bit harder as every inline script tag will need to have a nonce that is unique per request.
And you won’t be able to use on... attributes anymore.
So all in all what this is about is to make XWiki syntax injection which is currently exploitable both for remote code execution and for XSS mostly harmless which I think is important if you want to have a secure XWiki installation.

Re 2):

That sounds a lot less breaking I would hope.
We would need to test this but this could just work.

Re 3):

Do we really need strict? From what I understand, strict would basically mean that whenever you follow a link to the wiki, you’ll be logged out on the first request which sounds very bad for usability.
I’m fully +1 for setting the SameSite attribute to Lax.
But still, we would need to check if this could negatively impact any authenticator, e.g., when that authenticator uses cross-site POST requests in the authentication flow.
Chrome has Lax as default value but doesn’t enforce it for “fresh” cookies which is apparently to fix authentication flows. If you set the SameSite attribute, you don’t get this fix (and there is no value to get this behavior).
Explanation what the difference between lax and strict is: What is the difference between SameSite="Lax" and SameSite="Strict"? - Stack Overflow
For XWiki, it would make sense to set it to strict to prevent CSRF as most actions are allowed from GET requests (which is really a bad practice, we absolutely should stop that) but again, I fear we cannot as this would break quite a lot of behavior that people expect.
Like you couldn’t open links to the intranet anymore from your webmail, or I mean you would always get an access denied as no cookies would be sent in that request.
I believe an ideal world you would have two login cookies: One that allows read access, and another that allows everything. And you set SameSite to Lax for that one that allows read access and Strict for the other. That way, links work but you still prevent CSRF.
Now I fear that this is hard to introduce in XWiki now and it’s also still not a full protection. Could still improve security, though.
And I guess it could still break links to the wiki.

We should put secure on all cookies that are set in HTTPS requests. I think there is no use case for mixed http/https setups anymore.
That’s why afaik it’s a best practice to always set this when you’re setting cookies in an HTTPS request to avoid that session and login cookies can be exposed by a man-in-the-middle attack that forces the request to be HTTP.
Though HSTS has basically the same impact and should also be used.

WDYT?

What’s our next steps? How do we move forward?

Thanks

1 Like

We also need to discuss HTTP Strict Transport Security — Wikipédia

And also discuss the need to use a X-XSRF token in addition to our CSRF code.