Today, for the German OpenDesk research project, I was asked some questions about HTTP headers and cookies related to Security, as that project had some requirements.
They mandate that the
Content-Security-PolicyHTTP header be set to a value “as restrictive as possible”. Right now we don’t seem to be setting this header. Should we set it and what value should we use?
Another HTTP header that they’re asking us to set is
nosniff). Same question: Should we set it?
They’re also asking that the following settings for cookies be set:
secure. I started researching cookies we set and what settings we have for them at Security (XWiki.org) but I didn’t mention
samesite. However from what I see we set
httponlyonly in 1 place. For
secure, sometimes we set it (only when HTTPS is used btw) and sometimes we don’t. I didn’t see any place where we use
samesitethough. So the question is the same: should we set these 3 parameters, with what value and for all our cookies? The audit recommended a
Some answers from @MichaelHamann:
It’s also a nice way to prevent the execution of scripts and other dangerous content in SVG while allowing users to upload SVG.
I think at the moment, we don’t allow regular users to upload SVG that can be displayed in pages. If we allowed that, the one way I know about to make this secure is to use CSP headers.
Okay, actually I know about two ways: a) parse and clean the SVG and b) disallow dynamic content using a CSP.
I mean our users expect that they can put
onclickon any element in XWiki syntax (see
this forum post
) and any serious CSP would forbid this kind of inline script/require adding a hash of this script to the CSP header.
See also the discussion in CSP: script-src - HTTP | MDN
Disallowing inline scripts is the primary security win that you get from a content security policy as it would prevent most XSS attacks but XWiki’s users expect these kind of inline scripts.
My opinion on CSP: we should implement Dynamic Script Restrictions (Proposal.DynamicScriptRestrictions) - XWiki and as part of that add a restrictive CSP that forbids inline scripts unless they are secured by a nonce but make this feature optional. So people who care about the security of their XWiki installation can enable this feature and people who don’t rely on XWiki being secure but instead want to have the scripting possibilities they always had can turn it off.
Note that my proposal wouldn’t disable inline scripts, it would just be a bit harder as every inline
scripttag will need to have a nonce that is unique per request.
And you won’t be able to use
So all in all what this is about is to make XWiki syntax injection which is currently exploitable both for remote code execution and for XSS mostly harmless which I think is important if you want to have a secure XWiki installation.
That sounds a lot less breaking I would hope.
We would need to test this but this could just work.
Do we really need strict? From what I understand, strict would basically mean that whenever you follow a link to the wiki, you’ll be logged out on the first request which sounds very bad for usability.
I’m fully +1 for setting the
But still, we would need to check if this could negatively impact any authenticator, e.g., when that authenticator uses cross-site POST requests in the authentication flow.
Laxas default value but doesn’t enforce it for “fresh” cookies which is apparently to fix authentication flows. If you set the
SameSiteattribute, you don’t get this fix (and there is no value to get this behavior).
Explanation what the difference between lax and strict is: What is the difference between SameSite="Lax" and SameSite="Strict"? - Stack Overflow
For XWiki, it would make sense to set it to strict to prevent CSRF as most actions are allowed from GET requests (which is really a bad practice, we absolutely should stop that) but again, I fear we cannot as this would break quite a lot of behavior that people expect.
Like you couldn’t open links to the intranet anymore from your webmail, or I mean you would always get an access denied as no cookies would be sent in that request.
I believe an ideal world you would have two login cookies: One that allows read access, and another that allows everything. And you set
Laxfor that one that allows read access and
Strictfor the other. That way, links work but you still prevent CSRF.
Now I fear that this is hard to introduce in XWiki now and it’s also still not a full protection. Could still improve security, though.
And I guess it could still break links to the wiki.
We should put
secureon all cookies that are set in HTTPS requests. I think there is no use case for mixed http/https setups anymore.
That’s why afaik it’s a best practice to always set this when you’re setting cookies in an HTTPS request to avoid that session and login cookies can be exposed by a man-in-the-middle attack that forces the request to be HTTP.
Though HSTS has basically the same impact and should also be used.
What’s our next steps? How do we move forward?