SonarCloud Review

SonarCloud Issues Review

Introduction

A while ago I started reviewing SonarCloud issues mostly about xwiki-rendering and xwiki-commons since xwiki-platform did not have been analyzed for a long time due to Java version incompatibility.

Firstly, let me introduce SonarSource tools to you. SonarSource provides SonarQube, SonarCloud and SonarLint, three tools aiming to ensure better code quality, code security and and some consistency through code.

SonarLint is an IDE plugin, it accompanies developer while coding by lively raising a lot of warning, about bugs, errors, and bad practices. Using it prevent most of the issues, you as developer, could insert in the code.

SonarQube, as SonarLint, relies on a set or rules for each language, designed to reports as much issues as possible. However SonarQube unlike SonarLint, could launch its analysis at “build time”. Depending on language, build time does not have the same meaning but the point is that developer has to trigger it or to include it in CI and then retrieve reports through developer’s own SonarQube instance.

Then SonarCloud comes as a kind-of SonarQube as a Service. It reports as much issues as SonarQube but provides an automatic analysis in place, easy to set up and by default takes place as an automatic push/pull request workflow action.

Rules

SonarCloud reports issues as Bugs, Vulnerabilities and Code Smells in additions to Security Hotspots and it also displays an aggregation of tested code coverage reported by other tools in a repository.

The interesting thing about SonarCloud, and other SonarSource tools is that rules used to report issues are based on well-known security-related programs like CVE® Program and CWE™ list or SANS Top 25 software weaknesses list and OWASP Top Ten application security risks. In addition SonarCloud wraps reported issues in an in-house classification.

That said, XWiki Project had triggered a fair amount of issues over the years and here I come with the most triggered issue types, their relevance, their criticity.

Triggered OWASP Rules Ranking

  1. A4 - XML External Entities (XXE) - Moslty concerns xwiki-commons and XML Parser.
  2. A3 - Sensitive Data Exposure - Mostly concerns loggers. These triggers are mostly seen about test code so it is not a big deal. The point here is that at every or as much as possible times when a log occurs a Logger has to be used. There are remaining stackTrace printing in xwiki-platform.
  3. A1 - Injection - In fact, it’s mostly about logging user-controlled unchecked code that could result in issues while parsing Logs.

Mostly in xwiki-platform, most of reported issues rely on user-controlled but not necessarily checked data. By the way, xwiki-platform has a lot of pending issues relying on old code and these issues and code smells are not relevant, especially for some old code. Putting in context that Sonar can not analyze code based on Java 8, xwiki-platform has to be updated to benefit of latest rules of SonarCloud and get pertinent results.

Triggered Rules Ranking (xwiki-platform is not concerned here)

  1. Null pointers should not be dereferenced/accessed
  2. Non-primitive fields should not be “volatile”
  3. Resources should be closed
  4. Only one method invocation is expected when testing checked exceptions
  5. XML parsers should not be vulnerable to XXE attacks

Most of these issues could be avoided at development time by using at least SonarLint extension which is available almost on every popular Java IDE.

CWE

  1. CWE-477 Use of Obsolete Function
  2. CWE-493 Critical Public Variable without Final Modifier
  3. CWE-611 Improper Restriction of XML External Entity Reference
  • CWE-827 Improper control of Document Type Definition
  1. CWE-582 Array Declared Public, Final and Static
  2. CWE-607 Public Static Final Field References Mutable Object
  3. CWE-754 Improper check for Unusual of Exceptional Conditions

Conlusion

I doubt most of these categories are really a big-deal. Lot of them have been reported as blocker but if XML External entities issues were fixed, there won’t be a lot of really blocker issues.
Most of the rules are set as blockers but often false positive because of code purpose often implies to not follow these rules like, particularily in xwiki-platform:

  • Lack of assertions in test cases
  • Clone method overriding
  • Add an end condition to this loop

If these rules were removed most of reported issues would be removed too and this can be done due to their irrevelance.
Also the big part of critical and blockers if not removed could be set as code-smells. If we do not want code smells, then their rules could be removed from analysis profile.

However a bit of work has to be done about user-controlled data and XML parsing to ensure security because of the amount of issues reported about it.

The most critical points in my opinion are:

  • Checking User-controlled data and related problems (XSS,Injection,…)
  • XML Parsing
  • RegEx computing
  • Thread-safe code

These topic should be concerned by blocker rules, other by less important classification.

Note: It actually uses the analysis provided by SonarQube and brings them to your IDE.

Thanks @gcoquard for this thread.

Is it a proposal? It’s not very clear to me what you are proposing, could you rephrase it as a proposal maybe?

ATM we fail our CI whenever a blocker issue exists for new code, for the following rules: SonarCloud

See Start breaking the CI job when the SonarQube quality gate fails

Are you proposing to add or remove rules from this list?

Thanks

Hi,

Since I was considering both critical and blocker issues, my conclusions were more global.
So yes, I was proposing to remove some rules but your comment made me get back to considering again the set of 41 rules for Blocker issues and there is actually no rules to remove right now.
Concerned issues might be a bit old and I’ll stay tuned to future development and Sonar reports to tell and decide if they are either really annoying or not.

However, I’ll investigate to create Jira issues to wrap some Sonar ones to achieve concrete results through this review.

Thanks for your reaction, I agree it needed a bit of clarification.