Gradle and SonarQube

If you are like me and like developing using a Test Driven Development (TDD) approach, then you need the ability to examine your code and test coverage. In a past article, I discussed how to use SonarQube to perform Python code inspection allowing you to see code test coverage. In this article, I’ll look at how to set up Gradle to talk to SonarQube.

The SonarRunner

SonarQube has a companion tool called the SonarRunner. The runner is what actually runs the unit tests (or integration tests) on your software, and then reports it to the SonarQube database. One of the nice things about the SonarRunner is that you can add it to just about any project that can be analyzed by SonarQube – regardless of whatever other lifecycle or build tools you may use. All you need to do is add a sonar-runner.properties file with the required information.

One problem however, is that the SonarRunner is yet another tool that is needed during the build process. If you maintain a continuous integration environment where, say, your nightly builds include an inspection run, then you know the pain of having to maintain yet another standalone binary package. Luckily, Gradle has a SonarRunner plugin that is incubating, and is stable enough to use.

Setting Up the Gradle Plugin

The SonarRunner plugin has been included in most recent distributions of Gradle. Applying the plugin requires one line:

apply plugin: sonar-runner

In order to get the runner talking to your local installation of Sonar, you will also need a sonarRunner configuration section in your build.gradle file (note – if you have an older version of Gradle this may not work for you, since older versions used slightly different properties). A minimal set of configuration properties is as follows:

sonarRunner {
    sonarProperties {
        property "sonar.host.url", "http://10.0.0.60:9000"
        property "sonar.jdbc.url", "jdbc:postgresql://10.0.0.60:5432/sonar"
        property "sonar.jdbc.driverClassName", "org.postgresql.Driver"
        property "sonar.jdbc.username", "mySonarUsername"
        property "sonar.jdbc.password", "mySonarPassword"
        property "sonar.projectKey", "MyJavaProject"
        property "sonar.projectName", "My Java Project"
        property "sonar.projectVersion", android.defaultConfig.versionName
        property "sonar.language", "java"
        property "sonar.sources", "src/main"
        property "sonar.binaries", "build"
    }
}

Here are some of the settings in more detail:

  • sonar.host.url – this is the URL to your Sonar host. Note that if you are trying to use a host that is protected by SSL, you will need some additional configuration information.
  • sonar.jdbc.url – this is the JDBC (database) URL. In my case, I was using PostgreSQL server, so I used the postgresql JDBC syntax. The 10.0.0.60 is my internal IP address, and 5432 is the port that the database listens on. Note that in the case of PostgreSQL, you need to explicitly tell it the addresses to listen on in your postgres.conf file, otherwise it defaults to listening on port 5432 only for localhost.
  • sonar.jdbc.driverClassName – should be fairly obvious that this is the driver it should be using for the database connection.
  • sonar.jdbc.username – this is the username used to access the database – not a username that you set up to access the front-end application (if you have authentication for SonarQube turned on).
  • sonar.jdbc.password – this is the password used to access the database – again, not a front-end application username password.
  • sonar.projectKey – a unique name used to identify the project.
  • sonar.projectName – the name of the project, more readable for us humans.
  • sonar.projectVersion – should be fairly obvious. In this instance, I have a second set of `android` properties which are describing my project. Instead of duplicating the version number, I just used it here instead.
  • sonar.language – tells Sonar what language you are analyzing.
  • sonar.sources – where the source files for the project are located. Make sure you keep this path separate from your test cases, otherwise you will have your tests included in the analysis.
  • sonar.binaries – where build artifacts are located.

Configuring JaCoCo

While the above configuration will get you up and running, you will be missing coverage information from your unit tests. To generate coverage reports, you will also need to apply the JaCoCo plugin:

apply plugin: jacoco

Then, you need to set two more Sonar Runner properties as follows (add them below sonar.binaries):

property 'sonar.jacoco.reportPath', "${buildDir}/jacoco/testDebug.exec"
property 'sonar.junit.reportsPath', "${buildDir}/test-results"

You may have to adjust these depending on where jacoco is actually putting your results. Look for the exec file and the test-results paths after you run a build.

All Together

Putting it all together:

apply plugin: sonar-runner
apply plugin: jacoco
 
sonarRunner {
    sonarProperties {
        property "sonar.host.url", "http://10.0.0.60:9000"
        property "sonar.jdbc.url", "jdbc:postgresql://10.0.0.60:5432/sonar"
        property "sonar.jdbc.driverClassName", "org.postgresql.Driver"
        property "sonar.jdbc.username", "mySonarUsername"
        property "sonar.jdbc.password", "mySonarPassword"
        property "sonar.projectKey", "MyJavaProject"
        property "sonar.projectName", "My Java Project"
        property "sonar.projectVersion", android.defaultConfig.versionName
        property "sonar.language", "java"
        property "sonar.sources", "src/main"
        property "sonar.binaries", "build"
        property 'sonar.jacoco.reportPath', "${buildDir}/jacoco/testDebug.exec"
        property 'sonar.junit.reportsPath', "${buildDir}/test-results"
    }
}

You can now check out whether or not Gradle is configured correctly by looking at the tasks:

./gradlew tasks

You should see a new task called sonarRunner:

sonarRunner - Analyzes project ':app' and its subprojects with Sonar Runner.

To generate reports, you do the following:

./gradlew clean test sonarRunner

Gradle will clean the project, rebuild and run the unit tests. It will then perform a Sonar analysis, and post the results to your SonarQube instance. If everything is successful, you should see output similar to the following from Gradle:

17:30:40.937 INFO  - Sensor JaCoCoSensor...
17:30:40.965 INFO  - Analysing /home/thomas/AndroidStudioProjects/MyJavaProject/app/build/jacoco/testDebug.exec
17:30:41.598 INFO  - No information about coverage per test.
17:30:41.598 INFO  - Sensor JaCoCoSensor done: 661 ms
17:30:42.955 INFO  - Execute decorators...
17:30:45.010 INFO  - Store results in database
17:30:45.133 INFO  - ANALYSIS SUCCESSFUL, you can browse http://10.0.0.60:9000/dashboard/index/MyJavaProject
17:30:45.205 INFO  - Executing post-job class org.sonar.plugins.core.issue.notification.SendIssueNotificationsPostJob
17:30:45.210 INFO  - Executing post-job class org.sonar.plugins.core.batch.IndexProjectPostJob
17:30:45.266 INFO  - Executing post-job class org.sonar.plugins.dbcleaner.ProjectPurgePostJob
17:30:45.289 INFO  - -> Keep one snapshot per day between 2014-09-19 and 2014-10-16
17:30:45.291 INFO  - -> Keep one snapshot per week between 2013-10-18 and 2014-09-19
17:30:45.292 INFO  - -> Keep one snapshot per month between 2009-10-23 and 2013-10-18
17:30:45.294 INFO  - -> Delete data prior to: 2009-10-23
17:30:45.302 INFO  - -> Clean MyJavaProject [id=160]
17:30:45.309 INFO  - Clean snapshot 5368
 
BUILD SUCCESSFUL

And of course, browsing to your Sonar instance should reveal some good information about your project.

Summary

In this post, I discussed how to configure Gradle to run a Sonar analysis on your code, and post the results back to your SonarQube instance. In a future post, I’ll look at more of the SonarQube analysis results, and talk about common fixes for various problems.

Python Code Inspection with SonarQube

Last time, I wrote about setting up Travis CI to effortlessly perform continuous integration and testing. My next step was to determine what was actually being tested in my Python Chip 8 emulator, and improve upon areas that had insufficient tests to cover them. While code coverage isn’t the best metric for code quality, it does at least provide visibility on what you are not actually testing.

SonarQube

A good tool that allows you to inspect your code is SonarQube (previously just called Sonar). Available as a standalone server, SonarQube gets you up and analyzing code in minutes. By default, the out-of-the-box configuration provides an H2 on-disk database that isn’t rated for production, but doesn’t require any external dependencies (like PostgreSQL or MySQL). To get it up and running, simply unzip it and run it. On Linux this looks like:

unzip sonarqube-4.3.zip
./sonarqube-4.3/bin/linux-x86-64/sonar.sh start

This will start SonarQube on your local machine, and set the server listening on port 9000. To get to it, simply browse to http://localhost:9000 in your web browser. When you initially visit the link, there will be no projects loaded. To load your project in to SonarQube, you will need one additional piece of software – the Sonar Runner.

Sonar Runner

The Sonar Runner is responsible for running your test cases and posting the results to SonarQube. In order for the Sonar Runner to be able to know what to do, each project needs a file called sonar-project.properties (unless you are using a Maven module – but that’s for a future post!). The properties file stores some pretty standard stuff:

sonar.projectKey=chip8:python
sonar.projectName=Chip8 Python
sonar.projectVersion=1.0
sonar.sources=chip8
sonar.tests=test
sonar.language=py
sonar.sourceEncoding=UTF-8
sonar.python.xunit.reportPath=nosetests.xml
sonar.python.coverage.reportPath=coverage.xml
sonar.python.coveragePlugin=cobertura

The sonar.project* should be fairly self-explanatory. The sonar.sources property tells Sonar Runner where the source files for the project are located. The sonar.tests property tells Sonar Runner where the unit tests are located. The last three lines tell the Sonar Runner where to find the coverage reports, and what format to find them in. Note that both the sonar.sources and sonar.tests property need to point to different sub-directories. If you keep them in the same directory, you will get errors such as:

ERROR: Error during Sonar runner execution
ERROR: Unable to execute Sonar
ERROR: Caused by: File [relative=chip8/screen.py, abs=/export/disk1/emulators/python/chip8/chip8/screen.py] can't be indexed twice. Please check that inclusion/exclusion patterns produce disjoint sets for main and test files

Nosetests and Coverage

Okay, so with Sonar Runner configured, we need one more tool. Sonar Runner by itself will not run the unit tests and gather coverage information. We need to use another package called nosetests. Nose is an advanced test runner that can easily be installed with pip. We also need the coverage tool. These can be installed with:

pip install nose
pip install coverage

Once installed, you need to run nosetests to run the unit tests for you, and generate information relating to the source code. The following line runs the test runner, generates coverage information, and generates an XML report that Sonar Runner will use:

nosetests --with-coverage --cover-package=chip8 --cover-branches --cover-xml

Note the --cover-package option. This restricts the coverage module to the chip8 directory – without it, every single Python source file will be included in the coverage report.

Putting It All Together

With SonarQube, Sonar Runner, and Nose, you are now ready to start inspecting your code. A typical session would be to make some changes to a source file, then run the following:

nosetests --with-coverage --cover-package=chip8 --cover-branches --cover-xml
sed -i 's/filename="/filename=".\//g' coverage.xml
sonar-runner

The nosetests line we have seen before. But what’s with the sed command? As I discovered, it’s a work-around to making the Sonar Runner properly identify the file names in the coverage.xml file. Without it, the Sonar Runner will discard the coverage metrics for the files in the chip8 package (see here and here for more information and discussion). Finally, the sonar-runner command will execute the runner and post the results. Once again visit http://localhost:9000 to see changes to your project.

Interpreting the Results

As I mentioned before, testing using coverage as the guiding principle isn’t the best way to ensure you’ve tested everything. For a good run-down of why this is the case, check out Ned Batchelder’s talk at PyCon 2009 called Coverage testing, the good and the bad.

As a blunt tool, SonarQube can at least tell you what you’re not testing. On the dashboard for the project, you can see metrics related to the unit test coverage and the unit test success:

unit_test_coverage

Clicking on the unit tests coverage report will display the coverage breakdown per module. Clicking on a module will provide coverage details for that file. For example, checking out the CPU code for the Python emulator, we see that the code coverage is 69.6%:

coverage-general

The green lines at the left hand side of the code listing represent lines of code that have been covered by running the unit test suite. Scrolling down a little in the code, the functions execute_instruction and execute_logical_instruction where not tested by the unit test suite. SonarQube nicely highlights these areas in pink so that we can quickly see what we’re not testing:

missing-coverage

Now it’s up to me to go back and write tests to ensure that those functions are working as expected. Then, I can re-run the test suite, and perform another SonarQube analysis on the code.

Conclusion

SonarQube and the Sonar Runner provide a simple and effective way to inspect what your unit tests are actually testing with only a few extra packages. This only scratches the surface of what SonarQube can actually do. In a future post, I will examine some of the other SonarQube metrics, and how they can help improve code quality.