Skip to content

Open Source

Jenkins World 2016 Wrap-up - Ask the Experts & Demos

This is a guest post by Liam Newman, Technical Evangelist at CloudBees. As I mentioned in my previous post, Jenkins World brought together Jenkins users from organizations of all sizes. It also brought together Jenkins users of all skill levels; from beginners to experts (including to JAM organizers, board members, and long time contributors). A number of those experts also volunteered to staff the Open Source Hub’s "Ask the Experts" desk throughout the conference to answer Jenkins questions. This included, but was not limited to: Paul Allen, R Tyler Croy, James Dumay, Jesse Glick, Eddú Meléndez Gonzales, Jon Hermansen, Owen Mehegan, Oleg Nenashev, Liam Newman, Christopher Orr, Casey Vega, Mark Waite, Dean Yu, and Keith Zantow. I actually chose to spend the majority of...
Categories: Open Source

Jenkins World 2016 Wrap-up - Scaling

This is a guest post by Liam Newman, Technical Evangelist at CloudBees. One of the great features of Jenkins is how far it can scale, not only from a software perspective, but also from an organizational one. From a single Jenkins master with one or two agents to a multiple master with thousands of agents, from a team of only a few people to a whole company with multiple disparate departments and organizations, you’ll find space where Jenkins is used. Like any software or organization, there are common challenges for increasing scale with Jenkins and some common best practices, but there are also some unique solutions. A big...
Categories: Open Source

Jenkins World 2016 Wrap-up - Pipeline

This is a guest post by Liam Newman, Technical Evangelist at CloudBees. As someone who has managed Jenkins for years and manually managed jobs, I think pipeline is fantastic. I spent much of the conference manning the Ask the Experts desk of the "Open Source Hub" and was glad to find I was not alone in that sentiment. The questions were not "Why should I use Pipeline?", but "How do I do this in Pipeline?" Everyone was interested in showing what they have been able to accomplish, learning about best practices, and seeing what new features were on the horizon. The sessions and demos on Pipeline that I saw were...
Categories: Open Source

NUnit-Summary Becoming an “Official” NUnit Application - Thu, 09/22/2016 - 23:39

NUnit-Summary is an “extra” that I’ve maintained personally for some time. It uses built-in or user-supplied transforms to produce summary reports based on the results of NUnit tests.

I have contributed it to the NUnit project and we’re working on updating it to recognize NUnit 3 test results. The program has never had a 1.0 release, but we expect to produce one soon.

This old post talks about the original nunit-summary program.

Categories: Open Source

An Engine Extension for Running Failed Tests – Part 1: Creating the Extension - Thu, 09/22/2016 - 20:47

In a recent online discussion, one of our users talked about needing to re-run the NUnit console runner, executing just the failed tests from the previous run. This isn’t a feature in NUnit but it could be useful to some people. So… can we do this by creating an Engine Extension? Let’s give it a try!

The NUnit Test Engine supports extensions. In this case, we’re talking about a Result Writer extension, one that will take the output of a test run from NUnit and create an output file in a particular format. In this case, we want the output to be a text file with each line holding the full name of a failed test case. Why that format? Because it’s exactly the format that the console runner already recognizes for the --testlist option. We can use the file that is created as input to a subsequent test run.

Information about how to write an extension can be found on the Writing Engine Extensions page of the NUnit documentation. Details of creating a ResultWriter extension can be found on the Result Writers page.

To get started, I created a new class library project called failed-tests-writer. I made sure that it targeted .NET 2.0, because that allows it to be run under the widest range of runtime versions and I added a package reference to the NUnit.Engine.Api package. That package will be published on with the release of NUnit 3.5. Since that’s not out yet, I used the latest pre-release version from the NUnit project MyGet feed by adding to my NuGet package sources.

Next, I created a class to implement the extension. I called it FailedTestsWriter. I added using statements for NUnit.Engine and NUnit.Engine.Extensibility and implemented the IResultWriter interface. I gave my class Extension and ExtensionProperty attributes. Here is what it looked like when I was done.

using System;
using System.IO;
using NUnit.Engine;
using NUnit.Engine.Extensibility

namespace EngineExtensions
    [Extension, ExtensionAttribute("Format", "failedtests")]
    public class FailedTestsWriter : IResultWriter
        public void CheckWritability(string outputPath)
            using (new StreamWriter(outputPath, false, Encoding.UTF8)) { }

        public void WriteResultFile(XmlNode resultNode, string outputPath)
            using (var writer = new StreamWriter(outputPath, false, Encoding.UTF8))
                WriteResultFile(resultNode, writer);

        public void WriteResultFile(XmlNode resultNode, TextWriter writer)
            foreach (XmlNode node in resultNode.SelectNodes("//test-case[@result='Failed']")) // (3)

The ExtensionAttribute marks the class as an extension. In this case as in most cases, it’s not necessary to add any arguments. The Engine can deduce how the extension should be used from the fact that it implements IResultWriter.

As explained on the Result Writers page, this type of extension requires use of the ExtensionPropertyAttribute so that NUnit knows the name of the format it implements. In this case, I chose to use “failedtests” as the format name.

The CheckWriteability method is required to throw an exception if the provided output path is not writeable. We do that very simply by trying to create a StreamWriter. The empty using statement is merely an easy way to ensure that the writer is closed.

The main point of the extension is accomplished in the second WriteResultFile method. A foreach statement selects each failing test, which is then written to the output file.

Testing the Extension

That explains how to write the extension. In Part 2, I’ll explain how to deploy it. Meanwhile, I’ll tell you how I tested my extension in it’s own solution, using nunit3-console.

First, I installed the package NUnit.ConsoleRunner from I used version 3.4.1. Next, I created a fake package subdirectory in my packages folder, so it ended up looking like this:


Note that the new extension “package” directory name must start with “NUnit.Extension.” in order to trick the console-runner and engine into using it.

With this structure in place, I was able to run the console with the --list-extensions option to see that my extension was installed and I could use a command like

nunit3-console mytests.dll --result:FailedTests.lst;format=failedtests

to actually produce the required output.

Categories: Open Source

New Website

Watir - Web Application Testing in Ruby - Thu, 09/22/2016 - 16:43


Recently, we updated the Watir website to be hosted on GitHub Pages. The existing and sites will no longer be maintained, but most if not all of the information currently on them has been transferred over to the new site. Check it out at now redirects to this new site, and soon will become

If you think something is missing or have any comments or feedback, make an issue on the site’s GitHub page or join us on our Slack channel.

Categories: Open Source

Back to Blogging! - Thu, 09/22/2016 - 02:50

My blog has been offline for a long time, as you can see. The last prior post was in 2009!

Recently, I found a backup copy of the old blog and was able to re-establish it. Watch for some new posts in the near future.

Categories: Open Source

Jenkins World 2016 Wrap-up - Introduction

This is a guest post by Liam Newman, Technical Evangelist at CloudBees. That’s a Wrap! Any way you look at it, last week’s Jenkins World Conference 2016 was a huge success. In 2011, a few hundred users gathered in San Francisco for the first "Jenkins User Conference". Over successive years, this grew into several yearly regional Jenkins user conferences. This year, over 1,300 people came from around the world to "Jenkins World 2016", the first global event for the Jenkins community. This year’s Jenkins World conference included: Keynote presentation by Jenkins creator, Kohsuke Kawaguchi, announcing a number of great new Jenkins project features, such as "Blue Ocean". More than 50...
Categories: Open Source

Jenkins Online Meetup report. Plugin Development - WebUI

On September 6th we had a Jenkins Online Meetup. This meetup was the second event in the series of Plugin Development meet ups. At this meetup we were talking about Jenkins Web UI development. Talks 1) Classic Jenkins UI framework - Daniel Beck In the first part of his talk, Daniel presented how Stapler, the web framework used in Jenkins, works, and how you can add to the set of URLs handled by Jenkins. In the second part he was talking about creating new views using Jelly and Groovy, and how to add new content to existing views. Keywords: Stapler, Jelly, Groovy-defined UIs 2) Developing modern Jenkins UIs with Javascript - Tom Fennelly Feel...
Categories: Open Source

Announcing the Blue Ocean beta, Declarative Pipeline and Pipeline Editor

At Jenkins World on Wednesday 14th of September, the Jenkins project was happy to introduce the beta release of Blue Ocean. Blue Ocean is the new user experience for Jenkins, built from the ground up to take advantage of Jenkins Pipeline. It is an entire rethink of the the way that modern developers will use Jenkins. Blue Ocean is available today via the Jenkins Update Center for Jenkins users running 2.7.1 and above. Get the beta Just search for BlueOcean beta in the Update Center, install it, browse to the dashboard, and then click the Try BlueOcean UI button on the dashboard. Whats included? Back in April we open sourced...
Categories: Open Source

Take the 2016 Jenkins Survey!

This is a guest post by Brian Dawson on behalf of CloudBees, where he works as a DevOps Evangelist responsible for developing and sharing continuous delivery and DevOps best practices. He also serves as the CloudBees Product Marketing Manager for Jenkins. Once again it’s that time of year when CloudBees sponsors the Jenkins Community Survey to assist the community with gathering objective insights into how jenkins is being used and what users would like to see in the Jenkins project. Your personal information (name, email address and company) will NOT be used by CloudBees for sales or marketing. As an added incentive to take the survey, CloudBees will enter participants into a...
Categories: Open Source

We Are Adjusting Rules Severities

Sonar - Thu, 09/08/2016 - 09:31

With the release of SonarQube 5.6, we introduced the SonarQube Quality Model, which pulls Bugs and Vulnerabilities out into separate categories to give them the prominence they deserve. Now we’re tackling the other half of the job: “sane-itizing” rule severities, because not every bug is Critical.

Before the SonarQube Quality Model, we had no way of bringing attention to bugs and security vulnerabilities except to give them high severity ratings. So all rules with a Blocker or Critical severity were related to reliability (bugs) or security (vulnerabilities), and vice versa as a tautology. That made sense before the SonarQube Quality Model, but it doesn’t now. Now, just being a Bug is enough to draw the right attention to an issue. Now, having every Bug or Vulnerability at the Blocker or Critical level is actually a distraction.

So we’re fixing it. We’ve reclassified the severity on every single rule specification in the RSpec repository. The changes to existing reliability/bug rules are reflected in version 4.2 of the Java plugin, and future releases of Java and other languages should reflect the rest of the necessary changes. In some cases, the changes are significant (perhaps even startling), so it makes sense to explain the thinking.

The first thing to know is that the reclassifications are done based on a truth table:

  Impact Likelihood Blocker high high Critical high low Major low high Minor low low

For each rule, we first asked ourselves: What’s the worst thing that can reasonably happen as a result of an issue raised by this rule, factoring in Murphy’s Law without predicting Armageddon?

With the worst thing in mind, the rest is easy. For bugs we evaluate impact and severity with these questions:
Impact: Will the “worst thing” take down the application (either immediately or eventually), or corrupt stored data? If the answer is “yes”, impact is high.
Likelihood: What is the probability the worst will happen?

For vulnerabilities, the questions are:
Impact: Could the exploitation of the vulnerability result in significant damage to your assets or your users?
Likelihood: What is the probability a hacker will be able to exploit the issue?

And for code smells:
Impact: Could the code smell lead a maintainer to introduce a bug?
Likelihood: What is the probability the worst will happen?

That’s it. Rule severities are now transparent and easy to understand. And as these changes roll out in new versions of the language plugins, severity inflation should quickly become a thing of the past!

Categories: Open Source

Continuous Delivery of Infrastructure with Jenkins

This is a guest post by Jenkins World speaker R Tyler Croy, infrastructure maintainer for the Jenkins project. I don’t think I have ever met a tools, infrastructure, or operations team that did not have a ton of work to do. The Jenkins project’s infrastructure "team" is no different; too much work, not enough time. In lieu of hiring more people, which isn’t always an option, I have found heavy automation and continuous delivery pipelines to be two solutions within reach of the over-worked infrastructure team. As a big believer in the concept of "Infrastructure as Code", I have been, slowly but surely, moving the project’s infrastructure from manual tasks to code,...
Categories: Open Source

Pipeline at Jenkins World 2016

This is a guest post by R. Tyler Croy, who is a long-time contributor to Jenkins and the primary contact for Jenkins project infrastructure. He is also a Jenkins Evangelist at CloudBees, Inc. I have been heavily using Jenkins Pipeline for just about every Jenkins-related project I have contributed to over the past year. Whether I am building and publishing Docker containers, testing infrastructure code or publishing this very web site, I have been adding a Jenkinsfile to nearly every Git repository I touch. Implementing Pipeline has been rewarding, but has not been without its own challenges. That’s why I’m excited to see lots of different Jenkins Pipeline related content in the agenda at Jenkins...
Categories: Open Source

The Tweets You Missed in August

Sonar - Tue, 09/06/2016 - 16:36

Here are the tweets you likely missed last month!

SonarQube Java 4.1 provides 7 new rules and improves try-catch handling.

— SonarQube (@SonarQube) August 4, 2016

SonarQube Swift Plugin 1.7 Released: native support of XCode 7+ coverage reports, … #swift

— SonarQube (@SonarQube) August 26, 2016

SonarLint for Eclipse 2.2 supports Python, custom rules and issue/file exclusions

— SonarLint (@SonarLint) August 1, 2016

SonarLint for IntelliJ 2.3 supports Python, custom rules and issue/file exclusions

— SonarLint (@SonarLint) August 5, 2016

SonarLint for Command Line 2.0 Released: with full support of the Connected Mode

— SonarLint (@SonarLint) August 17, 2016

Categories: Open Source

Introducing a New Way to Define Jenkins Pipelines

This is a guest post by Jenkins World speaker Andrew Bayer, Jenkins developer at CloudBees. Over the last couple years, Pipeline as code has very much become the future of Jenkins - in fact, at this point, I’d say it’s pretty well established as the present of Jenkins. But that doesn’t mean it’s done, let alone that it’s perfect. While many developers enjoy the power and control that they get from writing Pipelines using scripting, not everyone feels the same way. A lot of developers want to specify their build as configuration and get on with building software. Pipeline scripts haven’t been a good way to do that…​until now. With...
Categories: Open Source

SonarAnalyzer for C#: The Rule Engine You Want to Use

Sonar - Thu, 09/01/2016 - 14:49

If you’ve been following the releases of the Scanner for MsBuild and the C# plugin over the last two years, you must have noticed that we significantly improved our integration with the build tool and at the same time added a lot of new rules. Also, we introduced SonarLint for Visual Studio, a new tool to analyze code inside the IDE. With these steps completed we are deprecating the SonarQube ReSharper plugin to be able to provide a consistent, high-level experience among our tools.

In the last couple years we’ve worked in close collaboration with Microsoft to make our products fit easily into the .NET ecosystem. The goal of the collaboration was two-fold:

  • integrate the SonarQube Scanner for MsBuild seamlessly into the build process
  • develop the Connected Mode in SonarLint for Visual Studio to propagate analysis settings from SonarQube to Visual Studio.

The improvements to the SonarQube Scanner for MsBuild resulted in pre- and post-build command line steps that respectively download settings from, and upload analysis results to your SonarQube server. And in between these steps, your MsBuild step doesn’t need to be changed at all. In addition to the SonarLint Connected Mode, we achieved our main goal of showing the exact same issues inside the IDE as you’d see on the SonarQube server.

From a technology perspective, both of these integration pieces are highly dependent on the new .NET compiler platform, Roslyn. Additionally, we’ve put a great deal of effort into implementing rules based on Roslyn. From SonarLint for Visual Studio version 1.0, which was released on July 20, 2015 with 76 rules, we’ve increased our C# offerings to 173 rules. Our C# rule engine, the SonarAnalyzer for C#, is the underlying rule engine in both SonarLint for Visual Studio and the C# plugin. So no matter where you’re running the analysis, you benefit from the new rules. Many of the rules might have already been familiar to you, because we prioritized the implementation of ReSharper-like rules. We went through all the C# warning rules that are enabled by default in ReSharper and in the end we found that more than 80% of them are now covered by the SonarAnalyzer for C#.

We even went a step further, and made the SonarQube Roslyn SDK to provide a way to integrate your Roslyn-based rules into the analysis process both inside the IDE and with the Scanner for MSBuild. However, we can’t provide the same consistent user experience with ReSharper because it’s not based on Roslyn. ReSharper analysis in the build process isn’t MSBuild-based; it requires a dedicated command line tool. And inside Visual Studio, ReSharper is a completely separate analysis tool, so there’s no way to make the Connected Mode support ReSharper settings. As a result, we decided to deprecate the ReSharper plugin and move it to the community maintained extensions.

To sum up, in order to best focus our efforts on valuable features and provide you with the best user experience, we decided to drop support for the ReSharper plugin. “Less is More” is our frequently repeated mantra at SonarSource. With this step, you’ll have fewer tools to worry about, and a more consistent experience across our products. Additionally, you’ll benefit from our quick release cycles, and get updates every months or so. Recently, we’ve focused our efforts on advanced bug detection rules. Did you know that our brand new symbolic execution engine found a NullReferenceException bug in Roslyn?

Categories: Open Source

Jenkins World Contributor Summit

At previous Jenkins User Conferences we have hosted "Contributor Summits" to gather developers and power-users in one room to discuss specific areas of Jenkins, such as Scalability, Pipeline, etc. As part of this year’s Jenkins World we’re hosting another Contributor Summit, to discuss: Blue Ocean, Pipeline and Storage Pluggability. Contributors to these three areas of the Jenkins ecosystem will be in attendance to present details of their design, requirements, and tentative roadmaps. After the presentations, the afternoon will be "unconference style" which is much more fluid to allow discussions, feedback, and brain-storming around the three focus areas. The program for the Jenkins World Contributor Summit includes: Updates from the various project officers. A discussion of the Blue...
Categories: Open Source

Scaling Jenkins at Jenkins World 2016

This is a guest post by R. Tyler Croy, who is a long-time contributor to Jenkins and the primary contact for Jenkins project infrastructure. He is also a Jenkins Evangelist at CloudBees, Inc. I find the topic of "scaling Jenkins" to be incredibly interesting because, more often than not, scaling Jenkins isn’t just about scaling a single instance but rather scaling an organization and its continuous delivery processes. In many cases when people talk about "scaling Jenkins" they’re talking about "Jenkins as a Service" or "Continuous Delivery as a Service" which introduces a much broader scope, and also more organization-specific requirements, to the problem. One of my favorite parts of a...
Categories: Open Source

Demos at Jenkins World 2016

At this year’s Jenkins World, our events officer Alyssa has been working to organize various activities in the "Open Source Hub" on the expo floor. Both days of the conference (Sept. 14th and 15th), during the break for lunch, there will be 15 minute demos by many of the experts helping to staff the Open Source Hub. Demo Schedule Wednesday, September 14th Time Session Details Presenter 12:15 - 12:30 Blue Ocean in Action Showcase of Blue Ocean and how it will make Jenkins a pleasure to use. Keith Zantow 12:30 - 12:45 Notifications with Jenkins Pipeline Sending information to Slack, HipChat, email and more from your Pipeline Liam Newman 12:45 - 13:00 Docker and Pipeline Learn how to use Docker inside of...
Categories: Open Source