Skip to content

Open Source

SonarQube 5.2 in Screenshots

Sonar - Thu, 11/26/2015 - 15:14

The team is proud to announce the biggest release ever of the SonarQube server, version 5.2, which includes the second-most-anticipated feature ever: code scanners no longer access the database! In brief, this version features:

  • Scanners no longer access the database
  • Enhanced monitoring
  • Better issue management
  • Improved UI for global admin
  • Also worth noting

Scanners no longer access the database

In a significant, fundamental change, this version breaks the direct ties from the SonarQube Scanners (SonarQube Runner, Maven, Gradle, …) to the SonarQube database. From this version forward, it is no longer necessary to hand out your SonarQube database credentials to would-be analyzers, and if they’re still included in your analysis parameters, you’ll see warnings in the log:

Breaking the database connection means you’re now free to execute analysis from your CI services like travis-ci, appveyor, VSO Build, and so on without biting your nails over database security. Instead, scanners now submit analysis reports to the server, and the server processes them asynchronously. This means that analysis results are not available in the Web application right after the scanner has finished its execution, it can take some time depending on the load on the server:

But it also means that it’s no longer required to have a fat network connection between the machines analysis runs on and the database. Now you can arrange those machines on your network based solely on your own criteria.

As soon as an analysis report is sent to the server, the status of the report is displayed on the dashboard of the corresponding project:

Enhanced monitoring

Because more processing is done on server-side, more information is available server-side to monitor and understand what’s going on in SonarQube. First, the former “Analysis Reports” page has been renamed “Background Tasks” and redesigned to offer far more features, including access to the analysis report processing logs:

The page is available at project administration level too:

Server logs are also now accessible from the UI, and it’s possible to dynamically change the server log level (it reverts automatically on restart):

Better issue management

Continuing the theme of more and better information, the reporting of issues has also improved in this version. First, is the ability to have more precise issue highlighting, additional issue locations, and additional messages:

The additional highlights and messages are attached to the issues, so you have to select an issue to see its “extras”:

Of course, the platform just makes these things possible; the language plugins have to support them before you’ll see these effects. So far, you can see additional locations and messages in select rules in the Java plugin.

Another improvement is the ability to display issues by count or technical debt:

As well as a new entry page for issues with quick links to default and saved issue filters:

Speaking of filters, there’s a new issue filter widget with a wide variety of display options, so you can put the results of any search directly on your dashboard:

Wrapping up the topic of issues, we’ve improved notifications, with a new “My New Issues” notification that tells you only about what’s relevant to you, and we’ve added the ability to define a default issue assignee on a project. This account will be used for every new issue that SonarQube can’t assign automatically based on the SCM information.

Improved UI for global admin

A number of pages have been rewritten in this version for a more consistent user experience. The one available to everyone is the Quality Profiles page:

Beyond that, many administrative pages have been rewritten, including all the security pages:

As well as the Update Center:

And the Project Management page:

As a side-effect of these rewrites, web services are now available for all the types of data required to feed these pages. Check your server’s api_documentation for details, or use Nemo’s for a quick reference.

Also worth noting

As a side-effect of the ties between analysis and the database, plugins that do data manipulation beyond simply gleaning raw numbers and issues directly from source files will probably need to be rewritten because the API’s have changed, and such processing must now be done server-side.

All design-related features were dropped in this version (see SONAR-6553 for details), including Package Tangle Index and related metrics.

Also gone in 5.2, but slated to reappear in 5.3 is cross-module/project duplication detection. Why? We simply ran out of time.

That’s All, Folks!

Time now to download the new version and try it out. But don’t forget to read the installation or upgrade guide.

Categories: Open Source

Analysis of Visual Studio Solutions with the SonarQube Scanner for MSBuild

Sonar - Thu, 11/19/2015 - 17:19

At the end of April 2015 during the Build Conference, Microsoft and SonarSource Announced SonarQube integration with MSBuild and Team Build. Today, half a year later, we’re releasing the SonarQube Scanner for MSBuild 1.0.2. But what exactly is the SonarQube Scanner for MSBuild? Let’s find out!

The SonarQube Scanner for MSBuild is the tool of choice to perform SonarQube analysis of any Visual Studio solution and MSBuild project. From the command line, a project is analyzed in 3 simple steps:

  1. MSBuild.SonarQube.Runner.exe begin /key:project_key /name:project_name /version:project_version

  2. msbuild /t:rebuild

  3. MSBuild.SonarQube.Runner.exe end

The “begin” invocation sets up the SonarQube analysis. Mandatory analysis settings such as the SonarQube project key, name and version must be passed in, as well as any optional settings, such as paths to code coverage reports. During this phase, the scanner fetches the quality profile and settings to be used from the SonarQube server.

Then, you build your project as you would typically do. As the build happens, the SonarQube Scanner for MSBuild gathers the exact set of  projects and source files being compiled and analyzes them.

Finally, during the “end” invocation, remaining analysis data such as Git or TFVC one is gathered, and the overall results are sent to the SonarQube server.

Using the SonarQube Scanner for MSBuild from Team Foundation Server and Visual Studio Online is even easier: there is no need to install the scanner on build agents, and native build steps corresponding to the “begin” and “end” invocations are available out-of-the-box (see the complete Microsoft ALM Rangers documentation for details).

A similar experience is offered for Jenkins users as well since the Jenkins SonarQube plugin version 2.3.

Compared to analyzing Visual Studio solutions with the sonar-runner and the Visual Studio Bootstrapper plugin, this new SonarQube Scanner for MSBuild offers many advantages:

  1. Having a Visual Studio solution (*.sln) file is no longer a requirement, and customized *.csproj files are now supported! The analysis data is now extracted from MSBuild itself, instead of being retrieved by manually parsing *.sln and *.csproj files. If MSBuild understands it, the SonarQube Scanner for MSBuild will understand it!

  2. For .NET, analyzers can now run as part of the build with Roslyn, which not only speeds up the analysis but also yields better results; instead of analyzing files one by one in isolation, the MSBuild integration enables analyzers to understand the file dependencies. This translates into fewer false positives and more real issues.

  3. Enabling FxCop is now as simple as enabling its rules in the quality profile. There is no longer any need to manually set properties such as “sonar.visualstudio.outputPaths” or “sonar.cs.fxcop.assembly” for every project: All the settings are now deduced by MSBuild.

As a consequence, we are deprecating the use of sonar-runner and the Visual Studio Bootstrapper plugin to analyze Visual Studio solutions, and advise all users to migrate to the SonarQube Scanner for MSBuild instead. Before you begin your migration, here are a few things you need to be aware of:

  1. The analysis must be executed from a Windows machine, with the .NET Framework version 4.5.2+ installed, and the project must be built using MSBuild 12 or 14. Note that the project you analyze can itself target older versions of the .NET Framework, but the SonarQube Scanner for MSBuild itself requires at least version 4.5.2 to run.

  2. Obviously, you now need to be able to build the project you want to analyze!

  3. Most old analysis properties (such as “sonar.cs.fxcop.assembly“, “sonar.dotnet.version”) are no longer used and should be removed. The only useful ones are unit test result and code coverage reports paths.

  4. The “” file is no longer used and should be deleted.

Try it out for yourself and get started
! Download the SonarQube Scanner for MSBuild, install it, and start to analyze your projects! If you are new to SonarQube, the end-to-end guide produced by the Microsoft ALM Rangers will take you through every step.

Categories: Open Source

Celebrating Hacksgiving!

Next week in the US we have a national holiday where, generally speaking, lots of turkey gets converted into left-over turkey sandwiches. For many software developers the Thanksgiving holiday also represents a lull in project schedules, freeing up some time to hack on pet projects or even contribute to open source projects.

Taking a cue from the Adopt a Plugin program that Daniel wrote about earlier this month, we thought it would be fun to organize a "virtual hackathon" to coincide with that gap in project schedules. Thus Hacksgiving 2015 was created!

We'll be hosting Hacksgiving Nov 23rd and Nov 24 from 7:00PST - 15:00PST (10:00EST - 18:00EST) and would love for you to join! (RSVP here)

You don't need to know Java to help! We will have documentation and design hacking going on as well.

We have a few goals for Hacksgiving:

  1. Introduce new contributors to the process of writing code and/or documentation (documentation hacking details here).
  2. Find some plugins which are up for adoption new maintainers.
  3. Clean up or merge some existing plugins which need some care (listed here).
Sessions to note

Here are some of the sessions scheduled that will be hosted by members of community that may interest you:

Day One

  • 7:00PST/10:00EST (15-30min) - rtyler will host a welcome and introduction to contribution to the Jenkins project (walking through our contributors guide)
  • 10:00PST/13:00EST (60min) - schristou will host a workshop titled "Introduction to plugin development for Jenkins"

Day Two

  • 10:00PST/13:00EST (60min) - abayer will be hosting a "Plugin Developer Open Q&A" session, so bring your questions!

Hacksgiving is very unconference structured right now, so if you're interested in hosting a session please let us know in the #jenkins-community channel or by signing up for a session on the schedule

How to participate

Since this is a virtual hackathon, we'll be congregating and chatting in a couple of ways:

  • On the #jenkins IRC channel as per usual
  • We'll be hosting sessions and tutorials via Google Hangouts, see the "hacker hangout* section on the wiki page up to date details
  • Via the #hacksgiving hashtag on Twitter

You can also RSVP on our meetup page!

We hope you can join in the festivities!

Categories: Open Source

SonarQube Enters the Security Realm and Makes a Good First Showing

Sonar - Thu, 11/12/2015 - 16:45

For the last year, we’ve been quietly working to add security-related rules in SonarQube’s language plugins. At September’s SonarQube Geneva User Conference we stopped being quiet about it.

About a year ago, we realized that our tools were beginning to reach the maturity levels required to offer not just maintainability rules, but bug and security-related rules too, so we set our sights on providing an all-in-one tool and started an effort to specify and implement security-related rules in all languages. Java has gotten the furthest; it currently has nearly 50 security-related rules. Together, the other languages have offer another 50 or so.

That may not sound like a lot, but I’m pleased with our progress, particularly when tested against the OWASP Benchmark project. If you’ve heard of OWASP before, it was probably in the context of the OWASP Top 10, but OWASP is an umbrella organization with multiple projects under it (kinda like the Apache Foundation). The Top 10 is OWASP’s flagship project, and the benchmark is an up-and-comer.

The benchmark offers ~2700 Java servlets that do and do not demonstrate vulnerabilities corresponding to 11 different CWE items. The CWE (Common Weakness Enumeration) contains about 1,000 items, and broadly describes patterns of insecure and weak code.

The guys behind the benchmark are testing all they tools they can get their hands on and publishing the results. For commercial tools, they’re only publishing an average score (because the tool licenses don’t allow them to publish individual, named scores). For open source tools, they’re naming names. :-)

When I prepared my slides for my “Security Rules in SonarQube” talk, the SonarQube Java Plugin arguably had the best score, finding 50% of the things we’re supposed to and only flagging 17% of the things we should have ignored for an overall score of 33% (50-17 = 33). Compare that to the commercial average, which has a 53% True Positive Rate and 28% False Positive rate for a final score of 26%. Since then, a new version of Find Security Bugs has been released, and it’s spot on the graph has jumped some, but I’m still quite happy with our score, both in relative and absolute terms. Here’s the summary graph presented on the site:

Notice that the dots are positioned on the x and y axes based on the True Positive Rate (y-axis) and False Positive Rate (x-axis.) Find Security Bugs is higher on the True Positive axis than SonarQube, which threw me for a minute, but it’s also further out on the False Positive axis too. That’s why I graphed the tools’ overall scores:

Looked at this way, it’s probably quite clear why I’m still happy with the SonarQube Java scores. But I’ll give you some detail to show that it isn’t (merely) about one-upsmanship:

This graph shows the Java plugin’s performance on each of the 11 CWE code sets individually. I’ll start with the five 0/0 scores in the bottom-left. For B, E, G, and K we don’t yet have any rules implemented (they’re “coming soon”). So… yeah, we’re quite happy to score a 0 there. :-) For F, SQL Injection, we have a rule, but every example of the vulnerability in this benchmark slips through a hole in it. (That should be fixed soon.) On a previous version of the benchmark, we got a better score for SQL Injection, but with the newest iteration, the code has been pared from 21k files to 2.7k, and apparently all the ones we were finding got eliminated. That’s life.

For A and D, it’s interesting to note that while the dots are placed toward the upper-right of the graph, they have scores of -2% and 0% respectively. That’s because the false positives cancelled out the true positives in the scoring. Clearly, we’d rather see a lower false positive rate, but we knew we’d hit some FP’s when we decided to write security rules. And with a mindset that security-related issues require human verification, this isn’t so bad. After all, what’s worse: manually eliminating false positives, or missing a vulnerability because of a false negative?

For ‘I’, we’ve got about the best score we can get. The cases we’re missing are designed to be picked up only by dynamic analysis. Find Security Bugs gets the same score on this one: 68%.

For the rest, C, H, and J, we’ve got perfect scores: a 100% True Positive Rate and a 0% False Positive Rate. Woo hoo!

Of course, saying we’ve got 100% on item C or 33% overall is only a reflection of how we’re doing on those particular examples. We do better on some vulnerabilities and less so on others. Over time, I’m sure the benchmark will grow to cover more CWE items and cover in more depth the items it already touches on. As it does, we’ll continue to test ourselves against it to see what we’ve missed and where our holes are. I’m sure our competitors will too, and we’ll all get gradually better. That’s good for everybody. But you won’t be surprised if I say we’ll stay on top of making sure SonarQube is always the best.

Categories: Open Source

New Jenkins releases with important security fixes

We just released Jenkins 1.638 and 1.625.2 which contain important security fixes, including a fix for the zero-day vulnerability published on Friday. Please see the security advisory for more information.

Want to be kept up to date on Jenkins security releases, including advance notice on scheduled security updates? Subscribe to the jenkinsci-advisories mailing list!

Categories: Open Source

Mitigating unauthenticated remote code execution 0-day in Jenkins CLI

Updated 2015-11-11 15:00 UTC: We have released Jenkins 1.638 and 1.625.2 which contain a fix for this vulnerability. See the security advisory for more information about these releases.

Updated 2015-11-06 03:55 UTC: Included a updated mitigation script which doesn't have a Jenkins boot race condition

Earlier today we received numerous reports about a previously undisclosed "zero day" critical remote code execution vulnerability and exploit in Jenkins core. Unfortunately the vulnerability was not disclosed to us ahead of its publication so we're still working on more thorough fix. In the meantime however, we wanted to inform you of the issue and provide a workaround which will help prevent this exploit from being used against public Jenkins installations, for future reference this issue is being tracked privately as SECURITY-218 in our issue tracker.

The attack is mounted through the Jenkins CLI subsystem, so the work-around is to remove/disable the CLI support inside of the running Jenkins server.

Using the following Groovy script you can disable the attack vector in your Jenkins installations by navigating to “Manage Jenkins” and then to “Script Console”, or just go to http://your-jenkins-installation/script. This only addresses the current running Jenkins process, in order to make the workaround persist between restarts of the Jenkins server, add the script below to $JENKINS_HOME/init.groovy.d/cli-shutdown.groovy (create the directory if necessary, and the file).

import jenkins.*;
import jenkins.model.*;
import hudson.model.*;

// disabled CLI access over TCP listener (separate port)
def p = AgentProtocol.all()
p.each { x ->
  if ("CLI")) p.remove(x)

// disable CLI access over /cli URL
def removal = { lst ->
  lst.each { x -> if (x.getClass().name.contains("CLIAction")) lst.remove(x) }
def j = Jenkins.instance;

in order to make the workaround persist between restarts of the Jenkins server, add the script below to $JENKINS_HOME/init.groovy.d/cli-shutdown.groovy (create the directory if necessary, and the file).

The latest version of this script can be found in this GitHub repository.

As previously announced on the jenkinsci-advisories mailing list we’re preparing a security release for this upcoming Wednesday which will include patches for both the latest and LTS lines of Jenkins core. The Jenkins Security team is working to include a fix for this previously undisclosed exploit in or before this planned security release.

If you have questions about this exploit, join us in the #jenkins channel on Freenode or ask on the jenkinsci-users@ mailing list.

For security researchers and hobbyists, if you believe you have found a security vulnerability in Jenkins, we have some disclosure guidelines on this wiki page which will help us mitigate any discovered issues as quickly and safely as possible.

Be sure to subscribe to the jenkinsci-advisories mailing list (jenkinsci-advisories), it's the fastest way to get updates by the Jenkins Security team.

Categories: Open Source

October JAMs

It is great to see the pick up of local activities through hosted JAMs. In October, the Jenkins community hosted Atlanta JAM and Bay Area JAM. Many thanks to our sponsors: Ericsson, CloudBees, Blazemeter, NetRoadShow.

Here's a summary of what was discussed:

  • Atlanta JAM - Jenkins workflow and Docker to reduce friction in DevOps efforts.
  • Bay Area JAM- Performance testing strategies, incorporating performance tests into Jenkins workflows and the metrics that matter most for troubleshooting and diagnosing issues.

A look forward to November and December:

As usual, if you're interested in becoming an organizer or sponsor, here's how to get started. If you've heard of a great Jenkins talk out there, shoot us an email with speaker info to so we can invite him/her to our next meetups.

Categories: Open Source

Selenium Conf 2016

Selenium - Fri, 11/06/2015 - 19:26

Interested in learning what’s in store for Se Conf 2016? Then be sure to read this write-up from the Conference Organizers.

Also, if you want to receive email notifications about the conference (e.g., when and where it will be, call for speakers, ticket sales, etc.) then go here and complete the sign-up form.

Categories: Open Source

What JVM versions are running Jenkins?

Preceding some of last week's Jenkins 2.0 discussions, there had been some threads on whether we should move Jenkins to require Java 8. The introduction of Java 8 last year brought performance improvements and highly desirable API changes, which make developing Java-based applications (arguably) much easier than before. The release was followed earlier this year by the end-of-life announcement for Java 7; the writing is on the wall: upgrade to Java 8.

I wanted to answer the question "does it even make sense to force an upgrade to Java 8?" There are plenty of technical discussions that we can have in the community on whether or not this is the right approach, but my goal was to try and measure the current Jenkins install base for Java 8 preparedness.

While we do not currently (to my knowledge) collect Java runtime versions in our anonymous usage statistics, we do have access logs from our mirror redirector which might provide some insight.

With some access logs data, I went through the millions of requests made to Jenkins infrastructure in 2015 and started filtering out the user agent which made those requests.

NOTE: This data is totally not scientific and is only meant to provide a coarse insight into what versions of Java access Jenkins web infrastructure.

When Jenkins hits the mirror network, it's not overriding the default user agent from the Java runtime, so many of the user agents for the HTTP request are something like Java/1.7.0_75. This indicates that the request came from a Java Runtime version 1.7.0 (update 75).

Looking at the major JVM versions making (non-unique) requests to Jenkins infrastructure we have:

  • 1.8.0: 21,278,960
  • 1.7.0: 27,340,214
  • 1.6.0: 4,148,833

This breaks down across various updates as well, which is also particularly interesting to me because many of these Java versions have long since had security advisories posted against them.

As I mentioned before, this is not a rigorous analysis of the access log data and is also not filtered by unique IP addresses. What I found most interesting though is that the Java 8 upgrade numbers are actually fairly strong, which I didn't expect. I expect that piece of the pie will continue to grow. Hopefully so much so that we're able to move over to Java 8 before the end of 2016!

Categories: Open Source

Adopt a plugin!

With more than a thousand public plugins in the Jenkins community now, it should come as no surprise that some of them are no longer actively maintained. Plugin authors move on when they change jobs, or lose interest in the plugin, and that's fine. Plugins are hosted on the Jenkins project infrastructure after all, and when a maintainer moves on, others can continue their work.

The major problem of course is that it's often difficult to tell whether a plugin is still maintained (and there's just not a lot going on), or whether its maintainer has lost interest. Most plugins don't get released every few weeks, or even every few months, and still do their job just fine.

To connect plugins that aren't actively maintained with potential maintainers, we recently implemented the "Adopt-a-plugin" initiative: We built a list of plugins that are up for "adoption", and display a prominent message on the plugins' wiki pages. Anyone interested in taking over as a plugin maintainer can then contact us, and we'll set you up.

Are you interested in becoming a plugin maintainer? Maybe one of your favorite plugins isn't actively maintained right now. Check out the Adopt a Plugin wiki page for more details on this program, and a list of plugins that would benefit from your help.

Categories: Open Source

Jenkins 2.0 Proposal: Improved "Out of the box" user experience

This week we have featured a number of proposals for what we would like to see in "Jenkins 2.0", the vision of which is to make Jenkins users more efficient, productive and happy. We started with some more internally facing changes and have slowly progressed from the "inside-out" to today's topic: improving the out of the box user experience. That is to say, the experience that a brand-new Jenkins user has when getting started with the server.

Just to recap, so far we've reviewed:

The subject of today's proposal is captured in JENKINS-31157, which, like yesterday's proposal, contains a few issues linked from it with more details.

At a high level, the problem aiming to be solved is:

When a new user installs Jenkins, they are greeted with the main, empty, dashboard which suggests that they "create jobs." This makes no mention of plugins or the configuration options that are relevant to helping the user make Jenkins match their needs.

In past and current versions of Jenkins, if you know what you're looking for it's relatively easy to move around the interface. If you've never used Jenkins before, it can be very challenging to find your way around or even know what it is possible to do with Jenkins.

The proposed changes aim to address this initial confusion:

Instead of changing the post-install defaults, which may not properly represent the user's needs, the first-time user experience should help guide the user through configuration and plugin installation quickly so they can use Jenkins for their needs. Effectively it should be as easy as possible for a user to arrive at a good configuration for their usage.

Jenkins contributor Tom Fennelly, who has led this discussion on the mailing lists in the past, has posted a good prototype screencast of what some of this might entail:

Providing Feedback

We're asking you to read the issues linked from JENKINS-31157 and comment and vote on those issues accordingly.

If you have ever logged in to the issue tracker or the wiki, you have a "Jenkins user account" which means you'll be able to log into the issue tracker and vote for, or comment on the issue linked above.

(note: if you have forgotten your password, use the account app to reset it.)

We're going to review feedback, make any necessary adjustments and either approve or reject the proposal two weeks from today.

This concludes this week's blog series highlighting some of the Jenkins 2.0 proposals we felt were important to discuss with the broader Jenkins user audience. Many of these, and other minor proposals, can be found on the Jenkins 2.0 wiki page.

Categories: Open Source

Jenkins 2.0 Proposal: UX Improvements (Part One)

We have been featuring a few proposals this week for what "Jenkins 2.0" is going to include. Today we'll be diving into the most noticeable changes being proposed for Jenkins 2.0: the User Experience (UX) improvements

Thus far in this blog series we have reviewed proposals covering:

The UX improvements being proposed aren't necessarily as uniform as the proposals from earlier in the week but represent a large amount of prototype and exploratory work done by folks like Tom Fennelly, Gus Reiber and a few others. Those following the dev list may have already seen some of these proposals in some of the "mega threads" that we have had discussing potential UI/UX improvements previously.

The improvements proposed for 2.0 can be found under JENKINS-31156. The most promising proposal under this issue is to update the plugin manager experience.

Another very important proposal for 2.0 worth mentioning is the proposal to update UI work well on mobile devices.

Providing Feedback

We're asking you to read the issues linked from JENKINS-31156 and comment and vote on those issues accordingly.

If you have ever logged in to the issue tracker or the wiki, you have a "Jenkins user account" which means you'll be able to log into the issue tracker and vote for, or comment on the issue linked above.

(note: if you have forgotten your password, use the account app to reset it.)

We're going to review feedback, make any necessary adjustments and either approve or reject the proposal two weeks from today.

Stay tuned for tomorrow's post covering the remainder of the proposed user experience changes!

Categories: Open Source

Jenkins 2.0 Proposal: Pipeline as Code front and center

We have been featuring a few proposals this week for what "Jenkins 2.0" is going to include, today we're discussing my personal favorite, which I believe will have a tremendously positive impact for years to come (not to be too biased!): moving the "Pipeline as Code" support in Jenkins to the front and center.

Thus far in this blog series we have reviewed proposals covering:

Today's proposal comes from project founder Kohsuke Kawaguchi titled "Pipeline as code front and center" and represents perhaps the most important and dramatic shift we hope to make in Jenkins 2.0.

This functionality has existed through the workflow plugin, which we have discussed at various Jenkins events before but if you're not aware of some of the power behind it, check out this presentation from Jesse Glick:

The proposal in JENKINS-31152 expands on the problem we aim to address:

The default interaction model with Jenkins has been very web UI driven, requiring users to manually create jobs, then manually fill in the details through a web browser. This requires large amounts of effort to create and manage jobs to test and build multiple projects and keeps the actual configuration of a job to build/test/deploy a project separate from the actual code being built/tested/deployed. This prevents users from applying their existing CI/CD best practices to the job configurations themselves.

To address this, Kohsuke is proposing that we :

Introduce a new subsystem in Jenkins that:

  • lets you design a whole pipeline, not just a single linear set of tasks
  • stores the said pipeline configuration as human-editable Jenkinsfile in your SCM
  • makes it automatic to set up new pipelines when Jenkinsfile is added
  • differentiates multiple branches in the same repository

This is the key new feature that positions Jenkins for continuous delivery use cases and other more complex automations of today.

Kohsuke's proposal is largely about bringing together a lot of already existing pieces together to provide a very compelling experience for new and existing users alike. I hope it is clear now why this proposal is so exciting to me.

Providing Feedback

We're asking you to read the proposal in JENKINS-31152, which itself have some additional tickets linked under it, and provide feedback if you have it.

If you have ever logged in to the issue tracker or the wiki, you have a "Jenkins user account" which means you'll be able to log into the issue tracker and vote for, or comment on the issue linked above.

(note: if you have forgotten your password, use the account app to reset it.)

We're going to review feedback, make any necessary adjustments and either approve or reject the proposal two weeks from today.

Stay tuned for a couple more posts covering proposals to improve the Jenkins interface and user experience!

Categories: Open Source

Jenkins 2.0 Proposal: Split Groovy out of "core"

As I mentioned in yesterday's post, there's been a lot of discussion recently about what "Jenkins 2.0" means. In a recent "Office Hours" session, Kohsuke Kawaguchi presented his vision for Jenkins 2.0 in a office hours session, the slides for which can be found in this Google Presentation. Roughly paraphrasing Kohsuke's vision, 2.0 is primarily about making things better for the thousands of users out there.

This week, we'll be reviewing some key areas of the "Jenkins 2.0" proposal. Asking you, the user community, to provide feedback on these proposals, going from Jenkins internals to user interface.

Thus far we've covered:

Today's post involves a proposal originally from community member Jesse Glick who has proposed in JENKINS-29068 that Groovy be split out from the "core" Jenkins distribution. The linked issue expands on what the problem is here:

Currently Jenkins embeds a distribution of Groovy into "core" for a variety of scripting and management tasks. This version of Groovy is locked into core in such a way that users cannot upgrade Groovy independently from Jenkins itself. If the Jenkins-bundled version were upgraded to a different major version, it may break users' custom scripts as well as plugins that use Groovy due to API changes.

The proposal is relatively straight-forward and affects the many different users and use-cases for the embedded Groovy scripting support in Jenkins:

For ease of maintenance and modularity it would be useful to split Jenkins' use of Groovy into a library plugin; different clients could request 1.x and 2.x simultaneously by using different versions of the library, etc.

Stuff in core using Groovy that would need to either be put in this library (if infrastructure for other features) or put in another plugin depending on it (if first-class features themselves):

I selected this proposal to feature on this blog, despite its rather technical underpinnings, it will affect core developers, plugin developers, power and casual users alike. I encourage everybody to read through the proposal and its potential impact on the issue tracker.

Providing Feedback

We're asking you to read the proposal in JENKINS-29068 and provide feedback if you have it.

If you have ever logged in to the issue tracker or the wiki, you have a "Jenkins user account" which means you'll be able to log into the issue tracker and vote for, or comment on the issue linked above.

(note: if you have forgotten your password, use the account app to reset it.)

We're going to review feedback, make any necessary adjustments and either approve or reject the proposal two weeks from today.

Stay tuned for the rest of the week as we keep with our theme of going "from the inside out" and help us make Jenkins 2.0 great!

Categories: Open Source

Jenkins 2.0 Proposal: Introduce a policy for API deprecation

Over the past few weeks there has been a vibrant discussion happening on the jenkinsci-dev@ mailing list as to what "Jenkins 2.0" means. While Jenkins does not currently adhere to semantic versioning, the change of a major version number does indicate a major milestone for the community.

Project founder, Kohsuke Kawaguchi presented his vision for Jenkins 2.0 in a office hours session, the slides for which can be found in this Google Presentation. Roughly paraphrasing Kohsuke's vision, 2.0 is primarily about making things better for the thousands of users out there.

This week, we'll be reviewing some key areas of the "Jenkins 2.0" proposal. Asking you, the user community, to provide feedback on these proposals, going from Jenkins internals to user interface.

Today's post involves a proposal to introduce a policy for API deprecation from community members Oliver Gondža and Daniel Beck. Extensibility is the heart of Jenkins, but over the past ten years we've not had a proper API deprecation policy other than "try not to break plugins, ever."

Daniel, expanding more on the problem wrote:

We have no backward compatibility policy besides "compatibility matters". With 1000+ plugins and basically the entire core being available to plugins, a lot of difficult or impossible to remove cruft has accumulated over the last ten years. This limits both what can be changed in core, and makes documentation difficult to use for plugin developers.

The two have put together a detailed proposal under JENKINS-31035 which suggests we:

limit the availability in APIs (classes, methods, fields, …) provided by core to a number of releases. Depending on the feature, this can range from a few months, to a few years (e.g. two years being about 100 releases of Jenkins and eight LTS baselines).


I highly encourage you to read the entire proposal on the issue tracker, where we are trying to collect feedback/history.

Providing Feedback

We're asking you to read the proposal in JENKINS-31035 and provide feedback if you have it.

If you have ever logged in to the issue tracker or the wiki, you have a "Jenkins user account" which means you'll be able to log into the issue tracker and vote for, or comment on the issue linked above.

(note: if you have forgotten your password, use the account app to reset it.)

We're going to review feedback, make any necessary adjustments and either approve or reject the proposal two weeks from today.

Stay tuned, and help make Jenkins 2.0 great!

Categories: Open Source

SonarLint: Fixing Issues Before They Exist

Sonar - Thu, 10/22/2015 - 08:44

I’m very happy to announce the launch of a new product series at SonarSource: SonarLint, which will help you fix code quality issues before they even exist.

SonarLint represents a new approach to code quality: instant issue checking. It sits in the IDE and is totally developer-oriented. We’ve started with three variations: SonarLint for VisualStudio, SonarLint for Eclipse, and SonarLint for IntelliJ.

Version 1.x will be available for C# via SonarLint for VisualStudio, and for Java and PHP with both SonarLint for Eclipse and SonarLint for Intellij. So now you can start catching and fixing issues from your projects’ first keystrokes.

Here’s a preview in VisualStudio:

And here’s a preview for Eclipse:

Later, we’ll add the ability to link SonarLint with a SonarQube instance.

This complete break from the approach of previous implementations is what prompted us to start over with a new brand. With SonarLint, it’s a new day in code quality.

Categories: Open Source

Upcoming in office hours: FOSDEM Planning Session

For the past several years we've been attending FOSDEM, a massive free and open source event in Brussels, Belgium. In preparation for this upcoming FOSDEM (2016) event, we will be hosting an open planning meeting via Google Hangouts during this week's "Office Hours."


  • Gauge who can participate and at what capacity.
  • Pre-FOSDEM Contributor Summit
  • After-hours meetup/happy hour
  • Plans for a Jenkins stand (assuming we're accepted):

The FOSDEM 2016 wiki page is where we will be recording plans and tasks will be added to JIRA. If you cannot join us via the FOSDEM Office Hours Hangout, we will also be watching the #jenkins-community channel on the Freenode network if you cannot participate directly.

Please join us on this Google Hangout at 11:00 a.m. PDT this Wednesday (Oct 21)

Categories: Open Source

Cooking Up JAMs

There's been some active discussions and planning around Jenkins Area Meetups (JAMs) specifically in the following cities:

I wanted to gauge Jenkins interests in these cities, so let us know at if you live in one of these areas, and if you would be interested in becoming a member or be involved in JAM one way or another!

Of course, if the city you live in currently does not have a JAM and you're interested in paying it forward, here's HOW YOU can become a JAM organizer.

Categories: Open Source

Test Framework Feature Comparisons – What If We Cooperated? - Sun, 04/07/2013 - 03:14

Software projects often publish comparisons with other projects, with which they compete. These comparisons typically have a few characteristics in common:

  • They aim at highlighting reasons why one project is superior – that is, they are marketing material.
  • While they may be accurate when initially published, competitor information is rarely updated.
  • Pure factual information is mixed with opinion, sometimes in a way that doesn’t make clear which is which.
  • Competitors don’t get much say in what is said about their projects.
  • Users can’t be sure how much to trust such comparisons.

Of course, we’re used to it. We no longer expect the pure, unvarnished truth from software companies – no more than from drug companies, insurance companies, car salesmen or government agencies. We’re cynical.

But one might at least hope that open source projects might do better. It’s in all our interests, and in our users’ interests, to have accurate, up-to-date, unbiased feature comparisons.

So, what would such a comparison look like?

  • It should have accurate, up-to-date information about each project.
  • That information should be purely factual, to the extent possible. Where necessary, opinions can be expressed only if clearly identified as opinion by it’s content and placement.
  • Developers from each project should be responsible for updating their own features.
  • Developers from each project should be accountable for any misstatements that slip in.

I think this can work because most of us in the open source world are committed to… openness. We generally value accuracy and we try to separate fact from opinion. Of course, it’s always easy to confuse one’s own strongly held beliefs with fact, but in most groups where I participate, I see such situations dealt with quite easily and with civility. Open source folks are, in fact, generally quite civil.

So, to carry this out, I’m announcing the .NET Test Framework Feature Comparison project – ideas for better names and an acronym are welcome. I’ll provide at least a temporary home for it and set up an initial format for discussion. We’ll start with MbUnit and NUnit, but I’d like to add other frameworks to the mix as soon as volunteers are available. If you are part of a .NET test framework project and want to participate, please drop me a line.

Categories: Open Source

Software Testing Latest Training Courses for 2012

The Cohen Blog — PushToTest - Mon, 02/20/2012 - 05:34
Free Workshops, Webinars, Screencasts on Open Source Testing Need to learn Selenium, soapUI or any of a dozen other Open Source Test (OST) tools? Join us for a free Webinar Workshop on OST. We just updated the calendar to include the following Workshops:
And If you are not available for the above Workshops, try watching a screencast recording.

Watch The Screencast

Categories: Companies, Open Source