Skip to content

Open Source

SonarLint for Visual Studio: Let’s Fix Some Real Issues in Code!

Sonar - Wed, 02/10/2016 - 10:38

As part of the development process of SonarLint for Visual Studio we regularly check a couple of open source projects, such as Roslyn, to filter out false positives and to validate our rule implementations. In this post we’ll highlight a couple of issues found recently in Roslyn project.

Short-circuit logic should be used to prevent null pointer dereferences in conditionals (S1697)

This rule recognizes a few very specific patterns in your code. We don’t expect any false positives from it, so whenever it reports an issue, we know that it found a bug. Check it out for yourself; here is the link to the problem line.

When body is null, the second part of the condition will be evaluated and throw a NullReferenceException. You might think that the body of a method can’t be null, but even in syntactically correct code it is possible. For example method declarations in interfaces, abstract or partial methods, and expression bodied methods or properties all have null bodies. So why hasn’t this bug shown up yet? This code is only called in one place, on a method declaration with a body.

The ternary operator should not return the same value regardless of the condition (S2758)

We’re not sure if this issue is a bug or just the result of some refactoring, but it is certainly confusing. Why would you check isStartToken if you don’t care about its content?

 ”IDisposables” should be disposed (S2930)

Lately we’ve spent some effort on removing false positives from this rule. For example, we’re not reporting on MemoryStream uses anymore, even though it is an IDisposable. SonarLint only reports on resources that should really be closed, which gives us high confidence in this rule. Three issues ([1], [2][3]) are found on the Roslyn project, where a FileStream, a TcpClient, and a TcpListener are not being disposed.

Method overloads with default parameter values should not overlap (S3427)

Mixing method overloads and default parameter values can result in cases when the default parameter value can’t be used at all, or can only be used in conjunction with named arguments. These three cases ([1], [2], [3]) fall into the former category, the default parameter values can’t be used at all, so it is perfectly safe to remove them. In each case, whenever only the first two arguments are supplied, another constructor will be called. Additionally, in this special case, if you call the method like IsEquivalentTo(node: myNode), then the default parameter value is used, but if you use IsEquivalentTo(myNode), then another overload is being called. Confusing, isn’t it?

Flags enumerations should explicitly initialize all their members (S2345)

It is good practice to explicitly set a value for your [Flags] enums. It’s not strictly necessary, and your code might function correctly without it, but still, it’s better safe than sorry. If the enum has only three members, then the automatic 0, 1, 2 field initialization works correctly, but when you have more members, you most probably don’t want to use the default values. For example here FromReferencedAssembly == FromSourceModule | FromAddedModule. Is this the desired setup? If so, why not add it explicitly to avoid confusion?

“async” methods should not return “void” (S3168)

As you probably know, async void methods should only be used in a very limited number of scenarios. The reason for this is that you can’t await on async void method calls. Basically, these are fire and forget methods, such as event handlers. So what happens when a test method is marked async void? Well, it depends. It depends on your test execution framework. For example NUnit 2.6.3 handles them correctly, but the newer NUnit 3.0 dropped support. Roslyn uses xUnit 2.1.0 at the moment, which does support running async void test methods, so there is no real issue with them right now. But changing the return value to Task would probably be advisable. To sum up, double check your async void methods; they might or might not work as you expect. Here are two occurrences from Roslyn ([1], [2]).

Additionally, here are some other confusing pieces of code that are marked by SonarLint. Rule S2275 (Format strings should be passed the correct number of arguments) triggers on this call, where the formatting arguments 10 and 100 are not used, because there are no placeholders for them in the format string. Finally, here are three cases ([1], [2], [3]) where values are bitwise OR-ed (|) with 0 (Rule S2437).

We sincerely hope you already use SonarLint daily to catch issues early. If not, you can download SonarLint from the Visual Studio Extension Gallery or install it directly from Visual Studio (Tools/Extensions and Updates). SonarLint is free and already trusted by thousands of developers, so start using it today!

Categories: Open Source

Selenium Conf India 2016 Update

Selenium - Mon, 02/08/2016 - 19:47

Selenium Conf India is happening this June 24-26 in Bangalore, India.

Tickets, call for speakers, and sponsorship slots are now available!

http://2016.seleniumconf.in/

 


Categories: Open Source

SCaLE 14x Conference Report

Historically January has always been a very big month for the Jenkins community. Between FOSDEM Southern California Linux Expo (also known as SCaLE) we seem to hand out more stickers during the last week in January than any other week of the year. This year’s SCaLE 14X conference finally outgrew the LAX Hilton in Los Angeles, where it had been hosted in years past, and moved over to the Pasadena Convention Center in Pasadena California. While the organizers of the conference expanded their scope, so did the Jenkins project! In addition to our normal Jenkins stickers, we also had some special edition stickers with special logos to give away this...
Categories: Open Source

SonarQube 5.3 in Screenshots

Sonar - Thu, 01/28/2016 - 10:02

The team is proud to announce the release of 5.3, another paradigm-shifting version, with the addition of significant new features, and the return of popular functionality that didn’t make it in to 5.2:

  • New Project Space which puts the focus on the Quality Gate and the Leak Period
  • User tokens for authenticated analysis without passwords
  • New web services to facilitate a build breaker strategy
  • Cross-project duplication is back!

New Project Space which puts the focus on the Quality Gate and the Leak Period

The most striking change in this version is the replacement of the default project dashboard with a new, fixed Project space highlighting the top four data domains: technical debt, coverage, duplications, and structure (which includes both size and complexity):

Because managing technical debt introduced during the Leak Period is so crucial, this streamlined, new project home page keeps the leak period (first differential period, which is now overridable at the project level), in the forefront. Both current and differential values are shown both textually and graphically:

Each of the four domains offers a detailed sub-page, available either through the “more” links on the Project Space or the relevant project menu items:

Technical Debt:
Coverage:
Duplications:
Structure:

Each domain page offers the same combination of current values (in blue, with clickthroughs) and leak period changes (yellow background) found on the main page, along with detailed numeric and graphical presentations designed to help you quickly zero-in on the worst offenders in your projects.

SonarSource feels so strongly about the value of the new Project Space and the domain pages that none of them are configurable. But your old dashboards are still available under the “Dashboards” menu item.

User tokens for authenticated analysis without passwords

In version 5.2, we cut the last ties between analysis and the database. Now an analysis report is submitted to the server and all database updates take place server-side. In 5.3 we take the next step down the road of enhanced analysis security with the introduction of authentication tokens.

Now an administrator can create authentication tokens for any user.

Tokens may be used in for analysis and with web services. Simply pass the token as the login, and leave the password blank.

The list of user token names (but not values!) is easily visible, and existing tokens can be revoked at any time:

Users can’t generate their own tokens yet, but that’s coming soon.

New web services to facilitate a build breaker strategy

In the implementation of a Continuous Inspection strategy, many people use Continuous Integration servers, such as Jenkins, to execute their SonarQube scans, and want to show as broken a run that includes new code that breaks fails the Quality Gate. Because of time constraints, the old hooks for that were removed in 5.2 and not replaced. In 5.3. we made it a priority to close this gap, so the functionality is now available to allow you to implement a build breaker strategy.

When the client-side scanner is done, it writes out a data file with the URL to call for the server-side processing status

Once the processing is successful,

you can use the analysis id to get the quality gate status

Cross-project duplication is back!

Also under the heading of returning favorites is cross-project duplication. The changes in 5.2 required serious API updates. In turn a rewrite of cross-project duplication detection was required – another priority in 5.3

Notably, 5.3 only provides cross-project duplication detection, not the detection of duplications across modules within a project, which is planned for 5.4.

That’s All, Folks!

Time now to download the new version and try it out. But don’t forget to read the installation or upgrade guide first!

Categories: Open Source

Jenkins World 2016: Call For Papers Is Open!

This is a guest post by Alyssa Tong. Alyssa works for CloudBees, helping to organize Jenkins community events around the world. Planning is underway for Jenkins World, a major Jenkins event for developers, release engineers and others interested in automation. The conference will be held from September 13th to 15th in Santa Clara, California and is being organized and sponsored in part by CloudBees. Just like the "Jenkins User Conferences" before it, this year’s event will feature many experts from the Jenkins community that help make Jenkins the most popular open source automation server on the planet. We’ve found that we outgrew the popular multi-city one-day Jenkins User Conferences, so unlike previous...
Categories: Open Source

Office Hour: The State of JavaScript in Jenkins

Tom Fennelly will host tomorrow’s office hour on JavaScript in Jenkins. The intended audience for this presentation is core and plugin developers. In his own words: I believe strongly that we can make meaningful user experience improvements to Jenkins, but it will require having more weapons in our arsenal in terms of how we build plugins etc. This is what we’ll be talking about in this week’s office hour. It will be a developer-focused session where we’ll start off by talking a little about how UI development has traditionally been done in Jenkins, before moving on to some newer patterns and tools that...
Categories: Open Source

A beautiful Jenkins dashboard

This is a guest post by Julian Kleinhans, Software Architect at AOE, who is outlining some of the Jenkins dashboard work he’s done with dashing-js Jenkins offers a handful of third party dashboards, but none of them are really beautiful and flexible enough from my point of view. For example, I could not find a solution which gives me the possibility to easily decide which data should be display in the widget and which not. It also doesn`t have the possibility to add additional widgets to the dashboard which have nothing to do with Jenkins. So I came up with something interesting that includes Jenkins data. But I cannot do...
Categories: Open Source

Jenkins Code of Conduct

Over the past couple months, we have been working on a long overdue Code of Conduct for the Jenkins project (meeting minutes here and here). Following in the footsteps of other projects like the Apache Software Foundation, Go lang and countless others, we have adopted this code of conduct to help set guidelines for what behaviors are acceptable, and what behaviors are not, when acting within the Jenkins community or on behalf of the Jenkins project. I would like to extend our gratitude to the authors of the Contributor Covenant who provided us with a very good and mostly finished Code of Conduct template. We have adapted the covenant to meet the unique needs of...
Categories: Open Source

A new Jenkins website

When I first started working on the Jenkins website, then called by a different name, I selected Drupal, an extensible content management system, to get the job done. Like Jenkins itself, Drupal is easy to set up, install plugins and authoring content is done in a web UI. For the past seven years Drupal has served us well, but it is time to move on to something better suited for our needs. The general requirements for something newer were: Easy to edit and create content Changes to content should be tracked and reviewable Any Jenkins contributor should be able to participate Support mixed content types The consensus was that a statically-generated...
Categories: Open Source

Jenkins at SCaLE 14x

For the past few years, a couple members of the Jenkins project have made the trip to Los Angeles for the Southern California Linux Expo. Despite the name it’s a fairly broad open source technology conference and since it is hosted prior to FOSDEM, it’s also a good conference to get us in the open source mood after the holiday break. Unlike previous years, when SCaLE was hosted at the LAX Hilton, this year it has grown and moved to the Pasadena Convention Center. There, as with previous years, we’ll have a table in the expo hall with plenty of stickers and perhaps some other forms of swag available for...
Categories: Open Source

December JAM World Tour: Toulouse, France

On December 15, the Toulouse JAM was co-hosted with the Toulouse JUG and Toulouse DevOps. Indeed it made sense since Jenkins is written in Java, makes use of Groovy code in many places (system groovy script, job dsl, workflow...), and it also made sense to co-organize with the local DevOps community since Jenkins is also a great tool to enable Continuous Integration, Continuous Delivery and automation in general. There were 103 RSVPs, with 80 to 90 people in attendance.

There were 3 talks planned for the evening:

Note: presentations have been recorded (in french). They are still being processed, and once they are posted we will update this blog.

Photos: https://goo.gl/photos/1Usd96trfreFnWrZ8

Categories: Open Source

Selenium Conf India — Save The Date!

Selenium - Mon, 12/21/2015 - 18:28

In our last update we mentioned there will be 2 Selenium Confs in 2016 — one in India, another somewhere else (TBD).

Well, we are pleased to announce the official dates and location for Selenium Conf India!

When: June 24th & 25th, 2016

Where: Bangalore, India (at The Chancery Pavilion Hotel)

Mark you calendars! We’ll have more details as they become available (e.g., call for speakers, ticket sales, etc.). To get the latest updates, be sure to sign up for the Selenium Conf mailing list.


Categories: Open Source

December JAM World Tour: Toulouse, France

On December 15, the Toulouse JAM was co-hosted with the Toulouse JUG and Toulouse DevOps. Indeed it made sense since Jenkins is written in Java, makes use of Groovy code in many places (system groovy script, job dsl, workflow…), and it also made sense to co-organize with the local DevOps community since Jenkins is also a great tool to enable Continuous Integration, Continuous Delivery and automation in general. There were 103 RSVPs, with 80 to 90 people in attendance. There were 3 talks planned for the evening: Job DSL Intro [fr], by Ghislain Mahieux Video recording ...
Categories: Open Source

Test Framework Feature Comparisons – What If We Cooperated?

NUnit.org - Sun, 04/07/2013 - 03:14

Software projects often publish comparisons with other projects, with which they compete. These comparisons typically have a few characteristics in common:

  • They aim at highlighting reasons why one project is superior – that is, they are marketing material.
  • While they may be accurate when initially published, competitor information is rarely updated.
  • Pure factual information is mixed with opinion, sometimes in a way that doesn’t make clear which is which.
  • Competitors don’t get much say in what is said about their projects.
  • Users can’t be sure how much to trust such comparisons.

Of course, we’re used to it. We no longer expect the pure, unvarnished truth from software companies – no more than from drug companies, insurance companies, car salesmen or government agencies. We’re cynical.

But one might at least hope that open source projects might do better. It’s in all our interests, and in our users’ interests, to have accurate, up-to-date, unbiased feature comparisons.

So, what would such a comparison look like?

  • It should have accurate, up-to-date information about each project.
  • That information should be purely factual, to the extent possible. Where necessary, opinions can be expressed only if clearly identified as opinion by it’s content and placement.
  • Developers from each project should be responsible for updating their own features.
  • Developers from each project should be accountable for any misstatements that slip in.

I think this can work because most of us in the open source world are committed to… openness. We generally value accuracy and we try to separate fact from opinion. Of course, it’s always easy to confuse one’s own strongly held beliefs with fact, but in most groups where I participate, I see such situations dealt with quite easily and with civility. Open source folks are, in fact, generally quite civil.

So, to carry this out, I’m announcing the .NET Test Framework Feature Comparison project – ideas for better names and an acronym are welcome. I’ll provide at least a temporary home for it and set up an initial format for discussion. We’ll start with MbUnit and NUnit, but I’d like to add other frameworks to the mix as soon as volunteers are available. If you are part of a .NET test framework project and want to participate, please drop me a line.

Categories: Open Source

Software Testing Latest Training Courses for 2012

The Cohen Blog — PushToTest - Mon, 02/20/2012 - 05:34
Free Workshops, Webinars, Screencasts on Open Source Testing Need to learn Selenium, soapUI or any of a dozen other Open Source Test (OST) tools? Join us for a free Webinar Workshop on OST. We just updated the calendar to include the following Workshops:
And If you are not available for the above Workshops, try watching a screencast recording.

Watch The Screencast

Categories: Companies, Open Source

Selenium Tutorial For Beginners

The Cohen Blog — PushToTest - Thu, 02/02/2012 - 08:45
Selenium Tutorial for Beginners Selenium is an open source technology for automating browser-based applications. Selenium is easy to get started with for simple functional testing of a Web application. I can usually take a beginner with some light testing experience and teach them Selenium in a 2 day course. A few years ago I wrote a fast and easy tutorial Building Selenium Tests For Web Applications tutorial for beginners.

Read the Selenium Tutorial For Beginners Tutorial

The Selenium Tutorial for Beginners has the following chapters:
  • Selenium Tutorial 1: Write Your First Functional Selenium Test
  • Selenium Tutorial 2: Write Your First Functional Selenium Test of an Ajax application
  • Selenium Tutorial 3: Choosing between Selenium 1 and Selenium 2
  • Selenium Tutorial 4: Install and Configure Selenium RC, Grid
  • Selenium Tutorial 5: Use Record/Playback Tools Instead of Writing Test Code
  • Selenium Tutorial 6: Repurpose Selenium Tests To Be Load and Performance Tests
  • Selenium Tutorial 7: Repurpose Selenium Tests To Be Production Service Monitors
  • Selenium Tutorial 8: Analyze the Selenium Test Logged Results To Identify Functional Issues and Performance Bottlenecks
  • Selenium Tutorial 9: Debugging Selenium Tests
  • Selenium Tutorial 10: Testing Flex/Flash Applications Using Selenium
  • Selenium Tutorial 11: Using Selenium In Agile Software Development Methodology
  • Selenium Tutorial 12: Run Selenium tests from HP Quality Center, HP Test Director, Hudson, Jenkins, Bamboo
  • Selenium Tutorial 13: Alternative To Selenium
A community of supporting open source projects - including my own PushToTest TestMaker - enables you to apply your Selenium tests as functional tests for smoke testing, regression testing, and integration tests, load and performance tests, and production service monitors. These techniques and tools make it easy to run Selenium tests from test management platforms, including HP Quality Center, HP Test Director, Zephyr, TestLink, QMetry, from automated Continuous Integration (CI) tests, including Hudson, Jenkins, Cruise Control, and Bamboo.

I wrote a Selenium tutorial for beginners to make it easy to get started and take advantage of the advanced topics. Download TestMaker Community to get the Selenium tutorial for beginners and immediately build and run your first Selenium tests. It is entirely open source and free!

Read the Selenium Tutorial For Beginners Tutorial

Categories: Companies, Open Source

5 Services To Improve SOA Software Development Life Cycle

The Cohen Blog — PushToTest - Fri, 01/27/2012 - 00:25
SOA Testing with Open Source Test Tools PushToTest helps organizations with large scale Service Oriented Architecture (SOA) applications achieve high performance and functional service delivery. But, it does not happen at the end of SOA application development. Success with SOA at Best Buy requires an Agile approach to software development and testing, on-site coaching, test management, and great SOA oriented test tools.

Distributing the work of performance testing through an Agile epoc, story, and sprints reduces the testing effort overall and informs the organization's business managers on the service's performance. The biggest problem I see is keeping the testing transparent so that anyone - tester, developer, IT Ops, business manager, architect - follows a requirement down to the actual test results.

With the right tools, methodology, and coaching an organization gets the following:
  • Process identification and re-engineering for Test Driven Development (TDD)
  • Installation and configuration of a best-in-class SOA Test Orchestration Platform to enable rapid test development of re-usable test assets for functional testing, load and performance testing and production monitoring
  • Integration with the organization's systems, including test management (for example, Rally and HP QC) and service asset management (for example, HP Systinet)
  • Construction of the organization's end-to-end tests with a team of PushToTest Global Professional Services, using this system and training of the existing organization's testers, Subject Matter Experts, and Developers to build and operate tests
  • On-going technical support
Download the Free SOA Performance Kit On-Site Coaching Leads To Certification
The key to high quality and reliable SOA service delivery is to practice an always-on management style. That requires on-site coaching. In a typical organization the coaches accomplish the following:
  • Test architects and test developers work with the existing Testing Team members. They bring expert knowledge of the test tools. Most important is their knowledge of how to go from concept to test coding/scripting
  • Technical coaching on test automation to ensure that team members follow defined management processes
Cumulatively this effort is referred to as "Certification". When the development team produces quality product as demonstrated by simple functional tests, then the partner QA teams take these projects and employ "best practice" test automation techniques. The resulting automated tests integrate with the requirements system (for example, Rally), the continuous integration system, and the governance systems (for example, HP Systinet.)
Agile, Test Management, and Roles in SOA
Agile software development process normally focuses first on functional testing - smoke tests, regression test, and integration tests. Agile applied to SOA service development deliverables support the overall vision and business model for the new software. At a minimum we should expect:
  1. Product Owner defines User Stories
  2. Test Developer defines Test Cases
  3. Product team translates Test Cases into soapUI, TestMaker Designer, and Java project implementations
  4. Test Developer wraps test cases into Test Scenarios and creates an easily accessible test record associated to the test management service
  5. Any team member follows a User Story down into associated tests. From there they can view past results or execute tests again.
  6. As tests execute the test management system creates "Test Execution Records" showing the test results
Learn how PushToTest improves your SOA software development life cycle. Click here to learn how.


Download the Free SOA Performance Kit

Categories: Companies, Open Source

Application Performance Management and Software Testing Trends and Analysis

The Cohen Blog — PushToTest - Tue, 01/24/2012 - 16:25
18 Best Blogs On Software Testing 2011 began with some pretty basic questions for the software testing world:
  • To what extent will large organizations dump legacy test tools for open source test tools?
  • How big would the market for private cloud software platforms be?
  • Does mankind have the tools to make a reliable success of the complicated world we built?
  • How big of a market will SOA testing and development be?
  • What are the best ways to migrate from HP to Selenium?
Let me share the answers I found. Some come from my blog, others from friends and partner blogs. Here goes:

The Scalability Argument for Service Enabling Your Applications. I make the case for building, deploying, and testing SOA services effectively. I point out the weakness of this approach comes at the tool and platform level. For example, 37% of an application's code simply to deploy your service.

How PushToTest Uses Agile Software Development Methodology To Build TestMaker. A conversation I had with Todd Bradfute, our lead sales engineer, on surfacing the results of using Agile methodology to build software applications.

"Selenium eclipsed HP’s QTP on job posting aggregation site Indeed.com to become the number one requisite job experience / skill for on-line posted automated QA jobs (2700+ vs ~2500 as of this writing,)" John Dunham, CEO at Sauce Labs, noted.

Run Private Clouds For Cost Savings and Control. Instead of running 400 Amazon EC2 machine instances, Plinga uses Eucalyptus to run its own cloud. Plinga needed the control, reliability, and cost-savings of running its own private cloud, Marten Mickos, CEO at Eucalyptus, reports in his blog.

How To Evaluate Highly Scalable SOA Component Architecture. I show how to evaluate highly scalable SOA component architecture. This is ideal for CIOs, CTOs, Development and Test Executives, and IT managers.

Planning A TestMaker Installation. TestMaker features test orchestration capabilities to run Selenium, Sahi, soapUI, and unit tests written in Java, Ruby, Python, PHP, and other langauges in a Grid and Cloud environment. I write about the issues you may encounter installing the TestMaker platform.

Repurposing ThoughtWorks Twist Scripts As Load and Performance Tests. I really like ThoughtWorks Twist for building functional tests in an Agile process. This blog and screencast shows how to rapidly find performance bottlenecks in your Web application using Thoughtworks Twist with PushToTest TestMaker Enterprise test automation framework.

4 Steps To Getting Started With The Open Source Test Engagement Model. I describe the problems you need to solve as a manager to get started with Open Source Testing in your organization.

Corellation Technology Finds The Root Cause To Performance Bottlenecks. Use aspect-oriented (AOP)  technology to surface memory leaks, thread deadlocks, and slow database queries in your Java Enterprise applications.

10 Agile Ways To Build and Test Rich Internet Applicatiions (RIA.) Shows how competing RIA technologies put the emphasis on test and deploy.

Oracle Forms Application Testing. Java Applet technology powers Oracle Forms and many Web applications. This blog shows how to install and use open source tools to test Oracle Forms applications.

Saving Your Organization From The Eventual Testing Meltdown of Using Record/Playback Solely. The Selenium project is caught between the world of proprietary test tool vendors and the software developer community. This blog talks about the tipping-point.

Choosing Java Frameworks for Performance. A round-up of opinions on which technologies are best for building applications: lightweight and responsive, RIA, with high developer productivity.

Selenium 2: Using The API To Create Tests. A DZone Refcard we sponsored to explain how to build tests of Web applications using the new Selenium 2 APIs. For the Selenium 1 I wrote another Refcard, click here.

Test Management Tools. A discussion I had with the Zephyr test management team on Agile testing.

Migrating From HP Mercury QTP To PushToTest TestMaker 6. HP QTP just can't deal with the thousands of new Web objects coming from Ajax-based applications. This blog and screencast shows how to migrate.

10 Tutorials To Learn TestMaker 6. TestMaker 6 is the easier way to surface performance bottlenecks and functional issues in Web, Rich Internet Applications (RIA, using Ajax, Flex, Flash,) Service Oriented Architecture (SOA,) and Business Process Management (BPM) applications.

5 Easy Ways To Build Data-Driven Selenium, soapUI, Sahi Tests. This is an article on using the TestMaker Data Production Library (DPL) system as a simple and easy way to data-enable tests. A DPL does not require programming or scripting.

Open Source Testing (OST) Is The Solution To Modern Complexity. Thanks to management oversite, negligence, and greed British Petroleum (BP) killed 11 people, injured 17 people, and dumped 4,900,000 barrels of oil into the Gulf of Mexico in 2010. David Brooks of the New York Times became an unlikely apologist for the disaster citing the complexity of the oil drilling system.

Choosing automated software testing tools: Open source vs. proprietary.  Colleen Fry's article from 2010  discusses why software testers decide which type of automated testing tool, or combination of open source and proprietary, to best meets their needs. We came a long way in 2011 to achieve these goals.

All of my blogs are found here.

Categories: Companies, Open Source

Free Webinar on Agile Web Performance Testing

The Cohen Blog — PushToTest - Tue, 01/10/2012 - 19:22
Free Open Source Agile Web Application Performance Testing Workshop
Your organization may have adopted Agile Software Development Methodology and forgot about load and performance testing! In my experience this is pretty common. Between Scrum meetings, burn-down sessions, sprints, test first, and user stories, many forms of testing - including load and performance testing, stress testing, and integration testing - can get lost. And, it is normally not only your fault. Consider the following:
  • The legacy proprietary test tools - HP LoadRunner, HP QTP, IBM Rational Tester, Microsoft VSTS - are hugely expensive. Organizations can't afford to equip developers and testers with their own licensed copies. These tools licenses are contrary to Agile testing, where developers and testers work side-by-side building and testing concurrently.

  • Many testers still cannot write test code. Agile developers write unit tests in high level languages (Java, C#, PHP, Ruby.) Testers need a code-less way to repurpose these tests into functional tests, load and performance tests, and production service monitors.

  • Business managers need a code-less way to define the software release requirements criteria. Agile developers see Test Management tools (like HP Quality Center QC) as a needless extra burden to their software development effort. Agile developers are hugely attracted to Continuous Integration (CI) tools like Hudson, Jenkins, Cruise Control, and Bamboo. Business managers need anintegrated CI and test platform to define requirements and see how close to 'shipping' is their application.
Lucky for you there is a way to learn how to solve these problems and deliver Agile software development methodology benefits to your organization. The Agile Web Application Performance Testing Workshop is your place to learn the Agile Open Source Testing way to load and performance test your Web applications, Rich Internet Applications (RIA, using Ajax, Flex, Flash, Oracle Forms, Applets,) and SOAP and REST Web services. This free Webinar delivers a testing methodology, tools, and best/worst practices to follow. Plus, you will see a demonstration of a dozen open source test tools all working together.

Registration is free! Click here to learn more and register now:

Register Now

Categories: Companies, Open Source

Free Help To Learn TestMaker, Selenium, Sahi, soapUI

The Cohen Blog — PushToTest - Fri, 01/06/2012 - 05:57
Help Is Here To Learn TestMaker, Selenium, Sahi, soapUI Do you sometimes feel alone? Have you been trying any of the following:
  • Writing Load Test Scripts
  • Building Functional Tests for Smoke and Regression Testing
  • Trying to use Selenium IDE and needing a good tutorial
  • Configuring test management tools working with TestMaker, Sahi, and soapUI
  • Needing To Compare Selenium Vs HP QuickTest Pro (QTP)
  • Stuck While Doing Cloud Computing Testing
  • Need Help Getting Starting with Load Testing Tools
If you feel stuck, need help, or would like to see how professional testers solve these situations, then please attend a free live weekly Webinar.

Register Now

Here Is What We Have For You Bring your best questions, issues, and bug reports on installing, configuring, and using PushToTest TestMaker to our free weekly Workshop via live Webinar. PushToTest experts will be available to answer your questions.

Frank Cohen, CEO and Founder at PushToTest, and members of the PushToTest technical team will answer your questions, show you where to find solutions, and take your feedback for feature enhancements and bug reports.

Every Thursday at 1 pm Pacific time (GMT-8)
Registration Required

At the Webinar:
  1. Register for the Webinar in-advance
  2. Log-in to the Webinar at the given day and time
  3. Each person that logs-in will have their turn, ask their question, and hear our response
  4. You may optionally share/show your desktop for the organizers to see what is going wrong and offer a solution
  5. The organizers will hear as many questions as will fit in 1 hour. No guarantee that everyone will be served.
See how these tools were made to work together. Bring your best questions for an immediate answer!

Register Now

Categories: Companies, Open Source