Skip to content

Feed aggregator

Stop hugging, start working … on excellence!

Some context: this blogpost is my topic for a new peer conference called “Board of Agile Testers (BAT)” on Saturday December 19 2014 in Hotel Bergse Bossen in Driebergen.

I love agile and I love hugging… For me an agile way of working is a, not THE, solution to many irritating problems I suffered from in the 90’s and 00’s. Of course people are the determining factor in software development. It is all about people developing (as in research and development) software for people. So people are mighty important! We need to empower people to do awesome work. People work better if they have fun and feel empowered.

Vineet Nayar talks about people, who want to excel, need two important things: a challenge and passion. These factors resemble the ones described by Daniel Pink: autonomy makes room to excel, passion feeds mastery and a challenge gives purpose. I wrote an article about this subject for agile record called “Software development is all about people“. I see agile teams focus on this people stuff like collaboration, working together, social skills… But why do they often forget Mastery in testing?

Rapid Software Testing teaches serious testing skills by empowering testers in a martial art approach to testing. Not by being nice and hug others. By teaching testers serious skills to talk about their work, say what they mean, stand up for excellence. RST teaches that excellent testing starts with the skill set and the mindset of the individual tester. Other things might help, but excellence in testing must centre on the tester.

One of the many examples is in the new “More agile testing” book by Lisa & Janet in chapter 12 Exploratory testing there is a story by Lisa: “Lisa’s story – Spread the testing love: group hugs!” My review comment was and I quote: “I like the activity but do not like the name… I fear some people will not take it too serious… It might get considered too informal or childish. Consider a name like bug hunts.”

Really? Hugs? The whole hugging ethos in agile makes me CRAZY. Again, I love hugging and in my twitter profile it says I am a people lover. But a fluffy approach to agile in general and testing in particular makes me want to SCREAM! It makes me mad! Stop diminishing skills. If people are doing good work, sure hug them, but if they don’t: give them some serious feedback! Work with them to get better and grow. Mentor them, coach them, teach them. But what if they do not improve? Or do not want to improve? Well… maybe then it is time to say goodbye? It is time to start working on some serious skills!

Testing is serious business, already suffering from misunderstanding and underestimation by many who think they can automate all testing and everybody can test. In agile we are all developers and t-shaped people will rule the world. In 15 years there will be only developers doing everything: writing documentation, coding and testing… Yeah right! I wish I could believe that. Testing is HARD and needs a lot of study. As long as I see a vast majority of people not willing to study testing, I know I will have a job as a testing expert for the rest of my life!

This blogpost reflects some “rough ideas”. After the peer conference I will update this post with the ideas discussed in the room.

Categories: Blogs

Oh, Kay!

Hiccupps - James Thomas - Sat, 12/20/2014 - 08:28


Phil Kay is a stand-up comedian known for his love of live work and improvisation. In his interview for The Comedian's Comedian recently he said some things that resonated with me.

When he's talking about the impression others may have that there are rules of improvisation, I'm thinking about testing:
There's not a principle that I must avoid things I've done before ... There's plenty of room in the form for doing brand new things [but that's] not the aim, that I must do it brand new.When he's talking about how he constantly watches for and collects data that he hopes will come in useful later, that will help him to make connections and that will keep his mojo working when he's not on stage, I'm thinking about testing:
I write notes all the time ... anything interesting that comes to me ... but [the notes] are not the thing. The thing is the fact that I'm watching out for stuff ... like a boxer keeping loose ... on stage I hope they'll all come together.When he's talking about how not being tied to a prescribed structure opens up possibilities, I'm thinking about testing:Allow the best to be a thing that could happen.  If you're trying to enforce something, no best can ever happen.And when he talking about how it doesn't work sometimes, I'm still thinking about testing:
The list of traumatic failure gigs is so long ...  I accept the risk it'll go wrong.Looking around for related material I found that James Lyndsay has a workshop on Improvising for Testers, and Damian Synadinos has one specifically on the links between improv comedy and testing, Improv(e) Your Testing! Tips and Tricks from Jester to Tester.
Image: https://flic.kr/p/hquBik
Categories: Blogs

AutoMapper 3.3 feature: open generics

Jimmy Bogard - Sat, 12/20/2014 - 01:10

One of the interesting features of AutoMapper 3.3 is the ability to map open generic types. Open generics are those that don’t supply type parameters, like:

var openType = typeof(IEnumerable<>);

AutoMapper had some limited support for certain built-in open generics, but only the collection types. This changed in version 3.3, where you can now map any sort of open generic type:

public class Source<T> {
    public T Value { get; set; }
}

public class Destination<T> {
    public T Value { get; set; }
}

// Create the mapping
Mapper.CreateMap(typeof(Source<>), typeof(Destination<>));

Instead of using the normal syntax of the generic CreateMap method, you need to use the overload that takes type objects. This is because C# only accepts closed generic types as type parameters. This also means you can use all the configuration available for you to do member-specific mappings, but can only do them by referencing as a string instead of an expression. Not a limitation per se, but just something to be aware of.

To use the open generic mapping configuration, you can execute the mapping against a closed type:

var source = new Source<int> { Value = 10 };

var dest = Mapper.Map<Source<int>, Destination<int>>(source);

dest.Value.ShouldEqual(10);

Previously, I’d have to create maps for every closed type. With the 3.3 version, I can create map for the open type and AutoMapper can automatically figure out how to build a plan for the closed types from the open type configuration, including any customizations you’ve created.

Something that’s been asked for a while, but only recently have I figured out a clean way of implementing it. Interestingly enough, this feature is going to pave the way for programmatic, extensible conventions I’m targeting for 4.0.

Someday.

Post Footer automatically generated by Add Post Footer Plugin for wordpress.

Categories: Blogs

Give Back to the Developers This Holiday Season: Test Open Source

uTest - Fri, 12/19/2014 - 22:42

We here at the uTest Blog have long been a proponent of a harmonious tester-developer relationship. And according to a recent InfoWorld article, the best wlinux_penguin400ay to a dev’s heart this holiday season may be through testing their open source code.

According to the piece, “for proprietary software, the only option is to suck it up and hope your vendor will fix problems with a future release. But with open source software,” testers can actively be part of the action and “make contributions that lead to them being more effective for less effort.” This is the key to the developers’ hearts because devs on open source projects don’t have the support teams of robust enterprise players — they’re going to rely a lot on testers taking action versus complaining.

Here’s the main benefits of testing open source code this holiday season:

  • It’s good for your blood pressure!
  • It makes the experience better for users of the open source program worldwide
  • It’s better than complaining about crappy code (devs get this feedback enough and in the open source realm, can’t do much with time and resource constraints)
  • It turns user feedback into something constructive (a bug report) that can be quickly used for a fix

So enjoy the full read, and give back this holiday season by testing your favorite open source project. Your developer is guaranteed to love the gift.

Categories: Companies

Testing on the Toilet: Truth: a fluent assertion framework

Google Testing Blog - Fri, 12/19/2014 - 20:28
by Dori Reuveni and Kurt Alfred Kluever

This article was adapted from a Google Testing on the Toilet (TotT) episode. You can download a printer-friendly version of this TotT episode and post it in your office.


As engineers, we spend most of our time reading existing code, rather than writing new code. Therefore, we must make sure we always write clean, readable code. The same goes for our tests; we need a way to clearly express our test assertions.

Truth is an open source, fluent testing framework for Java designed to make your test assertions and failure messages more readable. The fluent API makes reading (and writing) test assertions much more natural, prose-like, and discoverable in your IDE via autocomplete. For example, compare how the following assertion reads with JUnit vs. Truth:
assertEquals("March", monthMap.get(3));          // JUnit
assertThat(monthMap).containsEntry(3, "March"); // Truth
Both statements are asserting the same thing, but the assertion written with Truth can be easily read from left to right, while the JUnit example requires "mental backtracking".

Another benefit of Truth over JUnit is the addition of useful default failure messages. For example:
ImmutableSet<String> colors = ImmutableSet.of("red", "green", "blue", "yellow");
assertTrue(colors.contains("orange")); // JUnit
assertThat(colors).contains("orange"); // Truth
In this example, both assertions will fail, but JUnit will not provide a useful failure message. However, Truth will provide a clear and concise failure message:

AssertionError: <[red, green, blue, yellow]> should have contained <orange>

Truth already supports specialized assertions for most of the common JDK types (Objects, primitives, arrays, Strings, Classes, Comparables, Iterables, Collections, Lists, Sets, Maps, etc.), as well as some Guava types (Optionals). Additional support for other popular types is planned as well (Throwables, Iterators, Multimaps, UnsignedIntegers, UnsignedLongs, etc.).

Truth is also user-extensible: you can easily write a Truth subject to make fluent assertions about your own custom types. By creating your own custom subject, both your assertion API and your failure messages can be domain-specific.

Truth's goal is not to replace JUnit assertions, but to improve the readability of complex assertions and their failure messages. JUnit assertions and Truth assertions can (and often do) live side by side in tests.

To get started with Truth, check out http://google.github.io/truth/

Categories: Blogs

Microsoft Virtual Academy Links for 2014

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

So I thought that going through a few Microsoft Virtual Academy links could help some of you.

Here are the links I think deserve at least a click. If you find them interesting, let me know!

Categories: Blogs

Temporarily ignore SSL certificate problem in Git under Windows

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

So I've encountered the following issue:

fatal: unable to access 'https://myurl/myproject.git/': SSL certificate problem: unable to get local issuer certificate

Basically, we're working on a local Git Stash project and the certificates changed. While they were working to fix the issues, we had to keep working.

So I know that the server is not compromised (I talked to IT). How do I say "ignore it please"? Temporary solution

This is because you know they are going to fix it.

PowerShell code:

$env:GIT_SSL_NO_VERIFY = "true"

CMD code:

SET GIT_SSL_NO_VERIFY=true

This will get you up and running as long as you don’t close the command window. This variable will be reset to nothing as soon as you close it. Permanent solution

Fix your certificates. Oh… you mean it’s self signed and you will forever use that one? Install it on all machines.

Seriously. I won’t show you how to permanently ignore certificates. Fix your certificate situation because trusting ALL certificates without caring if they are valid or not is juts plain dangerous.

Fix it.

NOW.

Categories: Blogs

The Yoda Condition

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

So this will be a short post. I would like to introduce a word in my vocabulary and yours too if it didn't already exist.

First I would like to credit Nathan Smith for teaching me that word this morning. First, the tweet:

Chuckling at "disallowYodaConditions" in JSCS… https://t.co/unhgFdMCrh — Awesome way of describing it. pic.twitter.com/KDPxpdB3UE

— Nathan Smith (@nathansmith) November 12, 2014

So... this made me chuckle.

What is the Yoda Condition?

The Yoda Condition can be summarized into "inverting the parameters compared in a conditional".

Let's say I have this code:

string sky = "blue";if(sky == "blue) {    // do something}

It can be read easily as "If the sky is blue". Now let's put some Yoda into it!

Our code becomes :

string sky = "blue";	if("blue" == sky){    // do something}

Now our code read as "If blue is the sky". And that's why we call it Yoda condition.

Why would I do that?

First, if you're missing an "=" in your code, it will fail at compile time since you can't assign a variable to a literal string. It can also avoid certain null reference error.

What's the cost of doing this then?

Beside getting on the nerves of all the programmers in your team? You reduce the readability of your code by a huge factor.

Each developer on your team will hit a snag on every if since they will have to learn how to speak "Yoda" with your code.

So what should I do?

Avoid it. At all cost. Readability is the most important thing in your code. To be honest, you're not going to be the only guy/girl maintaining that app for years to come. Make it easy for the maintainer and remove that Yoda talk.

The problem this kind of code solve isn't worth the readability you are losing.

Categories: Blogs

Do you have your own Batman Utility Belt?

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24
Just like most of us on any project, you (yes you!) as a developer must have done the same thing over and over again. I'm not talking about coding a controller or accessing the database.

Let's check out some concrete examples shall we?

  • Have you ever setup HTTP Caching properly, created a class for your project and call it done?
  • What about creating a proper Web.config to configure static asset caching?
  • And what about creating a MediaTypeFormatter for handling CSV or some other custom type?
  • What about that BaseController that you rebuild from project to project?
  • And those extension methods that you use ALL the time but rebuild for each projects...

If you answered yes to any of those questions... you are in great risk of having to code those again.

Hell... maybe someone already built them out there. But more often than not, they will be packed with other classes that you are not using. However, most of those projects are open source and will allow you to build your own Batman utility belt!

So once you see that you do something often, start building your utility belt! Grab those open source classes left and right (make sure to follow the licenses!) and start building your own class library.

NuGet

Once you have a good collection that is properly separated in a project and that you seem ready to kick some monkey ass, the only way to go is to use NuGet to pack it together!

Checkout the reference to make sure that you do things properly.

NuGet - Publishing

OK you got a steamy new hot NuGet package that you are ready to use? You can either push it to the main repository if your intention is to share it with the world.

If you are not ready quite yet, there are multiple way to use a NuGet package internally in your company. The easiest? Just create a Share on a server and add it to your package source! As simple as that!

Now just make sure to increment your version number on each release by using the SemVer convention.

Reap the profit

OK, no... not really. You probably won't be money anytime soon with this library. At least not in real money. Where you will gain however is when you are asked to do one of those boring task yet over again in another project or at another client.

The only thing you'll do is import your magic package, use it and boom. This task that they planned would take a whole day? Got finished in minutes.

As you build up your toolkit, more and more task will become easier to accomplish.

The only thing left to consider is what NOT to put in your toolkit.

Last minute warning

If you have an employer, make sure that your contract allows you to reuse code. Some contracts allows you to do that but double check with your employer.

If you are a company, make sure not to bill your client for the time spent building your tool or he might have the right to claim them as his own since you billed him for it.

In case of doubt, double check with a lawyer!

Categories: Blogs

Software Developer Computer Minimum Requirements October 2014

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

I know that Scott Hanselman and Jeff Atwood have already done something similar.

Today, I'm bringing you the minimum specs that are required to do software development on a Windows Machine.

P.S.: If you are building your own desktop, I recommend PCPartPicker.

ProcessorRecommendation

Intel: Intel Core i7-4790K

AMD: AMD FX-9590

Unless you use a lot of software that supports multi-threading, a simple 4 core here will work out for most needs.

MemoryRecommendation

Minimum 8GB. 16GB is better.

My minimum requirement here is 8GB. I run a database engine and Visual Studio. SQL Server can easily take 2Gb with some big queries. If you have extensions installed for Visual Studio, it will quickly raise to 1GB of usage per instance and finally... Chrome. With multiple extensions and multiple pages running... you will quickly reach 4GB.

So get 8GB as the bare minimum. If you are running Virtual Machines, get 16GB. It won't be too much. There's no such thing as too much RAM when doing software development.

Hard-driveRecommendation

512 GB SSD drive

I can't recommend enough an SSD. Most tools that you use on a development machine will require a lot of I/O. Especially random read. When a compiler starts and retrieve all your source code to compile, it will need to read from all those file. Same thing if you have tooling like ReSharper or CodeRush. I/O speed is crucial. This requirement is even more important on a laptop. Traditionally, PC maker put a 5200RPM HDD on a laptop to reduce power usage. However, 5200 RPM while doing development will be felt everywhere.

Get an SSD.

If you need bigger storage (terabytes), you can always get a second hard-drive of the HDD type instead. Slower but capacities are also higher. On most laptop, you will need external storage for this hard drive so make sure it is USB3 compatible.

Graphic Card

Unless you do graphic rendering or are working with graphic tools that require a beast of a card... this is where you will put the less amount of money.

Make sure to get enough of them for your amount of monitors and that they can provide the right resolution/refresh rate.

Monitors

My minimum requirement nowadays is 22 inches. 4K is nice but is not part of the "minimum" requirement. I enjoy a 1920x1080 resolution. If you are buying them for someone else, make sure they can be rotated. Some developers like to have a vertical screen when reading code.

To Laptop or not to Laptop

Some company go Laptop for everyone. Personally, if the development machine never need to be taken out of the building, you can go desktop. You will save a bit on all the required accessories (docking port, wireless mouse, extra charger, etc.).

My personal scenario takes me to clients all over the city as well as doing presentations left and right. Laptop it is for me.

Categories: Blogs

SVG are now supported everywhere, or almost

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

I remember that when I wanted to draw some graphs on a web page, I would normally have 2 solutions

Solution 1 was to have an IMG tag that linked to a server component that would render an image based on some data. Solution 2 was to do Adobe Flash or maybe even some Silverlight.

Problem with Solution 1

The main problem is that it is not interactive. You have an image and there is no way to do drilldown or do anything with it. So unless your content was simple and didn't need any kind of interaction or simply was headed for printing... this solution just wouldn't do.

Problem with Solution 2

While you now get all the interactivity and the beauty of a nice Flash animation and plugin... you lost the benefits of the first solution too. Can't print it if you need it and over that... it required a plugin.

For OSX back in 2009, plugins were the leading cause of browser crash and there is nothing that stops us from believing that similar things aren't true for other browsers.

The second problem is security. A plugin is just another attack vector on your browser and requiring a plugin to display nice graphs seem a bit extreme.

The Solution

The solution is relatively simple. We need a system that allows us to draw lines, curves and what not based on coordinate that we provide it.

That system should of course support colors, font and all the basic HTML features that we know now (including events).

Then came SVG

SVG has been the main specification to drawing anything vector related in a browser since 1999. Even though the specification started at the same time than IE5, it wasn't supported in Internet Explorer until IE9 (12 years later).

The support for SVG is now in all major browsers from Internet Explorer to FireFox and even in your phone.

Chances are that every computer you are using today can render SVG inside your browser.

So what?

SVG as a general rule is under used or thought of something only artists do or that it's too complicated to do.

My recommendation is to start cracking today on using libraries that leverage SVG. By leveraging them, you are setting yourself apart from others and can start offering real business value to your clients right now that others won't be able to.

SVG has been available on all browsers for a while now. It's time we start using it.

Browsers that do not support SVG
  • Internet Explorer 8 and lower
  • Old Android device (2.3 and less), partial support for 3-4.3
References, libraries and others
Categories: Blogs

Microsoft, Open Source and The Big Ship

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24


I would like to note that this post takes only public information available and are not based on my status as Microsoft MVP. I did not interview anyone at Microsoft for those answers. I did not receive any privileged information for writing this post. All the information I am using and the insight therefor are based on publicly available information.

When it happened

I'm not sure exactly when this change toward open source happened. Microsoft is a big ship. Once you start steering, it takes a while before you can feel the boat turn. I think it happened around 2008 when they started including jQuery in the default templates. It was the first swing of the wheel. Back then, you could have confused it for just another side project. Today, I think it was a sign of change.

Before this subtle change, we had things like Microsoft Ajax, the Ajax Control Toolkit and so many other reinvention from Microsoft. The same comment came back every time:

Why aren't you using <INSERT FRAMEWORK HERE> instead of reinventing the wheel?

Open source in the Microsoft world

Over 10 years ago, Microsoft wasn't doing open source. In fact, nothing I remember was open sourced. Free? Yes. Open source? No. The mindset of those days has changed.

The Changes

Initiatives like NuGetintegrating jQuery into Visual Studio templates, the multiple GitHub accounts and even going as to replace the default JSON serializer byJSON.NET instead of writing its own are all proofs that Microsoft have changed and is continuing to change.

It's important to take into account that this is not just lip service here. We're talking real time and money investment to publish tools, languages and frameworks into the open. Projects like Katana and Entity Framework are even open to contribution by anyone.

Without letting slip that Roslyn (the new C#/VB.NET compiler) as well as the F#'s compiler are now open sourced.

This is huge and people should know.

Where is it going today

I'm not sure where it's going today. Like I said, it's a big ship. From what I see, Microsoft is going 120% on Azure. Of course, Windows and Office is still there but... we already see that it's not an Open-Source vs Windows war anymore. The focus has changed.

Open source is being used to enrich Microsoft's environment now. Tools likeSideWaffle are being created by Microsoft employees like Sayed Hashimi and Mads Kristensen.

When I see a guy like Satya Nadella (CEO) talk about open source, I think it is inspiring. Microsoft is going open source internally then encouraging all employees to participate in open source projects.

Microsoft has gone through a culture change, and it's still happening today.

Comparing Microsoft circa 2001 to Microsoft 2014.

If you were at least 10 years in the field, you would remember that way back then, Microsoft didn't do open source. At all.

Compare it to what you've read about Microsoft now. It's been years of change since then and it's only the beginning. Back then, I wouldn't have believed anyone telling me that Microsoft would invest in Open Source.

Today? I'm grinning so much that my teeth are dry.

Categories: Blogs

List of d3.js library for charting, graphs and maps

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

So I’ve been trying different kind of library that are based on d3.js. Most of them are awesome and … I know I’m going to forget some of them. So I decided to build a list and try to arrange them by categories.

Charts
  • DimpleJS – Easy API, lots of different type of graphs, easy to use
  • C3.js – Closer to the data than dimple but also a bit more powerful
  • NVD3.js – Similar to Dimple, require a CSS for proper usage
  • Epoch – Seems to be more focused on real-time graphs
  • Dygraphs – Focus on huge dataset
  • Rickshaw – Lots of easy chart to choose from. Used by Shutterstock
Graphs

Since I haven’t had the chance to try them out, I won’t be able to provide more detailed comments about them. If you want me to update my post, hit me up on Twitter @MaximRouiller.

Data Visualization Editor
  • Raw – Focus on bringing data from spreadsheets online by simply copy/pasting it.
  • Tributary – Not simply focused on graphics, allows you to edit numbers, colors and such with a user friendly interface.
Geographical maps
  • DataMaps – Not a library per say but a set of examples that you can copy/paste and edit to match what you want.
Categories: Blogs

How to display a country map with SVG and D3js

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

I’ve been babbling recently with charts and most of them was with DimpleJS.

However, what is beside DimpleJS is d3.js which is an amazing tools for drawing anything in SVG.

So to babble some more, I’ve decide to do something simple. Draw Canada.

The Data

I’ve taken the data from this repository that contains every line that forms our Maple Syrup Country. Ours is called “CAN.geo.json”. This file is called a Geo-Json file and allows you to easily parse geolocation data without a hitch.

The Code
var svg = d3.select("#chartContainer")
    .append("svg")
    .attr("style", "solid 1px black")
    .attr("width", "100%")
    .attr("height", "350px");

var projection = d3.geo.mercator().center([45, 55]);
var path = d3.geo.path().projection(projection);

var g = svg.append("g");
d3.json("/data/CAN.geo.json", function (error, json) {
    g.selectAll("path")
           .data(json.features)
           .enter()
           .append("path")
           .attr("d", path)
           .style("fill", "red");
});
The Result var svg = d3.select("#chartContainer") .append("svg") .attr("style", "solid 1px black") .attr("width", "100%") .attr("height", "350px"); var projection = d3.geo.mercator().center([45, 55]); var path = d3.geo.path().projection(projection); var g = svg.append("g"); d3.json("/data/CAN.geo.json", function (error, json) { g.selectAll("path") .data(json.features) .enter() .append("path") .attr("d", path) .style("fill", "red"); }); Conclusion

Of course this is not something very amazing. It’s only a shape. This could be the building block necessary to create the next eCommerce world-wide sales revenue report.

Who knows… it’s just an idea.

Categories: Blogs

Animating your charts with Storyboard charts from DimpleJS and d3js

Decaying Code - Maxime Rouiller - Fri, 12/19/2014 - 20:24

chart-line-148256_640

Storyboard are charts/graphs that tell a story.

To have a graph, you need a timeline. Whether it’s days, weeks, months or years… you need a timeline of what happens. Then to have a chart, you need two axis. One that tells one version of the story, the other that relates to it. Then you move things forward in time and you move the data point. For each of those point, you also need to be able to label that point.

So let’s make a list of what we need.

  1. Data on a timeline.
  2. One numerical data
  3. Another numerical data that correlate to the other in some way
  4. A label to identify each point on the graph

I’ve taken the time to think about it and there’s one type of data that easy to come up with (I’m just writing a technical blog post after all).

Introducing the DataSet

I’ve taken the GDP, Population per country for the last 30 years from World Economics and merged it into one single file.

Note: World Economics is very keen to share data with you in format that are more readable than what is on their website. Contact them through their twitter account if you need their data!

Sound simple but it took me over 1 hour to actually merge all that data. So contact them to have a proper format that is more developer friendly.

Here’s what is the final result:

graphAnimation

So this is the result I have.

The Code

That’s the most bonkers thing ever. Once you have the data properly setup, this doesn’t require that much code. Here’s what the code to generate the same graph on your end:

$.ajax("/GDP.csv", {
    success: function (data) {
        var csv = d3.csv.parse(data);

        var post3 = function () {
            var svg = dimple.newSvg("#storyboardGraph", 800, 600);
            var chart = new dimple.chart(svg, csv);

            csv = dimple.filterData(csv, "Year", ["2000", "2001", "2002", "2003",
                "2004", "2005", "2006", "2007", "2008", "2009", "2010", "2011",
                "2012", "2013", ]);
            
            var frame = 2000;
            chart.addMeasureAxis("x", "GDP");
            chart.addMeasureAxis("y", "Population");
            chart.addSeries(["Country"], dimple.plot.bubble);
            var story = chart.setStoryboard("Year");
            story.frameDuration = frame;
            story.addOrderRule("Date");
            chart.draw();
        };
        post3();
    }
});
Conclusion

Stop using weird graphing library that will cost you an arm and a leg. Your browser (both desktop and mobile) can handle this kind of technology. Start using it now.

See DimpleJS for more examples and fun scenario to work with. Don’t forget to also follow John Kiernander on Twitter.

As usual, the source is available on Github.

Enjoy!

Categories: Blogs

Mobile Testing Talk: What’s New With Appium [RECAP]

Sauce Labs - Fri, 12/19/2014 - 18:00

appium_logo_with_sauce_NEW

Thanks for joining us last week for our latest webinar, Mobile Testing Talk: What’s New With Appium, which featured Jonathan Lipps, the Chief Architect for the Appium Project.

To recap, Jonathan gave listeners a tour of Appium version 1.3.x, including the stability improvements and features the team has added since the Appium 1.0 release back in May of 2014. He also touched on the following:

  •  Appium 1.3.x release features and improvements
  •  New platforms
  •  Better hybrid support
  •  Examples of maturing Appium clients & more

We had great feedback on the content and an influx of interesting questions that got addressed in the Q&A at the end.

Missed the webinar, need to hear it again, or want share it with your team? Check out the slides below and listen to the recording here:

What’s New With Appium? From 1.0 to Now from Sauce Labs

Want to try Sauce Labs? You can sign up for a free trial anytime on our website, or reply to this email and we’ll be happy to put you in touch with someone for a customized plan.

Look out for our next webinar invitation coming out soon!

Categories: Companies

Fresh UrbanCode Updates for zOS and iSeries

IBM UrbanCode - Release And Deploy - Fri, 12/19/2014 - 17:47

This month has seen a slew of updates for the big enterprise systems. The 6.1.1 Release introduced proper support for iSeries, including support for the Integrated File System.

We have also seen a bunch of content for z/OS recently. If you missed it, the z agents and initial batch of integrations came out in June. But over the last week or so, three new plugins dropped. These plugins will help mainframe customers using UrbanCode Deploy phase out their home grown scripts over time.

z plugins e1419004207278 Fresh UrbanCode Updates for zOS and iSeries

 

 

Support for the Information Management System (IMS) on z/OS makes enables running type 1 and type 2 commands on the platform.

The DB2 integration is focused on binding applications or packages.

Finally CICS Transaction Server. Initially released as a Beta, this integration makes it easy to deploy CICS applications and bundles. We are looking for a lot of feedback on this one particular. Let us know what would be useful to add! We know there's a ton of CICS out there.

CICS transacttions 300x300 Fresh UrbanCode Updates for zOS and iSeries

Categories: Companies

The 10 Hottest Devices for Mobile App Testing (This Holiday Season)

uTest - Fri, 12/19/2014 - 16:30

testing devicesThe stockings were hung by the chimney with care…full of the latest testing devices.

It’s been a while since we last updated the testing and development world on the most popular devices amongst our community of 150,000+ testers. But we thought — what better time than the holidays to get your favorite tester a gift?

Testers within our community often want to know on which devices they should be testing. Concurrently, developers also want to know where their babies should be given the most love. Based on customer and tester data from our platform, here are the 10 most popular mobile devices on which Applause customers’ apps were tested in the past 90 days:

  1. iPhone 5
  2. iPhone 6
  3. iPhone 4s
  4. iPhone 5s
  5. Galaxy S4
  6. Galaxy S3
  7. iPad 2 Wi-Fi
  8. iPad Mini Wi-Fi
  9. iPhone 4
  10. iPad Air Wi-Fi

As you can see, the iOS experience continually dominates our customers’ worries (especially the new iPhone 6), but some of our honorable mentions are in the Android realm. These include the Galaxy S5, Nexus 5, and Nexus 7. As an added bonus, the Nexus 6 has also surfaced as a device that many customers are starting to seek in terms of coverage for their apps.

This of course will always fluctuate as soon as the next big device comes out (will wearables change the mobile scene next year?), which is why we plan to update you every quarter on the hottest devices. But on the flipside, as the iPhones 4 and 5 can attest to released several years ago now, it’s always a safe bet that as long as the device still has widespread adoption “in the wild,” there’s always a need for you to test your apps on it.

Happy Holidays, and enjoy your new mobile device when testing this season.

Categories: Companies

Sharing Code Coverage Results With NCover Code Central

NCover - Code Coverage for .NET Developers - Fri, 12/19/2014 - 16:17

share_code_coverage_results_twitterOne of the keys to successful team-based code coverage, is making sure that code coverage results are transparent and actively shared across the entire team. Organizations that share results team-wide are able to encourage rapid feedback and are generally more effective in guiding the future allocation of resources than those that do not. With the latest version of NCover Code Central, sharing code coverage results is easier than ever.

Sharing Code Coverage Results

We will discuss two new ways that can be used independently, or together, to share code coverage with individuals involved in your project.  They are (1) the updated self-contained HTML summary report and (2) public browsing.

Self-Contained HTML Code Coverage Report

With NCover Code Central, you can produce either summary or detailed HTML code coverage reports.  These reports can be created from the command line or the NCover Code Central GUI.

The Summary Report

The summary report includes the metrics from the specified execution along with the corresponding trend graphs displayed in NCover Code Central for that specific project. Here is an example summary report for a sample project.

html_report_summary

This HTML file can be uploaded, emailed or shared with any individuals who need access to the report.  The report also includes links back to NCover Code Central to review the project in greater detail.

The report can be generated from the NCover Code Central GUI by selecting the report icon within a specific execution.

html_report_gui_access

The summary report can also be generated from the command line using the ncover report command with a usage similar to this:

ncover report --project="ProjectName" --execution="Test1" --file="MyPath" --detail="summary"

The report shows the coverage results of the most recent execution by default, if no execution is specified by the user.

The Detailed Report

New to Version 5 of NCover Code Central, the HTML code coverage report can now include detailed information including Types/Classes and Methods and also supports the ability to filter the generated report.  The detailed information is contained with the HTML and expands and collapses based on user interaction.  Here is an example of the HTML Code Coverage Report including detail:

html_report_detailed

This report provides the dual benefit of (1) still supporting a direct link back to the project in NCover Code Central but (2) also providing much greater detail embedded within the report.  This means users without an Internet connection and/or access to the NCover Code Central server can still see much more coverage data than ever.

Determining the type of detail and which filters, if any, you would like to apply is handled through the NCover GUI via the report icon.

html_report_gui_detail

The detailed report can also be run via the command line using the following options:

Options:
 --project=VALUE NCover project name. *REQUIRED*
 --execution[=VALUE] Specify execution by Caption, Build Id, or Date.
 --file[=VALUE] File name for export.
 --filter[=VALUE] Filter name or id.
 --detail[=VALUE] Report detail (CCSVR): summary, class, method
 --timeout[=VALUE] Timeout request after number of seconds
 --project=VALUE NCover project name. *REQUIRED*
 --execution=VALUE Specify execution by Caption, Build Id, or Date.
 --file=VALUE File name for export.
 --timeout=VALUE Timeout request after number of seconds

As a self-contained HTML report, both the summary and the detailed report can be kept as a build artifact, shared via email or as a static web page.

Public Browsing

NCover Code Central has always provided the ability to manage access to NCover projects via the Code Central admin page. By adding users, admins could share project results to any authorized user with access to the NCover Code Central server.

With the latest version of NCover Code Central, admins can also provide “public” access on a project-by-project basis, allowing any individual with access to the NCover Code Central server to view results through the NCover GUI, but without requiring login credentials.

public_browsing

From the project settings for any project from the NCover GUI, any admin can choose to set Browse Coverage to “No login required” and immediately provide “public” access to the coverage results.  The default state for all projects is “Code Central login required” and public browsing can be revoked at any time.

By combining the HTML Code Coverage Report with the option to browse coverage data without a Code Central login, teams can quickly share results and improve the transparency of code coverage results, all with less management overhead than ever.

The post Sharing Code Coverage Results With NCover Code Central appeared first on NCover.

Categories: Companies

Merry Christmas! BugBuster v3.0.5 executes test scenarios 30% faster now!

BugBuster - Fri, 12/19/2014 - 16:09

We’ve just released a new version of BugBuster which improves the execution time of test scenarios by 30% !

This release bundles also the following changes:

  • Improved action execution speed (~500ms faster per recorded action).
  • Support for CSS transition to determine the stability of the current screen
  • Better handling of carousels leading to a faster stabilization of the screen
  • Fixed the generated assertions for is disabled and is enabled in BugBuster Test Recorder

 

So, please heads up to the login page and witness these performance improvements by yourself!

 

Enjoy the xmas break and accept our best wishes for the upcoming year!

The post Merry Christmas! BugBuster v3.0.5 executes test scenarios 30% faster now! appeared first on BugBuster.

Categories: Companies

Knowledge Sharing

SpiraTest is the most powerful and affordable test management solution on the market today