Skip to content

Feed aggregator

Recap: Best Practices in Mobile CI [WEBINAR]

Sauce Labs - Wed, 04/29/2015 - 19:38

Thanks to those of you who joined us for our last webinar, Best Practices in Mobile Continuous Integration, with Kevin Rohling. The webinar covered topics like:

  • What makes mobile CI so different
  • Best ways to use emulators and simulators in testing
  • Suggestions for CI tools and mobile testing frameworks

Missed the presentation, want to hear it again, or share with a colleague?

Listen to the recording HERE and view the slides below.

Best Practices in Mobile CI (webinar) from Sauce Labs

For this webinar, we took a pool from our listeners about mobile CI and we thought we’d share the results. 198 people answered the following questions:

What CI tools do you use for mobile testing?
a. I don’t use CI – 18%
b. Jenkins – 27%
c. Travis – 2%
d. Bamboo – 6%
e. Ship.io – 0%
f. Other – 6%

What % of your mobile functional testing is automated vs. manual?
a. All manual – 15%
b. 1 – 25% automated – 25%
c. 26 – 50% automated – 10%
d. 51 – 75% automated – 7%
e. 76% – 100% automated – 1%

What % of your mobile tests are on emulators vs. real devices?
a. 100% emulators / no real devices – 10%
b. Up to 75% emulators / up to 25% real devices – 12%
c. Up to 50% emulators / up to 50% real devices – 9%
d. Up to 25% emulators / up to 75% real devices – 18%

Mobile definitely has a long way to go to catch up to web app testing. We’ll be sure to keep sharing tips and tricks for optimizing your flow. Happy testing!

Categories: Companies

Top 10 Paid Software Testing Projects at uTest: Week of April 27

uTest - Wed, 04/29/2015 - 14:30

The Apple Watch has recently landed in the hands of thousands of users around the globe (including many of our own uTesters!). Because of this, it’s no surprise that the hot wearable is one of the most sought-after devices by our global customer base for testing coverage on their apps. For the week of April […]

The post Top 10 Paid Software Testing Projects at uTest: Week of April 27 appeared first on Software Testing Blog.

Categories: Companies

Infographic: Announcing 250 Million Tests Run on Sauce Labs!

Sauce Labs - Wed, 04/29/2015 - 13:00

It’s true! We’re celebrating the fact that more than 250 million tests have been run on our platform! It’s crazy to think that we announced just over 100 million tests at the end of February, 2014. That’s an increase of 150% in just 14 months.

This time we thought we’d take a look at how our ecosystem has been growing as well, including our work with Appium, a cross-platform mobile test automation framework sponsored by Sauce Labs and a thriving community of open source developers.

Check out more Sauce-y stats in our celebratory infographic below. (Click to enlarge.)

SL_test_and_counting_Final-Draft_v04.3-4-28-15-PS

Categories: Companies

JavaScript Learning - Resource List

Yet another bloody blog - Mark Crowther - Wed, 04/29/2015 - 11:08
I've had this list of sites sat on my system for some time. As I often get asked about good sites to hit to learn either of Ruby or JavaScript I thought I'd share this list here. Feel free to suggest additions and I'll update the list. When I get 5 minutes I'll add a permalink to my website too.

A single resource will rarely teach you all you need to know or explain it in just the right way for your learning style or current understanding. Hack away at one of the sites then switch to another to both re-learn what you've been covering then learn new things.

Remember: study for 40 minutes per day for 30 days.


Tools to Code with
http://www.aptana.com/products/studio3.html

http://www.sublimetext.com/3


Online Reference Books
http://eloquentjavascript.net/

http://addyosmani.com/resources/essentialjsdesignpatterns/book/

http://www.javascriptenlightenment.com/


JavaScript Tutorials
https://developer.mozilla.org/en-US/docs/Web/JavaScript
https://developer.mozilla.org/en-US/docs/Web/JavaScript/A_re-introduction_to_JavaScript

http://torusoft.com/blog/5-days-of-code-curriculum-day-1

http://www.w3schools.com/js/

https://www.codementor.io/learn-javascript-online

https://ilovecoding.org/courses/learn-javascript-in-14-days/

http://javascript.info/

http://htmldog.com/guides/javascript/

http://www.freecodecamp.com/

http://www.codeavengers.com/

http://www.codecademy.com/en/tracks/javascript-combined

http://channel9.msdn.com/Series/JavaScript-Fundamentals-Development-for-Absolute-Beginners

https://www.khanacademy.org/computing/computer-programming

https://docs.webplatform.org/wiki/Beginners
https://docs.webplatform.org/wiki/javascript

http://davidwalsh.name/tutorials/javascript


JavaScript Components
https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API

http://www.webrtc.org/


JavaScript Frameworks / Libraries
https://jquery.com/

https://angularjs.org/

https://facebook.github.io/react/

https://nodejs.org/


Management
http://gruntjs.com/

https://www.npmjs.com/package/npm


JavaScript Browser Test Automation
http://nightwatchjs.org/

https://code.google.com/p/selenium/wiki/WebDriverJs

https://www.npmjs.com/package/selenium-webdriver

http://jasmine.github.io/


JavaScript Game Engine / Framework
http://www.babylonjs.com/ 

https://playcanvas.com/

http://www.limejs.com/ 

https://developer.mozilla.org/en-US/docs/Web/API/Path2D

https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API/A_basic_ray-caster

https://developer.mozilla.org/en-US/demos/ 


Game Making Tutorials
http://code.tutsplus.com/tutorials/build-your-first-game-with-html5--net-20786

http://www.html5gamedevelopment.com/html5-game-tutorials

http://blog.sklambert.com/

http://spyrestudios.com/30-tutorials-for-html5-browser-games/

https://developer.mozilla.org/en-US/docs/Web/WebGL/Getting_started_with_WebGL



Categories: Blogs

4 Reasons Bugs Are Missed

PractiTest - Wed, 04/29/2015 - 10:09
The following is a Guest Blog Post by Cullyn Thomson from Tellurium.  You can follow Cullyn on Twitter at @CullynT and Tellurium at @te52app, and we also suggest you visit the Test Talk blog. You can also read more about Tellurium – “Plain English” automated testing tool – from their site http://www.te52.com  Bug hiding mask You’re a discerning, thorough, and creative software tester, right? You bet! So how is it that bugs still manage to make it to production?Here are 4 reasons why a bug may be missed – and what you can do about it:

 

1. Testing didn’t hit the right combination of factors to trigger it.

Sometimes triggering a bug takes the perfect storm of the right (or wrong?) web browser, browser version, OS, screen dimensions, device…Because testing can never cover everything, it’s possible that you’ll never hit that specific bug-triggering combination. When that happens, a bug may slip through to production and stay hidden until a user discovers it “in the wild.”

What you can do about it: Whenever possible and practical, test your application under several different combinations of conditions. Pay particular attention to the combination most commonly used by your customers.

 

2. It’s been there for a long time and has been forgotten.

Ah, the backlog! It’s supposed to allow us to keep track of all of the bug fix tickets so we can prioritize them and work them into the sprint or project plan.Although we have good intentions when we create a follow-up issue to resolve a bug, unfortunately sometimes the backlog is like the Island of Misfit Toys. These tickets are created, dropped in the backlog, and forgotten – and thus, their bugs are forgotten too.

What you can do about it: Whether your team is Agile or not-so-Agile, make sure that someone treks through the backlog from time to time to make sure that the bugs filed there aren’t forgotten. You may even discover some that don’t exist anymore and can be closed!

 

3. Someone noticed it but didn’t speak up.

With the tension between testers and developers that plagues some companies, it’s not always easy to point out a bug. In fact, it can be downright intimidating. This is be even more true when the tester is new to the testing craft or new to the team, or when the developer is a highly respected senior programmer.If a tester notices a bug – or even just a possible bug – and doesn’t mention it, the chances of getting that issue resolved are pretty slim indeed.

 

What you can do about it: Foster an open, respectful, and approachable team culture. Make sure that all testers, regardless of experience level, know that they don’t need to be afraid to ask questions and point out things that don’t seem quite right.
4. Another bug obscured it.
Few things are as effective at hiding a bug than another bug that prevents you from triggering it or reaching it in the first place.Say you’re testing a new feature. Things seem to be going well, but suddenly you reach a point where further testing isn’t even an option because of a bug. For example, maybe you need to click a link to actually use that new feature, but the link isn’t there at all.You’ll have no way of knowing that there’s a problem with how that new feature works on a mobile device if you can’t even get the new feature to appear.

What you can do about it: Be sure to take thorough testing notes on what you did and didn’t cover in any given round of testing – you’ll help both yourself and the dev. Once that failed ticket is fixed and testable, start from the beginning and test it in full. What do you think are some of the common reasons bugs make it to production?Share your thoughts in the comments below.
Categories: Companies

Ranorex 5.3.2 Released

Ranorex - Wed, 04/29/2015 - 09:00
Ranorex 5.3.2 has been released and is now available for download. General changes/Features
  • Added support for iOS 8.3
  • Added support for Firefox 38
  • Added support for Chromium web browser
  • Added QtItem capability providing common attributes for Qt list/tree items and cells
Please check out the release notes for more details about the changes in this release.

Download the latest version of Ranorex.
(You can find a direct download link for the latest Ranorex version on the Ranorex Studio start page.) 

Categories: Companies

Announcing SonarQube integration with MSBuild and Team Build

Sonar - Wed, 04/29/2015 - 00:20

This is a cross-post of Microsoft ALM web site.

Technical debt is the set of problems in a development effort that make forward progress on customer value inefficient. Technical debt saps productivity by making code hard to understand, fragile, difficult to validate, and creates unplanned work that blocks progress. Technical debt is insidious. It starts small and grows over time through rushed changes, lack of context and lack of discipline. Organizations often find that more than 50% of their capacity is sapped by technical debt.

SonarQube is an open source platform that is the de facto solution for understanding and managing technical debt.

Customers have been telling us and SonarSource, the company behind SonarQube, that the SonarQube analysis of .Net apps and integration with Microsoft build technologies needs to be considerably improved.

Over the past few months we have been collaborating with our friends from SonarSource and are pleased to make available a set of integration components that allow you to configure a Team Foundation Server (TFS) Build to connect to a SonarQube server and send the following data, which is gathered during a build under the governance of quality profiles and gates defined on the SonarQube server.

  • results of .Net and JavaScript code analysis
  • code clone analysis
  • code coverage data from tests
  • metrics for .Net and JavaScript

We have initially targeted TFS 2013 and above, so customers can try out these bits immediately with code and build definitions that they already have. We have tried using the above bits with builds in Visual Studio Online (VSO), using an on-premises build agent, but we have uncovered a bug around the discovery of code coverage data which we are working on resolving. When this is fixed we’ll send out an update on this blog. We are also working on integration with the next generation of build in VSO and TFS.

In addition, SonarSource have produced a set of .Net rules, written using the new Roslyn-based code analysis framework, and published them in two forms: a nuget package and a VSIX. With this set of rules, the analysis that is done as part of build can also be done live inside Visual Studio 2015, exploiting the new Visual Studio 2015 code analysis experience

The source code for the above has been made available at https://github.com/SonarSource, specifically:

We are also grateful to our ever-supportive ALM Rangers who have, in parallel, written a SonarQube Installation Guide, which explains how to set up a production ready SonarQube installation to be used in conjunction with Team Foundation Server 2013 to analyse .Net apps. This includes reference to the new integration components mentioned above.

This is only the start of our collaboration. We have lots of exciting ideas on our backlog, so watch this space.

As always, we’d appreciate your feedback on how you find the experience and ideas about how it could be improved to help you and your teams deliver higher quality and easier to maintain software more efficiently.

Categories: Open Source

Book Review: Software Architecture for Developers

thekua.com@work - Tue, 04/28/2015 - 22:48

Simon Brown’s book, Software Architecture for Developers has been on my reading list for some time. I am aware of Brown’s talks that he gives at conferences, and his very good workshop on describing how to draw more effective diagrams as a communication mechanism for developers to other groups, but I wasn’t quite sure what his book was going to cover.

Software Architecture for Developers

This weekend, whilst travelling, I had a bit of airport time to do some reading to plough through his book.

What I enjoyed about the book
Architecture is a touchy subject, and Brown doesn’t have any problems raising this as a contentious topic, particularly in the agile community where it doesn’t have an explicit practice. Some XP books explain the role, but mantras like “Big Design Up Front” and “Last Responsible Moment” are often (wrongly) interpreted as “do no architecture.” What I liked about Brown’s approach is his recognition of the Goldilocks approach – not too little and not too much where he provides both points of view and some concrete practices.

Brown covers important topics like quality attributes (Cross Functional Requirements), what the role of an Architect is (and that it is just a role, not necessarily a person). I am biased in the opinion but I enjoyed Brown’s perspective about whether or not architects should code, and it aligns well with my own point of view that for a Tech Lead (or Architect) to make effective decisions, they need to have empathy and understand (live, breath and sometime burn for) the decisions they make.

I appreciated the way that Brown puts “Constraints” and “Principles” as key factors that aren’t necessarily represented in the codebase and are unlikely to be easily discoverable for new people. Both are things that I have done when leading software teams and are things I would repeat because I find it helps people navigate and contribute to the codebase.

What I found slightly strange about the book

I believe the book is really strong but there were a few sections that seemed slightly out of place, or not yet completely finished. One was around the “Sharepoint projects needs architecture too”, which I don’t necessarily disagree with but could easily be extended to “Any software product extended to build an application needs architecture too” (cue s/Sharepoint/CMS/g or other examples).

Conclusion

Software Architecture for Developers is a very accessible, relevant and useful book that I do not have any problems recommending for people looking at how to effectively implement Software Architecture in today’s environment.

Categories: Blogs

Spirent Launches DevOps Solution for Continuous Testing

Software Testing Magazine - Tue, 04/28/2015 - 19:12
Spirent Communications has launched Spirent CLEAR DevOps, a solution to help organizations accelerate product development and deployment through continuous testing. The solution addresses the DevOps challenge of incorporating the testing within the continuous integration and deployment cycles. Spirent CLEAR DevOps integrates Continuous Integration (CI), Continuous Deployment (CD) and Continuous Change Management (CCM) with Spirent’s automation and orchestration tools for Continuous Testing (CT). “As technology providers migrate to software-based networking and virtualize more of their network elements, traditional approaches to product development and testing are inefficient,” said Patrick Johnson, general manager of ...
Categories: Communities

Damn Bugs and Overlook Merge into Lean Testing

Software Testing Magazine - Tue, 04/28/2015 - 18:48
It’s a bit of an understatement to say that we’ve been shaking things up at Damn Bugs recently. Today, the bug tracker formerly known as Damn Bugs and the test case management software formerly known as Overlook have merged into Lean Testing, the complete test management solution. Lean Testing brings together powerful yet simple tools to help you ensure that your software products are of the highest quality. * Test plan creation: create your own test plans, or use iOS and Android readiness templates to ensure that your products are thoroughly tested ...
Categories: Communities

Things to Remember When Writing Tests

Software Testing Magazine - Tue, 04/28/2015 - 16:18
Writing test should be part of the normal routine of every software developer. It is however not always the case. In his book ” Bad Tests, Good Tests”, Tomek Kaczanowski provides some interesting tips to improve your test writing activity. * Doing things the right way takes more time than doing them any old way. But this extra effort usually pays off in the long term. * Hard to write a test? Maybe the production code is of low quality? …or maybe you should consider writing tests before production code? * Writing code ...
Categories: Communities

Identify Bad Service Oriented Architectures Through Metrics

There are many advantages of breaking an application into smaller services. When APIs and Interfaces are well defined it allows more independent development on a separate code base, keeping risk low to break the whole app with a single code change. It allows for more flexible and scalable deployments when done right and it is […]

The post Identify Bad Service Oriented Architectures Through Metrics appeared first on Dynatrace APM Blog.

Categories: Companies

Apple or Google: Who Will Reign Supreme With Mobile Payments?

uTest - Tue, 04/28/2015 - 15:00

Google and Apple are seasoned adversaries at this point, with each company constantly threatening to move in on each other’s territory and steal market share. For example, Google has gotten involved with device manufacturing (see: Google Nexus Tablets), an area which has historically been Apple’s bread and butter, and Apple has engineered a Maps application, […]

The post Apple or Google: Who Will Reign Supreme With Mobile Payments? appeared first on Software Testing Blog.

Categories: Companies

Congreso del Comité Español de Empresas de Pruebas Software (SSTQB), Madrid, Spain, May 8 2015

Software Testing Magazine - Tue, 04/28/2015 - 13:47
The Congreso del Comité Español de Empresas de Pruebas Software (SSTQB) is a one day-conference on software testing organized in Madrid by the Spanish Software Testing Qualifications Board. This event aims to explain and discuss the importance of software testing and its definition. Presentations will be made by Spanish and international software testing experts. This is a free event. All talks will be in Spanish. Web site: http://www.sstqb.es/congreso.html Location for Congreso del Comité Español de Empresas de Pruebas Software (SSTQB): NH Collection Eurobuilding, C/ Padre Damián 23, 28036 Madrid, Spain
Categories: Communities

New Guide Provides 6 Best Practices for Requirements Reuse

The Seapine View - Tue, 04/28/2015 - 04:30

To accelerate time to market and cut  costs, many product development teams can take requirements written for similar projects and reuse them for a new project. Not sure how to get started with requirements reuse? Our newest guide, 6 Best Practices for Requirements Reuse, can help!

This guide provides an overview of the following key reuse practices:

  1. Document the requirements
  2. Tune up existing requirements
  3. Begin with the end in mind
  4. Avoid excessive granularity
  5. Develop a pattern
  6. Link the dependencies

These best practices can help your team achieve time- and cost-savings goals without sacrificing product quality. The tool you use to write and manage requirements can help too. For example, TestTrack includes handy features to help you reuse requirements. We hope you’ll find this free guide helpful and learn some new ways to get the most out of your requirements.

Get the Guide

The post New Guide Provides 6 Best Practices for Requirements Reuse appeared first on Blog.

Categories: Companies

Translating Web App Functional Testing To Mobile

Sauce Labs - Mon, 04/27/2015 - 21:20

This is a guest post by Greg Sypolt, a Senior Engineer at Gannett Digital and automated testing expert.

Technological advancements and the explosion of devices across multiple platforms means hardware and software developers have a much more difficult job. They have to keep up with the demand to develop and roll out new products. One of the most significant issues is accounting for the differences in system response, when responding to mobile traffic rather than to internet traffic.

Applications must be tested to make certain they run responsively on key platforms and across numerous networks. Effective functional testing eases the pressure on device manufacturers while allowing application developers the time to collect applicable metrics that improve product quality.

A New Set of Challenges

Testing mobile applications is completely different and significantly more complicated than testing traditional desktop and net applications. Mobile devices don’t just emulate the desktop environment — they have their own set of requirements. Mobile app testing is far trickier because of the following key aspects.

Device Variation:

Mobile applications run on devices that have different:

  • Operating Systems (OS) such as iOS, Android, Windows and BB;
  • Versions of those OS; and
  • Manufacturers such as Apple, Samsung, Nokia, Motorola and LG.

When an application needs to run on multiple OS, devices, and varied screen sizes, QA teams are faced with the challenge of ensuring the application functions in every environment. The below graphic of the iOS support matrix alone shows the complexity with a single series of devices and OS type.

 iossupportmatrix.com

source: iossupportmatrix.com

Availability of Mobile Testing Tools:

Desktop and web application-based testing tools cannot be used for testing mobile applications, so a new testing framework is required. Some of the frameworks currently available for writing test scripts are:

Translating 2

Network Challenges:

You must be able to effectively emulate various bandwidth rates, because your end users will be operating on a variety of networks and bandwidths. You also must be able to test geographically isolated loads accurately to signify real-world traffic.

Screen Size and Densities:

With the diversity of screen sizes and densities available today, you must also be able to test your mobile app on different screen configurations so that it:

  • Fits on small screens;
  • Takes advantage of the additional space on larger screens while still looking good on smaller screens. For example, the difference in screen size between iPhone 4 and iPhone 6+;
  • Is optimized for both landscape and portrait orientations.

Peripheral Mode:

When testing in mobile, there are physical elements that also need to be considered, unlike with web applications. You may need to consider wireless and wired peripheral mode testing.

  • Wireless to the device such as near field communication (NFC), Bluetooth or stylus; and
  • Wired internal to the device such as a headphone jack or external testing with cc reader or bar code scanner;
  • Native Applications and Browser-Based Applications.

These are the variables that come with a mix of native and browser-based applications. Native applications reside on the user’s device and communicate over HTTP(s). The browser-based application uses a modified version of a browser to access applications online. As many companies use both applications to offer solutions to their customers, you must support testing of all types of mobile applications.

Taking Challenges on “Strategy First”

What do these testing challenges mean for web developers and site owners? Primarily, for every web application designed, you must also address a strategy to test the product in the mobile space. Building an app with all the features and functionality needed by the client and user is important. Having a rigorous mobile testing plan in place before the mobile app is deployed is even more crucial to its success.

Mobile applications are becoming more and more sophisticated, significantly increasing the requirement for functional testing. To tackle this, organizations that require app testing are always exploring alternatives to traditional manual testing.

Mobile automation testing is a highly effective approach to mobile app QA. It provides significant business returns when executed using the right tools and infrastructure, while factoring in cross-platform challenges. From sites to web applications to native mobile applications, test automation tools bring full-featured functional testing to mobile platforms.

Strategy for Mobile Test Automation

Automated testing can sometimes be a black hole. The best thing to do is automate something and measure that against a precise objective that can be measured in a realistic timeframe. Knowing your hard or soft Return on Investment (ROI) goals helps as well.

Translating 3

Mobile testing requires a balanced approach, and the key is understanding your company’s mobile strategy.

Target Device Selection:

You simply cannot test applications on every device that exists. However, cloud-based device emulation tools allow you to increase that breadth more and more. The best approach is analyzing the market and choosing a representative device that reduces the effort of executing multiple test cases. A few factors to consider are OS version, screen resolution and form factor (smartphone and tablets) while also ensuring the multi-device and multi-platform compatibility of the app.

Emulators:

Emulators mimic the software and hardware environments found on actual devices and provide excellent options. Options like the ability to bypass the network and use a real-world environment via modem where actual users run and interact with those applications on their devices. At the same time you will need to test on a physical device, but this can usually be done more ad-hoc for most applications. Find the right mix of emulators and physical devices to provide the best results!

Tool Selection Criteria:

Test automation tools create a framework to systemize the testing of mobile native and web apps across platforms. For instance, Appium is one of the few frameworks that can develop cross-platform scripts in multiple scripting languages. Collaboration between iOS and Android developers is crucial to ensuring every element that looks the same has the same accessibility label and builds a testable app.

An important step in this process is creating a list of requirements to review when choosing a tool for evaluation. Some questions to ask as you determine your requirements are:

  • Am I looking for a Behavior Driven Development (BDD) framework?
  • Do I need to support native, hybrid, mobile first (responsive web design) or mobile web?
  • Do I need to support diverse mobile platforms such as Android, iOS or Windows?
  • Am I looking to test locally or in the cloud to reduce the cost of ownership?
  • Am I looking to test against emulators or real devices?
  • Do I need a framework that supports cross-platform scripting?
  • Do I need a framework that offers an easy interface for tests to auto-generate scripts? For instance, scripts can be automated without any programming or scripting language knowledge.
  • Am I practicing continuous integration and need a tool that integrates into the broader environment seamlessly?
  • Am I looking for a framework that will attract existing developers to contribute to my automation goals?

Take a step back and consider your resources. Does your QA team have sufficient programming knowledge for automation development? For automation, you must have people with some programming knowledge. If not, do they have the technical capabilities to easily adapt to the new technologies?

Answers to these questions will help guide your team to picking the best framework, and also understand the capabilities of the team to execute on testing.

Automation Environment:

Automation environment and setup depends on the approach to testing. Automation testing approaches are either cloud-based or local.

Cloud-based testing provides web-based automation platforms that can be accessed from anywhere in the world with good internet connectivity. It is one standard way in which one can achieve native and hybrid types of test automation. And the “automation-from-anywhere” feature is a big advantage. You can run scripts from your test framework on most of the cloud solutions.

Local-based testing involves setting up tools in a test environment and leveraging either emulators/simulators or physical devices to automate testing using popular open-source tools such as Appium, Espresso, or Kif. The additional consideration with on-premise testing is the breadth you can cover, and the time to set up these environments.

Mobile Testing, The Practice

Though the techniques and tools used to automate mobile application testing are complex, we learn a lot from the days of client/server desktop applications. But a few extra mechanisms are required for mobile automation, and these are:

  • Using a mobile test framework such as Espresso or Kif makes it easy for developers and testers to write scripts in the native programming language and opens the door for pair programming;
  • Identify the requirements and categorize them based on the mobile application type—Native, Hybrid or Mobile Web;
  • Identify the scope and device that meets the requirement;
  • Identify an automation tool that will a best fit. Filter the best tools by performing Proof of Concept (POC) that would prove that the automation can produce real results;
  • Design the framework architecture based on the initial requirement.

Test automation strategy runs in parallel with the framework design, which details the technical scope, test environment, and running scripts on emulators/physical device with appropriate automation based approach.

Mobile Device Matrix:

In addition to the testing mechanisms, you have to consider the matrix of device type, device OS, and device browsers to test against. The size of this matrix is significantly larger than in web application testing because of the dependency of device types since each have different screen resolution possibilities and different features. For example, the new thumb print authentication in some iPhones is not available in others, but applications can leverage this feature. Templates for building a mobile automation device matrix require:

  • Operating Systems – customizations, missing libraries, driver issues;
  • Screen Size – rendering issues, usability, missing layouts;
  • Pixel Density – density independence, missing layouts;
  • Aspect Ratio – X,Y calculations, overlapping panels, display issues;
  • System on a Chip (SoC): hardware performance, instruction set, battery signal;
  • Carrier: network protocol, speed, responsiveness, packet loss.

Conclusion

To build a successful test mobile automation strategy, all stakeholders must understand the business value of automation. Further, all teams, including development, need to buy in to the process. Picking the right automation tool and building the right testing environment are typically the most difficult challenges when implementing a successful test automation process. However, you do need to focus on the long-term success of your product, so it’s helpful to address these important questions when starting the process:

  • What are your automation objectives?
  • Will your current development processes need to change?
  • What test environment support do you need?
  • What skills will you need?

Process and Organization + Environment + Technical + Resources + Scope and Roadmap = Test Automation Strategy.

Mobile testing tools are relatively new, and developing an automation strategy specific to your organization ensures that you will get the most business value out of your automation tool. A crucial piece to laying the groundwork for mobile app testing is building a well-balance testing portfolio that includes automated unit testing, integration testing, WebView testing, automated UI testing and exploratory manual app testing of the UI. Appium shows great promise in this area.

Making the leap into mobile testing today is not easy. The good news is a there are many tools that are being developed that support mobile testing efforts, but the biggest part of the challenge is integrating your testing into your existing environment. The important thing to know is that just because it is hard does not mean you should reduce your efforts compared to your web application. This would result in a disconnect of software quality, and could ultimately destroy your mobile application strategy from within.

By Greg Sypolt

Greg Sypolt is a Senior Engineer at Gannett Digital with 10 years of focus on project quality, results, and customer satisfaction while serving in multiple leadership roles. He is an expert in all areas of testing – including functional, regression, API, and acceptance testing – in both the automated and manual realm. Greg is experienced with test strategy planning, including scope, estimates, and scheduling. Greg also assumed the role of Automation Architect while helping convert a large scale, global test team from a manual to an automated focus.

Categories: Companies

Pragmatic Unit Testing in C++

Testing TV - Mon, 04/27/2015 - 17:58
Successful adoption of unit testing goes beyond picking a framework: The effectiveness of unit testing is dependent on run-time analysis, static analysis, and other tools to make up the “iron triangle” necessary to get profitable increases in feature velocity and MTBF in the field. We’ll cover where to start in a legacy codebase get the […]
Categories: Blogs

Is Groovy Better for Testing than Java?

Software Testing Magazine - Mon, 04/27/2015 - 17:06
Two years ago, we introduced Spock tests into the MongoDB Java driver. The decision could be considered controversial – the project used no external dependencies in production code, and was 100% Java. But there was a back door… with Gradle as the build system, there was a tiny excuse to use Groovy in the project, provided it was not in the production code. That was all the excuse we needed to start using Spock for unit and later, integration tests. Groovy has a lot of advantages as a testing language, and ...
Categories: Communities

.NET Community Shout Out

NCover - Code Coverage for .NET Developers - Mon, 04/27/2015 - 12:36

Our .NET community is not one to sit still (and we love you all for that!). Today we wanted to celebrate two very busy .NET community members that keep us on our toes with all the new skills they learn and teach to us. Thank you for all you do.

Roberto Freato

ncover_mvp_roberto_freato_twitterRoberto Freato combines his two passions – computer science and working independently – as a freelance IT consultant and trainer. Whether teaching, writing or speaking, Roberto shares his affinity for software architecture and development, prototyping, analysis, training, and improving the customer relationship.

Roberto has been a Windows Azure MVP since 2012 and is the co-author of Microsoft Azure Development Cookbook.   He maintains additional certifications with Sun, EXIN, Apple, Cisco, and IBM. Roberto’s other interest include Windows Phone, cloud computing, mobile, ASP.NET/IIS, developer security and .NET (over 40 and counting!).

Keep up with Robert on his website.

Vidya Vrat Agarwal

ncover_mvp_vidya_vrat_agarwal_twitterAs a .NET purist, blogger, community speaker, and author, Vidya Vrat Agarwal works as a .NET consultant. With over 14 years experience, Vidya loves contributing to the .NET community through his consulting work, his blog, and as a .NET MVP.

Vidya has also been recognized as a C# Corner MVP and lifetime member of the Computer Society of India (CSI). Outside of .NET, his specialties include architecting and building solutions for Win Forms, ASP .NET, MVC, SQL Server/BI, WCF, SOA, Windows Azure, SDL, MSF, Agile-scrum, TOGAF, Big Data, and Hadoop.

Stay connected with Vidya on twitter @dotnetauthor.

 

The post .NET Community Shout Out appeared first on NCover.

Categories: Companies

Overcoming test automation challenges

Agile Testing with Lisa Crispin - Sun, 04/26/2015 - 21:17

Janet Gregory and I enjoyed participating in the Quality in Agile conference in Vancouver April 20-21. We paired on a keynote: “Do testers need to code… to be useful?” Our opinion in a nutshell: testers need technical awareness to collaborate effectively with all their team members, but our software delivery teams already should have expert coders!

Even if they don’t write code, testers need to participate in automating regression tests and other useful automation, in collaboration with programmers, business stakeholders and others. Janet and I facilitated an all-day workshop on advanced topics in agile testing, with a focus on automation.

Challenges around automation

After some introductory slides, the 15 workshop participants self-organized into three smaller groups, choosing to sit with people that had similar goals for the day, or who had experience related to their goals. Each person listed their team’s impediments to automation, one per sticky note, and we grouped these on a big wall chart.

Automation Challenges

Automation Challenges

Next, everyone dot voted on the topics they wanted to tackle during the workshop. The top three vote-getters were:

  • Culture and responsibility – whose job is it to automate?
  • Lack of time for automation activities
  • Things that make tests hard to automate, such as complexity

(Note, you can find higher resolution photos of all the session wall charts, including those not on this post.)

Formulating the problem and brainstorming ideas to overcome it

Mind map and problem statement for cultural challenges

Mind map and problem statement for cultural challenges

Each group was tasked with writing their own problem statement for the culture and responsibility topic. It is challenging to write a good problem statement! You can see an example at the bottom of the mind map at left. Once the problem was defined, everyone picked up a Sharpie and each team mind mapped on their big piece of easel pad paper. IMG_4427

One group focused on a lack of shared vision and investment in automation at the company level. Another saw a lack of education on both sides.

Third group's mind map is on the right - my individual pic of it was blurred

Third group’s mind map is on the right – my individual pic of it was blurred

I thought it was interesting that the third group exploring culture and responsibility mentioned doing social activities together, and honing soft and tech skills including being respectful of each other.

For topic #2, after each group wrote their problem statement around the lack of time for automation, we tried a different brainstorming technique: Brainwriting. Each person wrote their ideas for dealing with complexity and other things that make automation difficult on a plain piece of paper. Every three minutes, they passed their paper to the group member to their right. They read what was written already, then wrote more ideas. This continued until each person within the group had written on each paper. Most people agreed that reading other peoples’ ideas jogged new ones for themselves. This technique lets people who might not be comfortable coming forward to draw on a mind map or say their ideas aloud contribute equally.

IMG_4438

Example problem statement

IMG_4435

Ideas for dealing with a brittle app

IMG_4437

Ideas for good code design and for collaboration

For topic #3 (sample problem statement to the left), we did “brainwriting with a twist”, an idea of Janet’s. Each team started by drawing, mindmapping or brainwriting ideas on a big flip chart page. After 10 minutes, each group moved to the next group’s flip chart, read the problem statement and ideas, and added their own. Some specific ways to design better automation code came out of this, as well as ideas for better tester-coder collaboration and ways to make these problems more visible.

Designing experiments

Example experiments

Example experiments

Screen Shot 2015-04-26 at 1.11.02 PMOf course, there is more to problem solving than brainstorming ideas. Janet presented a model of Esther Derby’s (left): define the problem and desired outcome, understand the context and requirements around potential solutions, design experiments, try them and evaluate the results. Each team spent time coming up with experiments they will try when back with their own teams. We hope that participants will report back to us on how their experiments went!

Got automation challenges – or any challenges related to quality and testing, for that matter? Get your team together, try out some brainstorming techniques, make it comfortable and safe for each person to contribute their ideas. Identify the biggest problem, brainstorm a couple of experiments to try to make that problem smaller, and use your retrospectives to evaluate the results. Keep experimenting, inspecting and adapting. Over time, your problems will be smaller and your successes bigger. But remember to celebrate even the small successes!

The post Overcoming test automation challenges appeared first on Agile Testing with Lisa Crispin.

Categories: Blogs

Knowledge Sharing

SpiraTest is the most powerful and affordable test management solution on the market today