Skip to content

Syndicate content
Updated: 10 min 48 sec ago

Meet the uTesters: Iwona Pekala

Wed, 11/26/2014 - 23:24

Iwona Pekala is a gold rated full-time tester on paid projects at uTest, and a uTester for over 3 years. Iwona is also currently serving as a uTest Forums moderator for the second consecutive quarter. She is a fan of computers and technology, and lives in Kraków, Poland.

Be sure to also follow Iwona’s profile on uTest as well so you can stay up to date with her activity in the community!

IwonauTest: Android or iOS?

Iwona: Android. I can customize it in more ways when compared to iOS. Additionally, apps have more abilities, there is a lot of hardware to choose from, and it takes less time to accomplish basic tasks like selecting text or tapping small buttons.

uTest: What drew you into testing initially? What’s kept you at it?

Iwona: I became a tester accidentally. I was looking for a summer internship for computer science students (I was thinking about becoming a programmer). The first offer I got was for the role of tester. I was about to change it, and I was transitioned to a developer role after some time. It was uTest that kept me as a tester, particularly the flexibility of work and variety of projects.

uTest: Which areas do you want to improve in as a tester? Which areas of testing do you want to explore?

Iwona: I need to be more patient and increase my attention to details. When it comes to hard skills, I would like to gain experience in security, usability and automation testing.

uTest: QA professional or tester?

Iwona: I describe myself as a tester, but those are just words, so it doesn’t really matter what you call that role as long as you know what its responsibilities are.

uTest: What’s one trait or quality you seek in a fellow software testing colleague?

Iwona: Flexibility and the skill of coping with grey areas. As a tester, you need accommodate to changing situations, and you hit grey areas on a daily basis. It’s important to use common sense, but still stay in scope.

You can also check out all of the past entries in our Meet the uTesters series.

Categories: Companies

New uTest Platform Features Emphasize Quality

Tue, 11/25/2014 - 16:56

Last week, uTest launched two new Platform features for uTesters on paid projects which continue to drive the needle in our continuous pursuit of quality (plus a very useful change to existing tester dashboard functionality). Here’s a recap of what is included in the latest uTest Platform release.

Bug Report Integrity

Most testers understand the role of a bug report is to provide information. However, a “good” or valuable bug report takes that a step further and provides useful and actionable information in an efficient way. As such, in addition to approving tester issues, Test Team Leads (TTLs) and Project Managers (PMs) have the ability to rate the integrity of a tester’s bug report by setting the bug report integrity to High, Unrated or Low. However, by default, all bugs will be set to Unrated.


The Bug Report Integrity feature will reward testers who meet a high report integrity standard by providing a positive rating impact to the quality sub-rating. Conversely, we will also seek to educate testers who may be missing the mark by negating any positive impact that may have occurred based on the value of the bug itself.

For more information, please review the Bug Report Integrity uTest University course.

Tester Scorecard

When navigating into a test cycle, you will see a new tab called “Tester Scorecard.” Clicking this tab will bring up a ranked list of testers based on their bug submissions and the final decisions on these bugs — i.e. approvals and rejections.

Points are awarded according to the explanation at the top of the Scorecard and result in a score that is used to rank testers based on their performance. Sorting the table by any of the columns is possible. If two testers have identical scores (i.e. same number of bugs approved at the same value tiers), the tester that started reporting bugs first will be first in the ranking with same point scores.

Our hope is that this Scorecard will spark some additional competition among top performers and will also be useful for PMs and TTLs to choose testers for participation bonuses. Of course, it is still at the discretion of the TTL or PM to decide who won any bug battles or is eligible for any bonus payments.

Note: Scores indicated on the scorecard do not impact the tester’s rating.

Score Card

Feature Change: Payout Card

Additionally, there was an improvement to existing functionality within the tester dashboard. Pending payouts are now included so that testers can easily see how much they have earned:

Payout Card

If you like what you see, feel free to leave your comments below, or share your ideas on these and other recent platform updates by visiting the uTest Forums. We’d love to hear your suggestions, and frequently share this valuable feedback with our development team for future platform iterations!

Categories: Companies

Meet the uTesters: David Oreol

Mon, 11/24/2014 - 16:15

David Oreol has been a uTester since the very beginning, and is a full-time Test Team Lead Premier and Gold-rated tester on paid projects at uTest. Before juTester-David-Oreol-300x300oining the community, David earned a B.S. in Computer Science from California State University Fresno and worked in IT and as a software engineer.

Be sure to also follow David’s profile on uTest as well so you can stay up to date with his activity in the community!

uTest: Android or iOS?

David: For work, both. I like testing on both environments, but for personal use, it is iOS and Mac all the way. I like the ease of use and integration between the mobile and desktop platforms. I don’t like having to constantly tweak my phone or computer to get it to work. I used to be a die-hard Windows fan, but I switched to Mac a few years ago and haven’t looked back.

uTest: What drew you into testing initially? What’s kept you at it? 

David: I’ve always been one to sign up for beta testing of apps I use, so it was a natural fit. I have a degree in Software Engineering as well, so that certainly helps out. What’s kept me going is the variety of products. I’ve tested everything from hardware devices to websites to Mac and PC apps to iOS and Android apps. Many of the products I have gotten to test weren’t available to the public yet. Seeing something that I tested out in the wild is a big thrill for me, even if I can’t tell anyone that I worked on it.

uTest: What’s your go-to gadget?

David: For work, my new iPhone 6 Plus. I’m finding some interesting bugs with it since it has the larger screen and the new wider landscape layout. For relaxing, I love my Kindle Paperwhite. The e-ink screen is so much easier on my eyes than a traditional backlit screen. I think that everyone that reads a lot should own an e-ink reader.

uTest: What is the one tool you use as a tester that you couldn’t live without?

David: My 27” iMac. The large screen really helps with big spreadsheets for work. Additionally, OS X has built in virtual desktops that are super easy to use. I normally run 7 desktops with different browsers and tools on each one. It’s almost like having multiple monitors, but without taking up all my desk space.

uTest: What keeps you busy outside testing?

David: Lately, I’ve been running and walking a lot. I enjoy the time away from the computer. Otherwise, I spend most of my time with my wife and playing with our ferrets. We also really enjoy hiking and tent camping.

You can also check out all of the past entries in our Meet the uTesters series.

Categories: Companies

Why Are Testers Uninterested in Upgrading Their Skill Sets?

Fri, 11/21/2014 - 22:47

“The only type of testing that I can do is manual testing.”Distance-Education
“Test automation is very important, but I am too busy now to learn something new.”
“Test automation is useful, but I will learn it when I will need it.”
“I am interested in test automation, but I don’t know any programming and it will take a long time to learn it.”
“I want to learn test automation, but my employer does not have any training programs.”

Have you ever heard any of these stories? I have, and not only once, but many times, about test automation, load testing, and web service testing.

Most of the testers I know say in one way or another that they would like to learn more about their profession but, “not now, maybe later, when the conditions will be better, when they will need the new skills in their job, when their employer will pay for their training, when someone will train them for free, when they will be less busy, etc.” The list goes on.

People sometimes say the same things about fitness: I will do it tomorrow, I will do it when I will have more time, when I will need it, etc. I have certainly done this many times as well.

But why exactly are testers not interested in learning new skills? Actually, to take things a bit further, why are testers the least interested in upgrading their skills out of all people that work in IT? Can it be because testing is seen as an easy job that anyone can do? Or because there is still no formal education track for testing? Or because some testers could not do other IT jobs well and needed a way out? Because of complacency? Maybe because of affluence and a high standard of living? Or possibly because of the illusion that things that they did yesterday will be there for them forever?

Who knows. I certainly don’t. But it is something that I see a lot. And recently, I asked other people what they think about it: Are testers the IT people least interested in learning new things?

One of the people I asked is a development director for a large development company with thousands of developers and hundreds of testers. He hires lots of testers all the time. The other three I talked with are IT recruiters who know the IT market very well. They all agreed with my observation. And none of them had better answers than me.

What do you think?

Alex Siminiuc is a uTest Community member and Gold-rated tester and Test Team Lead on paid projects at uTest. He has also been testing software applications since 2005…and enjoys it a lot. He lives in Vancouver, BC, and blogs occasionally at

Categories: Companies

Google Test Automation Conference: Video From Days 1 & 2

Fri, 11/21/2014 - 00:24

The Google Test Automation Conference (GTAC) is an annual test automation conference hosted by Google, bringing together engineers to discuss advances in test automation and the test engineering computer science field.

GTAC 2014 was recently held just a few weeks ago at Google’s Kirkland office (Washington State, US), and we’re happy to present video of talks and topics from both days of the conference.

If 15-plus hours of video below just isn’t enough, be sure to also check out all of our Automation courses available at uTest University today.

Categories: Companies

Testing Tool Showdown: liteCam HD vs. Mobizen

Wed, 11/19/2014 - 23:36

7a9a23a7651f16f378279c983cd8039a_400x400Clear, to-the-point bug reports that are backed up with solid evidence are a must for testers when it comes to communicating with developers and getting to the root cause of issues quickly.

And that evidence comes in the form of attachments, which add to a bug report by offering proof of the bug’s existence, enabling the customer or developer to reproduce and quickly rectify the issue at hand.

But with all of the options out there, we wanted to single out a couple of options that could get testers started, so we took to two popular screen recording tools from our uTest Tool Reviews in liteCam and Mobizen.


liteCam has a four-star average review from our uTesters, and while a couple of testers appreciated that “it packs all the features they need in an single UI that greatly improves their video recording workflow,” performance issues with frequent crashes marred the experience for one tester. What liteCam also has going for it is a Free (videos are watermarked) and Paid edition of the product.


Mobizen is also a popular screen recording tool amongst our tester base, with an identical four-star average review. Testers have called out its high frame rate, ease of use and installation, and great support on tablets. Additionally, another key standout of this particular tool is that it is 100% free.

Which of these screen recording tools gives you the most bang for your buck when it comes to bug report documentation? Be sure to leave your feedback in the Tool Reviews section of uTest or in the comments below.

If you end up choosing one of these options, also be sure to check out our recent uTest University courses on how to set up liteCam HD or Mobizen for screen recording.

Categories: Companies

Latest Testing in the Pub Podcast Takes on Testing Weekend Europe

Tue, 11/18/2014 - 20:00

Testing in the PubSteve Janaway and team are back for more pub pints over software testing discussion, in the latest Testing in the Pub podcast.

In Episode 13, UK-based software testers Amy Phillips and Neil Studd talk Weekend Testing Europe. Weekend Testing Europe is the European chapter of Weekend Testing and was just relaunched in 2014 by Amy and Neil.

Weekend Testing is a program that aims to facilitate peer-to-peer learning through monthly Skype testing sessions. If you’ll also recall, uTest contributor Michael Larsen is a founding member of the Americas chapter of the program.

Be sure to check out the podcast to learn more about the monthly sessions. If you’re from Europe and interested in participating thereafter, you can send an email and/or ping Testing Weekend Europe on Skype ID europetesters. You can also follow them on Twitter @europetesters.

The latest edition of the podcast is available right here for download and streaming, and is also available on YouTube and iTunes. Be sure to check out the entire back catalog of the series as well, and Stephen’s recent interview with uTest.

Categories: Companies

Join uTest for a #uTestChat Twitter Chat on Friday

Tue, 11/18/2014 - 16:15

This Friday, November 21st, we are excited for you to join @uTest on Twitter for #uTestChat starting at 1:00 p.m. EST. It’s time to huddle around the virtual water cooler and connect with your fellow software testers as we chat about all things software testing.

Have a question about furthering your career or breaking into a new testing type? How to write a great bug report? What’s the best testing tool for the job? Bring your thoughts and opinions, and get ready to spend some time connecting with the testing community. twitter-utest-chat

What is a Twitter chat?

A Twitter chat (or tweet chat) is a live, real-time conversation between a group of people on Twitter. The group follows one hashtag (#uTestChat) and your moderators from the Community Management team (Linda and Ryan) will keep the discussion moving.

Why have #uTestchat?

The goal of #uTestChat is to connect and engage the vibrant and active testing community on Twitter.

How do I participate?

Sign up for a Twitter account if you don’t have one already. If you don’t have an account, you can still follow along with the #uTestChat hashtag but you will not be able to participate in the conversation (but what fun would that be?)

Once you  have your own account, follow a few accounts like @uTest and @Applause so that you get the hang of how Twitter is used. Then show up online at 1:00 p.m. EST on Friday and get ready to chat!

Categories: Companies

Meet the uTesters: Michael Solomon

Mon, 11/17/2014 - 16:30

Michael Solomon is a Silver-rated tester on paid projects at uTest, hailing from the United States (New York). When he’s not testing softmichael solomonware, Michael works as a freelance sound man in TV. You can visit some of his work over at his website.

Be sure to also follow Michael’s profile on uTest as well so you can stay up to date with his activity in the community!

uTest: Android or iOS?

Michael: iOS! I have had an iPhone since the first one came out, and I think I have owned every model since the very first one. I do have a Samsung Galaxy S4 for testing purposes. While the Android platform has become easier for me to understand, I most definitely prefer iOS and its abilities to sync seamlessly with my Macbook Air, Calendars, iMessage, etc.

uTest: What drew you into testing initially? What’s kept you at it?

Michael: When I came upon the site, I didn’t have formal testing experience. Turns out that having an inquisitive mind and a good head on your shoulders can be just as valuable as having formal technical training.

What’s kept me at it? The fact that it turns out I’m fairly good at it! I’ve always been a stickler for grammar, so I think that helps massively in my bug reports. I’m constantly going back into my bug reports to tweak a word here and there, which probably goes unnoticed by the Test Team Lead (TTL), Project Manager (PM), and client.

uTest: What’s your go-to gadget?

Michael: My iPhone 6 for sure! I love it. It’s a beautiful phone. Console logs are a breeze with Xcode, videos look beautiful, and in conjunction with HandBrake, you can attach some very high-resolution videos to your bug reports that measure up at just around three megs.

uTest: What’s one trait or quality you seek in a fellow software testing colleague?

Michael: Clear bug reports. Being part of a cycle at uTest involves going through the previous bugs that have been raised to make sure you’re not posting a duplicate. A good bug title goes a long way in terms of making my job easier when it comes to figuring out if I am posting a bug that has already been posted.

uTest: What keeps you busy outside testing?

Michael: Outside of testing, I stay busy with my day job, which is recording dialogue for many different types of TV programs. You can find my audio in various programs ranging from Impractical Jokers to the unveiling of the Lord & Taylor Holiday Windows with Nick Jonas, and from the first episode of Cake Boss to the Banksy Does New York documentary on HBO/HBO GO.

Categories: Companies

Selenium: 10 Years Later and Still Going Strong

Fri, 11/14/2014 - 18:43

In the field of testing technologies, it isn’t very often that we see a tool survive and grow richer in over a decade. Just recently, Selenium completed 10 years, and this article takes a look at the ecosystem that Selenium has

Agile and Selenium

The agile manifesto has been around longer than Selenium, and more teams are looking towards the agile form of software development to reduce their feedback cycles and practice continuous delivery. One of the practices that teams need to do well when working the agile way, is test automation.

Test automation may seem easy — but in order for it to be truly effective, the team needs to have a lot of discipline in defining their process, choosing the right tools and technologies to give that quick feedback by running various types of tests (smoke, regression, etc.), and also allow the test automation to evolve and scale.

That said, even today, completing test automation in the same iteration along with development is a huge challenge for most teams. These challenges get aggravated and more apparent when test automation uses tools and technologies that are difficult to keep in sync with the rapidly changing product.

It was 2004 when Jason Huggins, while testing an internal application at ThoughtWorks, created a JavaScriptTestRunner that changed the way automating the browser (browser-based-testing) is done. This then evolved into “Selenium” which was subsequently made open source.

Where is Selenium today?

Selenium has evolved and adapted to the changing test environment, and here’s a quick glance at where it currently stands in the industry:

  • Today, it is extremely rare to find someone involved in browser-based testing who does not know about, or has not heard of, Selenium
  • There are many other frameworks implemented in various languages that build on Selenium and offer more capabilities for test automation
  • There are various innovative companies that have built products on top of Selenium to offer more value to its clients – i.e. SauceLabs, BrowserStack, etc.
  • Service organizations across the world have the top layer of management talking and selling Selenium-based offerings to prospective clients
  • There are numerous organizations that have sprouted over the years, selling Selenium-automation-based testing services
  • Selenium is one of the top-searched skill-set in job profiles (related to test automation)

The Selenium team is working hard and fast to come up with more capabilities, features, and also standardize browser and mobile-browser automation support.

Why is Selenium so popular?

So how is it that Selenium survived for a decade and is getting stronger and more popular? Well – there are many reasons for this:

  • The most important reason – it simply “works.” It does what the APIs claim to do.
  • It is open source – so one can dig deep to understand how it works, and if it needs to change, the power is in the hands of each individual in the community. You don’t need to be in the ‘elite-class’ to have access to the core workings of the same (by elite-class, I mean that you probably need special license/hardware to use the tool, and because it is expensive to do so, organizations/teams would probably have only a few of these available for some team members).
  • The community support is awesome. If the Selenium core contributors cannot answer in time, there is vast amount of expertise in the community – it’s quite possible someone has encountered something similar, and the ideas/suggestions/workarounds come in fast numbers. Also, fixes do come in very quickly.
  • It is free – so other than having good programming language expertise, there is no other ‘tool-cost’ involved
  • Given the growth in adoption of agile, Selenium-based automation fulfills all the requirements of teams needing to do test automation, if done right!
What’s the roadmap for an agile tester?

The future of testing is very bright. We have a plethora of devices and technology advancements happening every minute. It can seem scary (and for good reason, too), to those who are not on the path of learning and adopting what science and technology holds for us.

Here are some thoughts of what you can do to get on that path:

  • The role of QA has become much more complex and requires a mindset shift as well as technical disciplines
  • Get involved and contribute towards product architecture – this will help understand what needs to be tested, what needs to be automated at which layer of the test pyramid, and why
  • Understand that test automation is a form of development. Get proficient at developer-practices, and apply your testing-hat to become great at test automation
  • While websites had become the challenge to tackle 10 years ago, QAs must now understand how to automate testing for not only websites, but also mobile and interactive applications
  • Behavior-Driven Testing (BDT) is one of the ways you can identify and build a good regression suite to avoid getting into the anti-patterns mentioned above

This article is an excerpt from my contribution to an eBook titled Perspectives on Agile Testing. Read the full article, and many others on testing in the past decade in the eBook. You can download it as well.

Anand Bagmar is a Test Practice Lead at ThoughtWorks, a software design, creation and delivery company credited with creating and open-sourcing what is now the defacto-standard for cross-platform/cross-browser web-app functional testing, Selenium. You can find out more about Anand on his profile, his blog, or follow him on Twitter @BagmarAnand.

Be sure to also check out our 8-part uTest University course series on the basics of Selenium.

Categories: Companies

Safety Language in Software Testing: Why It’s Not OK to Deal in Absolutes

Thu, 11/13/2014 - 21:38

Of course this has been tested. This is definitely working as it should be. images

How many times has a tester or developer uttered these words to only have them come back and haunt them? Or worse, lose credibility? As a tester, it seems like a no-brainer to use CYA language in your everyday work. Heck, one just has to look to prolific software tester James Bach’s recent talk at CAST to figure that out (“I’m reluctant to say ‘pass.’ I’d rather say, I don’t see a problem as far as I can tell”).

But is “safety language,” such as ‘it seems to be’ versus ‘it is,’ something that should be a part of every tester’s skillset? Gold-rated tester on paid projects and uTest Forums Moderator Milos Dedijer seems to think so. It was a discussion topic that recently cropped up in the uTest Forums:

Some time ago, I had an argument with my team lead about my use of safety language. I tend to use it in any argument, and I believe that it’s a good practice. I don’t use it in my factual reports, but I do use it frequently in my descriptive reports. For example, if I say that a set of steps has been executed I don’t use safety language to report results, but if I say that a certain feature has been tested I use safety language almost all of the time. Using safety language to preserve uncertainty appears to be one of the skills a tester must have.

And he was not alone:

I’ll tell you the most important reason to use safety language. When you are definitive as in “This feature has been fully tested” or “I’m positive this is the behavior” then you better be right or you will lose credibility. I never say I am 100% certain of something unless I can prove it or I’ve just looked at it.

It was also a fair assumption that this tester may have felt the same way:

If you read my writings you may notice that I generally use safety language and rarely use absolutes. I feel that in most cases this is a reasonable approach for testers.

What do you think? Is safety language a must for testers and an underrated trait? Feel free to leave your comments below or join in on the ongoing discussion over at the uTest Forums.

Categories: Companies

Applause Announces the Ovation Awards: Vote for Your Favorite Apps

Wed, 11/12/2014 - 19:00

As testers working with hundreds of apps each year, you probably already have a good idea which ones stand out in the pack. Now’s your chance to letovationLogoLeftBlack that be known to the world.

360-degree App Quality company Applause is excited to announce The Ovation Awards, the only app awards that measure what brands & developers truly seek: the love and loyalty of users and experts.

We’re looking to you not only as testers, but as app users, to vote for your favorite apps from a list of 200 finalists across 10 categories (and across both iOS and Android). We have a panel of expert judges who will be poring over your selections and making their decisions. Here’s the timeline of the awards:

  • Public voting: Nov 12 – Dec 14 – Vote for your favorite apps – vote for just one, or vote for 20 (one per category per OS) from our pool of 200 finalists. This is a big part of what our panel of expert judges will consider.
  • Judging – Our panel includes accomplished mobile engineers, journalists, CEOs and others who understand apps inside and out. Oh, and that means testers, too. You may recognize long-time uTesters Lena Houser, Allyson Burk and Michael Larsen who are also on our esteemed panel! The judges will look at YOUR votes –  as well as the analytics used by our in-house team of data scientists to help decide the 200 finalists – in order to choose the winners across 10 categories and the overall grand prize winner for each operating system.
  • Winners: Announced January 14, 2015 – The winner for each category + OS will be announced, as will the grand-prize, overall winners for iOS and Android.

Let your voice ring loud and clear. Be sure to vote today for your favorite apps in the Ovation Awards!

Categories: Companies

Authors in Testing Q&A With Mobile Tester Daniel Knott

Wed, 11/12/2014 - 17:05

Daniel Knott has been in software development and testing since 2008, working for companies including IBM, Accenture, XING and AOE. He is currently a Software Daniel KnottTest Manager at AOE GmbH in Germany where he is responsible for test management and automation in mobile and Web projects. He is also a frequent speaker at various Agile conferences and now a published author. You can find him over at his blog or on Twitter @dnlkntt.

In this uTest interview, Daniel explains the biggest mobile testing pain points that come up in his user groups, and gives us a preview of recently released book, Hands-On Mobile App Testing. At the conclusion of the interview, you’ll also receive a link to an exclusive discount for the purchase of the book.

uTest: You’ve been involved in software testing in general, but what specifically drew you into mobile testing?

Daniel Knott: Back in 2011 when I was working at XING AG in Hamburg as a software tester for web applications, I had the chance to switch to the XING mobile team to establish the QA processes. Working on this team was a great experience. I had the chance to build up a test automation framework for Android and iOS from scratch and establish a mobile testing process. I was also free to try several things out to find the right tools and workflow for my company and the development environment. This time and experience was just awesome and convinced me to focus on the mobile world.

uTest: What’s your go-to mobile device?

DK: My current device is the LG Nexus 5. I really love the device with its features. I also appreciate the raw Android experience and that the latest updates are always on my device (I am really looking forward to Android Lollipop). Before the Nexus 5, I had another Nexus device and several iOS devices. At home, I am using an Android tablet with a bigger screen to browse the web.

uTest: You host two software testing user groups in Germany. What is the most common challenge or pain point that comes up amongst mobile testers in these sessions?

DK: Well, the first group I organize is the Software Test User Group Rhein Main (STUGRM) and focuses more on general software testing topics. The second group is the Rhein Main Mobile Quality Crew and is a brand-new user group with its kickoff event in November. However, we already had some talks and meetings about mobile testing at STUGRM, and the biggest pain points were always around mobile test automation. So many people are looking for the “right” mobile test automation solution. Currently the one and only solution for all the different mobile platforms doesn’t exist.

I always recommend to start simple and to use some of the great open source mobile test automation tools that are available for almost every platform. Try the tools with your app and see how they integrate into the company development environment. Every environment is different and not every tool fits into every environment. This is an important point and people who are involved with mobile testing must know it. There is no need to buy expensive frameworks to cover mobile test automation.

Besides test automation, the other big pain point from my point of view is the missing mobile mindset. Mobile testing is different than normal software testing of web or desktop applications. It requires lots of manual testing on real devices and testing in the wild to test applications in the real-world environment where customers will use the app. For example, it is not enough to test a mobile app only in the office and over a Wi-Fi network — you need to test on real-data networks to see the real behavior of your app.

uTest: You’re a published author now with your new book on mobile testing. Were there any authors in testing that were your muse for this book?

DK: Well, not really. My muses were the readers of my blog. When I started the blog in 2011, I never expected that so many people would read my posts about mobile testing. I got so much great feedback and ideas from my readers, and this convinced me to start writing a book about mobile testing.

However, there are a few people that inspired me. For example, there are Lisa Crispin and Janet Gregory with their books about Agile Testing. Their books are just great and cover so many important points in the agile testing world that can also applied to mobile testing. Then, there is Elisabeth Hendrickson with the book Explore It! This book is also a must-read for all software testers. Other than that, there are so many great people in the software testing industry that inspire me everyday when I read their tweets and blogs.

uTest: One of the chapters in your new book details important skills mobile testers should have. Beyond good communication skills and a natural curiosity, which you mention any tester should have, which traits most separate a mobile tester from other testers?

DK: The biggest difference is in-the-wild testing, and the ability to be on the move while you test mobile applications. Software testers who test web or desktop applications don’t need to walk through cities or the countryside while testing. This is completely different for mobile testers, who must be on the move and cover different real-world usage scenarios. For example, this includes testing apps on different carrier data networks and using all the different sensors a smartphone offers.

A uTest-exclusive 20% discount is also available valid through December 31, 2014, for Daniel Knott’s new book, Hands-On Mobile App Testing.

Categories: Companies

uTest Partner BlazeMeter Hosts Performance Testing with JMeter Webinar

Tue, 11/11/2014 - 21:17

JMeter is the leading open source load performance testing tool, and cloud-based performance testing provider BlazeMeter will be hosting a live1_blazemeterbanner webinar next week giving testers and developers everything they need to run performance testing with the popular tool.

In this webinar on Wednesday, November 19, at 1pm Eastern Time, BlazeMeter’s Ophir Prusak will cover all of the vast capabilities and lesser known limitations of the popular open source load tool. The session will consist of three parts:

  • How to run performance testing with JMeter. Learn best practices, tips, and what you can and can’t do with JMeter.
  • Why it’s worth using BlazeMeter with JMeter. Learn the benefits and additional features you can get by running performance tests through BlazeMeter.
  • Live Q&A. Ask Ophir your questions about JMeter or BlazeMeter.

BlazeMeter is a proud uTest partner and provides next-generation, cloud based performance testing solutions that are 100% Apache JMeter™ compatible, and was founded with the goal of making it easy for everyone to run sophisticated, large-scale performance and load tests quickly, easily and affordably.

You can register right now for the live webinar next week.

Additionally, for more information on the popular open source load/performance testing solution, be sure to also check out the Tool Reviews of BlazeMeter and JMeter from the Tool Reviews section of our site.

Categories: Companies

Top Tweets from Better Software and Agile Development Conference East

Tue, 11/11/2014 - 18:11

As I wipe the frost off of my car and see my breath in the cold New England air, I’m especially envious of the folks down in Orlando this week for the Better Software and Agile Development East Conferences.

Testing and development professionals that registered for one of the two shows running concurrently this week received access to both the Better Software and Agile Development conferences, held at the Hilton Orlando Lake Buena Vista in Orlando, Fla., November 9-14.

The Agile Development Conference has been recognized for bringing together prominent thought leaders in the agile universe, while the Better Software Conference encompasses the entire software development life cycle, and true to its name, covers learning how to build better software. All week, both conferences have boasted keynotes featuring recognized thought leaders, in-depth tutorials and an expo floor with the latest software development and testing solutions.

For those unable to make it down to sunny Orlando this week for the 2-for-1 conference, here are some of the top tweets from the shows using the hashtag #BSCADC.

"Don't make yourself valuable to your company. Make your self valuable to your industry." @docjamessw #csp #pmiagile #bscadc

— Liza Wood (@brightcdns) November 10, 2014

"Believe: Lack of belief in yourself is career kryptonite." @docjamesw #csp #pmiagile #bscadc

— Liza Wood (@brightcdns) November 10, 2014

Mapping is sparking rich conversations as evidenced by much pointing at #bscadc

— David Hussman (@davidhussman) November 10, 2014

"What drives you crazy & pisses you off is your passion." @docjamesw #BSCADC #inspired

— Magda Lena (@lutoborska) November 10, 2014

Do you have a culture of heroism, blame or neglect? These are NOT the cultures of high performing #Agile teams. #pmiagile #bscadc

— Liza Wood (@brightcdns) November 10, 2014

#BSCADC quote "if it's not scripted, then you're not lazy enough."

— Christopher R Harvey (@thechrisharvey) November 10, 2014

If we have no way to tell when we're done…then why are we starting? #BSCADC #softwaretesting #softwareqa #testing

— Chris Okelberry (@theokester) November 10, 2014

In testing, focus on Intent over Implementation. #BSCADC #softwaretesting #softwareqa #testing

— Chris Okelberry (@theokester) November 10, 2014

"Stop trying being innovative unless you just insist on being a failure (…) be useful instead." @docjamesw #derivation #BSCADC

— Magda Lena (@lutoborska) November 10, 2014

You cannot always choose your influencers, but you can choose how you’re influenced by them. @docjamesw #csp #pmiagile #bscadc

— Liza Wood (@brightcdns) November 10, 2014

Categories: Companies

Meet the uTesters: Mikko Salamaki

Mon, 11/10/2014 - 16:45

Mikko Salamaki hails from Finland and is a Gold-rated tester and Test Team Lead on paid projects at uTest. He currently works as a Testing Specialist in his daM_facey job, and has been involved in several roles in software testing over the past 15 years, including beginning as a test designer. Mikko has also been a uTester for over 3 1/2 years.

Be sure to also follow Mikko’s profile on uTest as well so you can stay up to date with his activity in the community!

uTest: What drew you into testing initially? What’s kept you at it?

Mikko: Well, it was a bit of an accident, really, as my studies would have been more suitable for network router configuration or such roles. However, I got a summer job at a small mobile phone developer as a tester, and when I graduated, got a permanent job there, and realized that it was actually something I really liked to do. I still love to help companies increase their software quality, which is why I tend to shift place every few years.

uTest: What’s your go-to gadget?

Mikko: I’d say the number one gadget is my Lumia phone which is always somewhere close by, but I usually add Samsung Galaxy Note 10″ to the mix for heavier stuff (like uTest TTL work) if I’m not at home.

uTest: What’s your favorite part of being in the uTest Community?

Mikko: The diversity of the people and projects. You never know what’s waiting for you the next day or even the next hour.

uTest: What keeps you busy outside testing?

Mikko: Family. I’ve got two kids, who have all kinds of hobbies that require transportation help (usually from me). And in addition to that, I’m a bit of a sports fanatic. Especially when the sun is shining, I tend to find a need to hop on my bicycle and drive a few dozen KM around in the countryside, or grab my skiis and do a round or two at cross-country tracks in the winter.

uTest: QA professional or tester?

Mikko: I would say I’m a bit of both. I love helping teams and companies find better ways to increase the quality of their software, but I equally love finding new bugs when testing myself.

Categories: Companies

State of the (u)Testers: Software Testing Careers Survey

Fri, 11/07/2014 - 22:38

little-uWhat got your peers into testing? What are their biggest pain points on their testing teams? Are they seeing any career advancement from their software testing certifications?

uTest has never conducted a study of its community members who have testing as their full-time careers…that is, until now. By launching our State of the uTesters software testing careers survey, we hope to give our testers a better picture of their peers’ testing careers — what motivates them, their testing aspirations and some of the biggest pain points in their organizations.

But the data will only be as good as the participation, so send in your responses today if you’re interested in what your testing peers have to say. We will also be selecting a  couple of random participants for a uTest t-shirt from all entries. The survey should take just 10-15 minutes and will be open for submissions until Monday, November 24. We will be publishing the results of this study here on the uTest Blog in December.


Categories: Companies

uTest Platform Feature Announcement: Duplicate Bug Warning

Thu, 11/06/2014 - 22:36

In the spirit of continuous improvement, we here in uTest Community Management wanted to share with you another great feature brought to you by our fantastic development team.

Today, we rolled out a Duplicate Bug Warning feature. Now, when testers are participating in paid projects and begin to file a bug, it will automatically search for keywords within the cycle’s Issue Reports and prompt the Tester with a list of possible duplicates for review.

bug warning

This is a huge step forward for those who utilize the tester platform. The deployment of this feature will help prevent our testing community from filing duplicate bugs which creates noise for the customer and negatively impacts one’s rating.

If you like what you see, feel free to drop a note on the forums or in the comment section below. We regularly share these words of encouragement to our developers!

Categories: Companies

Top Tweets from STPCon Fall 2014

Thu, 11/06/2014 - 16:30

The Fall 2014 edition of STPCon (Software Test Professionals Conference) is winding down after three days of keynotes, sessions and workshops on topics including agile testing, performance testing, test automation, mobile application testing, and test team leadership and management.

Speakers this week have included Keith Klain, Mike Lyles, Mark Tomlinson, Carlene Wesemeyer, JeanAnn Harrison and Applause’s own Chris Munroe.

Don’t worry if you weren’t at the show or didn’t get a chance to stop by and see us at the Applause/uTest booth (while it saddens us, we won’t hold it against you…for too long, anyways). We’ve got you covered with some of the most inspiring, quotable, memorable — and just flat out funny — tweets from the show that were tagged with #STPcon:

What we don't pay attention to, we don't see! Practice #ExploratoryTesting and focus on the system not test steps @justjoehere #STPCon

— Alessandra Moreira (@testchick) November 5, 2014

"Take charge of your career and future by taking charge of your own learning". @testchick #stpcon

— Teri Charles (@booksrg8) November 5, 2014

"With dev & testing: Competition is out…collaboration is in" says @smita_qazone at #stpcon @SoftwareTestPro

— Mike Lyles (@mikelyles) November 5, 2014

I don't always make a 109 slide presentation. But when I do, it's for #STPCon

— Dave Haeffner (@TourDeDave) November 4, 2014

A few blogs to follow and read. @testchick #stpcon

— Teri Charles (@booksrg8) November 5, 2014

An inspiring mission and values is vital for people to get excited about living a culture of quality @Jason_Lauer #stpcon

— John Ruberto (@JohnRuberto) November 4, 2014

Keynote @KeithKlain "Testers, you are not your ideas" perspective when testers are talking to high level management. #STPCon

— Jean Ann Harrison (@JA_Harrison) November 5, 2014

"Establish credibility with people who matter" says @KeithKlain in #keynote at #stpcon @SoftwareTestPro #talkingToCIO

— Mike Lyles (@mikelyles) November 5, 2014

Trust @KeithKlain to always say it as it is! We need to learn how to talk about #testing. Great keynote at #stpcon!

— Alessandra Moreira (@testchick) November 5, 2014

"Our job isn't over when the product goes live it's when it's retired." ~ @bradjohnsonsv #STPCon #softwaretesting

— Jay Philips (@jayphilips) November 5, 2014

"Don't let the existing culture dictate your approach; disarm it!" ~ @Jason_Lauer #STPCon #leadership #softwaretesting

— Jay Philips (@jayphilips) November 4, 2014

The barrier to entry in testing is often low, so becoming a tester is easy. Being great at it not so much #STPCon #RaiseTheBar #testing

— Alessandra Moreira (@testchick) November 5, 2014

"I decided to stay in testing on purpose". @testchick #stpcon

— Teri Charles (@booksrg8) November 5, 2014

And rounding out the bunch, testing giant James Bach was also awarded a prestigious award on Wednesday. Congratulations, James!

Congratulations to @JamesMarcusBach awarded the 2014 Soft Test Luminary Award. Check it out @ the Reg desk. #STPCon

— SoftwareTestPro (@SoftwareTestPro) November 5, 2014

@SoftwareTestPro @JamesMarcusBach "Luminary" as an understatement: that's surely a first! Well deserved, James – congratulations. #STPCon

— Neil Studd (@neilstudd) November 5, 2014

Congratulations @jamesmarcusbach! #STPCon

— Teri Charles (@booksrg8) November 5, 2014

Categories: Companies

Authors in Testing Q&A With Agile Testing Champion Lisa Crispin

Wed, 11/05/2014 - 16:30

Lisa Crispin was voted the Most Influential Agile Testing Professional Person at Agile Testing Days 2012 by her peers, and enjoys working asCrispinDonkey a tester with an awesome agile team. She shares her experiences via writing, presenting, teaching and participating in agile testing communities around the world.

She is also the author and contributor of numerous software testing books, including her latest, released in October and co-authored with Janet Gregory, More Agile Testing: Learning Journeys for the Whole Team. You can learn more about Lisa’s work on her site, and follow her on Twitter @lisacrispin.

In this uTest interview, Lisa explains the reality of agile adoption and suggests ways teams can succeed with agile.

uTest: Where have companies or teams gone most wrong when rolling out agile in their organizations?

Lisa Crispin: Many organizations don’t understand that to succeed at software development, we have to focus on delivering the best possible quality, rather than focusing on speed. Too many think that “agile” means “fast.” You need a big investment in time and training so that teams learn important practices such as TDD, CI, specification by example/behavior-driven development, helping business stakeholders identify the most valuable features, and so on. Teams that don’t nurture a learning culture where failure is tolerated, experiments are supported, and the team has diversity, accumulate too much technical debt and fail.

uTest: From your experience, what’s your mix of either backgrounds or personalities that make up the ideal agile team?

LC: Any team needs diversity. It’s essential that each team member feels personally safe to ask questions, bring up issues, and suggest improvements. Each team member must learn to shift to an “agile” mindset and be willing to try small experiments, see what works, continually improve, and build quality into the product.

The downside of diversity can be communication gaps. We have to build a common language to communicate. Using executable tests to drive development helps with that. Sharing knowledge so that we each understand the others’ jobs also helps.

We need to build our T-shaped skills, or as Adam Knight puts it, square-shaped skills. For example, testers don’t need to be coders, but they should be technically aware, having a high-level understanding of the system architecture and some familiarity with the tools and frameworks in use. Everyone should feel safe to ask questions and ask for help.

uTest: Why is agile *the* answer for larger organizations stuck in their ways?

LC: I’d like for us to lose labels such as “agile.” Each organization should identify their biggest problem, brainstorm small experiments to make that problem better, try each experiment and grow incrementally from there. It’s a shame that “agile” has become, for many organizations, the latest in a long line of “miracle cure” methodologies.

I prefer Elisabeth Hendrickson’s agile acid test: A team is agile if it delivers business value frequently, at a sustainable pace. To have a sustainable pace, you have to embrace so many good values, principles and practices. Continuous delivery is the golden ring we’re all trying to grab these days. It takes a long time to achieve it – we have to be patient, try small experiments, sometimes take one step forward and two steps back. I know from experience that eventually we’ll get there. If it takes years, that’s OK – remember how many software delivery efforts fail.

uTest: You joined your first agile team in 2000. Could you talk a bit about the experience, particularly how it drove you to become an agile testing champion?

LC: On my last waterfall team, I was frustrated that despite how disciplined we were at following the waterfall process, we were always behind our competition. By the time we got a feature out the door, it was out of date.

During my first iteration as a tester on an agile team, I was shocked to discover that if two users logged into the product we were developing, the server crashed. “Unacceptable!” I cried. Our manager/coach had to explain to me that the customer didn’t care whether more than one user could use the product — they were a startup who needed features to demonstrate to potential investors. This was a giant mindset shift for me — the customer defines external quality, not the testers.

I had the opportunity to work for a small financial services company from 2003 to 2012. The co-founders knew they needed successful software delivery for their business model to work, but knew nothing about software. After researching who in the area could save them, they persuaded Mike Cohn to come on board full time, and luckily he brought me on board. I got to experience the magic that happens when business executives understand the value of quality, and a cross-functional team works together for quite a few years. Learning to deliver business value frequently at a sustainable pace (to paraphrase Elisabeth Hendrickson) was an eye-opening experience.

uTest: Your new book is a continuation of the first – ‘Agile Testing,’ which came out five years ago. How does this book differ and how is it a natural continuation of the first story being told?

LC: The “basics” we used in our first book, such as the agile testing quadrants and test automation pyramid, still prove useful today. But we’ve benefited from more disciplines joining the agile effort. For example, business analysis experts such as Ellen Gottesdiener and Mary Gorman have shown us more ways to help stakeholders specify how features should work. Longtime agile testing practitioners such as Gojko Adzic have come up with techniques such as impact mapping that achieve similar goals.

And there are new technological problems to solve, such as testing software embedded in so many things we use every day, from automobiles to exercise equipment. We are testing in many contexts: Globally distributed teams, enterprise organizations, business intelligence. We have to adapt and grow our models to accommodate all the new challenges.

uTest: The book is filled with stories – stories from real agile teams and firsthand experiences. What was one of the stories that surprised you the most, or was most inspirational?

LC: Wow, with around 70 sidebars from more than 40 contributors, it is hard to single any out. One that I thought had a unique perspective was Parimala Harisprasad’s story about leading an offshore test team that is working with agile development teams.

She has found a lot of simple ways to make the remote team members feel more connected with each other. She has a wonderful sense of humor, and her team clearly has a lot of fun, which in turn helps them collaborate better. For example, they share funny pictures and videos, creating a “butterfly effect” among the people in different geographical locations. It’s good to get ideas on how to make a distributed team “jell,” since so many of us work in organizations with individuals or teams who are remote.

Categories: Companies