Skip to content

uTest
Syndicate content
Updated: 4 hours 25 min ago

Can You Hack Into Google Chrome? It Could Net You an ‘Infinity Million’

Fri, 02/27/2015 - 20:55

59681140Google once again is holding its annual hackathon for participants to search for holes and major flaws in its Chrome OS. Last year, the bounty was $2.71828 million in prizes.

However, this year, they’ve totally upped the ante — to the infinite degree. In fact, according to Entrepreneur, “Google has changed the nature of the prize money at stake…It now goes all the way up to $∞ million.”

Prizes in the hackathon range from $500 up to a new high of $50,000, and there’s no limit on the reward pool, but that could always be scrapped at the drop of a hat. Google says that the changes “are meant to lower the barrier of entry, and remove the incentive for hackers to sit on discovered bugs until the annual competition.”

This certainly sweetens the pot for hackers everywhere, although I could totally see that blank check of an “infinity million” being a very temporary experiment when competition gets out of hand (and Google’s bank accounts…low).

What would you do with an infinity million?

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

uTest Takes: Best Software Testing Blogs From the Week of Feb. 23

Fri, 02/27/2015 - 15:30

From time to time, the uTest Blog highlights some of the recent blog entries that uTesters have crafted on their own personbbc-blogsal blogs, along with some standouts from the outside testing world.

Here are some such notables from the week of Feb. 23, 2015:

Blogs This Week from uTesters & uTest Contributors
  • Aspects of a Good Session: Any testers out there presenting at an upcoming conference or want to down the line? uTester Stephan Kämper penned this list of what he values in a “good” session, from humor and pain points, to not overdoing it on the slides, and telling good stories.
  • A Tester’s Portfolio: uTest contributor Stephen Janaway’s latest post from his own blog takes on the fact that while devs may have a robust portfolio, testers usually don’t — afterall, they don’t have a final creation to show for their efforts. What does that mean? They have to create this portfolio themselves through arenas like blogging, sharing presentations online and speaking at testing conferences.
  • Less Eeyore, More Spock: I didn’t grow up a Star Trek fan (I was always a Star Wars guy, myself), but I do know Spock, and frequent contributor Michael Larsen’s view on why testers should aspire to be Spock is thought-provoking. Live long and prosper!
Others That Caught Our Eye
  • Letter to a Starting Tester: This recent post from Joel Montvelisky of PractiTest (whom uTest partnered with for the State of Testing survey) is in the form of a letter, writing back to the ‘1998 Joel’ just starting out. It’s a very cool read, and especially hammers home advice a lot of context-driven testers would be proud of — seeking out fellow testers within your own organization and always questioning/standing your ground. For the tester just starting out — read Joel’s advice!
  • These Chicks Were O.G. (Original Geeks): Why do men get all the love in programming? Statistically there may be more males in the industry, but it downplays all of the important women that made important contributions to programming and testing. Nice post from the Testy Engineer that pays homage to Ada Lovelace and Grace Hopper — two female pioneers in the field.

Have ideas or blogs of your own that you haven’t yet shared with the world? Become a contributor to the uTest Blog today.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

uTest Platform Updates Focus on Bug Reports

Thu, 02/26/2015 - 18:47
[Testers] require the effective integration of technologies to simplify their workflow and boost efficiency.

- Anne M. Mulcahy

uTesters on paid projects: We’re happy to announce some new uTest Platform functionality, with this week’s release, that enhances the bug reporting experience.

Save Your Bug Reports

It’s now easier than ever to create and save your bug templates. You may remember that in the previous release, we added a field that allowed you to configure a custom bug report template. We’ve simplified this process by allowing you to save the bug report you entered as your bug report template by adding a “Save as Template” button in the lower right-hand corner of the bug report form.

Pic1

We hope that this will enable you to create bug report templates even faster and with more efficiency.

Custom Bug Report Fields

Customers will often require that testers provide specific details in their bug reports. To date, testers had to refer to the scope of the cycle to remember which information to include. However, going forward, a customer or PM can add the following template fields directly in a bug report form:

  • Device Make and Model (their app is on phones, tablets, set top boxes and game consoles)
  • Browsers with versions (Some customers need “exact browser build version”)
  • OS (Service pack version)
  • URL where issue occurs
  • Does issue occur on production (for staging cycles)
  • Login details
  • Does issue occur for multiple login providers
  • Number of times reproduced
  • Other pages with the same issue
  • Reproducibility (x of y times)

Additionally, customers and project managers will have the ability to create other custom inputs to ensure flexibility across all cycle types.

Pic2

We hope that this will help streamline the bug reporting process for our testers, resulting in higher-quality reports for our customers.

If you like what you see, feel free to drop a note in the Forums to share your ideas on these and other recent platform updates!

Categories: Companies

Q&A: ‘Let’s Test’ Leader Talks Global Reach of Context-Driven Testing, Previews Conference

Wed, 02/25/2015 - 15:30

Johan Jonasson is one of the organizers of Let’s Test conferences, which celebrate the context-driven school of JohanJonassonthought. In addition to co-founding testing consulting firm House of Test, Johlets-test-logo-180x47pxan is a contributing delegate at the SWET peer conferences, and has spoke at several national & international software testing conferences. He is also an active member of the Association for Software Testing (AST). Follow him on Twitter @johanjonasson.

Let’s Test 2015 is slated for May 25-27, 2015, in Stockholm, Sweden, and uTest has secured an exclusive 10% discount off new registrations. Email testers@utest.com for this special discount code, available only to registered uTest members.

In this interview, we talk with Johan on the global, inclusive context-driven testing community, and get a sense of what testers can expect at the 2015 edition of Let’s Test.

uTest: You have a lot of crossover with the CAST conference in the US — both are context-driven testing sessions featuring content by testers for testers. There’s also a lot of sessions driven by folks who were at CAST. What does it mean to you to have a fervent following that travels the world for these shows?

Johan Jonasson: The fact that many speakers and attendees are willing to travel lengthy distances to both events is, I think, a great testament to the fact that context-driven testing is something that excites a lot of people, and that there is a truly global and inclusive community eager to meet and share experiences. Last year, we had attendees from literally all parts of the world come to Let’s Test.

At the same time, there is a fairly large number of local testing experts on this year’s program too, which I think shows how much context-driven testing has grown in Europe since the first conference in 2012. The context-driven approach to testing is definitely gaining ground, even in some European markets that are traditionally considered ‘factory testing’ school strongholds, which is great.

uTest: Has there been a common theme in terms of tester pain points coming out of the conferences from the last couple of years?

JJ: There seems to be two main challenges that keep coming up. One is how to package and communicate the qualitative information that our testing reveals in a way that can be readily understood by our stakeholders, who might be used to making release decisions based on traditional and flawed quantitative metrics like bug counts and pass/fail ratios. In other words, how can you perform good test reporting that stands up to scrutiny?

The other one is how to convince managers and companies buying testing services to move away from wasteful and dehumanizing testing practices sold by the lowest bidder, and adopt approaches that focus on the value of information, and the demonstrable skill of the professionals delivering that value.

uTest: Speaking of pain points, ISO 29119, and its attempt to standardize testing, has been a pain for many in the context-driven community. It’s also the subject of one of the sessions this year at Let’s Test. What are your own views on 29119?

JJ: I actively oppose the work being done on ISO 29119. I think it is flawed thinking in the first place to even try to standardize adaptive, intellectual and creative work like testing.

Assuming for the sake of argument that it would be a good idea. In order to standardize testing, we would have to agree on at least the fundamental aspects of testing, and for the longest time, the global testing community has never been in agreement about those. Don’t get me wrong, I think that’s a good thing. Consensus is highly overrated. Argument and disagreement is crucial if we are to move forward. Which is another reason to not try to create a one-size-fits-all standard in a field that is still highly innovative and developing.

Those are just a couple of issues I have with ISO 29119, and that’s before we even start talking about the archaic and long-since-discredited models of testing that the this standard has presented so far, or the motivations behind the standard.

uTest: Which were some of the most impactful or memorable sessions for you personally from the 2014 edition of Let’s Test?

JJ: I very much appreciate all the experience-based talks as well as the inspirational or innovation-focused ones, and it wouldn’t be Let’s Test without them, but my favorite sessions are the experiential workshops and the learning that comes from doing and experiencing situations firsthand in those simulated environments.

Because of that, my favorite session last year was Steve Smith’s experiential keynote session where the entire conference participated in the keynote. So it was a 150+ person simulation which ended with fascinating presentations from the participants, and observations from Steve Smith. I don’t think either Steve or us organizers really knew if it would work to have a simulation that big before we tried it, but we’re never content with just doing what we’ve been doing the year before. We try to constantly raise the bar for both the content and format of Let’s Test.

uTest: You have had Let’s Test in Europe for several years now, and piloted Let’s Test Oz in Australia last year. Are there other  areas where you want to bring context-driven testing or see it emerge more?

JJ: Absolutely! We’ve done smaller Let’s Test events (called Tasting Let’s Test) in both the Netherlands and South Africa during 2014, and we’re planning another Let’s Test Oz, and trying to find a good date that would fit well with other things going on in the near future.

The next big thing though is an upcoming three-day Let’s Test event in South Africa in November 2015 that we’ll be releasing more information on in the coming weeks at our site and on Twitter @Letstest_conf.

uTest: Could you give us a preview of what may be different at Let’s Test this year?

JJ: We’ve really tried to turn up the number of workshops for Let’s Test 2015. Like I said previously, there’s always a need for great talks and experiences at Let’s Test. However, by aligning with the residential, almost retreat-like format of the Let’s Test conference, we felt that what really gets people talking is hands-on sessions where we spend more than an hour on a certain topic.

So for Let’s Test 2015, we have an unprecedented 26 workshops and tutorials of different sizes lined up during the three days of the conference. Several of these are three-plus hours in length to make sure there’s enough to listen to, experience, debrief and discuss for everyone participating.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, testing events, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

ISO 29119: Why is the Debate One-Sided?

Mon, 02/23/2015 - 17:23

unnamedIn August, the Stop 29119 campaign and petition kicked off at CAST 2014 in New York. In September, I wrote on the uTest Blog about why the new ISO/IEC/IEEE 29119 software testing standards are dangerous to the software testing community and good testing.

I was amazed at the commotion ‘Stop 29119′ caused. It was the biggest talking point in testing in 2014. Over six months have passed, and it’s time to look back. What has actually happened?

The remarkable answer is – very little. The Stop 29119 campaigners haven’t given up. There have been a steady stream of blogs and articles. However, there has been no real debate; the discussion has been almost entirely one-sided.

There has been only one response from ISO. In September, Dr. Stuart Reid, the convener of the working group that produced the standard, issued a statement attempting to rebut the arguments of Stop 29119. That was it. ISO then retreated into its bunker and ignored invitations to debate.

Dr. Reid’s response was interesting, both in its content and the way it engaged with the arguments of Stop 29119. The Stop 29119 petition was initiated by the board of the International Society for Software Testing. ISST’s website had a link to the petition, and a long list of blogs and articles from highly credible testing experts criticizing ISO 29119. It is a basic rule of debate that one always tackles an opponent’s strongest points. However, Dr. Reid ignored these authoritative arguments and responded to a series of points that he quoted from the comments on the petition site.

To be more accurate, Dr. Reid paraphrased a selection of the comments and criticisms from elsewhere, framing them in a way that made it easier to refute them. Some of these points were no more than strawmen.

So Cem Kaner argued that IEEE adopts a “software engineering standards process that I see as a closed vehicle that serves the interests of a relatively small portion of the software engineering community… The imposition of a standard that imposes practices and views on a community that would not otherwise agree to them, is a political power play.”

Dr. Reid presented such arguments as “no one outside the Working Group is allowed to participate” and “the standards ‘movement’ is politicized and driven by big business to the exclusion of others.”

These arguments were then dismissed by stating that anyone can join the Working Group, which consists of people from all parts of the industry. Dr. Reid also emphasized that “consensus” applies only to those within the ISO process, failing to address the criticism that this excludes those who believe, with compelling evidence, that ISO-style standardization is inappropriate for testing.

These criticisms had been made forcefully for many years, in articles and at conferences, yet Dr. Reid blithely presented the strawman that “no one knew about the standards and the Working Group worked in isolation.” He then effortlessly demolished the argument that nobody was making.

What about the content? There were concerns about how ISO 29119 deals with Agile and Exploratory Testing. For example, Rikard Edgren offered a critique arguing that the standards tried but failed to deal with Agile. Similarly, Huib Schoots argued that a close reading of the standards revealed that the writers didn’t understand exploratory testing at all.

These are serious arguments that defenders of the standard must deal with if they are to appear credible. What was the ISO response?

Dr. Reid reduced such concerns with bland and inaccurate statements that “the standards represent an old-fashioned view and do not address testing on agile projects” and ”the testing standards do not allow exploratory testing to be used.” Again, these were strawmen that he could dismiss easily.

I could go on to highlight in detail other flaws in the ISO response — the failure to address the criticism that the standards weren’t based on research or experience that demonstrates the validity of that approach, the failure to answer the concern that the standards will lead to compulsion by the back door, the failure to address the charge from the founders of Context-Driven Testing that the standards are the antithesis of CDT, and the evasion of the documented links between certification and standards.

In the case of research, Dr. Reid told us of the distinctly underwhelming claims from a Finnish PhD thesis that the standards represent “a feasible process model for a practical organization with some limitations.” These limitations are pretty serious — “too detailed” and “the standard model is top-heavy.” It’s interesting to note that the PhD study was produced before ISO 29119 part 3 was issued; the study does not mention part 3 in the references. The study can therefore offer no support for the heavyweight documentation approach that ISO 29119 embodies.

So instead of standards based on credible research, we see a search for any research offering even lukewarm support for standards that have already been developed. That is not the way to advance knowledge and practice.

These are all huge concerns, and the software testing community has received no satisfactory answers. As I said, we should always confront our opponents’ strongest arguments in a debate. In this case, I’ve run through the only arguments that ISO have presented. Is it any wonder that the ‘Stop 29119′ campaigners don’t believe we have been given any credible answers at all?

What will ISO do? Does it wish to avoid public discussion in the hope that the ISO brand and the magic word “standards” will help them embed the standards in the profession? That might have worked in the past. Now, in the era of social media and blogging, there is no hiding place. Anyone searching for information about ISO 29119 will have no difficulty finding persuasive arguments against it. They will not find equally strong arguments in favor of the standards. That seems to be ISO’s choice. I wonder why.

James Christie has 30 years’ experience in IT, covering testing, development, IT auditing, information security management and project management. He is now a self-employed testing consultant, based in Scotland. You can learn more about James and his work over at his blog is  and follow him on Twitter @james_christie.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

What’s In Your Work-From-Home Toolkit as a Tester?

Fri, 02/20/2015 - 16:30

indexMany of our testers around the world at uTest work from the comforts of their own homes when testing away on the latest mobile apps.

Check that — if uTesters have made the decision to take on paid projects with our customers from around the globe, they are working from home, as our community of 150,000+ is always testing in the wild where they live, work and play. Additionally, there’s a large contingent, that, while they may not be testing with uTest specifically, are on teams in their day jobs that are remote.

I myself enjoy the comfort and flexibility that uTest and Applause have graciously afforded me by working from home from time to time, but I’m also a realist. I know that it can be a distraction, and need some tools in my digital toolkit to keep me productive.

There was a great piece recently in the Harvard Business Review (HBR) that covered such things to buy, download or do when working remotely, and some of them no doubt can be helpful to the testing world:

  • Instant Messaging: According to HBR, ‘use it to ask someone a quick question, or even for a little bit of lightweight socializing that can cut down on the isolation of remote work.’ At Applause, I use Skype, which, in addition to being very convenient because it’s so universal, is cost-effective if you’re on a budget — a Skype call is free to someone halfway around the world.
  • A Lightweight Laptop: Not an issue if I’m at home, but if I need to escape the distractions of a TV and a barking dog in La Casa and move to Starbucks, this is a lifesaver.
  • Chunking Your Day: By this, HBR is referring to blocking off portions of your day “that focus your attention on what kinds of work you want to do when.” For example, if the other parent and the kids are home while you’re snowed in, you could come to an agreement that for your conference call and heads-down work, it’s totally “closed-door” time for that three-hour period. And it doesn’t just have to be in this type of scenario — each morning from 8-9 could be ‘email check-in,’ while afternoons are reserved for long-term projects.

These are just a few suggestions, obviously. Throwing it out to testers on projects at uTest, and those who work on remote teams in their day jobs, what are some of the tools and best practices that make you most productive while at home? Let us know in the comments below.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

This Week in Software Testing Fails: You’re In! (Or Not)

Thu, 02/19/2015 - 21:36

AcceptedYou’re a senior in high school and you eagerly open up the mailbox in anticipation of that college admissions letter. The day finally comes, and you tear apart the huge envelope in anticipation of the (hopefully) good news to come.

It does. You have been accepted to your dream school (says the admissions letter, anyways)!

Except when that letter is, in reality, a soul-crusher of dreams — the case that happened with 800 prospective students applying to Carnegie Mellon’s top-ranked master’s in computer science program. According to the Washington Post, a glitch in the school’s computer systems allowed an email like this one to go out to these students mistakenly informing them that they had been accepted.

Like in the film Office Space, however, school officials soon “fixed the glitch,” and quickly followed with a second (far less congratulatory) email:

“Earlier this morning, we mistakenly sent you an offer of admission to Carnegie Mellon’s MS in CS program. This was an error on our part. While we certainly appreciate your interest in our program, we regret that we are unable to offer you admission this year.”

Ouch. After that rollercoaster of painful emotions — and one student who had a celebratory dinner with his family out of jubilation — could these kids at least have a courtesy admission, please?

I’d say some testing of the admissions e-mail procedures and programs may be a welcome (and warranted) solution the next admissions season.

Not a uTester yet? Sign up today to comment on all of our blogs like this one, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

Registration is Now Open for uTest’s ‘Introduction to Android Testing’ Webinar

Thu, 02/19/2015 - 18:29

androiduTest University is happy to announce that registration is now open for our next live webinar opportunity.

Hot on the heels of the recent Introduction to Security Testing and Build the “right” regression suite using Behavior-Driven Testing (BDT) webinars, uTest University is offering a chance for testers to get familiar with Android testing. The webinar is taught by Iwona Pekala, a Gold-rated uTester and frequent contributor to the uTest Forums.

In this webinar, participants will learn how to:

  • Prepare your mobile device and PC for testing
  • Install applications
  • Record videos and take screenshots
  • Collect logs
  • Get information about the types of crashes

Webinar Details

  • What: A live webinar presented by Iwona Pekala called “Introduction to Android Testing”
  • When: Tuesday, March 3, 2015 from 1:00 p.m. to 2:00 p.m. EST
  • How: Register now. Seats are limited!

About Iwona 

Iwona holds a Master’s degree in applied computer science and has been a professional software tester since 2007. Previously, she worked at IBM as a tester and developer. A uTester since 2011, she has more than three years of experience as an Android tester and two years of experience as an Android developer.

Iwona is also a course author in uTest University. See her course called How to Set Up and Use Mobizen Screen Recording.

About uTest University

uTest University is free to all members of the uTest Community. We are constantly adding to our course catalog to keep you educated on the latest topics and trends. If you are an expert in UX, load & performance, security, or mobile testing, you can share your expertise with the community by authoring a University course. Contact the team at university@utest.com for more information. Not a uTester yet? Sign up today!

Categories: Companies

uTest Announces the 2014 uTester of the Year Awards

Wed, 02/18/2015 - 17:19

uTester of the YrYou may recall that 2014 was a landmark year for our 6 1/2-year-old community. The uTest brand lived on in a big way: We transformed our community into the professional network for testers, an open community that promotes and advance the testing profession, and the people who do this vital work.

And no doubt, it was these same people, our uTesters, that were the heart and soul of this transformation through their continued hard work on paid projects and in contributing valuable learning content for our testers. So it makes the uTest Community Management team especially proud to announce the brightest stars of this bunch as our 6th Annual uTesters of the Year.

The uTester of the Year Awards are the highest distinction a uTester can receive. They celebrate community members that not only have gone above and beyond in their call of duty on paid projects at uTest, but have contributed valuable and impacful content for our testers in the uTest Forums and uTest University, and on the uTest Blog.

This year’s winners once again were chosen by our Community and Project Management teams, who have the privilege of working closely with top-notch testers from around the world. However, we also looped Test Team Leads (TTLs) into the voting process for the first time given just how much they are in tune with the pulse of our community. It’s no easy work boiling down 150,000+ testers (not literally!) into one small group, but we believe that after much sifting through votes and data, we’ve chosen an extremely dedicated and talented bunch for 2014.

Now, let’s get right to it. Meet 2014’s uTesters of the Year.

The top honors for this year’s awards goes to……Sheryl Reed from the United States!

Sheryl is this year’s Most Valuable Tester (MVT), and just last week celebrated three years with uTest. Like Romulo in last year’s awards, this is also Sheryl’s first appearance in the uTester of the Year Awards, which makes this all the more impressive.

Sheryl is a prolific Gold-rated tester on paid projects here at uTest — she has filed over 5,300 bugs on over 1,800 test cycles — countless numbers of these bugs listed as ‘exceptional’ — and has been listed as a ‘favorite tester’ by 16 customers. In her day job, too, she is no stranger to testing as she has spent over 30 years testing and programming everything from mainframes to mobile apps, including a run at IBM for over 20 years.

Here’s what Sheryl had to say on receiving MVT honors for 2014:

It is an honor to be selected 2014 Most Valued Tester receiving recognition for my contributions. For the past three years, it has been my pleasure working for an organization where software test engineers are held in high regard.

When I stumbled upon uTest on a friend’s LinkedIn profile, I never imagined I would find a software testing job offering me the freedom to choose projects and job offers to test interesting apps, and education to grow my testing skills.

I am grateful for the vote of confidence placed in me by the Community and Project Managers along with the Test Team Leads. Looking forward to a great 2015 at uTest, working in-the-wild with amazing testers from all over the world.

Including Sheryl, here is the list of the 2014 uTesters of the Year:

In addition to the special category honors, we’re proud to recognize the following testers as Top Testers of 2014:

Please join me in congratulating all of our 2014 uTesters of the Year! Be sure to formally stop by the Forums as well and do so. This group once again showcases the outstanding testing talent that shines in our community, and these testers’ dedication to not only quality on paid testing projects, but to producing outstanding content at uTest.

In addition to having their names permanently etched in our uTest Hall of Fame, the winners will be taking home a fancy swag pack that includes a custom t-shirt with the new-for-2014 design you see above.

uTesters are already making their mark and paving their way to 2015 uTester of the Year nominations with their impactful work as we move deeper into the year.

Until 2015…

Categories: Companies

uTest Offers Discount for Test Management-Focused SQTM Conference

Tue, 02/17/2015 - 17:13

The International Conference on Software Quality and Test Management (SQTM) is the only conference that focuses on advancing the tSQTM-small-logoest management and quality management professions by providing practical methods based on best practices. The conference relies on experts and practitioners to bring to the conference success stories and case studies. The 2015 SQTM will be held May 31-June 5, 2015, in Washington, D.C.

uTest spoke with conference chair and CEO of the International Institute for Software Testing (IIST) Magdy Hanna, Ph.D., for a preview of what the 2015 edition of the show will bring. Stay tuned at the end of the interview to see how uTesters can receive an exclusive discount on new registrations to SQTM.

uTest: Could you give us a little preview of what testers can expect at this year’s conference?

Magdy Hanna: At SQTM 2015, attendees will have the opportunity to select from 26 full-day tutorial workshops. These tutorials provide in-depth coverage in many areas of quality and test management. This year, we also added a number of tutorials on soft skills. Also, for the first time, attendees will be able to achieve education-based certifications such as the Certified Agile Software Test Professional certification.

uTest: There’s a lot of great testing conferences out there, from EUROSTAR to STPCon. If I’m a software tester looking to attend a show in 2015, what differentiates SQTM from some of these conferences in the testing circuit?

MH: SQTM is actually the only conference that focuses mainly on management issues. All sessions are mainly for test and QA managers, project managers, and directors of corporate quality. Also, SQTM only accepts presentations on methods that have been tried and proven to produce good results.

uTest: What were some of the most impactful sessions – or stories to come out of the show – for you personally at last year’s SQTM?

MH: As a conference chair, I really did not have a chance to sit in any of the sessions for too long. However, I did speak at length with conference attendees throughout the conference to get their feedback and read through all feedback forms. Many of the attendees spoke very highly of the presentation on Identifying Software Quality Best Practices by David Herron. Almost all presentations on mobile testing received very high reviews.

uTest: From your years in experience leading the conference and networking with testers at SQTM, is there a common thread from your discussions that is the biggest pain point facing testers?

MH: Almost every test manager I spoke with is faced with challenges including late involvement in the project, inadequate resources, and lack of support from management.

As a special offer to our testing community, uTest has arranged for a 20% discount for new registrations to SQTM. Contact us at testers@utest.com to receive an exclusive promo code. NOTE: This promotion is only valid for registered uTest community members.

Not a uTester yet? Sign up today for free today to gain access to exclusive conference discounts like this one, and free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

Apple Splits Up Beta Testers With ‘Groups’ in TestFlight

Fri, 02/13/2015 - 17:46

Apple announced that it has testflightrolled out the ‘Groups’ feature to TestFlight, its app beta testing tool.

According to Apple, developers will be able to “organize testers into groups to quickly send builds, provide separate instructions on where to focus, and apply an action to several testers at once in TestFlight.” In short, if a group of testers is focused more on, let’s say, minor GUI issues, while another group is focused on deeper-rooted problems, devs can split the beta testers and communicate totally different instructions.

This news comes on the heels of Apple closing its legacy TestFlightApp.com app beta testing site for good on February 26.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

Test Your Knowledge With uTest University Quizzes

Wed, 02/11/2015 - 17:57

KEEP-CALM-2-borderToday, uTest University launched the beta of the “Test Your Knowledge” feature on select courses. You can now take an optional quiz at the end of a course to see if you understood the information presented. Your score can be emailed to you, and you can choose to share your quiz results on social media so that other uTesters can see how you did!

The first two courses in the beta quiz program are Accessibility Testing 101 Tutorial: Finding Color Contrast Ratios and What is White Hat vs. Black Hat Security Testing?

In the Accessibility Testing 101 Tutorial, expert Helen Burge explains the concept of color contrast ratios, why it’s important and how to determine the color contrast ratio on a web page. At the conclusion of the short video, a five-question quiz tests to see if you learned the information.

In What Hat vs. Black Hat Security Testing, course author Pramod Lumb explains the differences of each type of security testing. At the conclusion of the course, a three-question quiz is available for you to gauge if you understood the course.

We’ll be adding more quizzes to the 150+ courses available in uTest University, your destination for free software testing training.

About uTest University

uTest University is free for all members of the uTest Community. We are constantly adding to our course catalog to keep you educated on the latest topics and trends. If you are an expert in UX, load & performance, security or mobile testing, you can share your expertise with the community by authoring a course. Contact the team at university@utest.com for more information. Not a uTester yet? Sign up!

Categories: Companies

Community Choice Announced Ahead of the 2014 uTester of the Year Awards

Wed, 02/11/2015 - 15:00

uTestLogoBlackAs a lead-in to next week’s prestigious 6th Annual uTester of the Year Awards, we are happy to announce the winner of the only uTester-sourced category of these awards, Community Choice.

This category was launched in conjunction with the 2013 edition of the ceremonies as a way to get community members involved in the voting mix, and this year was no different. Voting concluded just last week in this very special category, and we are proud to announce the community’s choice for this category of uTester of the Year:

Prasad Patoju

Prasad hails from India and is a 4 1/2-year uTest veteran and Gold tester on paid testing projects in the community. Like his counterparts who will be crowned 2014 uTesters of the Year next week, Prasad will be taking home all of the fame, along with the bounty, including a new-for-2014, custom-designed t-shirt. Congratulations, Prasad!

Stay tuned next Wednesday, February 18, here on the uTest Blog for the announcement of all 2014 uTester of the Year Award winners! In advance of the awards, stop by the Forums and congratulate Community Choice winner Prasad yourself.

Categories: Companies

Five Things to Consider When a Tester’s Job May Be Outsourced

Tue, 02/10/2015 - 16:30

The reality these days is that some testing jobs are vanishing. Development jobs, too, because of job outsourciJob-Outsourcing-Good-for-Amerciang and other reasons.

This is an uncomfortable situation for the full-time tester that does great testing for the same company, for several years, and expects this to continue. What can the tester do when his or her testing job vanishes? It is impossible to convince an employer not to outsource. What else can one do?

It’s been a few years since the tester had his last job interview. The job market has changed — there are many new technologies, new hardware, new methodologies, and new companies. What is the secret sauce that this tester can use to get by?

There is no secret sauce. But there are different ingredients that the tester can play with.

Stop being preoccupied about your employer. Look more into what you can do.

Since you cannot change your employer’s decision, that’s a bad path to go down. You can change yourself, however, to be better adapted to the present times and be ready for future jobs.

Learn how to navigate the job market.

Should the tester float in the water of the job market or swim? Should he stay where he is and be “safe” or take risks? Floating will take this tester wherever the currents go, while swimming will allow this individual to steer in any direction wanted. This direction can consist of so many different things:

  • Working on a different type of testing
  • Working for any company that the tester wants to
  • Diversifying knowledge
  • Trying out new things
Consider what you value more out of an employer.

Is it better to work for one employer for a longer period or work for multiple employers for shorter times? It’s up to the tester to decide which option is more beneficial.

Staying with the same employer for a longer time will keep you working on the same projects, with the same technologies and people. It will make you depend too much on one company, however, when maybe you should depend on something else, such as keeping your skills fresh.

Is it better then for a tester to change employers more often, and have new challenges working with new people, projects and products?

Consider what makes your services more valuable…and you less replaceable.

How can a tester become so good that he cannot be ignored? To become this good, it will require patience and time. But starting now will get the tester closer to the solution of the outsourcing problem. Not starting now, or at all, will just keep the individual treading water.

Becoming good starts with finding a learning strategy that suits the tester. Challenging one’s self just above the current skill level is a possible strategy. Then, start learning about things that are related to what the tester already knows. While learning, if something gets the tester’s attention and increases motivation and the level of energy, that direction should be investigated more.

Consider a few directions before deciding on one.

Some testers want to do mobile testing next. Others are interested in load testing and test automation. Still, others are interested in testing web services or database systems.

Should the tester decide the new direction from the beginning without knowing much about it? Or go in a few different directions and then decide the one that’s best? Maybe it is a good idea to diversify first, expand skills and knowledge, and try new things before making a decision about the next career stop.

Remember, there is no secret sauce. If jobs are being outsourced, start working with — and on — yourself.

Alex Siminiuc is a Gold Tester at uTest and lives in Vancouver, Canada. He writes occasionally and teaches test automation with Java and Web Driver on his blog. His newest course is about Learning Load Testing with JMETER, which you can currently take for a 20% discount.

————————-
Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

Strategies to Use and Pitfalls to Avoid When Evaluating Software Tester Performance

Mon, 02/09/2015 - 17:50

Note: The following is a guest submission to the uTest Blog from Sanjay Zalavadia of Zephyr.Performance-Review-Questions

As more businesses move to an agile model of software development, they will need an effective method of evaluating their testers’ performance.

The process of ensuring that critical software runs properly is often arduous. There are numerous considerations to take into account that could affect the program’s performance, such as what operating system it runs on and if it has to interact with other applications. Despite the challenges posed by comprehensive software testing, it is a critical aspect of the development process, and neglecting it could lead to disastrous results for an organization.

A series of bug-ridden software releases could considerably hurt a company’s standing within its industry as more consumers and business partners begin to associate it with low quality assurance (QA) standards.

Furthermore, the amount of time and resources needed to go back and make changes to software after it has already gone to market can significantly cut into an organization’s bottom line. This is why so many companies put a great deal of focus on optimizing their software test management efforts.

The need for a robust test management tool increases as more businesses move from a traditional linear form of software development to an agile model where testing is an ongoing process continuing throughout the various stages of programming, QA and production. There are many benefits to introducing agility into a software production effort, including the ability to better address new client demands and shifting market trends.

Fostering an effective testing process by evaluating testers

In order to ensure that these ongoing efforts are being carried out at the highest level, accurate testing is needed to check on the development progress and to eliminate as many bugs or errors as possible before they show up in a finalized version.

This can be especially difficult when either QA duties are outsourced to a team located on the other side of the globe. Test teams can easily become burdened looking through the code changes made or the bugs reported by the other unit at the outset of every working day, sapping precious time from the development process.

A real-time test management strategy allows those changes to be made across the network instantaneously, meaning teams on both sides of the world can be sure that they are working with the most up-to-date version or bug report available.

IT department leaders also need to be sure their QA teams are carrying out their duties effectively, and efficiently identifying software flaws when they arise. That is why it is a good practice to evaluate software testers on a regular basis. If a member of the QA team is creating incomplete or inaccurate bug reports, or failing to deliver them in a timely fashion, the entire production process can be delayed significantly. Creating a comprehensive evaluation process utilizing various software testing metrics can help organizations accurately identify poorly performing testers.

Researchers from the Institute for Computer Science at Germany’s University of Heidelberg outlined the components of an effective evaluation process. The review should be tailored to the specific parameters of a given project. Any constraints or uncontrollable issues that may affect a tester’s performance should be taken into consideration.

Additionally, any particular metrics or KPIs that may be important to the software’s success should be prioritized during the testing process. QA management officials can then create a test case and use test management software to gauge the ability of their employees to carry out their essential job functions. This way, organizations can be sure that their software development process is being conducted at an optimal level.

Company officials can also assess the performance of their software testers by looking at contributions to the production process. Not all testers need to be able to write test scripts, but for those who do, they should be creating effective tools which can be reused for later projects. This will save the organization on production costs further down the line.

Additionally, QA leaders should monitor employee activity to see if any individuals are carrying out redundant or overlapping duties. It should be noted, however, that these instances can be minimized by incorporating a sophisticated test management system to provide comprehensive oversight across QA operations.

Avoid tester evaluation pitfalls

QA leaders should also be careful about ascribing too much value to any one particular metric. For instance, many quality assurance efforts are measured by the number of flaws that are listed in bug reports. These figures can be misleading, however, as they are entirely dependent on the software itself and not the effort or performance of the testers.

More importantly, the number of bugs found during testing cannot provide a clear indication of the quality of the whole product. A QA team that finds fewer flaws but helps release a more usable and engaging software product will have successfully performed its duties. Above all else, QA teams will be evaluated in regard to their ability to shepherd high-quality software products to completion on time and under budget.

Sanjay Zalavadia is the VP of Client Services for Zephyr, who offers a real-time, full-featured test management system. Learn more about Zephyr right here. Be sure to also check out their tool review page over at uTest’s Tool Reviews.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

Latest Testing in the Pub Podcast Looks Forward to 2015

Fri, 02/06/2015 - 22:26

titp-round-1005x1024It’s been a while since the great folks over at Testing in the Pub have broadcasted, but they are back with their first edition for the new year.

In this episode, Stephen Janaway and company look forward to 2015 and talk about the conferences, meetups and other events that they are interested in this year.

Listen to the full podcast right here for download and streaming, and on YouTube and iTunes. Be sure to check out the entire back catalog of the series as well. And speaking of events, uTest is also interested in a bunch of them in 2015, some of those which were discussed during this podcast. To learn more about TestBash and other conferences throughout the year, check out our Events Calendar.

Categories: Companies

uTest Announces Behavior-Driven Testing Webinar with Anand Bagmar

Fri, 02/06/2015 - 16:00

webinaruTest is excited to announce another live webinar opportunity. Registration is now open for the webinar Build the “right” regression suite using Behavior-Driven Testing (BDT), with Anand Bagmar. In this webinar, participants can learn:

  • How to build a good and valuable regression suite for the product under test
  • Different styles of identifying / writing scenarios that will validate the expected business functionality
  • Automating tests identified using the BDT approach will automate your business functionality
  • Advantages of identifying regression tests using the BDT approach

Anand is a familiar guest blogger on the uTest Blog. His recent post Selenium: 10 Years Later and Still Going Strong takes a look at the ecosystem that Selenium has nurtured over the past decade.

Webinar Details

  • What: A live webinar presented by Anand Bagmar called, “Build the ‘right’ regression suite using Behavior-Driven Testing (BDT)”
  • When: Wednesday, February 18, from 2-3 p.m. ET
  • How: Register now. Seats are limited!

About Anand Bagmar

Anand is a hands-on and results-oriented software quality evangelist with 17 years in the IT field. Passionate about shipping quality products, Anand specializes in building automated testing tools, infrastructure, and frameworks. He writes testing-related blogs and has built open-source tools related to software testing— Web Analytics Automation Testing Framework (WAAT), TaaS (for automating integration testing in disparate systems), and Test Trend Analyzer (TTA). Anand is the lead organizer for vodQA, the popular testing conference in India. Follow him on Twitter or read his Essence of Testing blog.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

21 of the Most Popular Test Automation Blogs

Thu, 02/05/2015 - 19:09

We’ve put a lot of stock into the most popular blogs in the QA/testing circuit, but some of these focus exclusively on functional testing conceindexpts.

Steven Machtelinckx of his own blog TestMinded pointed out a great list from TestBuffet of the 21 most popular blogs from 2014 that focus exclusively on automation, an especially hot area in testing right now.

You can check out the full list right here. It’s a great roster, and one of the blogs on it is actually headed up by uTester Stephan Kamper.

Not a uTester yet? Sign up today to comment on all of our blogs, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

uTest Platform Update for the Week of February 2, 2015

Thu, 02/05/2015 - 16:30
“Perfection is not attainable, but if we chase perfection we can catch excellence.”

– Vince Lombardi

In our continued pursuit of chasing perfection in our tester platform, we’re pleased to preview this week’s latest platform release for uTesters on paid projects.

New Feature: Customizable Bug Report Template

Oftentimes when participating in a test cycle, there are specific requirements for how bug reports need to be completed. If a tester is filing multiple bug reports, this can result in the entering the same information again and again.

To help increase the efficiency associated with bug reporting, we have launched a new feature in the bug report form that will allow testers to create customizable bug templates (per test cycle), allowing users to configure:

  • Bug title prefixes or standard bug title content
  • Actions performed prefixes such as prerequisites or common steps to perform prior to your steps to reproduce
  • Additional environment information as requested by the Project Manager or Customer

To use this feature, you will need to access the bug report form. In the top right, you will now see two new links:  “Clear Form” and “Configure Template.”

Platform1

To create a template, click the “Configure Template” link. On this screen, enter the data you wish to see displayed in each bug report for that cycle and then click “Save Template.”

Platform2

To go back to the bug report, select the “Report Issue” button at the top right. The form will now be pre-filled with the template you configured for this test cycle.

If you like what you see, feel free to drop a note on the Forums to share your ideas on these and other recent platform updates.

Categories: Companies

uTest To Host Live Security Testing Webinar With Dave Ferguson

Tue, 02/03/2015 - 16:00

dave-fergusonuTest is happy to announce a new live webinar, Introduction to Security Testing + Live Demo with Dave Ferguson, scheduled for Friday, February 13, 2015 from 1-2 p.m. ET.

Dave is a former application developer and security consultant, and now a Gold-rated security tester in the uTest Community. Over the years he’s found too many vulnerabilities to count, including a particularly scary one at a top-tier streaming media company.

Dave is a member and contributor to the Open Web Application Security Project (OWASP) and resides in Texas, USA. He holds CISSP and CSSLP security certifications.

As you may notice, Dave is a familiar face on the uTest Blog. He was featured in our popular Testing the Limits series in 2014; you can read Part I and Part II and see what he says about the current security testing landscape. You can also follow Dave on Twitter or check out his blog, AppSec Notes.

In this webinar, Dave will describe important security testing concepts, including how it is different than functional testing, special skills and knowledge needed, and critical tools. A brief security test against a live web application will be demonstrated and participants will have a chance to ask questions in a Q&A segment.

Registration is now open for this webinar. Please note: You must be a member of the uTest Community to register for this webinar, as registrations will be checked against membership prior to approval. Not a uTest member? Sign up and then register for the session. Becoming a uTester is 100% free and you’ll also have access to free training, networking, the latest software testing news and a lot more.

Seats are limited to 100 participants. Registrations will be processed on a first-come, first-served basis.

Other video courses, including recorded versions of previous webinars, are also available at uTest University.

Categories: Companies