We here at the uTest Blog have long been a proponent of a harmonious tester-developer relationship. And according to a recent InfoWorld article, the best way to a dev’s heart this holiday season may be through testing their open source code.
According to the piece, “for proprietary software, the only option is to suck it up and hope your vendor will fix problems with a future release. But with open source software,” testers can actively be part of the action and “make contributions that lead to them being more effective for less effort.” This is the key to the developers’ hearts because devs on open source projects don’t have the support teams of robust enterprise players — they’re going to rely a lot on testers taking action versus complaining.
Here’s the main benefits of testing open source code this holiday season:
- It’s good for your blood pressure!
- It makes the experience better for users of the open source program worldwide
- It’s better than complaining about crappy code (devs get this feedback enough and in the open source realm, can’t do much with time and resource constraints)
- It turns user feedback into something constructive (a bug report) that can be quickly used for a fix
So enjoy the full read, and give back this holiday season by testing your favorite open source project. Your developer is guaranteed to love the gift.
It’s been a while since we last updated the testing and development world on the most popular devices amongst our community of 150,000+ testers. But we thought — what better time than the holidays to get your favorite tester a gift?
Testers within our community often want to know on which devices they should be testing. Concurrently, developers also want to know where their babies should be given the most love. Based on customer and tester data from our platform, here are the 10 most popular mobile devices on which Applause customers’ apps were tested in the past 90 days:
- iPhone 5
- iPhone 6
- iPhone 4s
- iPhone 5s
- Galaxy S4
- Galaxy S3
- iPad 2 Wi-Fi
- iPad Mini Wi-Fi
- iPhone 4
- iPad Air Wi-Fi
As you can see, the iOS experience continually dominates our customers’ worries (especially the new iPhone 6), but some of our honorable mentions are in the Android realm. These include the Galaxy S5, Nexus 5, and Nexus 7. As an added bonus, the Nexus 6 has also surfaced as a device that many customers are starting to seek in terms of coverage for their apps.
This of course will always fluctuate as soon as the next big device comes out (will wearables change the mobile scene next year?), which is why we plan to update you every quarter on the hottest devices. But on the flipside, as the iPhones 4 and 5 can attest to released several years ago now, it’s always a safe bet that as long as the device still has widespread adoption “in the wild,” there’s always a need for you to test your apps on it.
Happy Holidays, and enjoy your new mobile device when testing this season.
The holidays are upon us, but we here at uTest aren’t kicking back and relaxing. My colleagues and I have been hard at work building new features that focus on efficiency, and also optimizing existing features to improve performance. Here’s what was deployed today.Tester – More Payment Statistics
The uTest Community is a great social network to connect to other testers, but at the same time a great place for professional testers to earn additional income. Given that we have many testers and Test Team Leads working for uTest full-time, we want to provide better insight into their earnings — vital for any freelancer. Therefore, we have added an Average Monthly Earnings statistic to the payments view:
You will be able to see how much money you earned per month in the current year as well as in the past, giving you better insight into your actual income generated at uTest.Tester – Faster Navigation Between Cycles
The chat sidebar which lists your active and locked test cycles is the feature which testers interact with the majority of the time. As such, to improve navigation workflows in the platform, we have extended the existing navigation features to the chat sidebar:
The chat room names will now feature a navigation icon that will take you to that test cycle overview tab on left-click. The same feature has been added to open chat room windows, where we added the arrow icon to the top right of each chat room:
Now, when someone posts an announcement in chat and you need to review changes made to the Scope & Instructions, you can easily get there with just one click.
Note: This functionality is now completely replacing the former “Cycles in Progress” sidebar which was causing a number of UI issues.Tester – Performance improvements
Besides removing redundant network calls causing less traffic, we tweaked both the bug report form and the bug details pages’ performance.
The bug details should now load at least 25% faster than before, and the bug report form will open instantly when accessed multiple times in a row, which fits well into the usual bulk-bug-reporting workflows that our testers follow.
We are now caching more information and helping testers with slow connections to have a more enjoyable experience while reporting issues.TTL – Tester Messenger Template
We strive to constantly improve the workflows of our testers and our Test Team Leads (TTLs). Going forward, to improve the communication workflows between testers and TTLs, TTLs will have a message template that incorporates a greeting and a signature. This message template is configurable for TTLs so they can choose how they want to greet testers. Some will use the first name, some may use the last name, and some might not use this template at all (it is an optional feature).TTL – Quick Triage Buttons
TTLs that have the ability to recommend Approval/Rejection are generally sending the same recommendation on the bugs they triage again and again. This means that the majority of reported bugs are reproducible, in scope, seem to be valid and are not a duplicate.
From now on, TTLs will have three buttons in a dropdown on the recommendation button to allow quicker bug triaging for the common scenarios:
Selecting one of these options will pre-fill the form with the following options preselected:
The only variation between these presets is the reproducibility of issues which is varying more often.
TTLs are still able to manually send a recommendation, but this will save a lot of time for the 80% of triage situations.
If you like what you see, feel free to leave your comments below, or share your ideas on these and other recent platform updates by visiting the uTest Forums. We’d love to hear your suggestions, and frequently share this valuable feedback with our development team for future platform iterations!
With over 10 years’ software development and testing experience, Alexander Waldmann hails from Germany. He is a senior software developer, and global security team lead for Applause and product owner of the community testing services components for testers. Alex constantly strives to improve our business and the quality of our community’s output through a better user experience for our testers. He is also a Gold-rated tester in the community and has been a uTester for three years.
After a very successful initial launch earlier in the year, uTest is proud to announce the Testers of the Quarter for Q4 2014 in the community.
If you’ll remember correctly, this quarterly program exists solely to recognize and award the rock stars of our global community. Testers recently concluded voting for their peers and mentors, recognizing their dedication and quality work in various facets of uTest participation including test cycle performance, course writing and blogging.
In addition to the callouts below, you can also view their names now in our eternal uTest Hall of Fame, which also includes all past uTest award-winners from 2009 and beyond:
Outstanding Forums Contributors
Milos Dedijer, Serbia
Marek Langhans, Czech Republic
John Schultz, United States
Patryk Raba, Poland
Outstanding uTest University Instructors
Lucas Dargis and Allyson Burk, United States
Alex Siminiuc, Canada
Outstanding Test Team Leads (TTLs)
Romulo Olivera, Brazil
Stephanie Russon, United States
Carl Schrader, United States
Federico Capucci, Argentina
Outstanding Testers, TTLs’ Choice
Karthik Vallabhaneni, United States
Andrei Kouhan, United States
Nicolas Bellagamba, Argentina (tie)
Patryk Raba, Poland (tie)
A big congratulations to all of those that had the distinction of being recognized by their peers for Tester of the Quarter this time around. We cannot reiterate enough that there were also countless other testers that got individual praise along the way. While their names may not be here, their hard work did not go unnoticed!
Stay tuned for some exciting recognition programs within the community as we move into 2015, including our most prestigious awards in uTester of the Year, and the introduction of our very first Lifetime Achievement Award. Be sure to also leave your congratulations in the Comments below, or visit the Forums to see the full announcement…along with some of the tester praise that led to these distinctions!
QA analysts and IT firms are often confronted with the same question when testing a software project — whether to go with manual software testing options or to try out new automated techniques.
In certain situations, there are clear advantages to working with automated software testing solutions, and other times the automated software technology is too leading-edge and could wind up costing you way more than it’s worth. That’s why it’s essential to weigh the costs and benefits according to each project.
Manual Software Testing
Manual Software Testing is the process of going in and running each individual program or series of tasks and comparing the results to the expectations, in order to find the defects in the program. Essentially, manual testing is using the program as the user would under all possible scenarios, and making sure all of the features act appropriately.
The process can be rather tedious to select every setting within a software package. For instance, you might be testing global software for 300 countries, so you need to make sure each of the countries aligns with the appropriate currency. To manually test this, you would select a country and look to see that it has the appropriate currency. But if a program might only have a few options, it would be much more manageable to manually run through the selections and the outcomes of each selection.
Usually, when you’re working for a small company with few financial resources, this is going to be your best option. A big advantage of manual testing is the ability to see real user issues. Unlike a robot, when developing software, going through manually allows you to see bugs that a real user could face.
Manual testing also gives you a lot more flexibility. Using automated tools, it’s difficult to change values in the program once you’ve began testing. When manually testing, you can quickly test the results and it allows you to see which ideas work the best.
In general, automated testing wouldn’t make sense for short-term projects because the upfront cost is too high. In addition, if you’re testing for things that require the human touch like usability, it’s better to have a ‘human’ tester. Companies that have little expertise in the area are also recommended to begin with manual testing. Once the team has mastered testing risks and test coverage, they can then move toward automation.
Automated Software Testing
Automated software testing uses automated tools to run tests based on algorithms to compare the developing program’s expected outcomes with the actual outcomes. If the outcomes are aligned, your program is running properly and you are most likely bug free. If the two don’t align, you have to take another look at your code and alter it and continue to run tests until the outcomes align.
According to John Overbaugh, a senior SDET lead at Microsoft, “It only makes sense to use automated testing tools when the costs of acquiring the tool and building and maintaining the tests is less than the efficiency gained from the effort.”
Automated testing is best to use when you’re working on a large project, and there are many system users. The biggest advantages of automated testing are its relative quickness and effectiveness. Once your initial test is set up, it’s an easy process to repeat tests, continuously fill out the same information and everything is done for you automatically.
Richard Fennel, the engineering director at Black Marble, explains the significance in using automated tools: “The addition of automated testing has helped to shorten the delivery cycle, as we are no longer limited to the slow and complex SharePoint development experience. It has not removed the need for the traditional development cycle completely, but much of the validation, particularly for web parts, has been made far easier.”
Automated testing also keeps developer minds more intrigued. It’s tedious work typing in the same information into the same forms over and over again in manual testing. The process of setting up test cases in automated testing takes a technical mind, and keeps you on your feet. It’s also more applicable to the rest of the team. When using automated tests, any member of the team can automatically see the results published to the testing system. This provides for better team collaboration, and a better overall product.
There are quite a few considerations to make based upon your project when deciding to go with manual or automated testing. Make sure you make an educated and well-informed decision that’s best for your specific project needs.
Do you have any other suggestions on whether to choose automated tools or just go manually? What’s worked for you in the past? Feel free to sound off in the comments below.
Eli Lopian is the Founder and Head of Products & Technology at Typemock. He enjoys a white board, code and transforming developing environments. Secretly, his one true love is Unit Testing and he has dedicated his life to making unit testing easier for everyone else.
Scripted testing naturally seems like it’s a match made in heaven just for the novice tester.
After all, you have steps and directions clearly defined — wouldn’t the inviting structure to the scripted testing compensate for a lack of experience on the part of the tester? Not necessarily, if you ask our uTesters, whom recently approached the topic in a lively Analyze This testing debate in our uTest Forums.
Most of our community members found that while experienced testers may be spending their time creating test cases and junior testers executing them, there were several notable reasons as to why executing these important steps can’t just be left exclusively to the novice.
Junior Testers Can’t (Yet) Add Value
There may be improvements that need or could be made to streamline the test case process, according to one tester, that the novice couldn’t even yet imagine:
Given sufficient time, it is desirable for senior testers to executed scripted test to 1) identify test cases which were already obsolete (have them deleted) or are not as effective anymore (rewrite or re-organize some test cases so they will be tested differently from that point on) and 2) catch new test cases which were not yet scripted.
Their Abilities are a Wildcard
Forrest Gump was all about the philosophy that ‘life is like a box of chocolates. You never know what you’re going to get.’ Color the novice tester as one of those chocolates:
In the end, these are brand new testers… what are they missing? You just really don’t know. Maybe not because they are junior, but because you just don’t know their ability yet. Or their reliability, either. It is always good to have that more experienced eye to give your scripted testing a run to make sure that you have the coverage you need.
It’s Involves Going Above and Beyond the Test Case
Does the junior tester have the ability to go above and beyond the written test case and cover things outside of the explicitly written instructions? If not, they may not be doing their job as a tester:
If they are really new to the field of testing, then whatever the script, even if well-written and detailed, will be of no help to stakeholders if the tester cannot think and test outside of the steps. It’s not all about what is on the script that matters.
What do you think? Should the work of the scripted test case fall into the hands of the junior tester exclusively?We’d like to hear from testers outside of the realm of the Forums in the comments below.
uTest is proud and excited to report that Governor of New York Andrew Cuomo’s Regional Economic Development Council has awarded $1.8 million to Per Scholas and its Urban Development Center (UDC) partner Doran Jones. The grant will help fund a software testing center currently being built in New York, bringing 150 software testing jobs to The Bronx.
“We are excited to receive this investment from the state for our expansion plans of the Urban Development Center,” said Keith Klain, CEO of Doran Jones in a Per Scholas press release. “It’s validating our vision for the next New York tech corridor in the South Bronx.”
Per Scholas is a non-profit with the mission of breaking the cycle of poverty by providing technology education, access, training and job placement services for people in underserved communities. You may remember that uTest was in The Bronx, NY, earlier in 2014 celebrating the groundbreaking of the $1 million, three-story, 90,000-square-foot software testing center. We’re proud of what Per Scholas, in conjunction with Doran Jones, is accomplishing by bringing valuable software testing jobs to areas that most need this employment.
A huge, well-deserved congratulations to Per Scholas and Doran Jones! You can also read the full Per Scholas press release for more details.
If you’re an Android user with a recent phone, chances are you’ve already played around with some of the cool features of Android 5.0, officially dubbed ‘Lollipop.’ If not, don’t worry, Galaxy S5 and other phone users, your time will be coming soon.
But as a tester or developer, there’s not much out there on what those changes mean for you, so we’ve compiled some new resources from uTest University and our friends at ARC not only about the fancy, shiny new things available with the new version of Android, but specifically what testers and devs need to know:
- What Software Testers Need to Know About Android Lollipop
- The Features of Android 5.0 Lollipop
- The App Developer’s Guide to Android Lollipop (e-book, requires registration)
While you’re checking out what testers need to know about Lollipop at uTest University, be sure to also check out all of the Android testing courses available as well.
Our goal each month is to recognize uTesters that have joined our community in the past two months and have shown that they are on their way to becoming testing rock stars.
We looked at how uTesters performed in test cycles and how they engage with the community when we considered potential candidates. Here are a few of the items we factored in:
- How often did you participate in cycles that you accept?
- What is the quality of your submitted work? (See our uTest University course for more details)
- How efficiently did you complete your projects?
- What was your interaction level on the various uTest domains (Forums, University, Events, Tools)?
All newly registered uTesters who have a Proven or lower Functional testing rating on paid projects are eligible for this recognition.
Without further ado, here are December’s Rising Stars:
Please help us in congratulating our awesome new uTesters, and be sure to stop by and say hi over in the Forums.
uTest recently conducted a survey within its own community for the first time, surveying uTesters on what motivates them, their testing aspirations, their views on certifications as a whole, and some of the biggest pain points in their organizations.
The survey was not scientific and wasn’t designed to be (our corporate brand lovingly took that on earlier this year surveying the greater testing industry), but rather aimed to provide the outside world a glimpse into what makes our community tick and give uTesters insight into what drives their peers.
The survey was launched on the uTest Blog on November 7, and submissions ran for just over two weeks. There were 125 total respondents — 80 male, and 44 female — ranging from entry-level QA/testers to senior-level test analysts.
The major qualification here was that respondents not only had to be a uTester, but make their primary source of income as a software tester. Here’s the story the submissions told us.To Certify or Not Certify? That is the question.
Certifications as they relate to testing are something that couldn’t be any more polarizing to testers, and the story was no different in the 2014 State of the uTesters survey.
Out of the 125 respondent pool, 44% actually did have a testing-related certification, and 46% did feel like having one was important. But when we dug into why uTesters thought these certifications were important, the reasons were that they were a door-opener…and not much of anything else:
“It’s used only to filter out the resume during the interview process. I believe testing certifications do not check your testing intelligence, but rather, check for your theoretical knowledge. It benefited me during the interview process.”
“It helped me find my first job in a foreign country (Canada) where everyone insisted on North American experience. But skill-wise? It hasn’t benefited me in any way, AT ALL.”
While it may be surprising that so many of our uTesters have testing certifications, their stance once they have the certificate in hand falls in line with many testing experts including The Social Tester, whom recently told uTest that “employers [use] certifications as a filter because it makes recruitment easier and it gives them a benchmark of assumed competency within the team…It will take massive change by hiring managers to see the problems with certifications and to ask ‘is there a better way of hiring good testers?’
Perhaps hiring managers will start with this survey as a starting point for this change when they see that our testers view them as pretty worthless after they are granted their job.Job Titles
James Bach would be angry if there were a lot of ‘testers’ in our survey with a QA label. The phrase QA or Quality Assurance appeared in nearly half of the job titles in our survey.
QA Engineer was the most commonly found job title in the results, followed closely by Software Testing Engineer. It is interesting to note that the latter title was used predominantly in job instances of testers outside of the United States, while the third-most-popular title of QA Analyst (or a similar variant with Analyst in the title) was a phenomenon very unique to the United States.My biggest pain in the…
When it comes to organizational pain points, the State of the uTesters 2014 survey didn’t beat around the bush — we flat-out asked testers for their candid feedback on what’s bothering them the most in their companies.
By a resounding margin, the biggest gripes centered around having enough time to test, coupled with a lack of resources (they go hand in hand; having more of the latter would help alleviate the former):
“When other parts of a project are late, the installation date stays the same and testing time is cut to make it. Any bugs found in production are thus QA’s fault for not catching them – not development’s for putting them there.”
Pay was also cited (especially in respondents from Eastern Europe) as a secondary concern, although not nearly as widely as being overtaxed on testing activities.
“I feel we are not paid as well as programmers, but are expected to know much more than a programmer who may be a master at one aspect of a program, while testers are expected to know ALL. We juggle project deadlines, builds that are late and are often expected to work nights and weekends. I feel we should be compensated much more than we are.”What I want to be when I grow up…
“I would love to say that I drew a software tester in my 1st grade art class when the subject was, ‘What I want to be when I grow up,’ but no. I came into testing by accident, and got invited to an interview while I was still a college freshman. I took the job and loved it.”
While some of our testers weren’t breathing software testing as young ones, they certainly still retained the sharp creative wit, as evidenced by this response, that would serve them well in a testing career later on. Most of the testers in our 2014 edition of the survey got into testing by chance. The second and third reasons were career switches: Switching from development to testing (rather surprising!) and starting in IT at another capacity (i.e. tech support) and moving to testing, respectively.I can’t get any respect?
If there were any lingering notions that testers were the redheaded stepchilds of their organizations, you wouldn’t have found them buried in the answers of the inaugural edition of this survey — a whopping 75% of respondents felt that not only did their organizations value QA/testing as a whole, but as a tester, their career paths were quite clear within their own organizations.
However, climbing the ladder wasn’t something as in-reach for the 25% of testers that felt there wasn’t much room for advancement. In some testers’ personal situations, the organizational makeup was “pretty flat.” There wasn’t room for advancement in these instances purely because the positions don’t exist.The Road Ahead
As we look to 2015, we want to hear your own stories outside of uTest.
Do these scenarios ring true within your organization? Can you identify with some of the challenges? Is there something completely out of whack with the State of the uTesters in 2014? Sound off in the comments below, and we hope you enjoyed reading our inaugural report.
When I started to do accessibility testing, I would rely too heavily on HTML validators to verify if a site is compliant. I would use the web developer toolbar, W3C validator, and would do some basic testing with a screen reader, not fully understanding the complexity of the needs of the end users.
As I gained more knowledge of the end users and the constraints of technology, I had to take a step back from reliance on the tools due to the many false negatives, and test the errors manually to ensure I reported actual bugs. For instance, the W3C validator would claim a variable had been duplicated, but one “duplicate” would be in a comment describing the variable — not a bug.
I have also found that what is marked as a semantic error is not an issue for assistive technology, but may be an issue if semantically correct (like an aria-label and a title for an href will have both read out). There are also some HTML validators, like the WAVE tool that will highlight errors that other validators will not highlight – so which one is correct?
Some other tools, like color ratio analyzers, will not highlight valid bugs, or mark a failure that is not actually incorrect. For instance, they will sometimes mark two colors as breaking the 1.4.3 checkpoint, but when you check the colors used by inspecting the elements, you find they do not actually break the checkpoint. One such example is the Juicy Studio tool — the Firefox add-on tends to have the result slightly out from the actual ratio compared to their website analyzer. Often, I will note the highlighted error, and then validate the actual colors used on the website to see if a failure is correct (which is also mentioned in a review for the tool).
Finally, another reason I make sure I can test without the tools is that some websites will prohibit the use of them, or, if they can be used, the website content is in a widget that cannot be validated. In such cases, a validator is of no use at all. Instead, you will have to test the site and note the failures manually. If you heavily rely on validators to do the work, this will be a difficult task.
I have found the best way to be a good tester in any discipline is to be able to check issues found manually, and to work out if the issue is valid in the scope given. This is something that can take years to understand and learn. One way I have found to help testers understand the complexities of issues, and their impact, is to search on technical forums like Stack Overflow, to see if others have found the issue, how it can be resolved and how relevant it is to the objective of the cycle you are testing.
In short, tools are a good starting point for testing, but should not be relied on to do the testing for you. There are always examples of bugs causing developers massive headaches, and are only “valid” to the tool that noted the issue. Just make sure you do not fall into this trap and treat tools as the friend that is not always reliable!
Helen Burge has been testing for over 10 years in a mixture of disciplines including functional, accessibility (WCAG 2.0 checkpoints using JAWS or NVDA), performance and automation. She is a recognized accessibility expert with several articles in uTest University.
Back in May, you may remember that our company rebranded to Applause and that we, the uTest team, relaunched as an open community that exists to promote and advance the software testing profession.
As Matt Johnston, Chief Marketing & Strategy Officer, noted in his blog post at the time, “In line with this new mission, we will build a million-member testing community. Together, we’ll create the most inclusive, informative and important brand in the lives of testers. This means much more than simply getting paid projects.”
We introduced a brand new uTest.com back then and have since introduced many other enhancements over the past six months, such as:
- A revamped Events calendar
- “Follow Me” capability for community members
- Leaderboard for members with the most followers
- uTest.com Profile improvements
- A recommendations feature that allows fellow testers to commend each other for great work
Now that we are approaching the New Year, it’s time to look back at our year and to look forward at the exciting things we have planned for 2015. Join Matt and members of the Community Management team on Tuesday, December 16, 2014, for our end-of-year Town Hall Meeting that includes a preview of upcoming feature enhancements to uTest.com and a live Q&A session.
Register for the time that works best for you. Seats are limited, so please register for one time slot only:
A recording of the session will be posted on our site for community members who cannot attend the live sessions. We look forward to speaking with all of you on Dec. 16!
Keeping up a career in software quality assurance (QA) and software testing means a constant balance of keeping up with the latest technology, a keen attention to detail and process, and understanding how to communicate (well) with the customer and testing team.
For this contest, we are looking for your best software QA and software testing career advice or tips for a chance at a $250 cash prize! Ideally, your entries will have a minimum of 600-800 words and include cited sources where necessary.
Sample topics may include the following (but we strongly encourage you to come up with your own!):
- 5 tips to ace the interview
- Resources to improve your testing abilities
- 3 bad testing habits you need to break
You can submit your entry directly to this thread in the uTest Forums (MS Word or plain text files preferred) by the end of day Friday, December 19, 2014! The winner will be announced the week of January 5, 2015.
A few small rules:
- You must be a uTest community member. If you or a co-author are not a community member, you can register today.
- The prize is awarded to the author or authors listed in your entry. If your submission contains more than one author, the prize money will be split amongst each person evenly.
- Your entry must be original content. Any entries that are found to be plagiarized will be disqualified.
- One entry per person, please!
Any questions about this contest? Contact us.
Our friends at BlazeMeter last week hosted a live session giving testers and developers everything they need to run performance testing with the popular open source load performance testing tool JMeter. And we’re happy to share the session here on the uTest Blog.
The hour-long session starts with an overview of performance testing, then moves onto how to run performance testing with JMeter, why it’s worth using BlazeMeter with JMeter, and concludes with a Q&A hosted by BlazeMeter’s Ophir Prusak.
BlazeMeter is a proud uTest partner and provides next-generation, cloud based performance testing solutions that are 100% Apache JMeter™ compatible, and was founded with the goal of making it easy for everyone to run sophisticated, large-scale performance and load tests quickly, easily and affordably.
After viewing, you can also check out our Load and Performance Testing course track at uTest University for even more in-depth learning.
Testing on a smartphone or tablet is a common occurrence as more and more developers produce mobile apps. Mobile testing is seemingly ubiquitous these days. That being said, there are always new ways to sharpen your skill set when it comes to mobile testing.
Whether you are new to software testing or are a veteran tester, the mobile testing course track in uTest University has something for everyone.
What are the differences between iOS and Android testing?
This course reviews the main characteristics of iOS and Android, and outlines the impact of those differences to testing. You can also learn tips and hints for testers, such as how to install an app, how to capture screen shots and video, and how to access log files.
Android Debug Tools for Capturing Logs, Screenshots, and Videos
This course provides instructions of how to accomplish the most common tasks when it comes to testing on the Android OS.
How to Set Up Charles Web Debugging Proxy
The two courses for this testing tool — one for iOS devices and one for Android devices — look at how to install Charles, as well as how to find proxy information and update the settings for your device and computer.
Check out these courses and more in the mobile testing course track in uTest University!About uTest University
uTest University is free for all members of the uTest Community. We are constantly adding to our course catalog to keep you educated on the latest topics and trends. If you are an expert in UX, load & performance, security, or mobile testing, you can share your expertise with the community by authoring a University course. Contact the team at email@example.com for more information.
James Donner is a Gold-rated tester on paid projects at uTest, based out of the United States. He is a professional driver, hobbyist programmer, software tester and perpetual learner. James has been interested in computer software for many years, and like many other uTesters, has a B.S. in Computer Science.
Be sure to also follow James’ profile on uTest as well so you can stay up to date with his activity in the community!
uTest: Android or iOS?
James: Wait a minute…no Windows Phone option here? I love, and keep at least one of, all three. In the past, I’ve always went with Android phones and iOS tablets. I think the iPad has always been the best tablet. I’m about to shake things up by going with a 5c as my main phone, and I’ve just ordered a Nexus 9 tablet. I really enjoy variety and testing on any/all of these devices.
uTest: What drew you into testing initially? What’s kept you at it?
James: I’ve always been interested in computers and especially computer software. I’ve dipped my feet into programming and published some very simple native mobile apps. I just wanted to try the other side of things and see if I liked to break things. It turns out that I do.
I’m only a part-time tester via uTest right now. What keeps me coming back at this point is the wide variety of software we get to test. And testing also helps me to justify purchasing some of the new devices I love so much. It’s nice to make a little money, too. I’m used to paying a good bit in order to take classes, so the paid software testing flips this around a bit.
uTest: What’s your go-to gadget?
James: I’ve settled on a MacBook Pro running Boot Camp as my main device. Although I still have several desktop systems and lots of mobile devices, I use this laptop daily. It is nice to be able to boot up Windows or OS X on the same system that I can take anywhere. I may receive hate mail for mentioning this, but I mainly run Windows 8.1 on my Mac. It actually works quite well.
uTest: What is the one tool you use as a tester that you couldn’t live without?
James: I think the most common answer to this question is without a doubt Jing. It is the one tool everyone uses and I’m certainly no exception. However, I also use Any Video Converter on a regular basis. I frequently capture videos using any mobile device I have handy. The thing is, most of these videos can bulk up size-wise very quickly. Aside from long upload times, there are limits to the file size that can be uploaded in the uTest platform or even required by the customer. Any Video Converter is freely available and does a great job bringing the file size down to a reasonable size for most of my bug report attachments.
uTest: What keeps you busy outside testing?
James: As I’ve already mentioned, I’m currently only a part time tester. My day job involves driving a dump truck, or on occasion, running or hauling a piece of heavy machinery. In fact, I can legally drive any vehicle ranging from a motorcycle to a tractor trailer.
Additionally, I have two boys ages 13 and 8, so I try to spend as much time as possible with them.
You can also check out all of the past entries in our Meet the uTesters series.
If you’ll remember, this quarterly program exists solely to recognize and award the rock stars of our global community, and differs from uTester of the Year in that it puts the power of nominations directly in the hands of our testing community.
Testers can recognize their peers’ dedication and great work in various facets of their participation at uTest from course writing and blogging to test cycle participation, and recognize mentors who have helped them along their testing journey on paid projects at uTest.
Once the nominations are tallied up, winners will have their name forever enshrined in the uTest Hall of Fame, our recognition hub for the uTest Community, amongst prior Tester of the Quarter and uTester of the Year winners.
If you are a uTester, submit your nominations now through December 15, or just drop by to see what great things testers are saying about their peers. Don’t forget to also see our inaugural Testers of the Quarter.
Good luck to all!
Rob Lambert (aka The Social Tester) is a veteran Engineering Manager building a forward-thinking, creative and awesome team at NewVoiceMedia. His mission is to inspire testers to achieve great things in their careers and to take control of their own learning and self development.
Rob is the author of Remaining Relevant, a book about remaining relevant and employable in today’s testing world. Rob is a serial blogger about all things product testing on his own site, and is also active on Twitter @rob_lambert.
In this uTest interview, Rob discusses what makes a passionate tester, what holds testers back from getting the jobs they want, and the power of social media in the testing world. At the conclusion of the interview, you’ll also receive a link to an exclusive discount for the purchase of his book ‘Remaining Relevant.’
uTest: You recently posted that one of the eight lessons you learned from building and growing a test team was finding ‘people with a passion for testing.’ What is a ‘passionate’ tester to you?
Rob Lambert: I believe passion shows itself in a number of different behaviors. The first behavior to observe is that of a deep curiosity for the work being done and the surrounding environment people are working in. I look for people who wonder what the other testers do. They ask, “How could I do this better?” and “What problem does the software solve?” and “What does this company I work for do?”
The second behavior to observe is one of self-driven learning. Testers who are passionate about their career aim to be better at what they do everyday. They read, watch and study topics and subjects that help them to become better testers.
A third behavior to observe is one of acceptance — acceptance that being a tester is OK. Too many testers believe that testing is inferior to other software project roles and that being a tester is a terrible job. Passionate testers show that they are happy and proud to be a tester.
The big challenge I have is working out whether it’s the passion for testing that drives these behaviors or whether exhibiting these behaviors fuels the passion.
uTest: And you also don’t think highly of certifications. What is it going to take for employers to stop seeking them? They obviously still wield power in many organizations.
RL: It’s not necessarily the certification and associated schemes that I don’t think highly of — it’s the way the industry uses the certifications as a marker of excellence.
In my wide experience of hiring testers, I can categorically say that I’ve never seen a correlation between excellence and the possession of a certification. But that doesn’t mean I don’t think the certification could be useful, especially when used in conjunction with a wide blend of learning resources. The problem is that many employers simply use certifications as a way to filter candidates – and they are losing out on hiring good testers.
Many employers though will not stop using certifications as a filter because it makes recruitment easier and it gives them a benchmark of assumed competency within the team, especially important if working on compliance-based projects. Some recruiters, hiring managers and the certification providers seem to be living in a happy symbiosis. It will take massive change by hiring managers to see the problems with certifications and to ask “is there a better way of hiring good testers?”
I refuse to get into recruiting and managing based on a certification. I’d rather focus on behaviors and results. In my experience, those testers that show valuable behaviors (good testing, good communication skills) and achieve the results we’re after (good software, effective teams) don’t tend to put their careers into the hands of hiring managers looking for certified testers.
uTest: You’re ‘The Social Tester.’ What have been some of the more rewarding experiences socializing with fellow testers at industry conferences?
RL: In my opinion, socializing at conferences is where the real conferring happens. The conversations happening outside of the conference talks are where the really interesting conversations take place. I’ve made lifelong connections with interesting and thought-provoking people at conferences. I may only see these people once a year, but it’s like we’ve been old friends when we meet up.
My fondest memory was at a conference about four years ago when I met two close friends in the testing world, Kris and Alex. Every time we meet up now at an event, it’s fabulous and it makes the event so much more rewarding. A conference shared and enjoyed with friends is, in my opinion, a much more rewarding conference.
uTest: On a side note, you also designed a pretty damn cool ‘It Works on My Machine’ t-shirt. Have you bought a bunch already for your developer friends?
RL: Thanks – glad you liked it. I haven’t bought any yet for my developer friends, but I did have a reader put in a request for quite a few orders. He was buying each and every member of his development team a t-shirt.
uTest: From your deep experience in hiring testers, your book focuses on testers taking control of their careers. What’s one of the biggest things holding testers back today from finding the job they want?
RL: I think the biggest thing holding most testers back from finding the job they want is a lack of confidence in their abilities. Many testers are much better at testing than they believe themselves to be. I wrote the book to help people draw out their abilities, skills and goals and to then help them to clearly articulate and shout about themselves.
There are so many talented testers working for companies that simply don’t see this talent or allow it to grow. I believe this has the effect of teaching testers, especially early in their careers, that their skills and abilities simply aren’t useful in our industry. The opposite is true — we need more people to show the testing community what skills they have. Only then will we nudge or industry forward and find new ways to solve tricky problems.
uTest: The book also offers career advice on joining social media communities. Why is social media especially an important medium for testers looking to bolster their careers?
RL: Social media is a hugely important medium because it’s a set of communication channels with epic reach. Whether that’s Twitter or Google Plus or Instagram, the chances are that you can find your tribe anywhere in the world and communicate with them.
Discussions about testing are happening every day on social media between lots of people all across the globe. This helps drive forward our understanding about how others are testing and how we can move our industry forward. It’s also democratizing learning sources and information, meaning more free content is available than ever before at the click of a mouse.
Social media also empowers people to find their niche tribes online and create strong relationships. If you’re into test automaton, there’s a community for that. If you’re into test automaton using a specific tool whilst wearing shorts, there’s most likely a community for that – if not, you can create one.
Social media can also lead people to online communities like uTest, a platform for discussions and learning. It’s also more than a platform for learning — it’s a platform for people to solve tricky testing problems in commercial environments, and it’s the social web that is helping to make these platforms and services viable.
A uTest-exclusive discounted price of $2.99 (originally $8.99) is also available for Rob Lambert’s book, ‘Remaining Relevant and Employable in a Changing World – Testers Edition.’
Iwona Pekala is a gold rated full-time tester on paid projects at uTest, and a uTester for over 3 years. Iwona is also currently serving as a uTest Forums moderator for the second consecutive quarter. She is a fan of computers and technology, and lives in Kraków, Poland.
Be sure to also follow Iwona’s profile on uTest as well so you can stay up to date with her activity in the community!
Iwona: Android. I can customize it in more ways when compared to iOS. Additionally, apps have more abilities, there is a lot of hardware to choose from, and it takes less time to accomplish basic tasks like selecting text or tapping small buttons.
uTest: What drew you into testing initially? What’s kept you at it?
Iwona: I became a tester accidentally. I was looking for a summer internship for computer science students (I was thinking about becoming a programmer). The first offer I got was for the role of tester. I was about to change it, and I was transitioned to a developer role after some time. It was uTest that kept me as a tester, particularly the flexibility of work and variety of projects.
uTest: Which areas do you want to improve in as a tester? Which areas of testing do you want to explore?
Iwona: I need to be more patient and increase my attention to details. When it comes to hard skills, I would like to gain experience in security, usability and automation testing.
uTest: QA professional or tester?
Iwona: I describe myself as a tester, but those are just words, so it doesn’t really matter what you call that role as long as you know what its responsibilities are.
uTest: What’s one trait or quality you seek in a fellow software testing colleague?
Iwona: Flexibility and the skill of coping with grey areas. As a tester, you need accommodate to changing situations, and you hit grey areas on a daily basis. It’s important to use common sense, but still stay in scope.
You can also check out all of the past entries in our Meet the uTesters series.
Last week, uTest launched two new Platform features for uTesters on paid projects which continue to drive the needle in our continuous pursuit of quality (plus a very useful change to existing tester dashboard functionality). Here’s a recap of what is included in the latest uTest Platform release.Bug Report Integrity
Most testers understand the role of a bug report is to provide information. However, a “good” or valuable bug report takes that a step further and provides useful and actionable information in an efficient way. As such, in addition to approving tester issues, Test Team Leads (TTLs) and Project Managers (PMs) have the ability to rate the integrity of a tester’s bug report by setting the bug report integrity to High, Unrated or Low. However, by default, all bugs will be set to Unrated.
The Bug Report Integrity feature will reward testers who meet a high report integrity standard by providing a positive rating impact to the quality sub-rating. Conversely, we will also seek to educate testers who may be missing the mark by negating any positive impact that may have occurred based on the value of the bug itself.
For more information, please review the Bug Report Integrity uTest University course.Tester Scorecard
When navigating into a test cycle, you will see a new tab called “Tester Scorecard.” Clicking this tab will bring up a ranked list of testers based on their bug submissions and the final decisions on these bugs — i.e. approvals and rejections.
Points are awarded according to the explanation at the top of the Scorecard and result in a score that is used to rank testers based on their performance. Sorting the table by any of the columns is possible. If two testers have identical scores (i.e. same number of bugs approved at the same value tiers), the tester that started reporting bugs first will be first in the ranking with same point scores.
Our hope is that this Scorecard will spark some additional competition among top performers and will also be useful for PMs and TTLs to choose testers for participation bonuses. Of course, it is still at the discretion of the TTL or PM to decide who won any bug battles or is eligible for any bonus payments.
Note: Scores indicated on the scorecard do not impact the tester’s rating.
Additionally, there was an improvement to existing functionality within the tester dashboard. Pending payouts are now included so that testers can easily see how much they have earned:
If you like what you see, feel free to leave your comments below, or share your ideas on these and other recent platform updates by visiting the uTest Forums. We’d love to hear your suggestions, and frequently share this valuable feedback with our development team for future platform iterations!