Skip to content

Feed aggregator

User Conference on Advanced Automated Testing (UCAAT), Munich, Germany, September 16-18 2014

Software Testing Magazine - 12 hours 12 min ago
The international ETSI User Conference on Advanced Automated Testing (UCAAT) is a software testing conference dedicated to advanced test automation. Jointly organized by ETSI Technical Committee “Methods for Testing and Specification” (TC MTS) and QualityMinds, a software testing company, the conference introduces the latest innovations made in test automation. In the agenda you can find topics like ” From Test Legacy to Model-Based Testing – How to refactor an existing test repository into an MBT model?”, “Strategy-driven Test Generation with Open Source Frameworks”, “Property-Based Testing”, “Deploying MBT-based test automation in an ...
Categories: Communities

On-Demand Webinar: 5 Keys to Improving Visibility into Risk

The Seapine View - Fri, 08/22/2014 - 00:02

Thanks to everyone who participated in the “5 Keys to Improving Visibility into Risk” webinar. The webinar recording is now available if you weren’t able to attend or if you would like to watch it again. Additionally, the questions and answers from the webinar follow the video.

Questions & Answers

How should I address concerns internally that improving risk identification and visibility will provide fodder to litigants in the event of a lawsuit?

Lawsuits are a reality of life. Unfortunately when something goes wrong, some people want to place the blame somewhere beside themselves—even if every reasonable safety precaution was taken by the manufacturer (including written instructions, which are often not read). My take on this issue is each of us, as a trustee of our company’s intellectual property, should document all of the “reasonable” precautions we take in manufacturing our product. Courts look for negligence. For me, documenting and managing potential risks, and being able to show that I’m doing so, is better than not being able to produce anything showing the safety of those products or, worse, hiding data.  Managing and documenting a sound Risk Management process will help more than it could hurt in any potential litigation. After all, being able to show how you eliminated, mitigated, and chose to accept a potential risk will always look better than having no plan at all. Remember that data is not just something to hit you over the head. The numbers and data can’t be changed and can actually prove your side in a potential litigation.

Shouldn’t risk mitigations also feed into the Product Requirements specification on the schema diagram that was shown?

Yes. Part of a sound risk management strategy is not simply identifying the risk but also mitigating that risk. Often, adding additional requirements are a way in which a risk might be mitigated. In addition, analysis of Product Requirements may in and of itself identify additional risks based on the feature set being implemented.

How hard is it to integrate TestTrack with Safety Assurance Case development tools?

Each integration is different, and the difficulty depends on the tool being integrated with TestTrack and exactly what the integration needs to accomplish. However, TestTrack has a robust API that enables nearly unlimited integration possibilities. If you have a specific integration need, contact our Sales team and they can provide more information about integration services.

Are the Severity field values you showed configurable, to enable organizations to match TestTrack with their risk processes and procedures?

Yes, all TestTrack field names, labels, and values are configurable.

Can RPN factors be tailored to include other factors, such as Detectability of Hazard should the risk be realized?

Yes, the RPN score field was using TestTrack’s calculated fields. To learn more about calculated fields, browse related blog posts.

Are dFMEAs and pFMEAs linked or are they both their own risks?

dFMEAs and pFMEAs are their own type of risk. You would do a dFMEA on your designs and a pFMEA on your processes. While they are similar, they each address different types of risks to a system.

Are there Word templates or some other capability to get reports to external clients?

Yes, you can save any of the reports that TestTrack generates as HTML or PDF files and email those to clients. You can also provide limited access to external clients, enabling them to run reports in real-time without accessing other parts of the system.

Share on Technorati . del.icio.us . Digg . Reddit . Slashdot . Facebook . StumbleUpon

Categories: Companies

The 10 Hottest Devices for Mobile App Testing

uTest - Thu, 08/21/2014 - 22:05

mixedTesters within our community often want to know on which devices they should be testing. Concurrently, developers also want to know where their beautiful creations should be given the most love.

Thankfully we have a magical data team that can take any request we throw their way, and give us such statistics on the hottest devices requested by our customers.

We sent such a request over to our trusty data team, and magically (for me, anyways, as an English/Communications major), they came back with this list of the 10 most tested mobile devices at uTest. The criteria for this data were the devices (both phones and tablets) on which the most bugs were filed in the past 30 days. Here’s the top 10 in order of popularity:

  1. iPhone 5 (on iOS 6.1 all the way up to 8.0 beta, but those on 7.0+ were tested far more) (phone)
  2. iPhone 4s (on iOS 6.1.2 all the way up to 8.0 beta, but again, only a handful of testers had versions lower than 7.0) (phone)
  3. iPad (late 2012 model & iPad 2) (tablet)
  4. iPhone 5S (on iOS 7.0 up to 8.0 beta)
  5. Galaxy S4 (phone)
  6. iPhone 4 (on iOS 6.0 up to 7.1.2, but vast majority of these tested were on 7.0+ versions) (phone)
  7. Galaxy S3 (phone)
  8. iPad Mini (Mostly wifi-only versions) (tablet)
  9. iPad Air (tablet)
  10. Nexus 7 (tablet)

Notables also included the LG Nexus 5, iPhone 5c and the HTC One, despite not making our Top 10. So, testers and devs, these are the devices your counterparts are getting the most out of based on our customer demand.

This of course will always fluctuate as soon as the next biggest and baddest device comes out (I’m lookin’ at you, iPhone 6), which is why we plan to update you every quarter on the hottest devices. But on the flipside, as the iPhone 4 can attest to released two years ago, it’s always a safe bet that as long as the device still has widespread adoption “in the wild,” there’s always a need for you to test your apps on it.

Categories: Companies

Web Application Testing in .NET

Software Testing Magazine - Thu, 08/21/2014 - 18:21
Web application testing is a rapidly evolving topic, so year by year it is reasonable to enumerate the possible options and re-evaluate the web testing strategy you have chosen for your .NET project. This presentation shares what we have learned about web testing during our projects. It shows strategies and tools that have worked to address the specific aspects of the different applications, because there is no one-size-fits-all solution in web testing?) You will hear about things like test-driven web development, problems and solutions of unit testing MVC controllers, efficient usages ...
Categories: Communities

Let’s Test Oz, Leura. Australia, September 15-17 2014

Software Testing Magazine - Thu, 08/21/2014 - 17:25
Let’s Test Oz is a three-day software testing conference that takes place in Australia and features a list of international and Australasian speakers. In the program you will find workshops, keynotes, tutorials and sessions. In the agenda you can find topics like “Test Cartography: Using Maps to Guide our Testing Journey”, “The Speed to Cool: Agile Testing and Building Quality In”, “What’s in a Name? Experimenting with Testing Job Titles”, “What do you mean Agile Tester?”, “Deprogramming the Testing Cargo Cult”, “Context-Driven Test Reporting” Web site: http://lets-test.com/?page_id=1863 Location for 2014 conference: Fairmont Resort ...
Categories: Communities

Understanding Application Performance on the Network – Part VIII: Chattiness and Application Windowing

In Part VII, we looked at the interaction between the TCP receive window and network latency. In Part VIII we examine the performance implications of two application characteristics – chattiness and application windowing as they relate to network latency. Chattiness A “chatty” application has, as an important performance characteristic, a large number of remote requests […]

The post Understanding Application Performance on the Network – Part VIII: Chattiness and Application Windowing appeared first on Compuware APM Blog.

Categories: Companies

Testers Aren’t Psychic: Six Ways Developers Can Be More Transparent With Testers

uTest - Thu, 08/21/2014 - 15:00

Dear Developers:resized_business-cat-meme-generator-i-m-good-but-i-m-not-psychic-d50a02

Testers aren’t mind readers.  We’re not psychic or telepathic, either.  Please write your test cases and review our bug reports accordingly.

With Love,
Your QA Testers

Many testers dislike seeing the dreaded “Working as Designed” rejection reason on our bug reports because, in many cases, we had no idea it was designed to work that way.  To us, it just seemed like something wasn’t quite right, and it’s our job to tell you these things.

In fact, if a tester who has the perspective of a new user (someone who has no prior experience with what they’re testing and sees the environment in a way a brand-new user would) writes up a bug report on a feature that seems to be broken or working poorly, there’s a good chance new users will also think the same thing. Perhaps you should review the design and user experience to see if there’s a flaw there.

After all, quality assurance is not just about telling developers when a page errors or when a link is broken. It’s also about telling you when your software does not work in such a way that a user will find it to be a useful, quality, worthwhile experience.

It might seem like a pain in the butt, but we’re really just trying to do our jobs and help make your project a complete success.  Really! We’re on the same team, I promise.

When you ask us to test something for you, try and do the following things. They will help us help you and, in turn, your projects:

  1. Provide us with at least a basic understanding of the software, what goals it has to meet, and how users are meant to see and use it. We are your first and last line of defense against poor user experiences.
  2. Give us the design and requirement specifications if you are able to. We can help ensure that they were met and let you know if we see discrepancies.
  3. Converse with us. Frequent communication and discussion helps us understand what you need from us and how we can help you better.
  4. Be detailed, accurate, objective, and constructive when you give us feedback. You don’t like when our bug reports only say, “It’s broken,” and similarly, we don’t like when your rejection notes say nothing but, “Working as intended.”
  5. Share your thoughts, expectations, needs, and wants with us. We want to be able to provide you with these things, but we can’t read your mind. You have to externalize them.
  6. Be proactive. Let us know what we can do for you sooner rather than later so that we can get it done for you when you need it.

Good testers are magical, yes, but we are far from psychic. Help us and we can help you so much more than you dreamed!

Tammy Shipps has experience working with testers as a developer and marketing engineer at Applause. She also is an active uTest Community member and tester herself.

Categories: Companies

The Grocer: Original Software helps drive growth of online

The Grocer is the UK’s leading source of information in the FMCG market. In a recent feature about The Grocer Gold Awards, the judges stated their reasons for selecting Original Software as winner of the Technology of the Year category. They were impressed with testimonials from Original Software’s clients including the Co-operative Food Group, Marston’s […]
Categories: Companies

SkyTap Integration for UrbanCode Deploy

IBM UrbanCode - Release And Deploy - Thu, 08/21/2014 - 12:39

The SkyTap team recently introduced their plugin integration for IBM UrbanCode Deploy. SkyTap is a SaaS based tool that provides developers and testers environments in their cloud. The plugin allows you to run an UrbanCode Deploy process that spins up a new SkyTap environment.

SkyTap is particularly interesting as it allows the testing infrastructure to scale with the size of the team, concurrent development tracks, or simply how hot the project is right now. Infrastructure teams can focus on other work, while developers and testers have a key bottleneck removed. With the ability to snapshot and share a full environment, testers can quickly share a working rig or a problematic setup with colleagues to investigate.

If you want to get started with this integration, you should check out the how-to guide put together by SkyTap. It will give you a good sense for what you need to do to take advantage of the integration.

A broader overview is in the PDF solution data sheet.

Categories: Companies

Ranorex among "The 20 Leading Software Testing Providers"

Ranorex - Thu, 08/21/2014 - 10:00
We at Ranorex are pleased to announce that Ranorex has been chosen as one of "The 20 Leading Software Testing Providers 2014".

Test Magazine – specifically created as a voice for the modern-day software professional and always aiming to give a true reflection of the issues affecting this import market – has compiled a list of software testing providers, of which we are honored to be a part.
Categories: Companies

Why bug reports are closed or ignored?

Testlio - Community of testers - Thu, 08/21/2014 - 09:09

Recently one of Testlions said that software testers sometimes feel they are undervalued. Software testing is important step before launching new products but there are things that de-motivate testers. Among others is closing their reported issues without fixing.
So we asked around a little and got some answers why issues are closed without further actions.

Most Testlio clients are following lean development methodology. They expect quite the same from testing and testers.

Testers are disturbed when an issue is closed without comment and further actions. We are not talking about closing issues after they are solved.

There might be several reasons for that:

Low priority bugs are closed because they are “nice-to-have” instead of “must-fix-now”. Every company needs to take decisions what are next features that they develop or build. Priorities are important in order to plan work effectively. But it doesn’t mean that the issues weren’t noticed, they might be in the list already. Developing a software product is never done – it needs improvements, planning and vision.

If issue is closed because there wasn’t proper documentation, then it shows that tester didn’t give effort on explanations. This is serious mistake by tester. If tester didn’t give proper information for developers to reproduce the issue then developers don’t know where to look for fixes. We spoke about proper bug reporting in our last blog post through KISS method.

Reported issue is not a bug but an improvement. Sometimes test cycle assignment might request ideas for improvement as well, but most of the time it’s about searching for broken issues and bugs. This is why it’s important to read the assignment, what is the focus of testing during particular test cycle, to make sure that it’s relevant for current development phase.

Sharing information and talking with other testers in the team helps to reduce issues that might not be relevant. Team members should talk to each other to discuss if and what bugs are important to report and inline with the requirements from the client. Great communication in the team is important for positive peer review and feedback to offer quality results at the end.

If you feel de-motivated because an issue was closed and no actions taken, remember that your work and passion for testing is important for millions of happy end-users out there.

#lovetesting

Written by Kristel – the cofounder of Testlio

Categories: Companies

Software Testing Complete Guide eBook (FREE Download)

Software Testing Zone - Thu, 08/21/2014 - 08:02
I'm glad to announce that we are giving away a FREE copy of the much awaited eBook "Software Testing Complete Guide" to our readers. The regular price of this testing book is $19.99USD but for a limited period we are giving away this book absolutely FREE to "Software Testing Tricks" readers.

This comprehensive book will help freshers and newbies in testing with how to get started in testing, understanding various testing concepts, QA methodologies, fundamentals and other day to day activities that a software tester must know when joining this field.
How to Download Software Testing Complete Guide eBook for FREE?Step 1You can do this by referring to the "» Download FREE eBook" section on the right side-bar of the site.


Alternately, you can also Download your FREE “Software Testing Complete Guide” ebook directly by entering your email ID and then clicking "Download" button below: #mc_embed_signup{background:#fff; clear:left; font:14px Helvetica,Arial,sans-serif; } /* Add your own MailChimp form style overrides in your site stylesheet or in this style block. We recommend moving this block and the preceding CSS link to the HEAD of your HTML file. */ Get FREE eBook & Testing Tricks Step 2Once you enter your email ID and click on Download button, you'll receive an email that looks something like as shown below. It is important that you click the "Yes, Subscribe Me To This List." button to be able to grab your FREE Software Testing eBook.
Step 3As soon as you click the "Yes, Subscribe Me To This List." button in the above email, the FREE eBook link will be  emailed to you. 

That's it! You can click on the download link to get this $19.99USD worth eBook totally free. So what are you waiting for? Go ahead and grab a free copy of this eBook as long as it is FREE. Currently we are giving away a limited number of copies to our readers and this offer may end soon. So HURRY! 
And don't forget to let me know how you felt about this eBook. There are more such interesting testing books that I'm writing right now and your feedback is important to me.
Categories: Blogs

ISTQB Foundation Level Training in Software Testing (CTFL) - Chicago - Sept 16 - 18, 2014

If you are looking for ISTQB Foundation Level Software Testing Training at a reasonable price, in the Chicago area, this is your opportunity!

The class is Sept. 16 - 18, 2014, held in The Regus Offices at One Dearborn St. in downtown Chicago.

The instructor is Thomas Staab, CTFL.

We are only going to hold registrations open for one more week! We will close registrations at 5 p.m. CDT on Wed, Aug 27th.

The class is limited to 10 people. So if you want to get in on this unique opportunity - act fast!

The cost for the class is $2,000 per person, with the exam ($250 value) included in that price. The exam will be given on the final afternoon of the class.

If you bring two people, you can get $500 off your registrations. Just use code "ISTQB916" at check-out
time.

You can register here: https://www.mysoftwaretesting.com/ISTQB_Foundation_Level_Course_in_Software_Testing_p/ctflpub.htm
 
To see the outline: http://riceconsulting.com/home/index.php/ISTQB-Training-for-Software-Tester-Certification/istqb-foundation-level-course-in-software-testing.html 

I hope you can make it!

Randy
Categories: Blogs

Hear no Evil, See no Evil, Deploy no Evil

Sonatype Blog - Wed, 08/20/2014 - 16:14
I was going to start off listing a series of what I think are easy questions that I reckon everyone in technology should be able to answer even if they are not or have never been involved with writing software. I gave this some serious thought and decided (perhaps a little arbitrarily) that,...

To read more, visit our blog at blog.sonatype.com.
Categories: Companies

Integrated Pipelines with Jenkins CI

This is part of a series of blog posts in which various CloudBees technical experts have summarized presentations from the Jenkins User Conferences (JUC). This post is written by Félix Belzunce, solutions architect, CloudBees about a presentation given by Mark Rendell, Accenture, at JUC Berlin.

Integrated Pipelines is a pattern that Mark Rendell uses at Accenture to reduce the complexity of integrating different packages when they come from different source control repositories.

The image below, which was one of the slides that Mark presented, represents the problem of building several packages, which will need to be integrated at some point. The build version we need to use, how to manage the control flow and what exactly we need to release are all the main pain points when you are working on such an integration.


Mark proposes a solution where you will not only create a CI pipeline, but also an Integration pipeline to be able to fix the problem. In order to stop displaying all the jobs downstream inside the pipeline, Mark uses a Groovy script. For deploying the right version of the application, several approaches could be used: Maven, Nexus or even a simple plain text file.


The pattern can scale up, but using this same concept for micro services could be indeed a big challenge as the number or pipelines significantly scales up. As Mark pointed out, it cannot only be applied to micro services or applications, as this concept on Jenkins could be also used when you do Continuous Delivery to manage your infrastructure.

You might use similar jobs configurations along your different pipelines. The CloudBees templates plugin will be useful to templatize your different jobs, allowing you to save time and making the process more reliable. It also allows you to do a one time modification in the template which will automatically be pushed to all the jobs without going individually from one job to another.

View the slides and video from this talk here.



Félix Belzunce
Solutions Architect
CloudBees

Félix Belzunce is a solutions architect for CloudBees based in Europe. He focuses on continuous delivery. Read more about him on his Meet the Bees blog post and follow him on Twitter.
Categories: Companies

Announcing the 2014 uTest Summer Bug Battle Champions

uTest - Wed, 08/20/2014 - 15:34

We’re proud and excited to crown the champions of the 2014 Summer Bug Battle, uTest’s first in almost four years.Marty3

If you’ll remember, in this recent edition of the uTest Bug Battle, testers were asked to submit the most impactful Desktop, Web and Mobile bugs from testing tools contained on our Tool Reviews site. After two weeks of heated competition, our Test Team Leads chose the top 10 most impactful finalists from the bunch, and the Community spoke by voting on their favorites from these.

Here’s who YOU, the Community, crowned as the 2014 Summer Bug Battle champions, winners of $1000 in prizes, along with their corresponding winning entries:

1st Prize ($500): Davide Savo, Italy, for his winning entry of a ‘ZD SOFT Screen Recorder tool bug that would allow a tester to have 2221 days remaining in a Free Trial of its software”

2nd Prize ($300): Iwona Pekala, Poland (Iwona is also a Forums Moderator this quarter!), for her winning entry of an ‘ iTools bug allowing for crashing when a device is disconnected after recording a video”

3rd Prize ($200): Priyanka Halder, United States, for her winning entry of a bug where “JIRA doesn’t display the proper error message when the URL of an issue is too long”

Honorable Mentions (Winners of $50 each):

  • Ioan Nita, Romania
  • Ronny Sugianto, Indonesia
  • Faye Gu, China
  • Michael Sokolov, United States
  • Marek (Mark) Langhans, Czech Republic
  • Yulia Atlasova, Russia
  • Pankaj Sharma, India

10 Winners of a uTest t-shirt (randomly selected from those who participated):

  • Marius van Rees, The Netherlands
  • Terence Ching, Hong Kong
  • Alexander Silkin, United States
  • Massimo Sartori, Italy
  • Mallikarjun Hassan, India
  • Kirill Bilchenko, Ukraine
  • Vivek Gar, India
  • Pradyumna Dutta, United Kingdom
  • Diego Delgado, Columbia
  • Vanessa Yaginuma, Brazil

A huge congratulations to all of the champions in uTest’s 2014 Summer Bug Battle! Please be sure to send your own congratulations to all of the winners (and see all of the winning entries in their entirety) over at the uTest Forums or in the Comments below.

Beyond our champions seen here, we also thank everyone that made our first battle in a long time such a success, including all entrants, and especially our TTLs for all their hard work in triaging the entries: Peggy Fobbe, Lucas Dargis, Gagan Talwar, Atul Angra and Sonal Modi! The competition wouldn’t have been possible without some outside help, so we are grateful for it!

If you missed out on this one, don’t fear. We’ll see you next time.

Categories: Companies

More schools embracing predictive analytics

Kloctalk - Klocwork - Wed, 08/20/2014 - 15:00

The rise of big data analytics technology revolutionized operations within countless industries, as organizations gained the ability to develop insight to a previously impossible degree. Since then, the field of predictive analytics has taken this technology even further. And while predictive solutions are not yet as widespread as big data analytics, adoption of the former is accelerating quickly.

The extent of this trend can be seen in the field of education. From elementary schools to universities, educational institutions of all kinds are now leveraging predictive analytics capabilities.

K-12 predictions
Writing for The Journal, Vasuki Rethinam recently asserted that school district leaders are increasingly embracing predictive analytics tools in their K-12 schools. He explained that these school districts are using advanced analytics tools to develop new, innovative models and tools that can effectively monitor students' performance, enabling teachers and administrators to craft better learning plans.

More specifically, Rethinam indicated that predictive analytics tools are being used to develop early warning indicators for students who are likely to drop out of school by factoring in such information as grades and attendance records. These tools can also help leaders predict whether a particular child will likely graduate or not as early as grade 9. Additionally, schools increasingly use predictive analytics tools to identify and retain the most skilled, effective instructors within a given district.

All of this insight can have a significant impact on decision-making within school districts, Rethinam explained.

"District leaders can use this information to monitor progress of individuals and groups, identify needs and align resources, develop and evaluate intervention programs to assist students and re-allocate resources to address problems and deficiencies more effectively," he wrote. "Change in policy can be based on sound, objective information developed out of analytics."

Considering the budgetary constraints most public school districts face, the ability to distribute resources in the most efficient, effective way possible is a powerful advantage. However, as Rethinam noted, districts must take a number of steps in order to position themselves to take advantage of this technology.

"School districts must have a reasonably well developed data system with several years of data in various areas to form accurate predictions," the writer observed. "Predictive analytics is ubiquitous throughout the business and actuarial industries but as a specialty practice has gotten traction in the education sector only relatively recently."

One way for districts to harness predictive analytics, then, is to partner with third-party firms that specialize in this technology. Alternatively, Rethinam suggested that schools bring data analytics in-house by recruiting talented professionals who can leverage these resources.

College-level prediction
This trend is perhaps even more widespread at the university level. The Education Advisory Board recently announced that more than 100 institutions are now part of its Student Success Collaborative. This group utilizes a range of technologies along with best practices to help colleges and universities improve their graduation rates and student retention. Predictive analytics lies at the heart of this effort.

For example, Southern Illinois University was able to improve its student retention rate 3.6 percent by leveraging predictive analytics.

Furthermore, Rethinam noted that many public school districts looking to leverage predictive analytics have partnered with local universities and colleges for this purpose.

Educational organizations are in some ways ideally situated to take advantage of predictive analytics technology, as these institutions inevitably create and have access to robust amounts of information concerning their students. By combining these resources with high-quality predictive analytics tools and best practices, schools of all sizes and levels can improve not only their bottom lines, but also the quality of the learning they provide.

Categories: Companies

Standard. Or Not.

The Social Tester - Wed, 08/20/2014 - 14:04
The international standard for Software Testing, ISO 29119, is soon to be upon us. There are people writing it and expanding it and creating it right now. I’ve signed the petition against it because I don’t agree with some of it and it’s supposed to represent the industry I work in and I had no […]
Categories: Blogs

How Holiday Readiness Testing is different?

MindTree Blogs » Testing - Wed, 08/20/2014 - 12:35
Unlike other industries where performance testing is often an after-thought, the retail industry cannot afford to have this approach. After all, Black Friday was last holiday season’s first billion-dollar-plus online shopping day – up a hefty 15% over the year …
Categories: Companies

Testing Usability Early

Software Testing Magazine - Wed, 08/20/2014 - 11:01
Without a full user-centered process, performing a usability test at the end of the development process merely serves to highlight the unacceptable nature of the design. It’s a sad and frustrating result, but there is usually little that can be done except to release the poor design. Source: Institutionalization of UX, Eric Schaffer & Apala Lahiri, Addison-Wesley
Categories: Communities

Knowledge Sharing

Telerik Test Studio is all-in-one testing solution that makes software testing easy. SpiraTest is the most powerful and affordable test management solution on the market today