Skip to content

Feed aggregator

How to Create Performance Models using Application Monitoring Data

Dynatrace collects a wealth of monitoring data on applications and one of the great aspects is that it also provides interfaces allowing external applications to use this information. An example we’ve just recently seen in a blog post showed how you can use Dynatrace data to monitor your entire application landscape across a server farm. […]

The post How to Create Performance Models using Application Monitoring Data appeared first on Dynatrace APM Blog.

Categories: Companies

iOS Instrumentation on Windows Machines

Ranorex - Thu, 03/19/2015 - 11:00
Ranorex 5.3 introduces a whole new process for instrumenting your iOS app directly on the Windows machine you are using for testing. OS X and Xcode are no longer required to instrument your apps – simply use the Ranorex Instrumentation Wizard to instrument your app under test whenever it is updated.



In addition to the simplified instrumentation, Ranorex 5.3 comes with a brand new service app for your iOS device under test. This service allows you to start and stop your instrumented apps via WiFi and gives an overview of instrumented apps as well as allowing logging for easier analysis of your app under test.

Start your iOS test automation now...
Categories: Companies

Running Unit Tests in LoadRunner just got free-er!

HP LoadRunner and Performance Center Blog - Thu, 03/19/2015 - 09:32

In LoadRunner 12.00, we introduced the Community Edition license which provided 50 free virtual users at no cost, for all protocols except COM/DCOM, Templates and GUI. The idea was that Performance Engineering practitioners and Developers alike could harness the power of LoadRunner without a high cost of entry, and give more value to their organizations by introducing testing earlier in the app lifecycle. So whether you are trying it out, or putting LoadRunner on every tester and developer's desktop, there is no cost unless you need to scale up. What's more, you can buy virtual users when are ready to test at larger volumes, starting at just a few cents per virtual user day.

 

The response has been fantastic. The number of downloads per month nearly doubled, and LoadRunner was already consistently one of the highest downloaded products for all of HP Software!

 

Categories: Companies

Making a QA Analyst’s Life Easier: Linking Requirements and Tests in TestTrack

The Seapine View - Thu, 03/19/2015 - 09:30

It’s no secret that we also use the products we develop at Seapine Software. Earlier this week, I was running some tests and had a flashback to the (dark) days before requirements management was introduced in TestTrack. It quickly dawned on me the importance of managing requirements alongside test cases, test runs, and other artifacts in TestTrack. Plus, it was a good example of how linking and traceability in TestTrack makes everyone’s job much easier.

I was running some test runs from a regression suite and came across one that didn’t make sense to me. I decided to check the functional design to see how the feature was actually supposed to work. Since we link test runs with the test case they are generated from, it was easy to go directly from the test run to the related test case. However, when looking at the test case, I realized it was written before we managed requirements in TestTrack. So, I had to hunt for the Microsoft Word design document in Surround SCM. Then, I had to take the time to comb through the Word document to find the specific requirement that described the functionality. Not fun and kind of time-consuming.

Later that same day, I had a similar experience with another test run. However, this time I opened the test case and was happy to see that the test was written after we started managing requirements with TestTrack. Because the test case was linked to the requirement, it was easy to go directly to the exact requirement the test case was written for. I was then able to correct the test case and get back to testing quickly. Wow, that was so much easier!

Link_TestCase_Requirement

Linking test cases to their originating requirements really does make a QA analyst’s life easier. I saw first hand how much time is saved when researching questionable tests. Not to mention the frustration eliminated since it was easy to find exactly what I needed.

Share on Technorati . del.icio.us . Digg . Reddit . Slashdot . Facebook . StumbleUpon

Categories: Companies

How to use Evernote and Postachio for Product Documentation

The Social Tester - Thu, 03/19/2015 - 09:00

I posted before Christmas about using Evernote and MohioMap to map out the product you’re building. I’ve taken it a further in this post. In this post I’ll explain how to use Evernote and Postachio for Product Documentation. So let’s...
Read more

The post How to use Evernote and Postachio for Product Documentation appeared first on The Social Tester.

Categories: Blogs

Testing in production: Points to ponder

HP LoadRunner and Performance Center Blog - Wed, 03/18/2015 - 22:14

pic1.jpgMost software development groups base their go/no-go decision on whether to release software on the results of the tests that were run during product testing.

 

But some organizations take the brave (ahem) step of deferring some – or all – of their testing till the product is deployed, and run their tests on the production system.

 

Continue reading to find out some of the considerations to help you decide whether testing in production is for you or not.

Categories: Companies

Master the Essentials of UI Test Automation Series: Chapter Five

Telerik TestStudio - Wed, 03/18/2015 - 19:00
You’re reading the fifth post in a series that’s intended to get you and your teams started on the path to success with your UI test automation projects: Look Before You Jump. By this point you should have a clear picture on the business-level problems you're hoping to solve, the team you'll need to build and the tools/infrastructure you'll need in place. Now it's time to stitch everything together and build some tests.
Categories: Companies

Repost: Lessons Learned from Automating iOS Apps – What To Do When Tests Require Camera Roll Resources?

Sauce Labs - Wed, 03/18/2015 - 17:22

This post comes from our friend Jay Sirju at Animoto, who is leveraging Sauce, Appium, and CI methodologies to automate mobile testing for their iOS application.  Check out the original post on their blog.

Here at Animoto, the mobile application development team had spent some time over the past year investigating and implementing CI methodologies into the development cycle of the Animoto Video Maker application for iOS. A major part of this initiative involved creating automated test cases that would run at various times and circumstances.

First a bit of background. When we started this, we had already implemented a good amount of automation for the Animoto website. We had chosen to use Selenium and ran our automated tests against various browsers using Sauce Labs. We decided to extend our existing infrastructure to support running automated tests using theAppium library against Sauce Labs.   For those unfamiliar with mobile testing on Sauce Labs, they use the iOS and Android Simulators to run tests. I know, not ideal, but we can get to that another time.

The Problem

For anyone who has ever launched a fresh iOS Simulator (before Xcode 6), the OS is in it’s factory state. The Animoto Video Maker App transforms your pictures, video, and text, into professional looking videos… see where I’m going?

Screen-Shot-2015-02-18-at-11.31.04-AM

A lot of user flows depend on having some photos in the camera roll. A factory-fresh simulator without any photos means there are man flows we can’t automate. Unfortunately, at the time of writing, Sauce Labs does not have a way to upload assets to populate the Camera Roll, and the simulators are reset after executing each test case. Starting with XCode 6, the Camera Roll does have some images out of the box, but what was really needed were meaningful pictures and videos. So what is needed is a way to populate the Camera Roll while working within the constraints of running tests in Sauce Labs. Well, the app already reads images and pictures from Camera Roll, what about writing to it as well?

Adding or Altering Configuration Profile

Before we get to actually populating the Camera Roll, we need a mechanism to ensure that this logic is performed only when the intent is to run automation. Out of the box, Xcode provides 3 build configurations (Debug, Release, and Distribution). We can edit these configurations, add new ones, or even delete unnecessary ones. In this case, we can simply add a Test configuration to the mix. Once we did that, we were able to change various build settings to help create hooks for app automation. We can start out by adding a Preprocessor Macro for the Test build configuration, so that we can tell the pre-processor when to compile test hooks into the build.

animoto 2

Okay, now we can do some fun stuff with the build. For the sake of brevity, let’s focus specifically on the original issue: Getting pictures and videos into the Camera Roll.

Change Test Configuration Profile Settings

First things first – how does one get pictures and videos up to Sauce Labs? We can simply add them to the iOS project, but that would increase the size of the application bundle regardless of which build configuration is being used. Definitely not ideal. A better choice would be to store them somewhere externally and copy them to the application bundle when the Test configuration is used. This can be done by running a script when building the project.

if [ ${CONFIGURATION} == "Test" ]; then
cp -r ${PICTURE_AND_VIDEO_LOCATION}/ ${BUILT_PRODUCTS_DIR}/${PRODUCT_NAME}.app
fi
Populating the Camera Roll

Now we have an application bundle that contains a bunch of sample pictures and videos. This is great because when the application gets uploaded to Sauce Labs for testing, so do all the sample data.   The following code example assumes all the sample images are in a folder within the application bundle named ‘TestImages’:

+ (void) populateCameraRoll
{
    NSString* absolutePicturePath = [[NSBundle mainBundle] pathForResource:@"TestImages" ofType:nil];
    NSArray* pictureList = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:absolutePicturePath error:nil];

    for(int i = 0; i < [pictureList count]; i++)
    {
        NSString* absolutePictureFilePath =[ NSString stringWithFormat:@"/%@/%@", absolutePicturePath,[pictureList objectAtIndex: i]];

        NSData *jpeg = [NSData dataWithContentsOfFile:absolutePictureFilePath];

        UIImage *image = [UIImage imageNamed:absolutePictureFilePath];

        CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);
        CFDictionaryRef imageMetaDataRef = CGImageSourceCopyPropertiesAtIndex(source,0,NULL);
        NSDictionary *imageMetadata = CFBridgingRelease(imageMetaDataRef);
        CFRelease(source);

        if (image != nil)
        {
            ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];

            dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
                dispatch_semaphore_t sema =  dispatch_semaphore_create(0);

                [library writeImageToSavedPhotosAlbum:[image CGImage] metadata:imageMetadata completionBlock:^(NSURL *assetURL, NSError *error)
                 {
                     dispatch_semaphore_signal(sema);
                 }];
                dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
            });
        }
    }
}
@end

The above code makes a bunch of assumptions. First, it only works for images. Another folder consisting of sample videos can be created, and the using similar logic and the writeImageToSavedPhotosAlbum method. Next, calling writeImageToSavedPhotosAlbum can indeed fail. For the sake of keeping this example code more readable, error handling was excluded. Retry logic should be included if an error is returned.

Finally, you may have noticed the use of a semaphore in the example code. Writing images to the Camera Roll is actually an asynchronous call, meaning that the call returns immediately while a separate thread processes writing the image data to the Camera Roll. The writeImageToSavedPhotosAlbum method can fail if there are too many threads trying to write image data simultaneously. The semaphore is used ensure that images are written to the Camera Roll sequentially. This makes using the writeImageToSavedPhotosAlbum method much more stable.

Okay, so now that is left is to call the method when running the Test configuration. This can easily be done using the Preprocessor Macro setting that was above mentioned.

#if TEST
[ANTestClass populateCameraRoll];
#endif

It is recommended to call the method somewhere deterministic (ie -a button tap). Simply populating the Camera Roll at start-up may mess up launching Apples Instrumentation library because of the Alert displayed when the app accesses the Camera Roll for the first time. This lesson was learned the hard way.

This opens up the ability to add more automated test hooks into the application under test, but as a word of warning; the more hooks added, the more the test configuration of the app diverges from what is being released to customers.

 

Have an idea for a blog post, webinar, or more? We want to hear from you! Submit topic ideas (or questions!) here.

Categories: Companies

Android: JUnit XML Reports with Gradle

a little madness - Wed, 03/18/2015 - 08:32

The Android development tools project has seen big changes over the last year. The original Eclipse ADT development environment was superseded late last year by Android Studio — a new IDE based on Intellij. Under the hood Android Studio also uses a new command line build system based on Gradle, replacing the previous Ant-based system. I’ve been keen to find out how these changes impact the integration of Android test reports with continuous integration servers like Pulse.

Summary
  • Android JUnit Report is redundant.
  • Run on-device Android tests with: ./gradlew connectedAndroidTest
  • Collect reports from: app/build/outputs/androidTest-results/connected/*.xml

 

Details

The original Ant-based build system for Android didn’t produce XML test reports for instrumentation tests (i.e. those that run on-device), prompting me to create the Android JUnit Report project. Android JUnit Report produced XML output similar to the Ant JUnit task, making it compatible with most continuous integration servers. The good news is: Android JUnit Report is now redundant. The new Gradle-based build system produces sane XML test reports out of the box. In fact, they’re even more complete than those produced by Android JUnit Report, so should work with even more continuous integration servers.

The only downside is the documentation, which is a little confusing (while there are still documents for the old system about) and not very detailed. With a bit of experimentation and poking around I found how to run on-device (or emulator) tests and where the XML reports were stored. With a default project layout as created by Android Studio:

ASDemo.iml
app/
  app.iml
  build.gradle
  libs/
  proguard-rules.pro
  src/
    androidTest/
    main/
build.gradle
gradle
gradle.properties
gradlew
gradlew.bat
local.properties
settings.gradle

You get a built-in version of Gradle to use for building your project, launched via gradlew. To see available tasks, run:

$ ./gradlew tasks

(This will download a bunch of dependencies when first run.) Amongst plenty of output, take a look at the Verification Tasks section:

Verification tasks
------------------
check - Runs all checks.
connectedAndroidTest - Installs and runs the tests for Debug build on connected devices.
connectedCheck - Runs all device checks on currently connected devices.
deviceCheck - Runs all device checks using Device Providers and Test Servers.
lint - Runs lint on all variants.
lintDebug - Runs lint on the Debug build.
lintRelease - Runs lint on the Release build.
test - Run all unit tests.
testDebug - Run unit tests for the Debug build.
testRelease - Run unit tests for the Release build.

The main testing target test does not run on-device tests, only unit tests that run locally. For on-device tests you use the connectedAndroidTest task. Try it:

$ ./gradlew connectedAndroidTest
...
:app:compileDebugAndroidTestJava
:app:preDexDebugAndroidTest
:app:dexDebugAndroidTest
:app:processDebugAndroidTestJavaRes UP-TO-DATE
:app:packageDebugAndroidTest
:app:assembleDebugAndroidTest
:app:connectedAndroidTest
:app:connectedCheck

BUILD SUCCESSFUL

Total time: 33.372 secs

It’s not obvious, but this produces compatible XML reports under:

app/build/outputs/androidTest-results/connected

with names based on the application module and device. In your continuous integration setup you can just collect all *.xml files in this directory for reporting.

Although the new build system has killed the need for my little Android JUnit Report project, this is a welcome development. Now all Android developers get better test reporting without an external dependency. Perhaps it will even encourage a few more people to use continuous integration servers like Pulse to keep close tabs on their tests!

Categories: Companies

JUC 2015 Call for Paper Deadlines Approaching!

The deadlines to speak at a 2015 Jenkins User Conference are fast approaching. Don’t miss out on this great opportunity to share your Jenkins tips, tricks, stories, and know-how with the community! Submit your proposal by the below deadlines to have your talk considered by a panel of Jenkins experts:

Please note: The deadline to submit a speaking proposal for East Coast US (DC) and Europe (London) is SUNDAY, MARCH 22, 2015. That is only FIVE days away!

2015 JUC Cities & Call for Papers Deadlines

  • East Coast US: Deadline to Submit - March 22, 2015
  • London: Deadline to Submit - March 22, 2015
  • West Coast US (Bay Area): Deadline to Submit - May 3, 2015
  • Israel: Deadline to Submit - May 15, 2015

Not interested in speaking? Contribute to the community in another way: nominate or refer a speaker you would like to hear from at JUC! Contact alytong13@gmail.com or simply become a sponsor.

Categories: Open Source

Deadlines Fast Approaching: JUC 2015 Call for Papers


Don’t miss your chance to speak at a 2015 Jenkins User Conference! JUC provides a great opportunity for Jenkins experts, beginners, fans and all in between to connect, learn and have fun celebrating what JenkinsCI can do.

Speaking at JUC is the best way to share your Jenkins experiences, stories and expertise. Submit your talk proposal by the below deadlines to be considered.

Please note: To be considered to speak at JUC East Coast US and JUC Europe (London), you must submit your proposal by SUNDAY, MARCH 22, 2015. That is only FIVE days away!

2015 JUC Cities & Call for Papers Deadlines
  • East Coast US: Deadline to Submit - March 22, 2015
  • London: Deadline to Submit - March 22, 2015
  • West Coast US (Bay Area): Deadline to Submit - May 3, 2015
  • Israel: Deadline to Submit - May 15, 2015

If you are not interested in speaking, you can still contribute to the community. Contact alytong13@gmail.com to nominate a Jenkins user you would like to hear from at JUC!

Categories: Companies

Load Impact 3.0 Released

Software Testing Magazine - Tue, 03/17/2015 - 18:59
This version features a completely redesigned UI as well as lots of cool features that are already implemented in the backend, which we will be rolling out steadily over the course of the next few weeks. Load Impact 3.0 is ready for teams embracing DevOps and continuous delivery, but we know there is much more to do. Here are some changes in Load Impact 3.0: * Test configurations and results: Instead of one big list of test runs and one big list of test configurations, we’ve grouped all test results under their ...
Categories: Communities

Decreasing False Positives in Automated Testing [WEBINAR]

Sauce Labs - Tue, 03/17/2015 - 17:00

False positives: automated testing’s arch nemesis.

When automated tests are written well, they are part of a healthy CI/CD process that can save developer time and company money. But when a team gets false positives from unreliable tests, the entire build can get derailed. What’s worse, too many false positives can erode an organization’s belief in the value of using a test automation framework at all.

In this webinar, Anand Ramakrishnan of QASource will walk you through what false positives are and provide key strategies to reduce or eliminate them.

Anand will cover:

• What false positives are and why they occur

• Common causes and the challenges they create

• How to implement key strategies to reduce them

• Why implementing these strategies is essential for increasing productivity and reducing cost, as well as time to market

• Case studies, real world examples, a live demo and Q&A

Join us for this presentation on Tuesday, March 24th at 11am PDT/2pm EDT. There will be a Q&A following the end of the presentation.

Click HERE to register today.

Categories: Companies

The Essential Omni Channel User Experience Measurement Index [VIDEO]

In the past we have been looking at page load times of our desktop browser applications and we used concepts like the APDEX to find out what the user experience looks like but since it got defined a lot of things have changed. At Velocity last November I presented that APDEX is dead, and W3C […]

The post The Essential Omni Channel User Experience Measurement Index [VIDEO] appeared first on Dynatrace APM Blog.

Categories: Companies

uTest Announces Collaboration With expo:QA, Conference Discount

uTest - Tue, 03/17/2015 - 15:30

SquareButton(125x125)-01uTest is happy to announce its collaboration with expo:QA 2015, the meeting point for software testing and quality professionals in Spain and Europe.

The 2015 edition of the conference will be held June 8-11, 2015, in Madrid, Spain. expo:QA has successfully consolidated itself as the biggest and most prestigious event in southern Europe dedicated to this field.

The schedule once again is packed with keynotes from notable speakers including David Evans, Zeger Van Hese and Cesario Ramos, and diverse sessions including The Agile Testing Mindset, The Internet of Everything – How Will We Test it?, and Security Testing in Mobile Applications.

As part of this collaboration, uTest has secured an exclusive 15% discount off the already low early bird price for all conference/tutorial packages. Please email testers@utest.com for an exclusive discount code (available to uTesters only for one-time use) to use at registration checkout. Additionally, as an added bonus to uTesters, the first five to register with the discount will receive free admission to the conference networking dinner, a 70 Euro value!

For more information on expo:QA including a full schedule of all sessions and registration, visit the conference listing on the uTest Events Calendar or the official expo:QA 2015 website.

Not yet a uTester? Sign up for free today to gain access to exclusive tester discounts to events like this one, free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros.

The post uTest Announces Collaboration With expo:QA, Conference Discount appeared first on Software Testing Blog.

Categories: Companies

ELEKS - New European Service Partner

Ranorex - Tue, 03/17/2015 - 11:25
As our customer base in Europe and especially the UK is constantly growing, we have an increasing demand for Ranorex consulting and implementation services. In order to ensure that our clients have the best in class service available, we have selected ELEKS as a Service Partner for the UK.

ELEKS is a global organization providing software engineering, technology consulting and quality assurance services. The company delivers innovative, reliable and award-winning solutions for the customers' unparalleled business growth to include Data Science, Mobility, Digital Production and Financial solutions. Since 1991, ELEKS software solutions have significantly contributed to the success of the company's customers, including Fortune 500 companies, and are recognized as a valuable part of international best practices.

Named a Top 100 Global Outsourcing Company by the International Association of Outsourcing Professionals® (IAOP®), ELEKS is a strategic partner with the proven ability to address the customers' most pressing needs. The company's delivery organization, consisting of approximately 1,000 professionals in Eastern Europe, is strengthened by a local presence in Europe and the UK.
Categories: Companies

Beer and Product Development: 11 Great Pairings

The Seapine View - Tue, 03/17/2015 - 10:30

beer1 ***!!!DISCLAIMER!!!***

Please note drinking of any kind while working is in direct contradiction to many company’s employee manuals. Please refrain from drinking while working, or at least hide it well. This blog article was meant purely in jest and should not be taken seriously. In fact I recommend you never drink, ever.

TestTrack Pairings

In recent years, drinking beer has become as common and sophisticated as drinking wine. There are microbreweries popping up all over the place. The influx of beer lovers, beer aficionados, and beer experts has led to an influx of beer styles. Along with different styles has come different pairings of these styles with meals. For example, one might like a hearty Porter with barbecued meat or a hoppy IPA with a spicy curry.

All of this talk makes me think, “What beers go well with TestTrack and product development?” I’ve compiled a list of common beer styles and common product release tasks below. While I do enjoy a good beer and I’ve even brewed a few of my own, I do not consider myself an expert nor do I see myself as the definitive authority on these pairings.

beer2

Beer Seapine Activity Pairing Notes American Lager Keeping track of defect counts (did I mention I recently found my 1000th defect?) American Lagers are often valued for quantity over quality. Oddly enough, this is my least favorite beer style. IPA Developer Testing The hoppy aroma will have you hopping in your seat while testing new features! Brown Ale Sales Meeting Brown ales have been often considered the ‘sweet  teas’ of beer. Great for keeping focus on the goals while allowing one to relax, just a little. WitBier Company Meeting The smooth taste of a wheat beer goes nicely with a company meeting, to discuss the future. Add a slice of orange, and this may help one think of important questions to ask. Belgian Style Ale Marketing Meetings Wikipedia says: Beer in Belgium varies from pale lager to lambic beer and Flemish red. Is it a mere coincidence that the majority of people on Seapine’s Finance and Marketing teams have red or light-colored hair (blonde or graying)?  I think not. Dopplebock Executing Automated Tests These beers tend to be higher in alcohol, so it’s probably best not to work much when drinking. Sit back, pop open a dopplebock, and let automation do your work for you! Coffee Stout Writing Functional Designs Stouts often have a roasted or coffee-like taste. It’s said that you should drink beer for good ideas and coffee to get them done. Stout seems relevant as a segue to coffee and code. Scottish Ale Test Case Writing Test case writing is a gritty process that often involves writing functional steps for scenarios that do not yet exist. The mix between heavy and light beers provides the perfect balance needed to tackle this task! Porter Defect Triage You need a good strong beer to get through this, even on the best of days. Irish Red Ale Blog Post Writing Honestly, how do you think I came up with this post idea? Great things happen with Irish red ales! English Bitter Release Retrospective It’s strong, doesn’t pull punches and really helps give one a great perspective on life, much like a release retrospective.

So, how about you? What is your favorite beer for daily tasks? This might be a great topic to bring up with your manager at a yearly review. As always, enjoy!

Cheers!

Share on Technorati . del.icio.us . Digg . Reddit . Slashdot . Facebook . StumbleUpon

Categories: Companies

How to compare two PDF files with ITextSharp and C#

Testing tools Blog - Mayank Srivastava - Tue, 03/17/2015 - 09:06
I have struggled lot to compare two PDF files and display the differences. Finally I came with approach where I am extracting all the text from PDF files, splitting by lines, comparing line by line and showing the differences. So if you have same kind of requirement, you can use below code to resolve it. […]
Categories: Blogs

Exploratory Testing 3.0

DevelopSense Blog - Tue, 03/17/2015 - 06:09
This blog post was co-authored by James Bach and me. In the unlikely event that you don’t already read James’ blog, I recommend you go there now. The summary is that we are beginning the process of deprecating the term “exploratory testing”, and replacing it with, simply, “testing”. We’re happy to receive replies either here […]
Categories: Blogs

Test Automation Doesn’t Mean You Need Less Testers

Software Testing Magazine - Mon, 03/16/2015 - 21:34
It’s commonly said that Test Automation means you need less testers on the team, it speeds up the testing process and allows more time for Exploratory Testing. In this talk Richard shares his critique of these common outcomes by calling upon his experiences of working in and managing teams where Automation has played a crucial part in the testing approach and has been used to great effect; but hasn’t resulted in the above outcomes. This presentation explains why he believes these common misconceptions of Automation are unfounded and gives reasoning as ...
Categories: Communities

Knowledge Sharing

SpiraTest is the most powerful and affordable test management solution on the market today