Skip to content

Feed aggregator

This Week in Software Testing Fails: You’re In! (Or Not)

uTest - Thu, 02/19/2015 - 21:36

AcceptedYou’re a senior in high school and you eagerly open up the mailbox in anticipation of that college admissions letter. The day finally comes, and you tear apart the huge envelope in anticipation of the (hopefully) good news to come.

It does. You have been accepted to your dream school (says the admissions letter, anyways)!

Except when that letter is, in reality, a soul-crusher of dreams — the case that happened with 800 prospective students applying to Carnegie Mellon’s top-ranked master’s in computer science program. According to the Washington Post, a glitch in the school’s computer systems allowed an email like this one to go out to these students mistakenly informing them that they had been accepted.

Like in the film Office Space, however, school officials soon “fixed the glitch,” and quickly followed with a second (far less congratulatory) email:

“Earlier this morning, we mistakenly sent you an offer of admission to Carnegie Mellon’s MS in CS program. This was an error on our part. While we certainly appreciate your interest in our program, we regret that we are unable to offer you admission this year.”

Ouch. After that rollercoaster of painful emotions — and one student who had a celebratory dinner with his family out of jubilation — could these kids at least have a courtesy admission, please?

I’d say some testing of the admissions e-mail procedures and programs may be a welcome (and warranted) solution the next admissions season.

Not a uTester yet? Sign up today to comment on all of our blogs like this one, and gain access to free training, the latest software testing news, opportunities to work on paid testing projects, and networking with over 150,000 testing pros. Join now.

Categories: Companies

Skytap Offers Test Environments On SoftLayer

Software Testing Magazine - Thu, 02/19/2015 - 19:49
Skytap, a leading provider of on-demand Environments-as-a-Service (EaaS), has introduced Skytap running on SoftLayer infrastructure. This new offering enables enterprises to rapidly deploy software development and test environments that seamlessly migrate entire complex environments between Skytap and IBM cloud centers globally. Skytap’s EaaS environment allows customers to rapidly develop and test in the cloud, even their most complex applications, in a fraction of the time required with traditional methods. Built on the SoftLayer infrastructure, customers can use Skytap to drive a fully integrated software development experience and speed collaboration among globally ...
Categories: Communities

Development Challenges on the Internet of Things

The Kalistick Blog - Thu, 02/19/2015 - 19:45

Recently, I had the privilege of hosting a webinar with Chris Rommel of VDC Research discussing common development challenges for Internet of Things (IoT) systems. We had many interesting questions during the event—far more than we had time to address—and I’d like to use this post to address some of the common themes and continue the discussion.

Many development teams struggle to assemble their system from a variety of different components, tools and frameworks. This is an unfortunate reality of today’s immature IoT development ecosystem. The ecosystem is in a state of flux, as the silos where mobile, embedded, and server/cloud applications slowly break down. As developer skill sets expand, tools, frameworks and standards will be created to support these efforts.

One thing that I find interesting, is that this evolution of the development landscape is not really creating new problems. Rather, it is exposing developers to problems and capabilities that may be well known in other domains. For example, enterprise and web developers are very familiar with the need for robust security against local and remote attackers. IoT development is expanding the scope of those concerns, in that embedded, device, and mobile developers need to start considering security challenges during development as well. Those security challenges are not new; the connectedness of the IoT is forcing all developers to worry about security now.

Worse, the existing development silos mean that many developers don’t have access to expertise that will help them understand how to address their security problem—they simply don’t know what they don’t know. IoT teams will need to promote cross-pollination of expertise to ensure developers are able to adequately understand and address the challenges they face.

In the webinar we discussed the growing reliance on open source and other third-party components, especially in areas like embedded systems where open source has traditionally not had a large presence. Embedded development often involves specialized requirements and non-standard, purpose-built operating environments. Moreover there is often a lack of trust in open source code’s ability to operate within embedded systems’ constraints. These facts made it extremely difficult to provide commercial products and support for embedded developers. The lack of commercial support further limited adoption.

The challenges of embedded development are not significantly different today, but there are now companies able to successfully provide and support mature, open source and commercial embedded components. This is important because devices in IoT systems are growing more complex. For example, a device that used to operate independently may need to add network connectivity for telematics applications. Devices that used to run with a simple super loop or homegrown operating system/executive, are increasingly adopting open source and commercial operating systems with sophisticated device support and full communication stacks because it just doesn’t make sense to re-implement these sophisticated components.

Adopting third-party components allows development teams to offload development burden and leverage mature solutions, but introduces new complexities since these components need to be maintained and updated to address problems such as security vulnerabilities that might be discovered in the future. There is no simple solution; teams need to decide how best to address these complexities. In some cases they might need to fork the code and maintain it themselves; in other cases, they may monitor open source development and update components as needed.

All code, whether it is developed internally or integrated from a third-party, comes with risk. Much of that risk can be identified through testing and static analysis, but those approaches may be overlooked or infeasible with third-party code. Moreover, they can be problematic when used on the interfaces to third-party code. That is, not only may you not know about problems in the third-party code but you might also not know how to test whether you are using such components correctly. Those problems obviously include coding problems, but open source components can also yield IP compliance risks in cases where the lineage or licensing of acquired code might be unknown. In other words, do not make the decision to adopt third-party code lightly. Understand that you are trading some up-front cost for increased testing, maintenance and compliance costs.

Finally, I’d like to address the theme of concurrency in the IoT. This is one area with very wide variability. Certain applications, such as those covering industrial or transportation systems, might need to maintain strict connectivity and latency thresholds. Other systems, such as consumer wearables, might be extremely tolerant of delays and lost data. Regardless of the application requirements, IoT systems must be designed to fulfill those requirements in all reasonable scenarios. That might mean adding components to proactively monitor and manage the system. For example, degrading expectations or fidelity to ensure requirements can be met with the available bandwidth, or automatically spinning up more server instances to keep up with demand spikes. Many teams have to solve these challenges on their own today, but I suspect that future components and frameworks will begin to provide help on this front.

Where do you think IoT development is headed? I’d love to continue this discussion in the comments below. And, in case you missed it, please watch the “Big Data, Big Testing and the IoT: Overcoming Development Challenges” on-demand.

The post Development Challenges on the Internet of Things appeared first on Software Testing Blog.

Categories: Companies

Reliable database tests with Respawn

Jimmy Bogard - Thu, 02/19/2015 - 19:21

Creating reliable tests that exercise the database can be a tricky beast to tame. There are many different sub-par strategies for doing so, and most of the documented methods talk about resetting the database at teardown, either using rolled back transactions or table truncation.

I’m not a fan of either of these methods – for truly reliable tests, the fixture must have a known starting point at the start of the test, not be relying on something to clean up after itself. When a test fails, I want to be able to examine the data during or after the test run.

That’s why I created Respawn, a small tool to reset the database back to its clean beginning. Instead of using transaction rollbacks, database restores or table truncations, Respawn intelligently navigates the schema metadata to build out a static, correct order in which to clear out data from your test database, at fixture setup instead of teardown.

Respawn is available on NuGet, and can work with SQL Server or Postgres (or any ANSI-compatible database that supports INFORMATION_SCHEMA views correctly).

You create a checkpoint:

private static Checkpoint checkpoint = new Checkpoint
{
    TablesToIgnore = new[]
    {
        "sysdiagrams",
        "tblUser",
        "tblObjectType",
    },
    SchemasToExclude = new []
    {
        "RoundhousE"
    }
};

You can supply tables to ignore and schemas to exclude for tables you don’t want cleared out. In your test fixture setup, reset your checkpoint:

checkpoint.Reset("MyConnectionStringName");

Or if you’re using a database besides SQL Server, you can pass in an open DbConnection:

using (var conn = new NpgsqlConnection("ConnectionStringName"))
{
    conn.Open();

    var checkpoint = new Checkpoint {
        SchemasToInclude = new[]
        {
            "public"
        },
        DbAdapter = DbAdapter.Postgres
    };

    checkpoint.Reset(conn);
}

Because Respawn stores the correct SQL in the right order to clear your tables, you don’t need to maintain a list of tables to delete or recalculate on every checkpoint reset. And since table truncation won’t work with tables that include foreign key constraints, DELETE will be faster than table truncation for test databases.

We’ve used this method at Headspring for the last six years or so, battle tested on a dozen projects we’ve put into production.

Stop worrying about unreliable database tests – respawn at the starting point instead!

Post Footer automatically generated by Add Post Footer Plugin for wordpress.

Categories: Blogs

Registration is Now Open for uTest’s ‘Introduction to Android Testing’ Webinar

uTest - Thu, 02/19/2015 - 18:29

androiduTest University is happy to announce that registration is now open for our next live webinar opportunity.

Hot on the heels of the recent Introduction to Security Testing and Build the “right” regression suite using Behavior-Driven Testing (BDT) webinars, uTest University is offering a chance for testers to get familiar with Android testing. The webinar is taught by Iwona Pekala, a Gold-rated uTester and frequent contributor to the uTest Forums.

In this webinar, participants will learn how to:

  • Prepare your mobile device and PC for testing
  • Install applications
  • Record videos and take screenshots
  • Collect logs
  • Get information about the types of crashes

Webinar Details

  • What: A live webinar presented by Iwona Pekala called “Introduction to Android Testing”
  • When: Tuesday, March 3, 2015 from 1:00 p.m. to 2:00 p.m. EST
  • How: Register now. Seats are limited!

About Iwona 

Iwona holds a Master’s degree in applied computer science and has been a professional software tester since 2007. Previously, she worked at IBM as a tester and developer. A uTester since 2011, she has more than three years of experience as an Android tester and two years of experience as an Android developer.

Iwona is also a course author in uTest University. See her course called How to Set Up and Use Mobizen Screen Recording.

About uTest University

uTest University is free to all members of the uTest Community. We are constantly adding to our course catalog to keep you educated on the latest topics and trends. If you are an expert in UX, load & performance, security, or mobile testing, you can share your expertise with the community by authoring a University course. Contact the team at university@utest.com for more information. Not a uTester yet? Sign up today!

Categories: Companies

Romania Testing Conference 2015 Call for Papers

Software Testing Magazine - Thu, 02/19/2015 - 16:50
The Romania Testing Conference is a two-day event focused on software testing that will take place in May in Cluj Napoca, Romania. The theme for the 2015 Romania Testing Conference is about communities of practice. Share your experience from your local communities or how did a group of people influence your work or how did you manage to influence other people work, etc. Beginning speakers are encouraged to submit their proposal for this conference. Deadline to submit a proposal is February 27th 2015. Visit http://www.romaniatesting.ro/call-for-papers/ for more information
Categories: Communities

GTAC 2014 Coming to Seattle/Kirkland in October

Google Testing Blog - Thu, 02/19/2015 - 15:22
Posted by Anthony Vallone on behalf of the GTAC Committee

If you're looking for a place to discuss the latest innovations in test automation, then charge your tablets and pack your gumboots - the eighth GTAC (Google Test Automation Conference) will be held on October 28-29, 2014 at Google Kirkland! The Kirkland office is part of the Seattle/Kirkland campus in beautiful Washington state. This campus forms our third largest engineering office in the USA.



GTAC is a periodic conference hosted by Google, bringing together engineers from industry and academia to discuss advances in test automation and the test engineering computer science field. It’s a great opportunity to present, learn, and challenge modern testing technologies and strategies.

You can browse the presentation abstracts, slides, and videos from last year on the GTAC 2013 page.

Stay tuned to this blog and the GTAC website for application information and opportunities to present at GTAC. Subscribing to this blog is the best way to get notified. We're looking forward to seeing you there!

Categories: Blogs

GTAC 2014: Call for Proposals & Attendance

Google Testing Blog - Thu, 02/19/2015 - 15:21
Posted by Anthony Vallone on behalf of the GTAC Committee

The application process is now open for presentation proposals and attendance for GTAC (Google Test Automation Conference) (see initial announcement) to be held at the Google Kirkland office (near Seattle, WA) on October 28 - 29th, 2014.

GTAC will be streamed live on YouTube again this year, so even if you can’t attend, you’ll be able to watch the conference from your computer.

Speakers
Presentations are targeted at student, academic, and experienced engineers working on test automation. Full presentations and lightning talks are 45 minutes and 15 minutes respectively. Speakers should be prepared for a question and answer session following their presentation.

Application
For presentation proposals and/or attendance, complete this form. We will be selecting about 300 applicants for the event.

Deadline
The due date for both presentation and attendance applications is July 28, 2014.

Fees
There are no registration fees, and we will send out detailed registration instructions to each invited applicant. Meals will be provided, but speakers and attendees must arrange and pay for their own travel and accommodations.

Update : Our contact email was bouncing - this is now fixed.



Categories: Blogs

The Deadline to Sign up for GTAC 2014 is Jul 28

Google Testing Blog - Thu, 02/19/2015 - 15:21
Posted by Anthony Vallone on behalf of the GTAC Committee

The deadline to sign up for GTAC 2014 is next Monday, July 28th, 2014. There is a great deal of interest to both attend and speak, and we’ve received many outstanding proposals. However, it’s not too late to add yours for consideration. If you would like to speak or attend, be sure to complete the form by Monday.

We will be making regular updates to our site over the next several weeks, and you can find conference details there:
  developers.google.com/gtac

For those that have already signed up to attend or speak, we will contact you directly in mid August.

Categories: Blogs

Announcing the GTAC 2014 Agenda

Google Testing Blog - Thu, 02/19/2015 - 15:20
by Anthony Vallone on behalf of the GTAC Committee

We have completed selection and confirmation of all speakers and attendees for GTAC 2014. You can find the detailed agenda at:
  developers.google.com/gtac/2014/schedule

Thank you to all who submitted proposals! It was very hard to make selections from so many fantastic submissions.

There was a tremendous amount of interest in GTAC this year with over 1,500 applicants (up from 533 last year) and 194 of those for speaking (up from 88 last year). Unfortunately, our venue only seats 250. However, don’t despair if you did not receive an invitation. Just like last year, anyone can join us via YouTube live streaming. We’ll also be setting up Google Moderator, so remote attendees can get involved in Q&A after each talk. Information about live streaming, Moderator, and other details will be posted on the GTAC site soon and announced here.

Categories: Blogs

GTAC 2014 is this Week!

Google Testing Blog - Thu, 02/19/2015 - 15:20
by Anthony Vallone on behalf of the GTAC Committee

The eighth GTAC commences on Tuesday at the Google Kirkland office. You can find the latest details on the conference at our site, including speaker profiles.

If you are watching remotely, we'll soon be updating the live stream page with the stream link and a Google Moderator link for remote Q&A.

If you have been selected to attend or speak, be sure to note the updated parking information. Google visitors will use off-site parking and shuttles.

We look forward to connecting with the greater testing community and sharing new advances and ideas.

Categories: Blogs

GTAC 2014 Wrap-up

Google Testing Blog - Thu, 02/19/2015 - 15:19
by Anthony Vallone on behalf of the GTAC Committee

On October 28th and 29th, GTAC 2014, the eighth GTAC (Google Test Automation Conference), was held at the beautiful Google Kirkland office. The conference was completely packed with presenters and attendees from all over the world (Argentina, Australia, Canada, China, many European countries, India, Israel, Korea, New Zealand, Puerto Rico, Russia, Taiwan, and many US states), bringing with them a huge diversity of experiences.


Speakers from numerous companies and universities (Adobe, American Express, Comcast, Dropbox, Facebook, FINRA, Google, HP, Medidata Solutions, Mozilla, Netflix, Orange, and University of Waterloo) spoke on a variety of interesting and cutting edge test automation topics.

All of the slides and video recordings are now available on the GTAC site. Photos will be available soon as well.


This was our most popular GTAC to date, with over 1,500 applicants and almost 200 of those for speaking. About 250 people filled our venue to capacity, and the live stream had a peak of about 400 concurrent viewers with 4,700 playbacks during the event. And, there was plenty of interesting Twitter and Google+ activity during the event.


Our goal in hosting GTAC is to make the conference highly relevant and useful for, not only attendees, but the larger test engineering community as a whole. Our post-conference survey shows that we are close to achieving that goal:



If you have any suggestions on how we can improve, please comment on this post.

Thank you to all the speakers, attendees, and online viewers who made this a special event once again. To receive announcements about the next GTAC, subscribe to the Google Testing Blog.

Categories: Blogs

The Art of DevOps: An Introduction to the Landscape

Welcome to my four part series on what I’m going to call the Art of DevOps.  We will embark on a mission to reveal the extremely valuable intelligence that’s been collected about a unique strategy to continuously deliver assets to the operational battleground safely, securely and quickly.  This strategy drives optimal monitoring of the frontlines […]

The post The Art of DevOps: An Introduction to the Landscape appeared first on Dynatrace APM Blog.

Categories: Companies

Synchronization or Wait in Selenium WebDriver (C#)

Testing tools Blog - Mayank Srivastava - Thu, 02/19/2015 - 13:46
Synchronization or wait helps to handle dependencies while executing the script because sometime tools execution speed does not match with the application speed or some web elements response time does not match with script actions. So to handle synchronization, selenium webdriver provides two effective ways and those are- Implicitly Wait Explicitly Wait An implicitly Wait tell to […]
Categories: Blogs

Reminder for Workshops in March

Ranorex - Thu, 02/19/2015 - 12:20
We are very pleased to remind you about our upcoming (online) Ranorex training courses that are scheduled for March.


Get firsthand training with Ranorex professionals and learn how to get the most out of Ranorex Studio and the Ranorex Test Automation Tools at one of these workshops.

Look at the  schedules  for additional workshops in the next few months.

We look forward to seeing you there!!
Categories: Companies

Announcing QA Wizard Pro 2015

The Seapine View - Thu, 02/19/2015 - 09:30

Have you heard the news? We recently released QA Wizard Pro 2015, the latest version of our automated functional testing and load testing tool. This version introduces new functionality for setting a main script for the workspace and quickly running it without opening it first, modifying linked Excel datasheets, adding additional files to batches, and lots of new statements to enhance standard and load test scripts.

Setting and running a main script

Set the main script for a workspace if you need to quickly run a frequently used script or a script used to run a complete test suite. When a main script is set, you can run or debug it without opening it first, which eliminates the need to switch focus to the main script if you are working with another script or file in the Scripts pane.

Datasheet enhancements

You can now modify linked Excel datasheets directly in QA Wizard Pro and from scripts, which makes it easy to sync changes between QA Wizard Pro and the Excel data source.

The new GetDataSourceColumnNames and GetRecordsetColumnNames statements are also available to retrieve column names from data sources and recordsets.

Add additional files to batches

You can now add called scripts or datasheets that are accessed using variables or other expressions to batch files to make the dependent files available when the batch runs.

Perform new actions during script playback

Use the following new statements to perform actions during playback.

  • CreateGUID creates a globally unique identifier (GUID).
  • CreateTempFile creates a temporary file.
  • IgnoreErrors stops displaying known errors in the Errors pane or in run reports.
Send more web requests during load tests

Use the following new statements to identify virtual users and send additional requests to web servers during load tests: GetVirtualUserID, WebDelete, WebHead, WebOptions, WebPatch, WebPatchFromFile, WebPatchJSON, WebPostJSON, WebPut, WebPutFromFile, and WebPutJSON.

For a full list of features, enhancements, and bug fixes included in QA Wizard Pro 2015, check out the release notes.

To learn more about QA Wizard Pro, visit www.qawizard.com/software-testing-tools/automated-testing.

Share on Technorati . del.icio.us . Digg . Reddit . Slashdot . Facebook . StumbleUpon

Categories: Companies

Why that Way?

Hiccupps - James Thomas - Thu, 02/19/2015 - 09:01
Most working days I go for a walk round the block after I've eaten my dinner. As I leave our office I've got two primary choices: lift or stairs. I say primary because there's obviously many things I could do at that point (although I have promised my wife that naked star jumps will not feature in my daily exercise regimen ... again). In any case I go for the stairs.

At the bottom of the stairs I have two more choices: left or right. Each takes me to a different exit from the building but both open onto the circuit that I stroll round, and if I leave by one of them I will naturally arrive back at the other so there's (again, to the level of granularity that I care about) no significant difference between them.

I can't go straight on at the bottom of the stairs because the lift is there and a u-turn sends me back into work so every day I am forced to make the choice - left or right.  And every day until recently I've been making that choice without any conscious thought.

But when I realised I was doing it, I started looking for patterns. Philosophical aspects of the observer effect to one side I discovered that, over the period I watched, I tended to choose left and right roughly equally and that (ignoring extraneous circumstances such as deliveries being in the way) I have a tendency to go in the direction closest to the side of the stairs I happen to be on.

For instance,  if I've moved left to let someone else up on my right, I'll tend to go left at the bottom. If I've swung round the corner between flights a bit faster than normal and ended up on the right hand side, I'll naturally hang a right when I get to the ground floor too.

On my walk I listen to podcasts. A couple of weeks ago, while I was stair-spotting, Invisibilia told the story of how an experimenter's attitude towards their subjects can influence the performance of the subjects. In one landmark study, when  an experimenter was told that a set of lab rats were smart, the rats performed better and when told they were stupid, the rats performed worse.

The study concluded that the experimenter behaviour was unconsciously changed by their expectation of the animals. When told the rat was clever they might hold it more gently, talk to it more and so on. This in turn made the rat more comfortable and in a better mood to run around the maze or whatever.
Unconscious action can lead to unexpected but, crucially, predictable consequences.I don't think about which way I'd go from the bottom of the stairs but I can discern a pattern to the behaviour when I look. We're all making decisions all day every day - both in trivial matters like which way to leave a building and in more serious stuff like which way we'll test something or how we'll speak to our colleagues.

I'm never short of questions, but now I have some new ones: Why did I choose that way? Did I notice there were options? Did I know I was choosing? How did that influence my behaviour? How can I know the effects of that?
Image: https://flic.kr/p/e2SoZc
Categories: Blogs

Continuous Testing in Practice [WEBINAR]

Sauce Labs - Wed, 02/18/2015 - 18:00

As web and mobile application software increases in complexity, the number and frequency of tests grows exponentially. But managing your tests with sub-optimal continuous integration (CI) workflows can lead to bottlenecks, delays, and lost developer productivity.

In our next webinar, Continuous Testing in Practice, Ophir Prusak from BlazeMeter and Abhijit Pendyal from Sauce Labs will show you how to integrate automated testing into your CI process so that you can test early and often to speed up deployment.

Ophir and Abhijit will cover:

  • Why Continuous Testing is so important today
  • How to ensure testing keeps pace with agile development cycles
  • The end-to-end flow of a continuous testing process
  • How to implement continuous automated functional & performance testing
  • How to integrate continuous testing with your existing tools

Join us for this presentation on Tuesday, February 24th at 10am PST/1pm EST. There will be a Q&A with both Ophir and Abhijit following the end of the presentation.

Click HERE to register today.

Categories: Companies

uTest Announces the 2014 uTester of the Year Awards

uTest - Wed, 02/18/2015 - 17:19

uTester of the YrYou may recall that 2014 was a landmark year for our 6 1/2-year-old community. The uTest brand lived on in a big way: We transformed our community into the professional network for testers, an open community that promotes and advance the testing profession, and the people who do this vital work.

And no doubt, it was these same people, our uTesters, that were the heart and soul of this transformation through their continued hard work on paid projects and in contributing valuable learning content for our testers. So it makes the uTest Community Management team especially proud to announce the brightest stars of this bunch as our 6th Annual uTesters of the Year.

The uTester of the Year Awards are the highest distinction a uTester can receive. They celebrate community members that not only have gone above and beyond in their call of duty on paid projects at uTest, but have contributed valuable and impacful content for our testers in the uTest Forums and uTest University, and on the uTest Blog.

This year’s winners once again were chosen by our Community and Project Management teams, who have the privilege of working closely with top-notch testers from around the world. However, we also looped Test Team Leads (TTLs) into the voting process for the first time given just how much they are in tune with the pulse of our community. It’s no easy work boiling down 150,000+ testers (not literally!) into one small group, but we believe that after much sifting through votes and data, we’ve chosen an extremely dedicated and talented bunch for 2014.

Now, let’s get right to it. Meet 2014’s uTesters of the Year.

The top honors for this year’s awards goes to……Sheryl Reed from the United States!

Sheryl is this year’s Most Valuable Tester (MVT), and just last week celebrated three years with uTest. Like Romulo in last year’s awards, this is also Sheryl’s first appearance in the uTester of the Year Awards, which makes this all the more impressive.

Sheryl is a prolific Gold-rated tester on paid projects here at uTest — she has filed over 5,300 bugs on over 1,800 test cycles — countless numbers of these bugs listed as ‘exceptional’ — and has been listed as a ‘favorite tester’ by 16 customers. In her day job, too, she is no stranger to testing as she has spent over 30 years testing and programming everything from mainframes to mobile apps, including a run at IBM for over 20 years.

Here’s what Sheryl had to say on receiving MVT honors for 2014:

It is an honor to be selected 2014 Most Valued Tester receiving recognition for my contributions. For the past three years, it has been my pleasure working for an organization where software test engineers are held in high regard.

When I stumbled upon uTest on a friend’s LinkedIn profile, I never imagined I would find a software testing job offering me the freedom to choose projects and job offers to test interesting apps, and education to grow my testing skills.

I am grateful for the vote of confidence placed in me by the Community and Project Managers along with the Test Team Leads. Looking forward to a great 2015 at uTest, working in-the-wild with amazing testers from all over the world.

Including Sheryl, here is the list of the 2014 uTesters of the Year:

In addition to the special category honors, we’re proud to recognize the following testers as Top Testers of 2014:

Please join me in congratulating all of our 2014 uTesters of the Year! Be sure to formally stop by the Forums as well and do so. This group once again showcases the outstanding testing talent that shines in our community, and these testers’ dedication to not only quality on paid testing projects, but to producing outstanding content at uTest.

In addition to having their names permanently etched in our uTest Hall of Fame, the winners will be taking home a fancy swag pack that includes a custom t-shirt with the new-for-2014 design you see above.

uTesters are already making their mark and paving their way to 2015 uTester of the Year nominations with their impactful work as we move deeper into the year.

Until 2015…

Categories: Companies

CloudBees Heads To Las Vegas For IBM Interconnect


Cloud, Mobile, DevOps and Security collide next week at the IBM Interconnect show in Las Vegas. This is proving to be the one of the biggest conferences of the year as IBM incorporates 3 of their most dynamic events. So who's going? We know CloudBees is!LEARN:Don't miss Kohsuke Kawaguchi, CTO of CloudBees and Jenkins founder, presenting “Automating Continuous Delivery with Jenkins and IBM UrbanCode Deploy" on Monday 2/23 at 2PM in Islander A Ballroom at Mandalay Bay. This presentation will look at how new features in Jenkins, such as traceability and Workflow, are helping enterprises orchestrate their entire software delivery process. Attend to hear how Jenkins works together with IBM UrbanCode Deploy to deliver higher quality software in a repeatable fashion.ACHIEVE RESULTS:What are you trying to accomplish this year? Faster time to market? Reduced costs of IT infrastructure? More automated processes? Make sure you pick up a copy of the Orbitz case study while visiting Booth #500 to learn how they were able to cut release cycles by more than 75%.INTERACT:Want to see CloudBees Jenkins Enterprise in action? Stop by Booth #500 to interact with a Jenkins expert to get your very own demo. Don't forget to ask about Workflow!WIN:Think you are a Jenkins Master or maybe just are striving to be one? Then you don't want to miss out on your own Jenkins Master t-shirt. You will be the envy of the office when you return home sporting these threads!



Christina PappasMarketing Funnel ManagerCloudBees
Follow her on Twitter
Categories: Companies

Knowledge Sharing

SpiraTest is the most powerful and affordable test management solution on the market today