Skip to content

Feed aggregator

Latest Test Studio Brings New Reports and HTML Results

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Test Studio continues to make test automation even easier with the first major release of 2017. 2017-03-10T14:33:30Z 2017-05-26T01:33:05Z Iliyan Panchev
Categories: Companies

How We Test Software: Chapter Four Part II—Developer Tools

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Have you wondered how the teams working on Telerik products test software? In the final chapter of our detailed guide, we give you deeper insight into the processes of our Web Division. 2016-12-13T20:41:09Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

How We Test Software: Chapter Four—Telerik Developer Tools

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Have you wondered how the teams working on Telerik products test software? In the next chapter of our detailed guide, we give you deeper insight into the processes of our Dev Tools division. 2016-11-29T13:00:00Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

How We Test Software: Chapter Three Part III—Telerik Business Services

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Have you wondered how the teams working on Telerik products test software? In the next chapter of our detailed guide, we give you deeper insight into the processes of our Business Services. 2016-11-21T19:48:49Z 2017-05-26T01:33:05Z Anton Angelov
Categories: Companies

Test Studio R3 Release Webinar Recording and Q&A

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Just a few weeks ago the 3rd major for 2016 release of Telerik Test Studio ushered in loads of new features like Angular 2 support, API Testing Fiddler integration, support for NativeScript and iOS10, and more. These were all demoed at our usual post-release webinar last week. Here's a recap of the Q&A session. 2016-10-21T18:24:16Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

Introducing Fiddler for OS X Beta 1

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Over the years, we have received numerous requests from our user community to provide a Fiddler build for OS X. So we have ported the latest version of Fiddler to the Mono Framework which in turn supports OS X—and you can grab the beta bits today. 2016-10-17T13:49:40Z 2017-05-26T01:33:05Z Tsviatko Yovtchev
Categories: Companies

How We Test Software: Chapter Three Part II—Telerik Business Services

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Have you wondered how the teams working on Telerik products test software? We continue with the next chapter in our detailed guide, giving you deeper insight into our very own processes. This chapter focuses on Telerik Business Services. 2016-10-12T12:30:00Z 2017-05-26T01:33:05Z Anton Angelov
Categories: Companies

Test Studio Goes Big with Angular, Mobile and API Testing

Telerik TestStudio - Fri, 05/26/2017 - 03:47
The third major Test Studio update of the year just came out today and it adds loads of new bits to our API testing and Mobile testing solutions. 2016-09-28T15:16:16Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

How We Test Software: Chapter Three—Telerik Business Services

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Have you wondered how the teams working on Telerik products test software? In the next chapter of our detailed guide, we give you deeper insight into the processes of our Business Services. 2016-09-09T14:53:23Z 2017-05-26T01:33:05Z Anton Angelov
Categories: Companies

How We Test Software: Chapter Two—Telerik Platform Part Two

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Have you wondered how the teams working on Telerik products test software? We continue with the next chapter in our detailed guide, giving you deeper insight into our very own processes. 2016-08-24T12:30:00Z 2017-05-26T01:33:05Z Angel Tsvetkov
Categories: Companies

Test Studio R2 Release Webinar Wrap Up and Q&A

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Last week we hosted a release webinar on the latest Test Studio features, including a new API tester. Here's a recap of some of the interesting questions we got during the live webcast. 2016-07-22T14:34:27Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

Be an API Testing Hero with the New Test Studio

Telerik TestStudio - Fri, 05/26/2017 - 03:47
The new Test Studio release is here! We are now offering GIT integration, MS Edge Support, provisioning for Android device along with the Test Studio for APIs Beta. 2016-06-30T13:12:42Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

Webinar Follow-up: New Testing Battlefields

Telerik TestStudio - Fri, 05/26/2017 - 03:47
Four testing experts recently explored testing beyond the "traditional" desktop and web, including the new battlefields of Mobile and IoT. Read on for a recap or to watch a webinar replay. 2016-06-21T12:20:00Z 2017-05-26T01:33:05Z Jim Holmes
Categories: Companies

Are You Ready for the New Testing Battlefields?

Telerik TestStudio - Fri, 05/26/2017 - 03:47
A software-defined way of living and the digital transformation of traditional businesses are not the future. They are already here. This brings challenges to testers and developers alike. Join this one-hour roundtable discussion with industry experts to hear what’s new in testing today. 2016-06-13T15:20:23Z 2017-05-26T01:33:05Z Antonia Bozhkova
Categories: Companies

Telerik and Testdroid Webinar Bonus Q&A

Telerik TestStudio - Fri, 05/26/2017 - 03:47
In this blog, Ville-Veikko Helppi tackles some of the unanswered questions from the webinar. 2016-06-06T14:55:06Z 2017-05-26T01:33:05Z Ville-Veikko Helppi
Categories: Companies

Just Do It: A snapshot of APM & Unified Enterprise Monitoring

As Bob learned in the first posting in this three-part series previous blog, technology alone can’t deliver a healthy lifestyle. Likewise, having a successful APM program isn’t just about seeing what’s happening on a computer screen; it’s about doing something about it. You have to align your strategy, culture, people and processes with your digital business goals to reach the summit — and that’s not easy. So, what does that kind of success look like? Is it the same for every company?

Steps along the way

The fact is that every organization has slightly — sometimes dramatically — different expectations of success depending on the business outcomes they want. Ultimate goals are always different, but all companies follow the same four-step process as they work towards them. It starts with the technology and ends with a culture change; a new way of doing business every day.

  1. Implementation

This is the beginning of the process, when an organization deploys a new technology of choice. This should be considered table stakes and should be achieved as quickly as possible — preferably during the proof-of-concept (POC) stage.

  1. Value Realization

The next step is when people in an organization start to use the new technology for detailed visibility into applications and digital experiences. Teams then gain understanding and perspective from what they now see and take action. This action — whether it means improving the performance of applications or making an online checkout process faster and easier — results in measurable value for the company. People like and use the new banking app. More visitors to an online store convert and buy. Financial analysts get the information they need without interruption.

  1. Adoption

Once individual teams start to realize value from enhanced visibility and information, word usually spreads within an organization. If the ops group can use this technology to find and eliminate problems in production, wouldn’t it be even better if development teams could use the same solution to diagnose problems with apps before they go into production? If it works for the team in NORAM, why not go global with it and see what could be accomplished on a large scale? Once companies start expanding their internal success with the adoption process, the measurable value starts to leap forward. The ROI can be remarkable.

  1. Operationalization

When the adoption process has spread like digital wildfire across the organization—through silos of business and technology—we often describe the company as having a “culture of performance.” The digital experience is an integrated part of the business. APM is embedded into the everyday operations of the organization, across the entire lifecycle. Everyone is a stakeholder in digital success.

Keys to success

“Sub-optimization is when everyone is for himself. Optimization is when everyone is working to help the company.” –W. Edwards Deming

How do companies open the sometimes-elusive door to performance culture? When I look at the ones who made it there, and keep improving, they have four things in common:

  • Executive-level leadership and a clear APM strategy articulated throughout the organization
  • A top-down monitoring approach that examines the health of applications from the end-user perspective, not just from an infrastructure standpoint (bottom-up)
  • Incentives and visibility into digital business that cross traditional silos bringing together teams like marketing, development and IT operations
  • Institutionalized, cross-functional collaboration between these different teams that make it easy for them to work together and speak the same language

One of our customers, a major insurance company, is a great example of the power of executive leadership. Initially, the company suffered some serious application issues during one of their annual open enrollment periods, and knew this had to change. The executive team sprang into action, communicated a clear strategy to the entire organization, and prioritized APM as a corporate goal.

They also established a dedicated APM team to drive broad and deep APM adoption across the company, supported by both Dynatrace Expert Services and Dynatrace University.

Together, we worked with the customer to develop an internal endorsement program to promote Dynatrace users who could demonstrate proficiency in APM technologies and concepts. Today, this company has won awards for its digital performance. A member of the team at this customer explained:

“The APM program has been the most successful IT initiative I have seen or heard of in more than 10 years working here.”

Another customer example is a large, global financial services company. IT operations leadership spearheaded initial APM efforts, and continued to support the business with a proactive and pervasive approach. Every ops team member knows their job is to make sure that financial advisers never have a single, noticeable, drop in performance or availability to ensure they generate the most money for their clients.

The team at this company is organized in a clear, almost military, fashion so that three groups can work together to focus on individual parts of the enterprise. One group deals with performance, handling on boarding and performance engineering. A tools infrastructure group is responsible for administrative tasks. Finally, a performance anomaly group is solely focused on hunting down and eliminating performance issues. The combined result of these groups working together as a team with Dynatrace is a virtually flawless digital enterprise.

“Without productivity objectives, a business does not have direction. Without productivity measurements, it does not have control.” – Peter Drucker

In the next blog entry, and the final one in this series, I’ll explain how we developed our path to success methodology by working with customers like the ones I described here. Thanks to our customers — some of the most respected companies in the world–we’ve learned what objectives and measurements work best.

The post Just Do It: A snapshot of APM & Unified Enterprise Monitoring appeared first on Dynatrace blog – monitoring redefined.

Categories: Companies

Cambridge Lean Coffee

Hiccupps - James Thomas - Wed, 05/24/2017 - 21:48

This month's Lean Coffee was hosted by Redgate. Here's some brief, aggregated comments and questions  on topics covered by the group I was in.

What benefit would pair testing give me?
  • I want to get my team away from scripted test cases and I think that pairing could help.
  • What do testers get out of it? How does it improve the product?
  • It encourages a different approach.
  • It lets your mind run free.
  • It can bring your team closer together.
  • It can increase the skills across the test group.
  • It can spread knowledge between teams.
  • You could use the cases as jumping-off points.
  • I am currently pairing with a senior tester on two approaches at the same time: functional and performance.
  • For pairing to work well, you need to know each other, to have a relationship.
  • There are different pairing approaches.
  • How long should you pair for?
  • We turned three hour solo sessions into 40 minute pair sessions.
  • You can learn a lot, e.g. new perspectives, short-cuts, tips.
  • Why not pair with developers?

Do you have a default first test? What it is? Why?
  • Ask what's in the build, ask what the expectation is.
  • A meta test: check that what you have in front of you is the right thing to test.
  • It changes over time; often you might be biased by recent bugs, events, reading etc to do a particular thing.
  • Make a mind map.
  • A meta test: inspect the context; what does it make sense to do here?
  • A pathetic test: just explore the software without challenging it. Allow it to demonstrate itself to you.
  • Check that the problem that is fixed in this build can be reproduced in an earlier build.

How do you tell your testing story to your team?
  • Is it a report, at the whiteboard, slides, a diagram, ...?
  • Great to hear it called a story, many people talk about a report, an output etc.
  • Some people just want a yes or no; a ship or not.
  • I like the RST approach to the content: what you did, what you found, the values and risks.
  • Start writing your story early; it helps to keep you on track and review what you've done
  • Writing is like pairing with yourself!
  • In TDD, the tests are the story.

One thing that would turn you off a job advert? One thing that would make you interested?
  • Off: a list of skills (I prefer a story around the role).
  • Off: needing a degree.
  • Interested: the impression that there's challenge in the role and unknowns in the tasks.
  • The advert is never like the job!
  • Interested: describes what you would be working on.
  • Off: "you will help guarantee quality".
  • Interested: learning opportunities.
  • Interested: that it's just outside of my comfort zone.
Image: https://stocksnap.io/photo/A78EC1EB73
Categories: Blogs

Semaphore Releases Boosters

Software Testing Magazine - Wed, 05/24/2017 - 17:49
Semaphore, a cloud-based code delivery service provider, announced the launch of Boosters, a new feature that drastically speeds up automated software testing. Boosters allows software development...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Categories: Communities

Nouvola Integrates with AWS CodePipeline

Software Testing Magazine - Wed, 05/24/2017 - 16:57
Nouvola, a vendor of in cloud-based performance testing and load testing, has announced integration with AWS CodePipeline, enabling developers using AWS CodePipeline to now include Nouvola tests in...

[[ This is a content summary only. Visit my website for full links, other content, and more! ]]
Categories: Communities

The Checking and Testing Debate Explained: Everything You Need to Know…

Gurock Software Blog - Wed, 05/24/2017 - 02:02

 Everything You Need to Know...

This is a guest posting by Simon Knight. Simon Knight works with teams of all shapes and sizes as a test lead, manager & facilitator, helping to deliver great software by building quality into every stage of the development process.

The terms “testing” and “checking” tend to get used interchangeably for activities performed by development teams to verify readiness and/or completeness of software products. Testing and checking could be easily interpreted to mean the same thing. However, as with most words in the English language, both testing and checking are in fact multi-faceted terms, layered with meaning and nuance depending on your context and audience.

For example, if you Google the question “what is software testing?”, you’ll get back a facsimile of the ISTQB definition:

“Software testing is a process of executing a program or application with the intent of finding the software bugs. It can also be stated as the process of validating and verifying that a software program or application or product: meets the business and technical requirements that guided its design and development, works as expected and can be implemented with the same characteristic.”

Receive Popular Monthly Testing & QA Articles

Join 34,000 subscribers and receive carefully researched and popular article on software testing and QA. Top resources on becoming a better tester, learning new tools and building a team.



Subscribe
We will never share your email. 1-click unsubscribes. articles That settles the matter then, right?

 Everything You Need to Know...

You’d think… But on the other hand, Context Driven Testing (CDT) figurehead Michael Bolton asserts that testing is “a process of exploration, discovery, investigation, and learning.” Whereas according to him, checking is “a process of confirmation, verification, and validation.”

In case that’s not sufficiently clear in the context of the checking-versus-testing debate, checking may typically mean activities we can program a computer or robot to do; it’s low-skill, repetitive work. Testing on the other hand, is exploratory and dynamic, normally requiring human intellect, often utilizing a dedicated and skilled resource.

Thanks, I feel much more informed now. So, what’s all the fuss about?

 Everything You Need to Know...

Some people within the wider (i.e. non-CDT) testing community think that the distinction is used as a form of intellectual bullying. The way the difference between testing and checking is debated, taught or otherwise discussed in project teams or within organizations has been perceived to be uncharitable in nature.

Specifically, Brett Pettichord co-author of the classic software testing book Lessons Learned in Software Testing (mentioned in a previous blog here) and Marlena Compton who, on May 18th decided to start a fire on Twitter with the tweets below:

Go and check out the full thread. It makes for an interesting read.

Is it important enough to argue about?

 Everything You Need to Know...

Although it may be counter-productive to bring it up in a conversation with Joe Developer, the distinction between checking and testing is an important one. Understanding the difference between tests that help you learn something new, versus tests that confirm something you already knew (checks) can help to steer a successful testing strategy.

The point of the Twitter debate (in my humble opinion – treading on dangerous/controversial ground here…) was to call out some people in the industry, a minority hopefully, who might use that distinction as a kind of blunt instrument with which to beat developers during discussions around what to test and how: Consultants and Trainers using “testing” and “checking” as buzzwords to drum up business; overzealous CDT followers looking to raise their profile and gain popularity with peers by using trending terminology.

So why do I need to know about this?

 Everything You Need to Know...

What you need to know about testing and checking, is that they’re both valuable strategies in a balanced approach to carrying out your testing.

Test to explore, learn about and detect new bugs in your product. Test to ensure you built the right thing, and that you built it right.

Check to verify that what you think you know about the product is still true. Check to ensure nothing has changed or regressed since you last tested it.

Distinguish between testing and checking to the extent it serves your test strategy well. Use checking as a heuristic to help you determine when, where and how to automate. Use testing for discovery, learning and creativity. Find a balance between the two, and communicate that to your people.

But for goodness sake, be careful how you talk about your testing. By all means replace the word “test” with “check” when referring to your automation stack if it helps. But don’t try and make everyone else do it. And don’t pick other people up when they don’t do it, unless there’s a very compelling reason to do so.

Precise language is important. Relationships more so. Your team will thank you.

Categories: Companies

Knowledge Sharing

SpiraTest is the most powerful and affordable test management solution on the market today