When you are a tester who operates in a traditional environment it can be difficult to understand how testing could work any other way. Your organisation may use a tool that dictates aspects of your test process; test cases are written and linked back to requirements. Management may request test reports in a specific format, where number of test cases executed, percentage complete and bug counts are expected. There are often commercial drivers for testing being restricted to confirming that the software delivered meets what was requested. These are all barriers to changing how testing is perceived, not only by others in our industry but also testers themselves.

Testing is not simply verifying that requirements have been met. Testing is discovering and communicating information about software, which is difficult in the environment described above. Traditional testing represents a small portion of what testing is, so the confidence the business has in its outcomes is misplaced. As testers, we can educate ourselves and our organisations to deliver better testing.

The danger of confirmatory testing
Test cases that link to requirements focus the tester on proving that the application can deliver what it has been asked to do. This narrow scope can prevent the discovery of undesired behavior in the application and revelations about what the business should have requested in the first place. Both these things are important in reducing the risk associated with releasing software.

A real user of the application will not know how it was requested to behave. It is the role of the tester to anticipate how customers will interact with the application and identify any problems they may encounter in doing so. Confirming requirements will not necessarily do this. To broaden your testing, you must to ask questions beyond what is written in the requirements document. Conversations of this nature will reveal a number of new test activities.

Parallel analysis and execution
Increasing the scope of testing activities will mean that you need more time to test. Fortunately in a traditional process there is an obvious candidate for elimination. Writing detailed test cases is a wasteful activity, which often requires the tester to anticipate how the application will behave before interacting with it. When test analysis and execution are parallel activities, the tester has the opportunity to learn and alter their model of the software as they interact with it. If you stop writing pre-scripted test cases and instead identify only goals for your testing, you will create more time for real testing.

No test case does not mean abandoning evidence of testing. Instead of documenting what you plan to do, document what has actually been done. This can be achieved in a variety of ways and is a much more useful record for test auditing than a suite of test cases.Similarly, no test case does not mean no structure. Rather, the structure of your testing can reflect the needs of your project rather than the restrictions of a tool. Test analysis and critical thinking occur as before, but are now fostered to reach beyond the artificial boundaries dictated by test case templates and dependence on requirements. There are a number of heuristics to guide your thinking as you learn to explore with purpose; testing is not ad-hoc.

Real Reporting
Eliminating traditional test cases will mean that you can no longer count them, which will make many testers and project managers nervous. Test reporting has been synonymous with numbers for a long time. However when we interrogate what the numbers actually mean, we often find that they offer an illusion of testing rather than providing any real information.

The request for numbers stems from a desire to know how testing is progressing. Often a number is the most forthcoming piece of information from a traditional testing team, yet the very notion of reporting in this way should be as ridiculous as asking a business analyst “How many requirements do you plan to write and what percentage are already written?”.

As the culture of testing shifts beyond verification of requirements, testers will have a greater understanding of the application. This is what management really want to know. Rather than telling them that testing is 62% complete, tell them a story about what you have discovered. Describe issues you have encountered and communicate potential risk. Be a source of rich, transparent and useful information. Reporting in this fashion will increase the value of testing as a service to the project team, which will alter the expectations of your colleagues about what testing is.

What is testing?
Testing in a traditional environment can change from verification of requirements to a broader scope. It can change from counting test cases to reporting useful information. Though the testing that you do should always be dependent on context, it should always be testing. As a profession, we need to share this common definition.

Testing is discovering and communicating information about software.

What are you doing?

https://i2.wp.com/www.testingcircus.com/wp-content/uploads/question-mark-testing-circus.jpg?fit=751%2C1024&ssl=1https://i2.wp.com/www.testingcircus.com/wp-content/uploads/question-mark-testing-circus.jpg?resize=150%2C150&ssl=1Katrina ClokieArticlesWhat is Testing?When you are a tester who operates in a traditional environment it can be difficult to understand how testing could work any other way. Your organisation may use a tool that dictates aspects of your test process; test cases are written and linked back to requirements. Management may request...
The following two tabs change content below.
Profile photo of Katrina Clokie
Katrina Clokie serves a team of more than 20 testers as a Testing Coach in Wellington, New Zealand. She is an active contributor to the international testing community as the editor of Testing Trapeze magazine, a mentor with Speak Easy, a co-founder of her local testing MeetUp WeTest Workshops, an international conference speaker, frequent blogger and tweeter.
Profile photo of Katrina Clokie

Latest posts by Katrina Clokie (see all)