Category Archives: writing

Tester as Editor Metaphor

Sometimes when I’m asked to explain how I see the role of testers on development teams, I use an (admittedly simplified) editor role as an analogy. For example, I might be asked to write an article on a particular topic, say test automation, and the publisher asks me to touch on three areas: a test strategy, a list of test tools and an example. These three requirements serve as the acceptance tests. I will write several drafts on my own, and with each draft I will remove spelling errors and grammatical mistakes. These are my unit tests. I move iteratively with article drafts until I feel I have expressed the ideas that are needed for the article. Once my unit tests have passed, and I’m confident I’ve have touched on the three areas I’ve been asked to write about (my acceptance tests seem to pass), I send the article to an editor for feedback.

The editor checks the content. If I haven’t met the obligation of the acceptance tests, they will let me know quickly that I missed a requirement. Good editors point out areas where I haven’t expressed an idea as clearly as I could. Good editors really know the audience, and point out areas that the audience may have problems with. They may also point out spelling mistakes and grammatical errors I have missed. They help draw out ideas and help me make the article all it could be given the time and resources available.

The editor doesn’t merely point out flaws, they can also provide suggestions to help overcome them. A cluster of writing errors may indicate to an author that an idea is malformed. Editors can spot problems that the author might miss because they are a skilled “second set of eyes”. They can provide constructive criticism and help encourage the author prior to publishing. It isn’t necessary for all articles to have a formal editor, but when I work with a good editor I realize how much better the article is than if I did it on my own.

In many ways a software program is also an expression of an idea. An idea may be technically correct and may meet the requirements of the customer, but may not be expressed clearly given the intended audience. My role of a tester is not an adversarial one, but like the editor my goal is to encourage and help make the program all it could be prior to its release. Like a good editor, a tester has an ability to know the audience of the program and the context in which it will operate.

In this post, Brian Marick talks about writer’s workshops which look like a good idea for doing software testing:

I’m drawn to the analogy of writers’ workshops as they are used in the patterns community. Considerable care is taken to prepare the author to be receptive to critical comments. At its worst, that leads to nicey-nice self-censorship of criticism. But at its best, it allows the author to really hear and reflect on critical ideas, freed of the need to be defensive, in a format where teaching what should change doesn’t overwhelm teaching what works and should stay the same.

Software development teams could learn a lot about constructive criticism from the good writer’s workshops.
In another post, Brian sums up the idea I’m trying to get across here:

… I offer this picture of an Agile team. They – programmers, testers, business experts – are in the business of protecting and nurturing the growing work until it’s ready to face the world.

Good testers are not the adversaries of developers, they are part of the team who works collectively towards creating the best software they can. Bad testing, like bad editing does not seem to be in the business of nurturing a growing work.

Jennitta Andrea on Tests as Documentation

Jennitta Andrea sent me her position paper developed for the Tests as Documentation workshop. Unfortunately, she was unable to attend the workshop, but sent this paper with test examples. I like her approach, and encourage you to take a look at her examples. Here are some of her thoughts in the paper on the Tests as Documentation concept:

  • You must work VERY hard for your tests to be actually used as documentation for the system requirements.
    • They must be both organized and named extremely well so that someone looking for the details on some aspect of the system can find them.
    • They must be written with the reader in mind … all of the well known test automation patterns describe these details.
  • The automated acceptance tests must be built upon a clear and concise domain specific testing language (this is a superset of the ubiquitous language described by Evans).
  • The system requirements are spread out amongst a number of different kinds of tests
    • User acceptance tests cover the big-picture work flow and business process description
    • Unit tests cover the specific business rules for individual / small groups of components.
  • I still think you need some actual textual / graphical documentation that serves as an overview of the business process and the system. We recently put this kind of thing on a wiki at the very end of the project for the support team (should have done it incrementally during the project)
  • Stories should be thrown away once they are implemented. They should not be used as a permanent artifact for describing the system. A story is only a small chunk of the big picture, and it may contradict/overturn an earlier story. We need to create the big picture as it evolves when we implement various stories.

It’s interesting to note her emphasis on writing tests with an audience in mind which was a major theme in the workshop. She also noted that user acceptance tests document the business side of the product while the unit tests document the system process side of the product. This was also noted by one of the groups in the workshop.

I especially like the approach she has used for tabular test data. Check out “Example 2: Tabular Test Specification” of her test examples in the paper. The format for the test data in a spreadsheet is very intuitive and reads nicely.

Go Flames Go (Edit)

One win away from the Stanley Cup.

Go Flames!!

edit — It was a heartbreaker last night, but Calgary and Canada are proud of the Flames amazing playoff run.

In related news, Tim Van Tongeren has a post on his blog about a publishing error where the wrong article ran in a Tampa newspaper. They accidentally published the version about the Lightning losing the cup. Sounds like they needed a second set of eyes in production last night to help with testing.