I got called into a conference call today because of an emerging question a client had about combination testing.

They wanted us to test their online survey, an application which consisted of several questions, most of which had an array of answers from which the respondent could pick. At the end of the survey, the respondent gets a score.

One section of the survey had 40 questions. Each question had anywhere from 2 to 28 possible answers. This made a total of 23 quintillion combinations to test.

So what to do when they only have a few weeks to release?

Enter a heuristic combination method called all-pairs. All-pairs is a mathematical technique that takes a table of options and pairs an option in one column with each option in each of the other column.

This approach is useful for flushing out the risk bugs that exist when two options are paired.

There are two FREE tools I know about that will do this kind of analysis: all-pairs tool from James Bach and PICT, a free tool from Microsoft.

Using both tools, I found that the 23 quintillion options were paired to just 673 test cases. The tools produce a nifty Excel-readable table of test cases, consisting of rows for the tester to follow — telling them which answers to select for each question.

The important point to note when using all-pairs is that it is a heuristic — a fallible mehtod for solving a problem — aka “a rule of thumb.” As such, it should be told to stakeholders that it’s not perfect, but it can help expose certain risks quickly.

For the PICT tool (which also does triples and a variety of other options), click here. For my brother’s free all-pairs tool written in Perl, click here.

TV Shows for Testing

I love tv. Perhaps too much. But I don’t sweat my addiction too much because most of the shows I like have a testing component to them.

Notice that your favorite shows, movies, books all have an important element that you’ll see in testing.


The difference between “desired” and “actual” is called a problem (It’s also called a bug.)

There are many defintions of testing, but my favorite is the discovery of problems as you assess capabilities.

Here’s my list of shows and movies that focus on the juxtaposition of problems and capabilities:

  1. Iron Chef — Food Network — chefs have one hour to cook 5 dishes for a panel of judges. A secret ingredient is revealed at the beginning of the show that the chef must use in all of the dishes — all the while competing with another chef.
  2. Mythbusters — Discovery — Two special effects guys with an extensive array of tools and props, set out to confirm or refute several urban legends.
  3. Ramsay’s Kitchen Nightmares — BBC America — a notoriusly irascible chef serves as a consultant to see if he can turn around England’s failing restuarants.
  4. Survivorman — Discovery — a guy drops himself into remote places like wilderness, desert, and snow pack armed only with a camera and a leatherman tool and the clothes on his back. His mission is to get himself out of danger, filming his journey.
  5. America’s Test Kitchen — PBS — culinary experts try out different tools, gadgets, and recipes, sometimes head-to-head.

Movies I can watch over and over again for their testing parallels:

  1. Apollo 13 — astronauts trapped in a failing capsule with only a few days to live.
  2. Super Size Me — a healthy man eats nothing but food listed on McDonald’s menu for breakfast, lunch, and dinner 30 days. What happens to him?
  3. The Matrix — Life is a nothing but a computer simulation — what bugs have you seen in the program?

Seattle CSI Files

Here are the notes I took after Mark Hanf, a detective from Seattle CSI came to speak to ( on 1/18/07:

  • “We are asked to go to different locations”; parallel: think about testing on different computer platforms.
  • “Look up, not just straight ahead”; parallel: change your perspective when thinking of software tests to run.
  • “Look in the garbage; we go into toilets quite a bit”; parallel: software bugs could reside in places we don’t associate with normally having problems.
  • “Proper documentation with photos”; parallel: we often document our tests and report our finding with screenshots.
  • “Can’t be afraid of heights”; parallel: can’t be afraid of testing on new platforms.
  • “Sometimes you have to match the bullet even though the crime is ‘solved'”; parallel: even though you have found the bug, there may be another cause.
  • “Crime scenes might have CS gas residue”; parallel: “we may be digging in an area that complicates our ability to find bugs.
  • Tools: reflective UV imaging screen, forensic stepping plates, sifting screens; parallel: we have special tools as well (inControl, log file tracing, LoadRunner).
  • “We must gather, document, and demonstrate in court that we did everything possible”; parallel: software projects have “bug juries” that we are often called in to testify in front of to make our case.
  • “We study different disciplines: entomology, odontology, etc.”; parallel: we also study different domains… cognitive psychology for usability, brain physiology, and Crime Scene Investigation!
  • “Everybody’s interested in coming in and going right to the dead body”; parallel: We go right for the features that attract us or that are easy to test.
  • “Detectives should cut their own path to an outside crime scene”; parallel: there is more than one way to reproduce or find a software bug.
  • “You get to the scene, are briefed in an initial walkthrough”; parallel: we have client kick-off meetings that tell us what to focus on and where bugs may likely be hiding.
  • “Footwear impressions and fingerprints are there whether we see them or not”; parallel: same is true for software defects… they are almost always hidden.
  • “Sometimes you’re concerned about the floor, but can’t deal with it then and there”; parallel: bugs mask other bugs… we’re concerned about one feature but may not have time to test it right then.
  • “Take photos with scale and without scale”; parallel: when filing a bug, think about its impact not just to the user but on other programs on the system.
  • “Juries expect a lot more, so in some cases, we have to entertain (re: animation) as well as inform”; parallel: sometimes filing a bug is not enough, we have to be an advocate for what we find.
  • “Defense attorneys could discount elements of our case, so we have to be thorough and careful”; parallel: same is true when we deal with programmers we have to anticipate scrutiny.
  • Photogrammetry‚ a series of digital photographs in succession; parallel: we have mouse click and keystroke recording tools to document the repro of bugs.
  • Talked about how a boyfriend/girlfriend got into a fight and then violence happened; parallel: we develop user stories and scenarios to test for bug pathologies in software.
  • “We can’t say this is what happened, but we can give a logical range of possibilities”; parallel: we’re not always sure what the fault is, but we can suggest possibilities.
  • Projectiles go through glass and leave different signatures; parallel: same is true for bugs… programs leave different signatures on how they use memory or install files.
  • “We have to do presumptive tests sometimes (like the bullet through rubber)”; parallel: we also have to check our basic perceptions to make sure that a bug is really what we think it is.
  • “We take elimination fingerprints to rule out different suspects”; parallel: we do follow-up tests or peripheral tests to rule out other causes.
  • “Keep an open mind… don’t make your evidence fit your theory”; parallel: be mindful of your biases… don’t be fooled into thinking that this is a bug you’ve seen before.
  • “There is a high cost for processing evidence, homicides get priority”; parallel: there is a cost to doing tests… high risk features that lead to crash, hang or data loss get priority.
  • “Temporal evidence: fingerprints can last a long time… might have been there from months before”; parallel: this is the Primacy Bias… a bug might have started weeks ago and shown itself now.
  • Harris vs United States, 1947: Only human failure to find it, study and understand it, can diminish its value; parallel: exactly the same for software testing.
  • “Two heads are better than one; parallel: paired testing, “fresh eyes find bugs.”
  • Staff: Team Lead, Sketch preparer, Photographer, Recorder, Specialists; parallel: Team Lead, Recording Tools, Subject Matter Experts.

Time is on My Watch: The Oft-Forgotten Role of QA Management

Posted by Jacob Stevens on January 30, 2009

Time equals quality. We know this. We’re also wary of feature creep, and we know that quality isn’t guaranteed solely by simply putting forth the time. Not even close. But when it’s all said and done, we know it: quality takes time.

Most of the effort of a QA manager is not to manage time, but to manage within time.The effort is spent on process. Streamlining. Documenting. Tracking. Measuring. Organizations pay QA managers to assure quality, by managing how to do things. And thankfully it isn’t uncommon for the how to include when to do things.

Or perhaps more accurately, when in a relational order of process. Too often, QA does not own or is not included in the decisions that set schedule and milestones. When QA is involved, too often the QA managerial role over time, over assuring quality by assuring adequate time, not just for testand verification but for all components of the project or the product group – too often that is forgotten or neglected, due to QA management itself.

It’s easy to get into a project manager mentality when overseeing any quality assurance efforts. In essence, QA managers are simply a specialized PM, and test leads specialized PjMs. It’s one cog in a larger piece, a greater purpose, and I’m not suggesting QA managers ought to hold in higher regard the importance of their own objectives than that of the team or product group. I’m simply reflecting on the ultimate role of QA management, the ultimate goals, and noting that a very effective tool – that also happens to be mandatory in some quantity or another – in assuring quality of project or product, is allocation of time.

Too often we forget that. But as QA managers, we’re still responsible for it, regardless of who sets the schedule. So don’t let Mick Jagger fool you – time isn’t always on your side. But your job is to assure quality, and quality takes time, so time is on your watch.

Scroll to top
Close Bitnami banner