REBOL Technologies

Testing, 1, 2, 3.. Testing... (Reflective Testing)

Carl Sassenrath, CTO
REBOL Technologies
26-Mar-2008 18:18 GMT

Article #0356
Main page || Index || Prior Article [0355] || Next Article [0357] || 9 Comments || Send feedback

Testing, 1, 2, 3.. Testing. In my old TV studio days, that's how we tested microphones (and set the audio channel gain as well).

Thinking back on it now, it's funny how much we tested microphones. In fact, to someone who's not been part of live performance shows, those of us who setup microphones may seem to be... well, overly paranoid.

But, more often than not, it was the microphones that failed during live shows. Failures did not happen often, maybe once a year (1 in 1000 shows), and normally during the large audience 6 PM evening news. For any of a variety of reasons, one of the microphones would fail. Murphy's law. And, we'd all wonder (normally because the director was shouting it)... didn't we test that microphone? Did he pick up the wrong one? Or, did a wire come loose inside? In the worst case, we'd have the floor manager hand over a new mic or get the talent to trade mics.

Anyway, what does this have to do with REBOL? Well, REBOL 3.0 needs testing. And, it needs a lot of testing - more than even microphones.

Much of this testing can be done by me, the designer. But, there's only so much I can do. With 56 datatypes times 68 polymorphic actions per datatype, as well as a few hundred built-in natives and mezzanines, there are a lot of combinations. Add in function refinements (modifiers) and you get a sense of the level of testing required. I figure we need at least a minimum of 100'000 tests in the test suite. 500'000 may even be more likely in the end.

Right now for R3, I have a "hand made" test suite with 3'000 tests (what I call test vectors, because each tests one specific element of the language). Realizing that more is needed, and also that time is short, I've started using the technique I'd call "reflective testing". (There's probably an official term for it, not sure.)

Reflective testing is where the language itself provides the information to generate the tests. Those of you who know REBOL know how this works, and some of you have already written various reflective tests and analyzers over the years. To those of you, there is nothing new here.

I must admit that in the past I've not seen the value of reflective tests and generally discounted them because they are not created from specifying expected behavior, but from what the language actually does. That permits serious errors to go undetected simply because the language allows them.

These days, I'm rethinking the approach. Reflective testing can be valuable if you certify the results by inspection. That is, you carefully check the results.

The basic method becomes:

  1. Use REBOL as a test generator to generate tests for specific datatypes or functions.
  2. Inspect the results. Although this is time consuming, it's a lot less time consuming than typing-in thousands of tests and figuring out all the valid test combinations manually.
  3. Once certified, a test can now be stored and used later to check for regressions (new bugs).

Using this approach, tens of thousands of test vectors can be created in a week. While it is true that there will still be various holes in the tests created by special-case combinations, I think this method yields greater coverage and leverage in the testing process. In fact, I've already found several bugs or oddities during the tests.

And, no, we never tested microphones by tapping on them. If you did, you'd risk the control room engineer running after you with a sharp object.

9 Comments

Updated 6-Mar-2024   -   Copyright Carl Sassenrath   -   WWW.REBOL.COM   -   Edit   -   Blogger Source Code