Basic Test Generation

Create the models

Open the On/Off System model “onoff/onoff.xml” file in Uppaal. The system contains two templates entitled System and User. The System represents the system or device (e.g. lamp) under test. The User models the possible user interactions with the system.

The system can be either On or Off, with channels on and off changing between them. The user can non-deterministically press on or off.

System templateUser template
System templateUser template

Add Test Cases specific model annotations

First, some Test Cases specific declarations muse be made; these are required for the generator to use depth-first reachability and single step techniques for test-case generation. These variables must not be altered in the model. Hence, these two variables must be declared in the global Declarations section:

int __reach__ = 0;
int __single__ = 0;

Add test pre-fix and post-fix code

Second, there are two blocks of comments in the System declarations section of the model. The contents of a multi-line comment starting with TEST_PREFIX or TEST_POSTFIX are ejected as prefix code (before the actual test case behavior) or postfix code (after actual test case behavior) for each test case.

In this example each test case is a Java class with a main method; executing this Java class constitutes running the test case. Therefore the prefix and postfix code is the wrapper code to setup this Java class.

/** TEST_PREFIX
  package app;
  import app.App;

  class Test extends App {

  public static void main(String[] args) {

*/

/** TEST_POSTFIX
  }
}
*/

The test case class extends the App class we are testing. If you use a specific unit test framework, these lines should be changed to match its requiements. Each step in the test case will execute some methods on this class.

By default, test cases will be named and sequentially numbered like testcase0.code. If desired, the filename and file extension can be changed by the following special comments in the System declarations section. For now, stay with the defaults.

/** TEST_FILENAME test- */
/** TEST_FILEEXT .txt */

Add test step code annotations

The system model must be decorated with test code. Double clicking the edge labeled on? and selecting the Test Code tab reveals an area for entering test code. In this model the test code should be set_on();. This will execute the method set_on() on the application whenever this edge is taken in a test case. Similarly, set_off(); is added to the edge labeled off?.

For locations, test code can be entered in two areas, Enter and Exit. This test code is added to the test case when the trace reaches the location or leaves it, respectively.

Adding edge test codeAdding location test code
Adding test code to edgesAdding test code to locations

Generating the test cases

To generate test cases go to the Test Cases tab. You may need to enable the tab in the GUI first.

Enable Test Cases TabTest Cases Tab
Enable Test Cases Generation tabTest Cases tab

Select which techniques for test case generation to use. For now only select the auto-depth search mode with a depth of 20, and click add.

Generating tests with auto-depth mode

Clicking add should generate a single trace. Each trace generated will have a line in the list similar to Trace coverage: 4/4. This shows that the trace covered four out of four edges. Further information about which locations and edges are/are not covered are shown in the Statistics panel to the right. Here, all of the 3 locations and 4 edges are trivially covered.

Generation

By double clicking the trace and selecting the Simulator tab, the trace can be examined. Unsurprisingly, the trace simply runs in circles with alternating on/off presses.

Simulating a test case

Next output the test cases to the file by clicking Save Test Cases. Select the output folder for test cases. Make this point to the onoff folder in this tutorial. Test Cases Output

Inspecting the test case code

Pressing the Save Test Cases button in the Test Cases tab will generate a file called testcase0.code in the selected output folder. If several traces have been generated, several files will be generated with sequential numbering.

Each of these will be a Java class with the sequence of method invocations induce by the generated trace. A snippet is shown below.

 1: package app;
 2: import app.App;
 3:
 4: class Test extends App {
 5:
 6:     public static void main(String[] args) {
 7:
 8: expect_off();
 9:
10: set_on();
11:
12: expect_on();
13:
14: set_off();
15:
   <... snip ... >
93:
94:     }
95: }
  • The test case starts with the prefix code on lines 1-6.
  • Line 8 is the first step of the trace. This is expect_off(); since the initial location is Off.
  • The first transition is the one labeled on?, the test code for this transition is set_on();. This is entered in the trace at line 10.
  • The trace then enters location On and expect_on(); is in the test case at line 12. The test case continues in this fashion.
  • The trace ends with the postfix code on lines 94-95.

Exectuting the test cases

Running the test.sh (or test.bat on Windows) will compile and run the test cases one at a time. It will output the name of each file as it executes them. No output from the test case signifies successful execution.

onoff$ ./test.sh
testcase0.code
onoff$

Running the testMutant.sh (or testMutant.bat on Windows) will compile and run the test cases on the mutant (intentionally wrong) implementation. This should result in an exception being thrown, signifying a test error.

onoff$ ./testMutant.sh
testcase0.code
Exception in thread "main" java.lang.AssertionError
  at app.App.expect_on(App.java:17)
  at app.Test.main(Test.java:15)
onoff$

The implementation code can be examined in the onoff/app folder. See file onoff/app/AppC.java for the correct implementation and onoff/app/AppM.java for the mutant.

In order to transfer this to you own applications, you will have to decorate the model with test code such that the output from the generator constitutes an executable test case in whatever test execution framework you use.