Squish Coco

Code Coverage Measurement for Tcl, QML, C# and C/C++

Part IX

Chapter 31  Creating an instrumented project

This chapter contains recipes how a software project can be set up for instrumentation. The methods depend on the languages and the development environments. Here we show how to create a new software project so that it is prepared for instrumentation. This is another simple way to get acquainted with Squish Coco.

A description of a larger project can be found in Chapter 4.

31.1  C++ on Microsoft® Visual Studio® using the Visual Studio® Coco Wizard

There is an add-in for Microsoft® Visual Studio® (see Chapter 30) that supports the work with Squish Coco.

31.2  C# on Microsoft® Visual Studio®

Start Microsoft® Visual Studio® and create a new C# application:

  1. Click on "File->New->Project…" to pop up the new project wizard.
  2. Choose a project type of "Visual C#->Windows" and the "Console Application" template.
  3. Enter a project name of squishcoco_sample, then click the "OK" button.
  4. When the wizard’s second page appears, click the "Finish" button.

To activate the instrumentation, use the project properties:

  1. In the "Solution Explorer", double click "squishcoco_sample->Properties" to open the property dialog.
  2. Click on the "Build" tab.
  3. Select "Conditional compilation symbols" and enter COVERAGESCANNER_COVERAGE_ON.

The code coverage analysis is enabled when the symbol COVERAGESCANNER_COVERAGE_ON is defined from the command line.

Build the squish_coco project. This will cause the executable squishcoco_sample.exe to be built and the code coverage instrumentation file squishcoco_sample.exe.csmes to be generated. Double click on squishcoco_sample.exe.csmes to inspect this file in CoverageBrowser.

Right now there are no code coverage statistics to be seen in CoverageBrowser: this is because the application has not yet been executed. Click on Program.cs in the source list to display the main function. All the instrumented lines are shown grayed out to indicate that nothing has been executed.

Now execute squishcoco_sample.exe by double clicking it. This will result in a file called squishcoco_sample.exe.csexe being generated. This file contains a code coverage snapshot which can be imported into CoverageBrowser:

  1. Click "File->Load Execution Report…".
  2. Select the "File" item and enter the path of the squishcoco_sample.exe.csexe file.
  3. Click on the "Import" button.

This will cause the code coverage statistics to be updated. Now, in the source code window, the main function’s return statement will be colored green to indicate that this line has been executed.

31.3  Command Line Tools

Open a console window (MS-DOS Prompt or Command Prompt window on Windows) and make sure that the Microsoft® Visual Studio® compiler (cl.exe) or GCC is installed on your system.

Create a simple hello.c source file that contains the following code:

#include <stdio.h>

int main(int argc, char *argv[])
   return 0;

Compile this program using the CoverageScanner compiler wrapper instead of the native compiler. This is done by prepending cs to the command line compiler’s name:

With Microsoft® Visual Studio®With GCC
$ cscl.exe hello.c /Fehello.exe$ csgcc hello.c -o hello.exe

Then the executable hello.exe and the file hello.exe.csmes, which contains the code coverage instrumentation, will be generated.

Execute hello.exe; this will cause the file hello.exe.csexe to be generated. It contains a code coverage snapshot which can be imported into hello.exe.csmes using cmcsexeimport:

cmcsexeimport --title="Hello execution" -m hello.exe.csmes -e hello.exe.csexe

Once imported, the hello.exe.csexe file is no longer needed and can be deleted. Now hello.exe.csmes contains a single execution record. We can generate an HTML report with cmreport:

cmreport --title="Hello application" -m hello.exe.csmes --html=hello.html

Here is a summary of the command line options that were used in this command:

--title="Hello application"
Set a title for the report. It appears on all generated HTML pages.
-m hello.exe.csmes
Select the instrumentation database.
Specify the name of the HTML file that the report should be written to.

Chapter 32  Generating instrumentations without modifying projects

Squish Coco can also generate code coverage information for a project without the need to modify it. The principle is to prepend to the PATH variable the path of the CoverageScanner compiler wrappers and to set the instrumentation parameters with the COVERAGESCANNER_ARGS environment variable. To activate the instrumentation, --cs-on must be present in COVERAGESCANNER_ARGS. If this is not the case, CoverageScanner is completely deactivated.

The variable COVERAGESCANNER_ARGS should only be set locally, e.g. in a script or on the command line. If it is set globally, it will influence every build.

32.1  GNU Make

On a UNIX® system, proceed as follows to instrument a project which can be generated using GNU Make:

export PATH=/opt/SquishCoco/wrapper/bin:$PATH
make clean

For macOS, replace the first line with

export PATH=/opt/SquishCoco/wrapper:$PATH

32.2  Microsoft® NMake

Proceed as follows to instrument a project that is generated with NMake:

set PATH=%SQUISHCOCO%\visualstudio;%PATH%
nmake clean

32.3  Microsoft® Visual Studio®

Proceed as follows to instrument a project that is generated with Microsoft® Visual Studio®:

set PATH=%SQUISHCOCO%\visualstudio;%PATH%
devenv /useenv myproject.sln /Rebuild

32.4  Microsoft® MSBuild

Proceed as follows to instrument a project that is generated with Microsoft® MSBuild

set PATH=%SQUISHCOCO%\visualstudio;%PATH%
msbuild /p:UseEnv=true myproject.sln /t:ReBuild

32.5  Mono C# XBuild

Proceed as follows to instrument a project that is generated with Mono C# XBuild

msbuild \
   /p:UseEnv=true \
   /p:UseHostCompilerIfAvailable=true \
   /p:CscToolPath="${SQUISHCOCO}/wrapper" \
   myproject.sln /t:ReBuild

The environment variable SQUISHCOCO contains the path to the CoverageScanner executable.

Chapter 33  Code Coverage of Libraries

33.1  Code Coverage of Static/Shared Libraries and DLL

During the linking operation, CoverageScanner includes all instrumentations of the shared libraries (if these are compiled with CoverageScanner). CoverageBrowser displays the code coverage of the complete application (executable and its libraries) in one view.

To get an analysis of the code coverage of a library only, it is necessary to compile the main application and exclude its sources from the code coverage (by adding in the command line --cs-exclude-file-regex=.* for example.)

33.2  Code Coverage of Plugins/Manually Loaded Shared Libraries

Libraries loaded dynamically can also be instrumented, but it is necessary to handle the generation of the execution report in the main application or in the plugin code itself.

33.2.1  Generating Code Coverage Information directly from the Main Application

Handling the plugins into the main application can be easily performed using the register/unregister mechanism of the CoverageScanner API: It is just necessary to call __coveragescanner_register_library() after loading a library and call __coveragescanner_unregister_library() just before unloading it.


#include <stdio.h>
#include <stdlib.h>
#include <dlfcn.h>

main(int argc, char **argv)
   void *handle;
   double (*cosine)(double);
   char *error;

   handle = dlopen("", RTLD_LAZY);
   if (!handle) {
       fprintf(stderr, "%s\n", dlerror());

   dlerror();    /* Clear any existing error */

   /* Writing: cosine = (double (*)(double)) dlsym(handle, "cos");
      would seem more natural, but the C99 standard leaves
      casting from "void *" to a function pointer undefined.
      The assignment used below is the POSIX.1-2003 (Technical
      Corrigendum 1) workaround; see the Rationale for the
      POSIX specification of dlsym(). */

   *(void **) (&cosine) = dlsym(handle, "cos");

   if ((error = dlerror()) != NULL)  {
       fprintf(stderr, "%s\n", error);

   printf("%f\n", (*cosine)(2.0));
Calling __coveragescanner_register_library() or __coveragescanner_unregister_library() on an non-instrumented library is allowed.

33.2.2  Generating Code Coverage Information directly from the Plugin

CoverageScanner cannot handle the instrumentation of plugins (i.e. shared libraries loaded manually) during the linking phase. In this case, the library must initialize and store itself the executions.
Therefore, the shared library needs to call __coveragescanner_filename() to set the name of the execution file during its initialization and __coveragescanner_save() to save the instrumentations when its becomes unloaded.  Code Coverage of Plugins Generated with Microsoft® Visual Studio®

The function DllMain is called on the initialization and the termination of the DLL generated using Microsoft® Visual Studio®. When the ’reason’ field is equal to DLL_PROCESS_ATTACH the function __coveragescanner_filename() should be called. To save the measures on exit the function __coveragescanner_save() shall be called when ’reason’ is DLL_PROCESS_DETACH.


extern "C"
                    DWORD dwReason,
                    LPVOID /*lpReserved*/)
  switch( dwReason )
    /* Initialization of the CoverageScanner library.        */
    /* Replace "mylib" with your filename without extension  */
    /* Saves the execution report */
  return TRUE;
}  Code Coverage of Plugins Generated with GNU gcc

The GNU compiler offers 2 attributes which let you execute a function when a library becomes loaded or unloaded:

__attribute__ ((constructor)) my_init(void);
This attribute specifies a function to be called when the library is loaded. Call the function __coveragescanner_filename() in the custom initialization function of the library.
__attribute__ ((destructor)) my_fini(void);
This attribute specifies a function to be called when the library is unloaded. Call the function __coveragescanner_save() on the termination of the library.


static void plugin_load(void)   __attribute__ ((constructor)) ;
static void plugin_unload(void) __attribute__ ((destructor))  ;

static void plugin_load(void)
  /* Initialization of the CoverageScanner library.        */
  /* Replace "mylib" with your filename without extension  */

static void plugin_unload(void)
  /* Saves the execution report */

Chapter 34  Test suites and Squish Coco

34.1  Execution comment, name and status

Test suites are able to provide to CoverageBrowser the name of the test, a comment in HTML format and also the status of their execution (passed or failed). Two methods are possible:

The .csexe file is a simple text file on which is it possible to append additional lines to extend the information provided to CoverageBrowser.

To set the name of a test, simply add before executing it the following line: “*name of the test\n
The character “*” must be the first character of the line. The name of the test is placed directly after it and the line is terminated with a carriage return.1

To set the status of a test, simply add after executing it the following line: “!status\n
The character “!” must be the first character of the line. The status of the test is placed directly after it and the line is terminated with a carriage return. The status can be one of the following strings:

to indicate that the test was successfully executed,
to indicate that the test was not successfully passed, and
to indicate that it was not possible to determinate whether the test was successfully executed.

To append an execution comment, insert the contents of an HTML document just after the execution of the application.

Example: The following batch file execute the test First Test and set the status of the execution to CHECK_MANUALLY.

echo *First Test >> myapp.csexe
echo "<HTML><BODY>Execution of myapp</BODY></HTML>" >> myapp.csexe
echo !CHECK_MANUALLY >> myapp.csexe

There may be more than one HTML comment, and the comments and the status declaration may occur in any order, but they all must be appended to the .csexe file after the test has been executed.

34.2  Unit testing

CoverageBrowser imports execution results for all object files that are part of an application. This means that, if an unit test uses the same object files as the application, CoverageBrowser can import the execution results of the unit tests and merge them into the code coverage of the compiled application.


An application consists of three files:

This file contains the main() function of the application.
This file contains functions that are called by app.cpp.
This file contains the test code for the functions in library.cpp. It has its own main() function.

Under Microsoft® Windows, the application is compiled with the following commands:

cscl app.cpp /Foapp.obj
cscl library.cpp /Folibrary.obj
cscl library.obj app.obj /Feapp.exe

The following commands compile the unit test program:

cscl testlibrary.cpp /Fotestlibrary.obj
cscl library.obj testlibrary.obj /Fetestlibrary.exe

To import the execution report of the unit tests into the instrumentation database of the main application, two methods can be used:

In both cases, only the code coverage analysis of the file library.cpp is loaded and the execution report of the test code ignored.

Chapter 35  Special compiler versions for cross-compiling

If a GNU compiler is used for cross-compiling, it often has a special name, like ‘arm-linux-gcc’ instead of ‘gcc’.

The installation script of Squish Coco tries to create automatically the compiler wrappers for every GNU compiler installed on the machine. In general, all necessary compiler wrappers are therefore present in the system. A cross-compiler may however also reside in a non-standard directory in the system, where it will not be found by the Squish Coco installer.

If a compiler configuration is missing, the following can be done:

  1. Copy ’coveragescanner’ to the file ’cscompiler⟩’.

    Example: On Microsoft® Windows, this would be something like:

    C:\Program Files\squishcoco>copy coveragescanner.exe csarm-linux-gcc.exe
  2. Copy the profile ’gcc.cspro’ or g++.cspro’ to ’⟨compiler.cspro’.


    C:\Program Files\squishcoco>copy gcc.cspro arm-linux-gcc.cspro

The new GNU cross-compiler ’cscompiler⟩’ can now be used. It instruments the code and then calls ’⟨compiler⟩’ for compilation.

Chapter 36  Multi-Platform Application

36.1  Principle

Squish Coco has strict checks to detect that the coverage data is coherent with the source code. It is, for example, not possible to import an execution report (.csexe file) of an earlier project version or to import the coverage information of an old version into a newer. This prevents some usage errors and the generation of incoherent test metrics.

The main principle of the coherence analysis consists simply of verifying that every instrumented source file in the project is not modified. This is done during the build and when coverage data are imported or merged. If the C preprocessor produces different files across the build – which can occur when some file are built in debug mode and others in release mode – Squish Coco detects it, adds a suffix like #1 or #2 to the file name, and requires that both versions are tested to achieve full coverage.

It is also possible to compile a program on several platforms and then merge the coverage data. This requires that the source code is identical for each build. If that is not the case, Squish Coco refuses to merge.

Let us take a small toy example to understand how it works: We suppose that the project consists of only one source file project.cpp and that we build it on Microsoft® Windows and Linux.

On Linux, we compile it with the command

$ csg++ /home/me/project.cpp -o project-unix

A file project-unix.csmes is generated and the tests can be executed and imported as usual into it.

On Microsoft® Windows, it is compiled with the command

C:\code> cscl C:\Home\me\project.cpp /Feproject-windows.exe

A file project-windows.exe.csmes is generated.

Merging the coverage information is possible with cmmerge:

cmmerge -o project.csmes project-unix.csmes project-windows.exe.csmes

Unfortunately this operation is not enough because the file project.csmes will then contain two project.cpp files:

It is in fact necessary to indicate that both files are identical. To do that, we use cmedit to rename this two absolute file names to an identical one:

$ cmedit project.csmes --rename="/home/me/,/PRJ/" --rename="C:\Home\me\,/PRJ/" --verbose

These two rules rename the base directories /home/me/ and C:\Home\me\ to /PRJ/ to a single source file: /PRJ/project.cpp. The coverage information is merged as if the code was tested identically on both platforms.

36.2  Restrictions

36.2.1  Code Generators

In some build processes, code generators are used, like flex/bison or, for Qt products, moc. Then the code generators need to produce identical source code on each platform to make the editing work.

If this is not the case, cmedit will refuse the renaming operation because it assumes that the project is not produced from the same source code. The solution is then to skip explicitly the files that are automatically generated.

Let us take the case of moc. qmake generates files with the name moc_*.cpp. The trick could consist of using the switch --force which skips the renaming operation for files with conflicts:

$ cmedit project.csmes --force --rename="/home/me/,/PRJ/" --rename="C:\Home\me\,/PRJ/" --verbose

As result, all source files are merged together in a directory PRJ except the moc files on which a conflict is detected. These moc files need then to be covered separately on each platforms.

36.2.2  Platform depending macros

In general, C macros expand to platform dependant code and should therefore be avoided. In the case that macro expansions are different between the platforms, Squish Coco will create duplicates by appending a #1 or #2 to the source files. Both versions of the sources need then to be covered.

There are some common pitfalls that the developer need to be careful about to avoid conflicts:

Chapter 37  Control of execution report generation

By default, the of CoverageScanner library produces an execution report when the instrumented application exits. This might not be enough in the case of unit test suites (where it is preferable to generate a report after each single test) or for applications like daemons which never terminate.

For these purposes, the CoverageScanner library also lets you:

This chapter is about the second method.

37.1  Execution report generation with UNIX® signals

To enable execution report generation with UNIX® signals, use the option --cs-dump-on-signal=sig⟩, where ⟨sig⟩ is the number of a signal or its common name (like e.g. SIGUSR1).

With the UNIX® kill command it is possible to trigger the report generation. (ex: kill -SIGUSR1 <pid>)

37.2  Execution report generation with Microsoft® Windows events

To enable execution report generation with Microsoft® Windows events, use the option --cs-dump-on-event=event⟩, where ⟨event⟩ is a string which identifies the event.

Since there is no standard Windows equivalent to the UNIX® kill command, Squish Coco offers a replacement which is described in the following sections.

37.2.1  A program to send events to applications

Squish Coco contains a tool to send events to Windows applications, for the use with --cs-dump-on-event. It is needed not very often, therefore you need to compile it yourself.

The program can be found in the directory Windows Coco\dump_on_event. It exists in two versions, one in C++ and one in C#. They are called dump_on_event and dump_on_event_cs, respectively.

To compile the program and see how it works, follow the instructions below. They refer to the C++ case – the C# version is quite similar and the differences are listed at the end.

  1. Copy the folder Windows Coco\dump_on_event to some other place, say to C:\dump_on_event, since the original version is write protected.
  2. Double-click on the file build_cpp. This starts the compilation of dump_on_event.cpp and of the example application event_sample_cpp.cpp. When the file event_sample_cpp.cpp is compiled, it is instrumented with the option --cs-dump-on-event=COVERAGE.
  3. After the programs have been compiled, three windows become visible.
    1. The window of the CoverageBrowser, with the contents of the file event_sample_cpp.exe.csmes loaded. It contains the source of the program event_sample_cpp, but not yet any coverage data.
    2. A window with a command prompt from which the program dump_on_event can be run later.
    3. A command window in which event_sample_cpp runs. Every second it prints a dot, and it will do so for five minutes.
  4. Now you can send the event COVERAGE to event_sample_cpp. Enter in the command prompt window:
    C:\dump_on_event>dump_on_event COVERAGE
    When the program event_sample_cpp receives the event, it creates a file event_sample_cpp.exe.csexe, which contains the coverage data that were generated so far.

    Send the command a second time. The file event_sample_cpp.exe.csexe then grows, since the new coverage data are appended to it. There is no danger that the same data are written twice.

  5. You can import the execution report from the CoverageBrowser with "File->Load Execution Report…". Each data dump appears as a separate execution, so if you have sent the event twice, you will see two executions in the executions window of the bowser.

The same recipe also works with the C# versions of the programs. You only need to replace “build_cpp” with “build_cs”, “dump_on_event” with “dump_on_event_cs” and “event_sample_cpp” with “event_sample_cs”.

37.2.2  Global events

In general, Windows events are created for a specific account and not accessible from other accounts on the same machine. But Windows supports also system-wide events. They have names that begin with Global\.

So to generate a code coverage report with Windows events from an application on a different account, system-wide events are needed.

In the following example we use the program dump_on_event to generate the event:

  1. Compile your application with the argument --cs-dump-on-event=Global\COVERAGE.
  2. Start your application.
  3. On another account, execute the command
    C:\dump_on_event>dump_on_event Global\COVERAGE
    The application then writes its coverage report to the disk.

37.2.3  Write your own program

You can also use dump_on_event as an example for sending events from your test setup. It is in fact only a wrapper for the Microsoft® Windows function SetEvent():

To download: dump_on_event.cpp pictures/zoom.png

Chapter 38  Customizing I/O of CoverageScanner library

38.1  Custom I/O using C file access

The following example shows how to generate the execution report using the standard C file API.

To compile the example on Microsoft® Windows:

cscl customiofile.c

To compile the example on Linux or macOS:

csgcc customiofile.c -o customiofile

Source code:

To download: customiofile.c pictures/zoom.png

38.2  Custom I/O using SFTP protocol

The following example shows how to generate the execution report directly on a SFTP server. The SFTP server is part of SSH v2 and is available on most of the Unix platforms. On Microsoft® Windows, a free SSH server can be downloaded from

To compile the example on Microsoft® Windows:

  1. Download libSSH2 from Generate the library and set the environment variable LIBSSH2 to the location of the libSSH2 source code.
  2. To compile the example:
    cscl %LIBSSH2%\win32\debug_dll\libssh2.lib -DWIN32 --cs-libgen=/MTd
         /MTd -I %LIBSSH2%\include ws2_32.lib customiosftp.c
  3. Execute custom_io_sftp.exe.

To compile the example on Linux:

  1. Install the development package of libssh2.
  2. To compile the example:
    csgcc -lssh2 customiosftp.c -o customiosftp
  3. Execute custom_io_sftp.

Source code:

To download: customiosftp.c pictures/zoom.png

Chapter 39  Using the CoverageScanner library with a memory pool

The CoverageScanner library can use an internal memory pool to handle memory allocations. This is necessary on platforms which do not provide the standard C functions malloc() and free().

In this case, it is necessary to allocate a C array at program start which will serve these functions. Doing this requires to estimate the memory consumption. The CoverageScanner library also provides a way to monitor the usage and to detect memory overflow.

39.1  Estimation of the size of the memory pool

The size of the memory pool depends on the number of files instrumented and on the size of the temporary output buffer and are used for dynamically loaded library support and for string manipulations.

For a code using only static libraries and which produce a single executable, it is possible to use the following approximation: 512 + 100 * number_of_instrumented_sources

number_of_instrumented_sources includes the headers and the source files. The size of the temporary buffer used for the generation of the execution report file will be added automatically to this size.

39.2  Monitoring of the size of the memory pool

The following example shows how to monitor the memory usage of the memory pool:

To download: memory-pool.c pictures/zoom.png

To compile the example on Microsoft® Windows:

cscl memory-pool.c --cs-memory-pool=1000 --cs-memory-alloc-failure-function=memory_failure

To compile the example on Linux or macOS:

csgcc memory-pool.c -o customiofile --cs-memory-pool=1000 --cs-memory-alloc-failure-function=memory_failure

The size of the memory pool is 1000 bytes plus the size of the temporary buffer that is used for serializing the execution report. To monitor the actual usage, the function CoverageScanner.__coveragescanner_memory_pool_stat() can be used.

If a memory overflow occurs (to simulate it, one can set the size of the memory pool to 5 bytes), the C function memory_failure() is called.

Chapter 40  Generation of diff files

Squish Coco supports patch file analysis in the CoverageBrowser (see Chapter 20.6) and in cmreport with the option --patch (see Chapter 26). For this, a diff file in the “unified” format is needed. It is provided by the UNIX® diff tool and by most version control systems.

The following examples work both on the UNIX® shell and with a Microsoft® Windows command window.

40.1  Comparing directories

If the two versions of the source are in different directories of the file system, the diff command can be used. It is the standard UNIX® file comparison program which gives the functionality its name.

The “unified” format must be set with the -u flag, and the result must be redirected to a file, so that the command line will look like

$ diff -u project/version_1 project/version_2 > project.diff

Then the directories project/version_1 and projects/version_2 are compared and the output is written to the file project.diff. Of the two directories, project/version_1 is viewed as the earlier version.

40.2  Version control systems

In a version control system, the diff functionality is used to compare two revisions of the source tree.

For most of the newer version control systems exist GUI wrappers, like Tortoise SVN, especially for Windows. If the underlying version control system supports unified diffs by default, the same can be expected from the GUI wrappers.

With git, one needs to specify two revisions and to redirect the output to a file. git uses the unified format by default. One can compare two revisions with
$ git diff ⟨rev1⟩ ⟨rev2⟩ > project.diff
This command creates by default a diff of the whole source tree.

For a list of ways to specify the revisions under git, see the manpage of gitrevisions (7).

The distributed version system Mercurial generates by default diff files in the unified format. One has to write
$ hg diff -r ⟨rev1⟩ -r ⟨rev2⟩ > project.diff
to compare the Mercury change sets ⟨rev1⟩ and ⟨rev2⟩.
The version control system Subversion generates unified diffs by default. To generate the diff, write
$ svn diff -r ⟨rev1⟩ -r ⟨rev2⟩ > project.diff
where ⟨rev1⟩ and ⟨rev2⟩ may be tags or revision numbers.
The Concurrent Versions System is the oldest of the revision control systems described here. It does not use the unified format by default. With it, one has to write
$ cvs diff -u -N -r ⟨rev1⟩ -r ⟨rev2⟩ > project.diff
to compare two revisions. The flag -u switches unified diffs on, and -N is necessary to track changes in which a file is added or removed. The revisions ⟨rev1⟩ and ⟨rev2⟩ must be tags or dates, since cvs does not have global version numbers. The command then creates a lot of additional console output, but the differences between the two revisions are still written to project.diff.

Note that the command only lists the differences in the current directory and its subdirectories.