Donate to e Foundation | Murena handsets with /e/OS | Own a part of Murena! Learn more

Commit 708d32e4 authored by David S. Miller's avatar David S. Miller
Browse files

Merge branch 'net-Introduction-of-the-tc-tests'



Lucas Bates says:

====================
net: Introduction of the tc tests

Apologies for sending this as one big patch. I've been sitting on this a little
too long, but it's ready and I wanted to get it out.

There are a limited number of tests to start - I plan to add more on a regular
basis.
====================

Signed-off-by: default avatarDavid S. Miller <davem@davemloft.net>
parents 93dda1e0 76b903ee
Loading
Loading
Loading
Loading
+1 −0
Original line number Original line Diff line number Diff line
__pycache__/
+102 −0
Original line number Original line Diff line number Diff line
tdc - Linux Traffic Control (tc) unit testing suite

Author: Lucas Bates - lucasb@mojatatu.com

tdc is a Python script to load tc unit tests from a separate JSON file and
execute them inside a network namespace dedicated to the task.


REQUIREMENTS
------------

*  Minimum Python version of 3.4. Earlier 3.X versions may work but are not
   guaranteed.

*  The kernel must have network namespace support

*   The kernel must have veth support available, as a veth pair is created
   prior to running the tests.

*  All tc-related features must be built in or available as modules.
   To check what is required in current setup run:
   ./tdc.py -c

   Note:
   In the current release, tdc run will abort due to a failure in setup or
   teardown commands - which includes not being able to run a test simply
   because the kernel did not support a specific feature. (This will be
   handled in a future version - the current workaround is to run the tests
   on specific test categories that your kernel supports)


BEFORE YOU RUN
--------------

The path to the tc executable that will be most commonly tested can be defined
in the tdc_config.py file. Find the 'TC' entry in the NAMES dictionary and
define the path.

If you need to test a different tc executable on the fly, you can do so by
using the -p option when running tdc:
	./tdc.py -p /path/to/tc


RUNNING TDC
-----------

To use tdc, root privileges are required. tdc will not run otherwise.

All tests are executed inside a network namespace to prevent conflicts
within the host.

Running tdc without any arguments will run all tests. Refer to the section
on command line arguments for more information, or run:
	./tdc.py -h

tdc will list the test names as they are being run, and print a summary in
TAP (Test Anything Protocol) format when they are done. If tests fail,
output captured from the failing test will be printed immediately following
the failed test in the TAP output.


USER-DEFINED CONSTANTS
----------------------

The tdc_config.py file contains multiple values that can be altered to suit
your needs. Any value in the NAMES dictionary can be altered without affecting
the tests to be run. These values are used in the tc commands that will be
executed as part of the test. More will be added as test cases require.

Example:
	$TC qdisc add dev $DEV1 ingress


COMMAND LINE ARGUMENTS
----------------------

Run tdc.py -h to see the full list of available arguments.

-p PATH           Specify the tc executable located at PATH to be used on this
                  test run
-c                Show the available test case categories in this test file
-c CATEGORY       Run only tests that belong to CATEGORY
-f FILE           Read test cases from the JSON file named FILE
-l [CATEGORY]     List all test cases in the JSON file. If CATEGORY is
                  specified, list test cases matching that category.
-s ID             Show the test case matching ID
-e ID             Execute the test case identified by ID
-i                Generate unique ID numbers for test cases with no existing
                  ID number


ACKNOWLEDGEMENTS
----------------

Thanks to:

Jamal Hadi Salim, for providing valuable test cases
Keara Leibovitz, who wrote the CLI test driver that I used as a base for the
   first version of the tc testing suite. This work was presented at
   Netdev 1.2 Tokyo in October 2016.
Samir Hussain, for providing help while I dove into Python for the first time
    and being a second eye for this code.
+10 −0
Original line number Original line Diff line number Diff line
tc Testing Suite To-Do list:

- Determine what tc features are supported in the kernel. If features are not
  present, prevent the related categories from running.

- Add support for multiple versions of tc to run successively

- Improve error messages when tdc aborts its run

- Allow tdc to write its results to file
+69 −0
Original line number Original line Diff line number Diff line
tdc - Adding test cases for tdc

Author: Lucas Bates - lucasb@mojatatu.com

ADDING TEST CASES
-----------------

User-defined tests should be added by defining a separate JSON file.  This
will help prevent conflicts when updating the repository. Refer to
template.json for the required JSON format for test cases.

Include the 'id' field, but do not assign a value. Running tdc with the -i
option will generate a unique ID for that test case.

tdc will recursively search the 'tc' subdirectory for .json files.  Any
test case files you create in these directories will automatically be included.
If you wish to store your custom test cases elsewhere, be sure to run tdc
with the -f argument and the path to your file.

Be aware of required escape characters in the JSON data - particularly when
defining the match pattern. Refer to the tctests.json file for examples when
in doubt.


TEST CASE STRUCTURE
-------------------

Each test case has required data:

id:           A unique alphanumeric value to identify a particular test case
name:         Descriptive name that explains the command under test
category:     A list of single-word descriptions covering what the command
              under test is testing. Example: filter, actions, u32, gact, etc.
setup:        The list of commands required to ensure the command under test
              succeeds. For example: if testing a filter, the command to create
              the qdisc would appear here.
cmdUnderTest: The tc command being tested itself.
expExitCode:  The code returned by the command under test upon its termination.
              tdc will compare this value against the actual returned value.
verifyCmd:    The tc command to be run to verify successful execution.
              For example: if the command under test creates a gact action,
              verifyCmd should be "$TC actions show action gact"
matchPattern: A regular expression to be applied against the output of the
              verifyCmd to prove the command under test succeeded. This pattern
              should be as specific as possible so that a false positive is not
              matched.
matchCount:   How many times the regex in matchPattern should match. A value
              of 0 is acceptable.
teardown:     The list of commands to clean up after the test is completed.
              The environment should be returned to the same state as when
              this test was started: qdiscs deleted, actions flushed, etc.


SETUP/TEARDOWN ERRORS
---------------------

If an error is detected during the setup/teardown process, execution of the
tests will immediately stop with an error message and the namespace in which
the tests are run will be destroyed. This is to prevent inaccurate results
in the test cases.

Repeated failures of the setup/teardown may indicate a problem with the test
case, or possibly even a bug in one of the commands that are not being tested.

It's possible to include acceptable exit codes with the setup/teardown command
so that it doesn't halt the script for an error that doesn't matter. Turn the
individual command into a list, with the command being first, followed by all
acceptable exit codes for the command.
+40 −0
Original line number Original line Diff line number Diff line
[
    {
        "id": "",
        "name": "",
        "category": [
            "",
            ""
        ],
        "setup": [
            ""
        ],
        "cmdUnderTest": "",
        "expExitCode": "",
        "verifyCmd": "",
        "matchPattern": "",
        "matchCount": "",
        "teardown": [
            ""
        ]
    },
    {
        "id": "",
        "name": "",
        "category": [
            "",
            ""
        ],
        "setup": [
            ""
        ],
        "cmdUnderTest": "",
        "expExitCode": "",
        "verifyCmd": "",
        "matchPattern": "",
        "matchCount": "",
        "teardown": [
            ""
        ]
    }
]
Loading