summaryrefslogtreecommitdiffstats
path: root/docs/testing/developer
diff options
context:
space:
mode:
authorxudan <xudan16@huawei.com>2018-12-12 04:08:02 -0500
committerDan Xu <xudan16@huawei.com>2019-02-22 06:21:05 +0000
commit383a21e6a2a493a82acf52a4b18f58f41e586497 (patch)
tree78b3fb065f09397e28c4eb1fd8760fbf1df85b57 /docs/testing/developer
parent17e6e66de82a90feafb5ca202e2443b580ee6695 (diff)
Add a doc introducing dovetail framework
This doc is used to introduce Dovetail framework and how to develop with this framework. JIRA: DOVETAIL-757 Change-Id: I3c56ce56151580d0e2aebf3485a55f4c7a23c8b6 Signed-off-by: xudan <xudan16@huawei.com>
Diffstat (limited to 'docs/testing/developer')
-rw-r--r--docs/testing/developer/genericframework/index.rst382
1 files changed, 382 insertions, 0 deletions
diff --git a/docs/testing/developer/genericframework/index.rst b/docs/testing/developer/genericframework/index.rst
new file mode 100644
index 00000000..9bafb3e4
--- /dev/null
+++ b/docs/testing/developer/genericframework/index.rst
@@ -0,0 +1,382 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) Huawei Technologies Co.,Ltd, and others
+
+====================================
+Dovetail as a Generic Test Framework
+====================================
+
+.. toctree::
+ :maxdepth: 2
+
+
+Overview
+========
+
+Dovetail is responsible for the technical realization of the OPNFV Verified
+Program (OVP) and other compliance verification projects within the scope of
+the Linux Foundation Networking (LFN) umbrella projects.
+Dovetail provides a generic framework for executing a specific set of test cases
+which define the scope of a given compliance verification program, such as OVP.
+
+This document aims at introducing what Dovetail generic framework looks like and
+how to develop within this framework.
+
+
+Introduction of Dovetail Framework
+==================================
+
+The following diagram illustrates Dovetail generic framework.
+
+.. image:: ../../../images/dovetail_generic_framework.png
+ :align: center
+ :scale: 50%
+
+In this diagram, there are 5 main parts, `TestcaseFactory`, `TestRunnerFactory`,
+`CrawlerFactory`, `CheckerFactory` and test case groups.
+
+- **TestcaseFactory**: For each project, there needs to create its own
+ testcase class such as `FunctestTestcase` and `OnapVtpTestcase`. All these
+ classes are based on class `Testcase`. There are already many functions in this
+ base class which are mainly used to parse test case configuration files. If no
+ other special requirements exist, it only needs to initialize these classes with
+ different types. Otherwise, it needs to overload or add some functions.
+
+- **TestRunnerFactory**: Similar to `TestcaseFactory`, each project has its own
+ test runner class. Dovetail supports 2 kinds of test runners, `DockerRunner`
+ and `ShellRunner`. For projects based on Docker, it needs to create
+ their own test runner classes such as `FunctestRunner` which inherit from class
+ `DockerRunner`. For other projects that are based on Shell, it can use `ShellRunner`
+ directly. Test case runners provide many functions to support test cases runs
+ such as preparing test tool of each project, run all the commands defined by
+ each test case and clean the environment.
+
+- **Test case groups**: Each group is composed of one project configuration file
+ and a set of test cases belonging to this project. These groups are used as the
+ input of test runners to provide information of projects and test cases. For
+ `ShellRunner`, it only needs test case configuratons as the input.
+
+- **CrawlerFactory**: This is used to parse the results of test cases and record
+ them with unified format. The original result data report by each project is
+ different. So it needs to create different crawler classes for different projects
+ to parse their results.
+
+- **CheckerFactory**: This is used to check the result data generated by crawler.
+ Each project should have its own checker class due to the different requirements
+ of different projects.
+
+
+Development with Dovetail Framework
+===================================
+
+Everyone who is interested in developing Dovetail framework to integrate new upstream
+test cases will face one of the two following scenarios:
+
+- **Adding test cases that belong to integrated projects**: There are already some
+ projects integrated in Dovetail. These projects are coming from OPNFV (Open Platform
+ for NFV) and ONAP (Open Network Automation Platform) communities. It will be
+ much easier to add new test cases that belong to these projects.
+
+- **Adding test cases that not belong to integrated projects**: The test cases
+ may belong to other projects that haven't been integrated into Dovetail yet.
+ These projects could be in OPNFV, ONAP or other communities. This scenario is a
+ little more complicated.
+
+
+Test cases belonging to integrated projects
+-------------------------------------------
+
+Dovetail framework already includes a large amount of test cases. All these test
+cases are implemented by upstream projects in OPNFV and ONAP. The upstream
+projects already integrated in Dovetail are Functest, Yardstick and Bottlenecks
+from OPNFV and VNF SDK and VVP from ONAP.
+
+In order to add a test case belonging to one of these projects, there
+only need to add one test case configuration file which is in yaml format.
+Following is the introduction about how to use the file to add one new test case.
+Please refer to `Dovetail test case github
+<https://github.com/opnfv/dovetail/tree/master/etc/testcase>`_
+for all configuration files of all test cases.
+
+.. code-block:: bash
+
+ ---
+ Test case name in Dovetail:
+ name: Test case name in Dovetail
+ objective: Test case description
+ validate:
+ type: 'shell' or name of the project already integrated in Dovetail
+ testcase: The test case name called in this project
+ image_name: Name of the Docker image used to run this test
+ pre_condition:
+ - 'Commands needed to be executed before running this test'
+ - 'e.g. cp src_file dest_file'
+ cmds:
+ - 'Commands used to run this test case'
+ post_condition:
+ - 'Commands needed to be executed after running this test case'
+ report:
+ source_archive_files:
+ - test.log
+ dest_archive_files:
+ - path/to/archive/test.log
+ check_results_file: results.json
+ sub_testcase_list:
+ - sub_test_1
+ - sub_test_2
+ - sub_test_3
+
+This is the complete format of test case configuration file. Here are some
+detailed description for each of the configuration options.
+
+- **Test case name in Dovetail**: All test cases should be named as 'xxx.yyy.zzz'.
+ This is the name in Dovetail and has no relationship with its name in its own
+ project. The first part is used to identify the project where this test case
+ come from (e.g. functest, onap-vtp). The second part is used to classify this
+ test case according to test area (e.g. healthcheck, ha). Dovetail supports to
+ run whole test cases in one test suite with the same test area. Also the area
+ is used to group all test cases and generate the summary report at the end of
+ the test. The last part is special for this test case itself (e.g. image,
+ haproxy, csar). It's better to keep the file name the same as the test case
+ name to make it easier to find the config file according to this test case
+ name in Dovetail.
+
+- **validate**: This is the main section to define how to run this test case.
+
+ - **type**: This is the type of this test case. It can be `shell` which means
+ running this test case with Linux bash commands within Dovetail container. Also it
+ can be one of the projects already integrated in Dovetail (functest, yardstick,
+ bottlenecks, onap-vtp and onap-vvp). Then this type is used to map to its project
+ configuration yaml file. For example, in order to add a test case
+ in OPNFV project Functest to Dovetail framework, the type here should be
+ `functest`, and will map to `functest_config.yml` for more configurations
+ in project level. Please refer to `Dovetail project config github
+ <https://github.com/opnfv/dovetail/tree/master/etc/conf>`_ for more details.
+
+ - **testcase**: This is the name defined in its own project. One test case can
+ be uniquely identified by `type` and `testcase`. Take the test case
+ `functest.vping.ssh` as an example. Its `type` is 'functest' and `testcase`
+ is 'vping_ssh'. With these 2 properties, it can be uniquely identified. Users only
+ need to know that there is a test case named `functest.vping.ssh` in OVP
+ compliance test scope. Dovetail Framework will run `vping_ssh` within Functest
+ Docker container.
+
+ - **image_name**: [optional] If the type is `shell`, there is no need to give
+ this. For other types, there are default docker images defined in their project
+ configuration files. If this test case uses a different docker image, it needs
+ to overwrite it by adding `image_name` here. The `image_name` here should only
+ be the docker image name without tag. The tag is defined in project's configuration
+ file for all test cases belonging to this project.
+
+ - **pre_condition**: [optional] A list of all preparations needed by this
+ test case. If the list is the same as the default one in its project configuration
+ file, then there is no need to repeat it here. Otherwise, it's necessary to
+ overwrite it. If its type is `shell`, then all commands in `pre_condition`,
+ `cmds` and `post_condition` should be executable within Dovetail Ubuntu 14.04
+ Docker container. If its type is one of the Docker runner projects, then all
+ commands should be executable within their own containers. For Functest, it's
+ alpine 3.8. For Yardstick and Bottlenecks it's Ubuntu 16.04. For VNF SDK it's
+ Ubuntu 14.04. Also all these commands should not require network connection
+ because some commercial platforms may be offline environments in private labs.
+
+ - **cmds**: [optional] A list of all commands used to run this test case.
+
+ - **post_condition**: [optional] A list of all commands needed after executing
+ this test case such as some clean up operations.
+
+- **report**: This is the section for this test case to archive some log files and
+ provide the result file for reporting PASS or FAIL.
+
+ - **source_archive_files**: [optional] If there is no need to archive any files,
+ this section can be removed. Otherwise, this is a list of all source files
+ needed to be archived. All files generated by all integrated projects will be
+ put under `$DOVETAIL_HOME/results`. In order to classify and avoid overwriting
+ them, it needs to rename some important files or move them to new directories.
+ Navigating directory `$DOVETAIL_HOME/results` to find out all files
+ needed to be archived. The paths here should be relative ones according to
+ `$DOVETAIL_HOME/results`.
+
+ - **dest_archive_files**: [optional] This should be a list corresponding to the
+ list of `source_archive_files`. Also all paths here should be relative ones
+ according to `$DOVETAIL_HOME/results`.
+
+ - **check_results_file**: This should be the name and relative path of the result
+ file generated by this test case. Dovetail will parse this file to get the
+ result (PASS or FAIL).
+
+ - **sub_testcase_list**: [optional] This section is almost only for Tempest tests
+ in Functest. Take `functest.tempest.osinterop` as an example. The `sub_testcase_list`
+ here is an check list for this kind of tempest tests. Only when all sub test
+ cases list here are passed, this test case can be taken as PASS. The other kind
+ of tempest tests is `tempest_custom` such as `functest.tempest.image`. Besides
+ taking the `sub_testcase_list` as the check list, it's also used to generate an
+ input file of Functest to define the list of sub test cases to be tested.
+
+
+Test cases not belonging to integrated projects
+-----------------------------------------------
+
+If test cases waiting to be added into Dovetail do not belong to any project
+that is already integrated into Dovetail framework, then besides adding the test
+case configuration files introduced before, there are some other files needed to
+be added or modified.
+
+
+Step 1: Add a project configuration file
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+For a new test case that belongs to a new project, it needs to create a project
+configuration file to define this new project in Dovetail first. Now Dovetail
+only supports integration of projects by using their Docker images. If this test
+case should be run with shell runner, then can only add test case configuration
+files with `type` 'shell' as describing before and skip the following steps. Following is
+the introduction of how to use project configuration file to add one new project
+into Dovetail. Please refer to `Dovetail projects configuration github
+<https://github.com/opnfv/dovetail/tree/master/etc/conf>`_ for all configuration
+files of all integrated projects.
+
+.. code-block:: bash
+
+ ---
+
+ {% set validate_testcase = validate_testcase or '' %}
+ {% set testcase = testcase or '' %}
+ {% set dovetail_home = dovetail_home or '' %}
+ {% set debug = debug or 'false' %}
+ {% set build_tag = build_tag or '' %}
+ {% set userconfig_dir = '/tmp/userconfig' %}
+ {% set patches_dir = '/tmp/patches' %}
+ {% set result_dir = '/tmp/results' %}
+
+ project name:
+ image_name: name of the docker image
+ docker_tag: tag of the docker image
+ opts: options needed such as '-itd'
+ envs: envs used to create containers such as '-e DEBUG={{debug}}'
+ volumes:
+ - '-v {{dovetail_home}}/pre_config:/home/opnfv/pre_config'
+ - '-v {{dovetail_home}}/userconfig:{{userconfig_dir}}'
+ - '-v {{dovetail_home}}/patches:{{patches_dir}}'
+ - '-v {{dovetail_home}}/results:{{result_dir}}'
+ patches_dir: {{patches_dir}}
+ pre_condition:
+ - 'Commands needed to be executed before running this test'
+ cmds:
+ - 'Commands used to run this test case'
+ post_condition:
+ - 'Commands needed to be executed after running this test case'
+ openrc: absolute path of openstack credential files
+ extra_container:
+ - container1_name
+ - container2_name
+
+This is the complete format of project configuration file. Here are some
+detailed description for each of the configuration options.
+
+- **Jinja Template**: At the begining of this yaml file, it uses Jinja template
+ to define some parameters that will be used somewhere in this file (e.g. result_dir).
+ Also there are some parameters providing by Dovetail framework as input of this
+ file, and other parameters can be defined by using these ones (e.g. testcase and
+ dovetail_home). The whole input parameters are list below.
+
+ - **validate_testcase**: This is the name of the test case instance which calls this
+ project configuration file. The name is provided by the configuration file
+ of this test case (validate -> testcase).
+
+ - **testcase**: This is the name of the test case which calls this project
+ configuration file. Different from `validate_testcase`, this is the name
+ defined in Dovetail not its own project.
+
+ - **os_insecure**: This is only for test cases aiming at OpenStack. This is
+ `True` or `False` according to `env_config.sh` file.
+
+ - **cacert**: This is also only for OpenStack test cases. It is the absolute
+ path of the OpenStack certificate provided in `env_config.sh` file.
+
+ - **deploy_scenario**: This is the input when running Dovetail with option
+ `--deploy-scenario`.
+
+ - **ram_num**: This is the input when running Dovetail with option
+ `--ram-num`.
+
+ - **dovetail_home**: This is the `DOVETAIL_HOME` getting from the ENV.
+
+ - **debug**: This is `True` or `False` according to the command running test
+ cases with or without option `--debug`.
+
+ - **build_tag**: This is a string includes the UUID generated by Dovetail.
+
+ - **host_url**: This is only for ONAP VNF SDK to get the HOST_URL provided
+ in `env_config.sh` file.
+
+ - **csar_file**: This is also only for ONAP VNF SDK to get the CSAR_FILE
+ provided in `env_config.sh` file.
+
+- **project name**: This is the project name defined in Dovetail. For example
+ OPNFV Functest project is named as 'functest' here in Dovetail. This project
+ name will be used by test case configuration files as well as somewhere in
+ Dovetail source code.
+
+- **image_name**: This is the name of the default Docker image for all test cases
+ within this project. Each test case can overwrite it with its own configuration.
+
+- **docker_tag**: This is the tag of all Docker images for all test cases within
+ this project. For each release, it should use one Docker image with a stable
+ and official release version.
+
+- **opts**: Here are all options used to run Docker containers except envs and
+ volume mappings (e.g. '-it --privileged=true').
+
+- **envs**: Here are all envs used to run Docker containers (e.g. '-e ONE=one
+ -e TWO=two').
+
+- **volumes**: A volume mapping list used to run Docker containers. Every project
+ should at least map the `$DOVETAIL_HOME/pre_config` and `$DOVETAIL_HOME/results`
+ in the test host to containers to get config files and collect all result files.
+
+- **patches_dir**: This is an absolute path of the patches applied to the containers.
+
+- **pre_condition**: A list of all default preparations needed by this project.
+ It can be overwritten by configurations of test cases.
+
+- **cmds**: A list of all default commands used to run all test cases within
+ this project. Also it can be overwritten by configurations of test cases.
+
+- **post_condition**: A list of all default cleaning commands needed by this
+ project.
+
+- **openrc**: [optional] If the system under test is OpenStack, then it needs to
+ provide the absolute path here to copy the credential file in the Test Host to
+ containers.
+
+
+Step 2: Add related classes
+^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+After adding the project and test case configuration files, there also need to
+add some related classes into the source code.
+
+- **Test Case class**: Each project should have its own test case class in
+ `testcase.py` for `TestcaseFactory`.
+
+- **Test Runner class**: Each project should have its own test runner class in
+ `test_runner.py` for `TestRunnerFactory`.
+
+- **Crawler class**: Each project should have its own test results crawler class
+ in `report.py` for `CrawlerFactory`.
+
+- **Checker class**: Each project should have its own test results checker class
+ in `report.py` for `CheckerFactory`.
+
+
+Step 3: Create related logs
+^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+If the classes added in step2 have function `create_log`, then need to call
+these functions in `run.py` to initial the log instances at the very begining.
+
+
+Step 4: Update unit tests
+^^^^^^^^^^^^^^^^^^^^^^^^^
+
+A patch is not going to be verified without 100% coverage when applying acceptance check.