summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--docker/Dockerfile.centos76
-rw-r--r--docs/dovetailtool/dovetail.tool.cli.rst165
-rw-r--r--docs/dovetailtool/dovetail.tool.configuration.rst35
-rw-r--r--docs/dovetailtool/dovetail.tool.installation.rst91
-rw-r--r--docs/dovetailtool/dovetail.tool.template.rst (renamed from docs/dovetailtool/dovetail.tool.configtemplate.rst)55
-rw-r--r--docs/dovetailtool/index.rst3
-rw-r--r--dovetail/compliance/debug.yml6
-rw-r--r--dovetail/conf/dovetail_config.yml20
-rw-r--r--dovetail/conf/functest_config.yml9
-rw-r--r--dovetail/report.py81
-rw-r--r--dovetail/testcase.py30
-rw-r--r--dovetail/testcase/example.tc001.yml2
-rw-r--r--dovetail/testcase/ipv6.tc001.yml2
-rw-r--r--dovetail/testcase/nfvi.tc101.yml8
-rw-r--r--dovetail/testcase/nfvi.tc102.yml8
-rw-r--r--dovetail/utils/dovetail_utils.py27
16 files changed, 327 insertions, 221 deletions
diff --git a/docker/Dockerfile.centos7 b/docker/Dockerfile.centos7
index acf60282..c423badd 100644
--- a/docker/Dockerfile.centos7
+++ b/docker/Dockerfile.centos7
@@ -6,8 +6,7 @@ ARG BRANCH=master
RUN yum update -y && yum install -y sudo iproute epel-release && \
yum install -y python-pip git docker && \
- sed -ie 's/requiretty/!requiretty/g' /etc/sudoers && \
- pip install pyyaml click jinja2
+ sed -ie 's/requiretty/!requiretty/g' /etc/sudoers
ENV HOME /home/opnfv
ENV REPOS_DIR ${HOME}/dovetail
@@ -15,6 +14,9 @@ WORKDIR /home/opnfv
RUN git config --global http.sslVerify false && \
git clone --depth 1 -b $BRANCH https://git.opnfv.org/dovetail ${REPOS_DIR} && \
+ pip install -U pip && \
+ pip install -r ${REPOS_DIR}/requirements.txt && \
+ pip install -e . && \
mkdir -p ${REPOS_DIR}/results
WORKDIR ${REPOS_DIR}/dovetail
diff --git a/docs/dovetailtool/dovetail.tool.cli.rst b/docs/dovetailtool/dovetail.tool.cli.rst
index bdcf46e4..f13bc289 100644
--- a/docs/dovetailtool/dovetail.tool.cli.rst
+++ b/docs/dovetailtool/dovetail.tool.cli.rst
@@ -3,21 +3,80 @@
.. http://creativecommons.org/licenses/by/4.0
.. (c) OPNFV, Huawei Technologies Co.,Ltd and others.
-Command Line Interface
-======================
+================================
+Dovetail Command-line Interface
+================================
-Dovetail supports modifying some parameters at the run-time by using the command
-line interface (CLI). The parameters can be defined through a config file by
-developers easily and then be used at the run-time.
-The CLI now fits all three kinds of running: directly running the python script,
-running after setup and running within Docker containers.
+Command Line Introduction
+==========================
-Define CLI with config file
----------------------------
+The Dovetail command-line interface provides a method for interacting with
+Dovetail from the console. For help on the ``dovetail`` command, enter:
-For easy to be modified, Dovetail provides ``dovetail/dovetail/conf/cmd_config.yml``
-to define CLI automatically.
+::
+
+ dovetail --help
+
+**dovetail** optional arguments:
+
+::
+
+ --version
+ show program's version number and exit
+ list <testsuite_name>
+ list the testsuite details
+ show <testcase_name>
+ show the testcase details
+ run <arguments>
+ run the testcases
+
+For **dovetail list**, if the ``<testsuite_name>`` is omitted,
+all the testsuites defined under ``/dovetail/compliance`` directory
+and related testcases will be listed, otherwise,
+the test area and test cases defined in ``<testsuite_name>`` are listed.
+
+For **dovetail show**, the ``<testcase_name>`` is required, the contents defined
+in ``<testcase_name>.yml`` is shown.
+
+For **dovetail run**, by running ``dovetail run --help``, the ``dovetail run``
+usage is shown as:
+
++------------------------+-----------------------------------------------------+
+|Options | |
++========================+=====================================================+
+| -t, --SUT_TYPE |Installer type of the system under test (SUT). |
++------------------------+-----------------------------------------------------+
+| --creds |Openstack credential file location |
++------------------------+-----------------------------------------------------+
+| -i, --SUT_IP |IP of the system under test (SUT). |
++------------------------+-----------------------------------------------------+
+| -d, --debug |True for showing debug log on screen. |
++------------------------+-----------------------------------------------------+
+| -f, --func_tag |Overwrite tag for functest docker container (e.g. |
+| |stable or latest) |
++------------------------+-----------------------------------------------------+
+| -y, --yard_tag |Overwrite tag for yardstick docker container (e.g. |
+| |stable or latest) |
++------------------------+-----------------------------------------------------+
+| --testarea |compliance testarea within testsuite |
++------------------------+-----------------------------------------------------+
+| --testsuite |compliance testsuite. |
++------------------------+-----------------------------------------------------+
+| -h, --help |Show this message and exit. |
++------------------------+-----------------------------------------------------+
+
+
+If no arguments are given, the default testsuite will be performed, i.e., the ``compliance_set``
+testsuite with default configurations.
+
+For more information about **dovetail** command-line interface, please refer to the wiki page [3]_
+
+Parameters definition with config file
+======================================
+
+The default **dovetail run** parameters can be modified in
+``dovetail/dovetail/conf/cmd_config.yml``, which is shown as:
::
@@ -71,40 +130,14 @@ to define CLI automatically.
default: 'full'
help: 'compliance testarea within testsuite'
-Dovetail uses click module in python to parse parameters defined in the above
-config file. The basic config file shown above contains two subsections:
-**arguments** and **options** corresponding to two types of parameters in click.
-
-Add options
-+++++++++++
-
-Just as the name suggested, option parameters can either be given or not by users
-after adding into CLI.
-
-Then how to add an option for developers?
-
-For each option, it at least needs a key **flags** to give its name. Customarily,
-each option has two names, full name and short name, and they are begin with '--'
-and '-' respectively. All other keys should be consistent with click's keys.
-
-Take option **scenario** as the example. Its full name is '--scenario', and its
-short name is '-s'. Actually full name is necessary but short name is optional.
-The full name '--scenario' should be the same with the block's name **scenario**.
-**default** section gives the default value of this option if it doesn't given
-by users. Without the **default** section, it will be set None. **help** section
-offers its help message that will be shown when excute -h/--help command. For
-more information about click, please refer to: http://click.pocoo.org/5/
-
-Add arguments
-+++++++++++++
+The Click module is used to parse parameters defined in the above config file,
+two subsections are included in this file, ``arguments`` and ``options``,
+which corresponds to two types of parameters in Click.
-Arguments must given orderly by users once they are defined in the config file.
-The Dovetail tool doesn't need any argument parameters currently. However, here
-just give a simple example for its format.
-
-Arguments also need subsection **flags** to give its name. Each argument can just
-have one name, and the name should be the same with the key of this section. Other
-keys should also be consistent with the click module.
+Arguments and Options
++++++++++++++++++++++
+Only ``options`` is used currently, which means parameters can be given (or not) without
+sequence restriction.
Config and control
++++++++++++++++++
@@ -131,50 +164,10 @@ the configs will be changed into
-e NODE_NAME=dovetail-pod -e DEPLOY_SCENARIO=ha_nosdn
-e BUILD_TAG=dovetail -e CI_DEBUG=true -e DEPLOY_TYPE=baremetal'
-The config options/arguments can be added or deleted just by modifying
+The config options/arguments can be added or deleted by modifying
``cmd_config.yml`` rather than changing the source code. However, for control
command, besides adding it into ``cmd_config.yml``, some other operations about
the source code are also needed.
-Run with CLI
-------------
-
-For users, they can use CLI to input their own envs at the run-time instead of
-modifying the config files of functest or yardstick. So Dovetail can supports
-different environments more flexible with CLI. Dovetail now can be run with three
-methods, directly running ``run.py`` script, running after setup and running
-in Docker containers. The uses of CLI are almost the same for these three methods
-and here take the first one as the example.
-
-All parameters offered by Dovetail can be listed by using help option ``--help``.
-
-::
-
- root@90256c4efd05:~/dovetail/dovetail$ python run.py --help
- Usage: run.py [OPTIONS]
-
- Dovetail compliance test entry!
-
- Options:
- -t, --SUT_TYPE TEXT Installer type of the system under test (SUT).
- -f, --func_tag TEXT Overwrite tag for functest docker container (e.g.
- stable or latest)
- -i, --SUT_IP TEXT IP of the system under test (SUT).
- -y, --yard_tag TEXT Overwrite tag for yardstick docker container (e.g.
- stable or latest)
- -d, --DEBUG TEXT DEBUG for showing debug log.
- --testarea TEXT compliance testarea within testsuite
- --testsuite TEXT compliance testsuite.
- -h, --help Show this message and exit.
-
-All options listed can be used to input special environment values at the run-time.
-For example:
-
-::
-
- python run.py --SUT_TYPE compass -y stable
-There is no need to give all these options. If it is not given by CLI, it will
-be set with the system's environment value. If it is not included in system's
-environment variables, it will be set with the default value in functest/yardstick
-config file.
+. [3] https://wiki.opnfv.org/display/dovetail/Dovetail+Command+Line
diff --git a/docs/dovetailtool/dovetail.tool.configuration.rst b/docs/dovetailtool/dovetail.tool.configuration.rst
deleted file mode 100644
index 8e97e73c..00000000
--- a/docs/dovetailtool/dovetail.tool.configuration.rst
+++ /dev/null
@@ -1,35 +0,0 @@
-.. This work is licensed under a Creative Commons Attribution 4.0 International
-.. License.
-.. http://creativecommons.org/licenses/by/4.0
-.. (c) OPNFV, Huawei Technologies Co.,Ltd and others.
-
-=========================
-Testcase Template Syntax
-=========================
-
-The testcases used for compliance and certification are defined in the ``dovetail/testcase`` directory,
-which are defined in yaml format. Take the testcase ``ipv6.tc001.yml`` as an example, it is shown as:
-
-::
-
- dovetail.ipv6.tc001:
- name: dovetail.ipv6.tc001
- objective: VIM ipv6 operations, to create/delete network, port and subnet in bulk operation
- scripts:
- type: functest
- testcase: tempest_smoke_serial
- sub_testcase_list:
- - tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_network
- - tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_port
- - tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_subnet
-
-- At least three sections named 'name', 'objective', 'scripts' must be included
-- Section 'name' distinguishes different test cases used for compliance and certification,
- and should start with ``dovetail.``
-- Section 'objective' describes what this testcase does
-- Section 'scripts' has subsections such as 'type', 'testcase' and 'sub_testcase_list'
-- Two kinds of 'type' is supported by now, functest and yardstick
-- For functest, the 'testcase' represents the testcases in slicing/tier,
- 'sub_testcase_list' represents the testcases in this slicing compliance and certification will use.
- For yardstick, since it is not sliced by now, 'sub_testcase_list' is not needed, only to edit the 'testcase' part
- such as ``yardstick_tc027``
diff --git a/docs/dovetailtool/dovetail.tool.installation.rst b/docs/dovetailtool/dovetail.tool.installation.rst
index 73a66cd1..9d3bc39f 100644
--- a/docs/dovetailtool/dovetail.tool.installation.rst
+++ b/docs/dovetailtool/dovetail.tool.installation.rst
@@ -19,13 +19,30 @@ running on the SUT (System Under Test):
::
- SUT_TYPE, SUT type, e.g., apex, compass, fuel, joid, etc
- SUT_IP, SUT external network IP, e.g., 192.168.200.2
- NODE_NAME, this can be shown in the test result for users to see which pod the dovetail tool runs
- DEPLOY_SCENARIO, deployment scenario, e.g., os-nosdn-nofeature-ha
- BUILD_TAG, this can be shown in the test result for users to identify logs
- CI_DEBUG, true for debug information printed and false for not printed
- DEPLOY_TYPE, baremetal or virtual
+ SUT_TYPE
+ SUT type, e.g., apex, compass, fuel, joid, etc
+ SUT_IP
+ SUT external network IP, e.g., 192.168.200.2
+ NODE_NAME
+ this can be shown in the test result for users to see which pod the dovetail tool runs
+ DEPLOY_SCENARIO
+ deployment scenario, e.g., os-nosdn-nofeature-ha
+ BUILD_TAG
+ this can be shown in the test result for users to identify logs
+ CI_DEBUG
+ true for debug information printed and false for not printed
+ DEPLOY_TYPE
+ baremetal or virtual
+
+The above configuration can be achieved by
+
+- modifying the environment variables in files which live under ``/dovetail/conf/`` directory
+- set and use Linux environment variables using ``export`` command
+- set and use these variables when using ``dovetail run`` command line, for details see the
+ `Dovetail Command-line Interface`_ section
+- enable the OpenStack credential file, which can be achieved by using
+ ``dovetail run --creds </path/creds>``
+
Dovetail tool installation on local Linux host environment
##########################################################
@@ -89,6 +106,7 @@ by running:
::
+ pip install tox
tox
Compliance and certification test cases
@@ -114,33 +132,33 @@ After environment preparation is complete and test cases added, the Dovetail too
::
- python run.py --testsuite compliance_set
+ dovetail run --testsuite compliance_set
-The value ``compliance_set`` passed to the ``testsuite`` flag can be replaced with the test cases yaml file.
-If not argument is given, the compliance_set testsuite will be run as the default.
+The value ``compliance_set`` passed to the ``testsuite`` flag can be replaced
+with the testsuite yaml file name which want to be run.
+If no argument is given, the compliance_set testsuite will be run as the default.
Moreover, the testcases in given testarea can be run with ``testarea`` command line argument, such as
testarea ``ipv6`` in ``compliance_set``
::
- python run.py --testsuite compliance_set --testarea ipv6
+ dovetail run --testsuite compliance_set --testarea ipv6
Dovetail provides some sets, ``debug``, ``proposed_tests`` and ``compliance_set``,
``debug`` is used for locally and Continuous Integration(CI) developing purpose,
which provides typical testcase examples, feel free to edit it when develops locally, such as
only to run a testcase which only takes minutes. ``proposed_tests`` is the testcase
-candidate which mainly comes from the wiki link
-https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Areas+and+Test+Cases.
+candidate which mainly comes from the wiki link [1]_.
``compliance_set`` is used for compliance. Moreover, dovetail tool can be easily
extended to support more complicated compliance requirements,
such as feature set based or scenario based compliance.
-If you want to run ``debug``, just run with
+If you want to run the ``debug`` testsuite, just run with
::
- python run.py --testsuite debug
+ dovetail run --testsuite debug
Running Dovetail in a Docker container
########################################
@@ -165,7 +183,8 @@ Docker image.
::
- sudo docker build -t <your_new_image_name> -f <your_Dockerfile> .
+ cd {dovetail_path}/dovetail/docker
+ docker build --no-cache -t opnfv/dovetail:<Tag> --build-arg BRANCH=master .
Dovetail Docker container creation
----------------------------------
@@ -178,13 +197,15 @@ Next, create the ``dovetail-docker-env`` file to define the environment paramete
DEPLOY_SCENARIO=ha-nosdn
CI_DEBUG=true
+or if an OpenStack credential file is provided.
+
Then to instantiate the Dovetail Docker container, execute::
sudo docker run --privileged=true --rm -t \
- --env-file dovetail-docker-env \
+ --env-file dovetail-docker-env OR </path/creds> \
-v /home/opnfv/dovetail/results:/home/opnfv/dovetail/results \
-v /var/run/docker.sock:/var/run/docker.sock \
- --name <Dovetail_Container_Name> \
+ --name <Dovetail_Container_Name> (optional) \
opnfv/dovetail:<Tag> /bin/bash
To attach dovetail container and Running test cases
@@ -192,16 +213,27 @@ To attach dovetail container and Running test cases
Before connecting to the container, you can check the container status by running ::
- docker ps -a
+ sudo docker ps -a
Attach to the container by starting it and obtaining a bash prompt with ::
- docker exec -it <Dovetail_Container_Name> bash
+ sudo docker exec -it <Dovetail_Container_Name>/<Container_Id> bash
+
+Inside the container the following commands can be executed to trigger the testing ::
+
+ dovetail run --testsuite compliance_set
-Inside the container the following commands can be executed to trigger the testcases ::
+Offline Support
+################
- cd /home/opnfv/dovetail/dovetail
- python run.py --testsuite compliance_set
+There are some SUTs that are isolated from the public internet,
+so offline support is needed. The idea is to provide all of the packages of dovetail
+release in http://artifacts.opnfv.org, then the user can download and transfer to their inner
+development environment.
+
+The packages are shown in [2]_
+
+TO DO: to introduce more when it is mature enough.
Results Output
###############
@@ -212,11 +244,10 @@ The compliance report is stored in ``/home/opnfv/dovetail/results/dovetail_repor
Dovetail Version and Release
############################
-Dovetail version tag is shown in ``setup.cfg``, which will also shown in the
-``dovetail report``. At the time of version release, just to set the version value in
-``setup.cfg``.
+Dovetail version information is defined in ``setup.cfg``.
+At the time of release, it is the dovetail team's responsibility to set
+the ``version`` value in ``setup.cfg``.
+
-# TO DO: (which should be discussed)
-1)how to pubish version, such as both the online and offline package in some website
-or somewhere.
-2)provide version download address, userguide, etc.
+.. [1] https://wiki.opnfv.org/display/dovetail/Dovetail+Test+Areas+and+Test+Cases.
+.. [2] http://artifacts.opnfv.org/dovetail.html.
diff --git a/docs/dovetailtool/dovetail.tool.configtemplate.rst b/docs/dovetailtool/dovetail.tool.template.rst
index 9c0748a9..1a483dd9 100644
--- a/docs/dovetailtool/dovetail.tool.configtemplate.rst
+++ b/docs/dovetailtool/dovetail.tool.template.rst
@@ -3,9 +3,58 @@
.. http://creativecommons.org/licenses/by/4.0
.. (c) OPNFV, Huawei Technologies Co.,Ltd and others.
-======================
+==================
+Template Syntax
+==================
+
+Testcase Template Syntax
+=========================
+
+The testcases used for compliance and certification are defined in the
+``dovetail/testcase`` directory, which are written in yaml format.
+Take the testcase ``ipv6.tc001.yml`` as an example. It is shown as:
+
+::
+
+ ---
+ dovetail.ipv6.tc001:
+ name: dovetail.ipv6.tc001
+ objective: Bulk creation and deletion of IPv6 networks, ports and subnets
+ validate:
+ type: functest
+ testcase: tempest_smoke_serial
+ pre_condition:
+ - 'echo test for precondition in testcase'
+ cmds:
+ - 'functest env prepare'
+ - 'functest testcase run {{validate_testcase}}'
+ post_condition:
+ - 'echo test for precondition in testcase'
+ report:
+ sub_testcase_list:
+ - tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_network
+ - tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_port
+ - tempest.api.network.test_networks.BulkNetworkOpsIpV6Test.test_bulk_create_delete_subnet
+
+
+- At least three sections named ``name``, ``objective``, ``validate``,
+ ``report`` must be included
+- Section ``name`` distinguishes different test cases used for compliance,
+ it is composed of 3 parts, ``dovetail.``, belongs to which test area,
+ and the serial number
+- Section ``objective`` briefly describes what this testcase does
+- Section ``validate`` defines the scripts and configurations for the
+ validation of the test case. ``type`` defines which method is used to validate,
+ 3 ways, i.e., functest, yardstick and shell is supported currently.
+ ``testcase`` represents the testcases in slicing/tier.
+- Section ``report`` defines the sub_testcases to be run.
+ For ``yardstick``, since it is not sliced by now,
+ ``sub_testcase_list`` is not needed, only to edit the ``testcase`` part in
+ section ``validate``, such as ``yardstick_tc027``
+
+
Config Template Syntax
-======================
+=======================
For Dovetail tool, the config files are located in ``dovetail/dovetail/conf``, which are written
in yaml format. As both functest and yardstick are utilized by Dovetail, their configuration files
@@ -15,7 +64,7 @@ respectively.
Functest config template syntax
-------------------------------
-An example functest configuration is shown as follows:
+An example of ``functest`` configuration is shown as follows:
::
diff --git a/docs/dovetailtool/index.rst b/docs/dovetailtool/index.rst
index 676c72db..b095704a 100644
--- a/docs/dovetailtool/index.rst
+++ b/docs/dovetailtool/index.rst
@@ -12,6 +12,5 @@ Dovetail Overview
dovetail.tool.overview.rst
dovetail.tool.installation.rst
- dovetail.tool.configuration.rst
- dovetail.tool.configtemplate.rst
+ dovetail.tool.template.rst
dovetail.tool.cli.rst
diff --git a/dovetail/compliance/debug.yml b/dovetail/compliance/debug.yml
index 6fc4f0f1..87003c47 100644
--- a/dovetail/compliance/debug.yml
+++ b/dovetail/compliance/debug.yml
@@ -6,6 +6,10 @@ debug:
name: debug
testcases_list:
- dovetail.example.tc002
- - dovetail.ipv6.tc001
+ - dovetail.ipv6.tc008
+ - dovetail.ipv6.tc009
+ - dovetail.ipv6.tc018
+ - dovetail.ipv6.tc019
- dovetail.nfvi.tc001
- dovetail.nfvi.tc002
+ - dovetail.nfvi.tc101
diff --git a/dovetail/conf/dovetail_config.yml b/dovetail/conf/dovetail_config.yml
index b6f7b016..f8f18e46 100644
--- a/dovetail/conf/dovetail_config.yml
+++ b/dovetail/conf/dovetail_config.yml
@@ -25,6 +25,20 @@ testarea_supported:
- ipv6
- example
+functest_testsuite:
+ - tempest_smoke_serial
+ - tempest_full_parallel
+ - rally_sanity
+ - promise
+
+functest_testcase:
+ - healthcheck
+ - vping_ssh
+ - vping_userdata
+ - doctor
+ - copper
+ - cloudify_ims
+
# used for testcase cmd template in jinja2 format
# we have two variables available now
# parameter path, use this path to walk through python object and get value
@@ -53,3 +67,9 @@ validate_input:
valid_docker_tag:
- 'stable'
- 'latest'
+ - 'colorado.1.0'
+ - 'colorado.2.0'
+ - 'colorado.3.0'
+ - 'danube.1.0'
+ - 'danube.2.0'
+ - 'danube.3.0'
diff --git a/dovetail/conf/functest_config.yml b/dovetail/conf/functest_config.yml
index f20d1b7e..2c702cdb 100644
--- a/dovetail/conf/functest_config.yml
+++ b/dovetail/conf/functest_config.yml
@@ -4,18 +4,21 @@ functest:
docker_tag: latest
envs: '-e INSTALLER_TYPE=compass -e INSTALLER_IP=192.168.200.2
-e NODE_NAME=dovetail-pod -e DEPLOY_SCENARIO=ha_nosdn
- -e BUILD_TAG=dovetail -e CI_DEBUG=true -e DEPLOY_TYPE=baremetal'
+ -e BUILD_TAG=dovetail -e CI_DEBUG=true
+ -e DEPLOY_TYPE=baremetal
+ -e RESULTS_STORE=file:///home/opnfv/functest/results/functest_result.json'
opts: '-id --privileged=true'
pre_condition:
- 'echo test for precondition in functest'
cmds:
- 'functest env prepare'
- - 'functest testcase run {{validate_testcase}}'
+ - 'functest testcase run {{validate_testcase}} -r'
post_condition:
- 'echo test for postcondition in functest'
result:
dir: '/home/opnfv/functest/results'
store_type: 'file'
- file_path: 'tempest/tempest.log'
+ file_path: 'functest_result.json'
+ tp_path: 'tempest/tempest.log'
db_url: 'http://testresults.opnfv.org/test/api/v1/results?case=%s&last=1'
creds: '/home/opnfv/functest/conf/openstack.creds'
diff --git a/dovetail/report.py b/dovetail/report.py
index 11e3c244..b7b27930 100644
--- a/dovetail/report.py
+++ b/dovetail/report.py
@@ -214,23 +214,44 @@ class FunctestCrawler(object):
def crawl_from_file(self, testcase=None):
dovetail_config = dt_cfg.dovetail_config
- file_path = \
- os.path.join(dovetail_config['result_dir'],
- dovetail_config[self.type]['result']['file_path'])
- if not os.path.exists(file_path):
- self.logger.info('result file not found: %s', file_path)
- return None
-
- try:
+ criteria = 'FAIL'
+ timestart = 0
+ testcase_duration = 0
+ testcase_name = testcase.validate_testcase()
+ json_results = {}
+ if testcase_name in dt_cfg.dovetail_config['functest_testcase']:
+ file_path = \
+ os.path.join(dovetail_config['result_dir'],
+ dovetail_config[self.type]['result']['file_path'])
+ if not os.path.exists(file_path):
+ self.logger.info('result file not found: %s', file_path)
+ return None
+ with open(file_path, 'r') as f:
+ for jsonfile in f:
+ data = json.loads(jsonfile)
+ if testcase_name == data['case_name']:
+ criteria = data['details']['status']
+ timestart = data['details']['timestart']
+ testcase_duration = data['details']['duration']
+
+ json_results = {'criteria': criteria,
+ 'details': {"timestart": timestart,
+ "duration": testcase_duration,
+ "tests": '', "failures": ''}}
+ elif 'tempest' in testcase_name:
+ file_path = \
+ os.path.join(dovetail_config['result_dir'],
+ dovetail_config[self.type]['result']['tp_path'])
+ if not os.path.exists(file_path):
+ self.logger.info('result file not found: %s', file_path)
+ return None
with open(file_path, 'r') as myfile:
output = myfile.read()
- error_logs = ""
- for match in re.findall('(.*?)[. ]*FAILED', output):
- error_logs += match
+ error_logs = " ".join(re.findall('(.*?)[. ]*fail ', output))
+ skipped = " ".join(re.findall('(.*?)[. ]*skip:', output))
- criteria = 'PASS'
- failed_num = int(re.findall(' - Failed: (\d*)', output)[0])
+ failed_num = int(re.findall(' - Failures: (\d*)', output)[0])
if failed_num != 0:
criteria = 'FAIL'
@@ -239,13 +260,11 @@ class FunctestCrawler(object):
json_results = {'criteria': criteria, 'details': {"timestart": '',
"duration": int(dur_sec_int),
"tests": int(num_tests), "failures": failed_num,
- "errors": error_logs}}
- self.logger.debug('Results: %s', str(json_results))
- return json_results
- except Exception as e:
- self.logger.error('Cannot read content from the file: %s, '
- 'exception: %s', file_path, e)
- return None
+ "errors": error_logs,
+ "skipped": skipped}}
+
+ self.logger.debug('Results: %s', str(json_results))
+ return json_results
def crawl_from_url(self, testcase=None):
url = \
@@ -289,17 +308,15 @@ class YardstickCrawler(object):
if not os.path.exists(file_path):
self.logger.info('result file not found: %s', file_path)
return None
- try:
- with open(file_path, 'r') as myfile:
- myfile.read()
- criteria = 'PASS'
- json_results = {'criteria': criteria}
- self.logger.debug('Results: %s', str(json_results))
- return json_results
- except Exception as e:
- self.logger.error('Cannot read content from the file: %s, '
- 'exception: %s', file_path, e)
- return None
+ criteria = 'FAIL'
+ with open(file_path, 'r') as f:
+ for jsonfile in f:
+ data = json.loads(jsonfile)
+ if 1 == data['status']:
+ criteria = 'PASS'
+ json_results = {'criteria': criteria}
+ self.logger.debug('Results: %s', str(json_results))
+ return json_results
def crawl_from_url(self, testcase=None):
return None
@@ -378,6 +395,8 @@ class FunctestChecker(object):
all_passed = True
for sub_testcase in sub_testcase_list:
self.logger.debug('check sub_testcase:%s', sub_testcase)
+ # TO DO: should think the test case when skipped, should think
+ # together with the "dovetail report"
if sub_testcase in db_result['details']['errors']:
testcase.sub_testcase_passed(sub_testcase, False)
all_passed = False
diff --git a/dovetail/testcase.py b/dovetail/testcase.py
index 5ca23c4b..4ad2b361 100644
--- a/dovetail/testcase.py
+++ b/dovetail/testcase.py
@@ -105,24 +105,28 @@ class Testcase(object):
def pre_condition(self):
try:
pre_condition = self.testcase['validate']['pre_condition']
- if pre_condition == '':
- pre_condition = self.pre_condition_cls(self.validate_type())
+ except KeyError:
+ pre_condition = ''
+ if pre_condition:
return pre_condition
- except:
+ pre_condition = self.pre_condition_cls(self.validate_type())
+ if not pre_condition:
self.logger.debug('testcase:%s pre_condition is empty',
self.name())
- return ''
+ return pre_condition
def post_condition(self):
try:
post_condition = self.testcase['validate']['post_condition']
- if post_condition == '':
- post_condition = self.post_condition_cls(self.validate_type())
+ except KeyError:
+ post_condition = ''
+ if post_condition:
return post_condition
- except:
+ post_condition = self.post_condition_cls(self.validate_type())
+ if not post_condition:
self.logger.debug('testcae:%s post_condition is empty',
self.name())
- return ''
+ return post_condition
def run(self):
runner = TestRunnerFactory.create(self)
@@ -151,11 +155,17 @@ class Testcase(object):
@staticmethod
def pre_condition_cls(validate_type):
- return dt_cfg.dovetail_config[validate_type]['pre_condition']
+ try:
+ return dt_cfg.dovetail_config[validate_type]['pre_condition']
+ except KeyError:
+ return None
@staticmethod
def post_condition_cls(validate_type):
- return dt_cfg.dovetail_config[validate_type]['post_condition']
+ try:
+ return dt_cfg.dovetail_config[validate_type]['post_condition']
+ except KeyError:
+ return None
@classmethod
def update_validate_testcase(cls, testcase_name):
diff --git a/dovetail/testcase/example.tc001.yml b/dovetail/testcase/example.tc001.yml
index e389a00f..0ba297a8 100644
--- a/dovetail/testcase/example.tc001.yml
+++ b/dovetail/testcase/example.tc001.yml
@@ -9,7 +9,7 @@ dovetail.example.tc001:
- 'echo test for precondition'
cmds:
- 'functest env prepare'
- - 'functest testcase run {{validate_testcase}}'
+ - 'functest testcase run {{validate_testcase}} -r'
post_condition:
- 'echo test for precondition'
report:
diff --git a/dovetail/testcase/ipv6.tc001.yml b/dovetail/testcase/ipv6.tc001.yml
index f9edf069..598e1cad 100644
--- a/dovetail/testcase/ipv6.tc001.yml
+++ b/dovetail/testcase/ipv6.tc001.yml
@@ -9,7 +9,7 @@ dovetail.ipv6.tc001:
- 'echo test for precondition in testcase'
cmds:
- 'functest env prepare'
- - 'functest testcase run {{validate_testcase}}'
+ - 'functest testcase run {{validate_testcase}} -r'
post_condition:
- 'echo test for precondition in testcase'
report:
diff --git a/dovetail/testcase/nfvi.tc101.yml b/dovetail/testcase/nfvi.tc101.yml
new file mode 100644
index 00000000..7c8fb3ec
--- /dev/null
+++ b/dovetail/testcase/nfvi.tc101.yml
@@ -0,0 +1,8 @@
+dovetail.nfvi.tc101:
+ name: dovetail.nfvi.tc101
+ objective: measure number of cores and threads, available memory size and cache size
+ validate:
+ type: yardstick
+ testcase: opnfv_yardstick_tc001
+ report:
+ sub_testcase_list:
diff --git a/dovetail/testcase/nfvi.tc102.yml b/dovetail/testcase/nfvi.tc102.yml
new file mode 100644
index 00000000..7ce0435e
--- /dev/null
+++ b/dovetail/testcase/nfvi.tc102.yml
@@ -0,0 +1,8 @@
+dovetail.nfvi.tc102:
+ name: dovetail.nfvi.tc102
+ objective: measure number of cores and threads, available memory size and cache size
+ validate:
+ type: yardstick
+ testcase: opnfv_yardstick_tc002
+ report:
+ sub_testcase_list:
diff --git a/dovetail/utils/dovetail_utils.py b/dovetail/utils/dovetail_utils.py
index 960801a8..a54081f5 100644
--- a/dovetail/utils/dovetail_utils.py
+++ b/dovetail/utils/dovetail_utils.py
@@ -10,7 +10,6 @@
#
import sys
-import time
import subprocess
from collections import Mapping, Set, Sequence
@@ -43,25 +42,21 @@ def exec_cmd(cmd, logger=None, exit_on_error=False, info=False,
exec_log(verbose, logger, msg_exec, level)
p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,
- stderr=subprocess.PIPE)
- seconds = 0
- while p.poll() is None:
- seconds += 1
- if seconds > 3:
- show_progress_bar(seconds)
- time.sleep(1)
-
- (stdout, stderr) = p.communicate()
- if p.returncode == 0:
- for line in stdout.strip().splitlines():
- exec_log(verbose, logger, line, level, True)
- else:
- exec_log(verbose, logger, stderr, 'error')
+ stderr=subprocess.STDOUT)
+ stdout = ''
+ for line in iter(p.stdout.readline, b''):
+ exec_log(verbose, logger, line.strip(), level, True)
+ stdout += line
+ stdout = stdout.strip()
+ returncode = p.wait()
+ p.stdout.close()
+
+ if returncode != 0:
exec_log(verbose, logger, msg_err, 'error')
if exit_on_error:
sys.exit(1)
- return p.returncode, stdout.strip()
+ return returncode, stdout
# walkthrough the object, yield path and value