summaryrefslogtreecommitdiffstats
path: root/docs
diff options
context:
space:
mode:
authorChristopherPrice <christopher.price@ericsson.com>2017-03-30 15:11:28 +0200
committerChristopher Price <christopher.price@ericsson.com>2017-04-26 08:15:29 +0000
commit48489a3e205014420bdb7d604e99b07e4fd8736b (patch)
tree8841836cc4e5c08d2c5abbfb52a4f84dca345648 /docs
parentd473f82ebfe5e18540702d6ef13cad19eae4f1a6 (diff)
JIRA: DOVETAIL-352 Updating the test specification document with iterative inprovements.
JIRA tasks are being created for specific sections of the document. Refer to the doc for links to relevant JIRA tasks. Change-Id: I1717d56b8817c38802a227db320f30029f68fbd0 Signed-off-by: ChristopherPrice <christopher.price@ericsson.com>
Diffstat (limited to 'docs')
-rw-r--r--docs/testing/user/systempreparation/index.rst72
-rw-r--r--docs/testing/user/testspecification/index.rst28
-rw-r--r--docs/testing/user/teststrategy/index.rst101
-rw-r--r--docs/testing/user/userguide/index.rst479
4 files changed, 632 insertions, 48 deletions
diff --git a/docs/testing/user/systempreparation/index.rst b/docs/testing/user/systempreparation/index.rst
new file mode 100644
index 00000000..3364becb
--- /dev/null
+++ b/docs/testing/user/systempreparation/index.rst
@@ -0,0 +1,72 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) Ericsson AB
+
+============================================================
+Compliance and Verification program system preparation guide
+============================================================
+
+.. toctree::
+ :maxdepth: 2
+
+Version history
+===============
+
++------------+----------+------------------+----------------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+----------------------------------+
+| 2017-04-25 | 0.0.1 | Chris Price | Draft version |
+| | | | |
++------------+----------+------------------+----------------------------------+
+
+
+Introduction
+============
+
+This system preparation guide provides a detailed outline of the needed system prerequisites
+and expectations for running OPNFV evaluation testing.
+
+Test planning and preparation
+=============================
+
+https://jira.opnfv.org/browse/DOVETAIL-384
+
+Give an outline of the planning phase.
+
+--------------
+Pre-requisites
+--------------
+
+Describe what needs to be in place before starting.
+
+Required Infrastructure, connectivity needs, LF & CVP accounts and any additional security or peripheral needs.
+Use sub-chapters for instance for any accounts etc that need to be created.
+
+-------------------------------------------
+Preparing the virtualisation infrastructure
+-------------------------------------------
+
+Briefly state what will be tested as an intro, then outline the required system state to be achieved prior to running the tests.
+
+Preparing the test staging host system
+--------------------------------------
+
+https://jira.opnfv.org/browse/DOVETAIL-385
+
+What is required from the system running the CVP test suites?
+
+Required network configuration
+-------------------------------
+
+https://jira.opnfv.org/browse/DOVETAIL-386
+
+VLAN configurations required for NFV deployments. Non-VLAN configurations if needed.
+
+Preparing your system for CVP testing
+-------------------------------------
+
+https://jira.opnfv.org/browse/DOVETAIL-387
+
+Describe how to realise the "Pharos ready state" and the "Software ready state" in this section.
+
diff --git a/docs/testing/user/testspecification/index.rst b/docs/testing/user/testspecification/index.rst
new file mode 100644
index 00000000..4d70ea1d
--- /dev/null
+++ b/docs/testing/user/testspecification/index.rst
@@ -0,0 +1,28 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) Ericsson AB
+
+======================================================
+Compliance and Verification program test specification
+======================================================
+
+.. toctree::
+ :maxdepth: 2
+
+Version history
+===============
+
++------------+----------+------------------+----------------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+----------------------------------+
+| 2017-03-15 | 0.0.0 | | Blank |
+| | | | |
++------------+----------+------------------+----------------------------------+
+
+
+Introduction
+============
+
+Write useful things about the test cases here. Break each section/area out into a seperate
+RST file for manageablity and tracking.
diff --git a/docs/testing/user/teststrategy/index.rst b/docs/testing/user/teststrategy/index.rst
index f5f2da80..db05e035 100644
--- a/docs/testing/user/teststrategy/index.rst
+++ b/docs/testing/user/teststrategy/index.rst
@@ -7,7 +7,7 @@ Compliance and Verification program test specification
======================================================
.. toctree::
-:maxdepth: 2
+ :maxdepth: 2
Version history
===============
@@ -100,7 +100,7 @@ achieved.
Compliance to the Pharos specification is evaluated as a component of the test suite and provides an visibility into
the physical infrastructures ability to scale in accordance with OPNFV requirements. The test suite itself does not
-require and infrastructure that is running to be deployed at scale in order to pass the tests. It is assumed the
+require an infrastructure that is running to be deployed at scale in order to pass the tests. It is assumed the
compliance to Pharos provides enough infrastructure validation that the capability is inherent.
Characteristics
@@ -152,43 +152,12 @@ Test planning and preparation
https://jira.opnfv.org/browse/DOVETAIL-384
-Give an outline of the planning phase.
+Give an outline of the planning phase, refer to the detailed system prep guide and test guides here.
---------------
-Pre-requisites
---------------
-
-Describe what needs to be in place before starting.
-
-Required Infrastructure, connectivity needs, LF & CVP accounts and any additional security or peripheral needs.
-Use sub-chapters for instance for any accounts etc that need to be created.
-
--------------------------------------------
-Preparing the virtualisation infrastructure
--------------------------------------------
-
-Briefly state what will be tested as an intro, then outline the required system state to be achieved prior to running the tests.
-
-Preparing the test staging host system
---------------------------------------
-
-https://jira.opnfv.org/browse/DOVETAIL-385
-
-What is required from the system running the CVP test suites?
-
-Required network configuration
--------------------------------
-
-https://jira.opnfv.org/browse/DOVETAIL-386
-
-VLAN configurations required for NFV deployments. Non-VLAN configurations if needed.
-
-Preparing your system for CVP testing
--------------------------------------
-
-https://jira.opnfv.org/browse/DOVETAIL-387
+ ../../systempreparation/index.rst
-Describe how to realise the "Pharos ready state" and the "Software ready state" in this section.
+Feature testing scope and approach
+==================================
-------------------
Pre-test validation
@@ -198,8 +167,6 @@ Describe how to evaluate test readiness here.
I suggest this be a process of doing a "dry run" and evaluating the results on the DB. This should not
need to be reproduced later in the document.
-Feature testing scope and approach
-==================================
-------------
Test approach
@@ -217,20 +184,46 @@ Feature test scope
Included test areas
-------------------
-This section should identify all the features areas and combinations of features that are to be tested.
-This should reference sections of test descriptions and test tooling documents to enable cross checking.
-A clear an concise description of the test areas and the purposes and content should be captured here, a
-little like an executive summary of the test case documentation.
+CVP testing for the Danube release focuses on evaluating the ability of a platform to host basic carrier
+networking related workloads. The testing focuses on establishing the ability of the SUT to perform
+basic NFVi operations such as managing images, instatiating workload & networking these workloads in a
+secure and resiliant manner.
+
+Many OPNFV features are derived from our target upstream communities resulting in the tests focusing
+on exposing behaviour present through those development efforts. This approach to OPNFV development
+is reflected in our CVP testing where upstream validation procedures are leveraged to validate the
+composite platform. The OpenStack `RefStack <https://refstack.openstack.org/#/>`_ test suites are
+leveraged in OPNFV for performing VIM validation according to expected behaviour in OPNFV.
+
+OPNFV CVP testing has explicit requirements on hardware, control plane and compute topologies which
+are assumed to be in place, these are covered in the `Preparing the virtualisation infrastructure`_
+section of this document. Tests may fail if the system is not prepared and configured in accordance
+the prpeparation guidelines.
Excluded test areas
-------------------
-Describe what is not tested here. At a similar level to the above, not making excuses just being concise
-as to what is out of scope.
+The CVP testing procedure in Danube do not cover all aspects of the available OPNFV system, nor feature
+suites. To ensure the highest quality of evaluation that provides a trustworthy foundation for industry
+compliance toward a common OPNFV standard tests and test areas are selected based on three key principals;
+maturity of the test area and framework, availability of features and capabilities across OPNFV compositions,
+relevance and value to the industry for evaluation.
+
+In the Danube release of the CVP we have elected to esbalish an evaluation suite for only the common base
+features of the platform. Features areas that are optional or not yet mature in the platform are expluded.
+This includes a number of optinal networking features such as BGP VPN networking and Service Chaining which
+are intended to be included as optional evaluation areas in future CVP releases.
+
+The Danube release of the OPNFV CVP testing suite in addition dose not attempt to provide an evaluation criteria
+for hardware. Any hardware used in the evaluation testing must comply to the pre-resuities outlined in the
+`Preparing the virtualisation infrastructure`_ section of this document. Although the hardware is not tested
+itself and no qualification metric has been established for the hardware in the Danube release.
Test criteria and reporting
---------------------------
+https://jira.opnfv.org/browse/DOVETAIL-389
+
This section should specify the criteria to be used to decide whether a test item has passed or failed.
As each area may have differences it is important to ensure the user can react to a failure in accordance
with it's relevance to the overall test suites running.
@@ -244,6 +237,8 @@ tester to know if they should submit, resolve an item in their stack, or try aga
Test design and tools
---------------------
+This section needs to be done once we know the tools and areas covered by the CVP testing suites. Parked for now.
+
VIM NBI testing
---------------
@@ -277,10 +272,21 @@ This section should identify the test procedure being executed. Include all peop
roles involved in the execution of a CVP test. Include procedures, mailing lists, escalation, support
and periods of waiting for results.
+The general workflow I assume should be something like this:
+
+* Log into the CVP test case staging computer
+* Open a Browser and Log into the CVP tool.
+* Select the CVP suite to run, agree to the questions (there should be only 1 for now)
+* Run the tests (we should be able to launch this from the Web UI if they are on the hosting machine)
+* Wait for the tool to report completed.
+* Review your results, refer to trouble shooting or support as needed
+* Submit your results for CVP evaluation
+
+------------
Test reports
-============
+------------
-Describe the process of producing and accessing the test report.
+Describe the process of producing and accessing the test report. This shoudl be a sub-section of CVP test execution I think.
how do I connect a test suite to my account to get a report? How do I access the report when it is ready,
how do I identify one report from another in the toolchain? We should go into all the details here and point
@@ -293,4 +299,3 @@ References
.. _`"Pharos specification"`: https://opnfv.org/
.. _`OPNFV Glossary`: http://docs.opnfv.org/en/latest/glossary/index.html
-
diff --git a/docs/testing/user/userguide/index.rst b/docs/testing/user/userguide/index.rst
new file mode 100644
index 00000000..d8eb124b
--- /dev/null
+++ b/docs/testing/user/userguide/index.rst
@@ -0,0 +1,479 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) Ericsson AB
+
+==============================================
+Compliance and Verification program user guide
+==============================================
+
+.. toctree::
+ :maxdepth: 2
+
+Version history
+===============
+
++------------+----------+------------------+----------------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+----------------------------------+
+| 2017-03-15 | 0.0.1 | Chris Price | Draft version |
+| | | | |
++------------+----------+------------------+----------------------------------+
+
+
+Dovetail CVP Testing Overview
+=============================
+
+The Dovetail testing framework consists of two major parts: the testing client that executes
+all test cases in a vendor lab (self-testing) or a third party lab, and the server system that
+is under the OPNFV's administration to store and view test results based on OPNFV Test API. The
+following diagram illustrates this overall framework.
+
+/* here is a draft diagram that needs to be revised when exact information is known and fixed */
+
+This section mainly focuses on helping the testers in the vendor's domain attempting to run the
+CVP tests.
+
+Dovetail client tool (or just Dovetail tool or Dovetail for short) can be installed in the
+jumphost either directly as Python software, or as a Docker(r) container. Comments of pros
+and cons of the two options TBD.
+
+The section 'Installing the test tool'_ describes the steps the tester needs to take to install
+Dovetail directly from the source. In 2.3, we describe steps needed for installing Dovetail
+Docker(r) container. Once installed, and properly configured, the remaining test process is mostly
+identical for the two options. In 2.4, we go over the steps of actually running the test suite.
+In 2.5, we discuss how to view test results and make sense of them, for example, what the tester
+may do in case of unexpected test failures. Section 2.6 describes additional Dovetail features
+that are not absolutely necessary in CVP testing but users may find useful for other purposes.
+One example is to run Dovetail for in-house testing as preparation before official CVP testing;
+another example is to run Dovetail experimental test suites other than the CVP test suite.
+Experimental tests may be made available by the community for experimenting less mature test
+cases or functionalities for the purpose of getting feedbacks for improvement.
+
+Installing the test tool
+========================
+
+Before taking this step, testers should check the hardware and networking requirements of
+the POD, and the jumphost in particular, to make sure they are compliant.
+
+In this section, we describe the procedure to install Dovetail client tool that runs the CVP
+test suite from the jumphost. The jumphost must have network access to both the public Internet
+and to the O&M (Operation and Management) network with access rights to all VIM APIs being tested.
+
+-------------------------------
+Checking the Jumphost Readiness
+-------------------------------
+
+While Dovetail does not have hard requirement on a specific operating system type or version,
+these have been validated by the community through some level of exercise in OPNFV labs or PlugFests.
+
+Ubuntu 16.04.2 LTS (Xenial) for x86_64
+Ubuntu 14.04 LTS (Trusty) for x86_64
+CentOS-7-1611 for x86_64
+Red Hat Enterprise Linux 7.3 for x86_64
+Fedora 24 Server for x86_64
+Fedora 25 Server for x86_64
+
+------------------------------------
+Configuring the Jumphost Environment
+------------------------------------
+
+/* First, openstack env variables to be passed to Functest */
+
+The jumphost needs to have the right environmental variable setting to enable access to the
+Openstack API. This is usually done through the Openstack credential file.
+
+Sample Openstack credential file environment_config.sh:
+
+/*Project-level authentication scope (name or ID), recommend admin project.*/
+
+export OS_PROJECT_NAME=admin
+
+/* Authentication username, belongs to the project above, recommend admin user.*/
+
+export OS_USERNAME=admin
+
+
+/* Authentication password.*/
+
+export OS_PASSWORD=secret
+
+
+/* Authentication URL, one of the endpoints of keystone service. If this is v3 version, there need some extra variables as follows.*/
+
+export OS_AUTH_URL='http://xxx.xxx.xxx.xxx:5000/v3'
+
+
+/* Default is 2.0. If use keystone v3 API, this should be set as 3.*/
+
+export OS_IDENTITY_API_VERSION=3
+
+
+/* Domain name or ID containing the user above. Command to check the domain: openstack
+user show <OS_USERNAME>*/
+
+export OS_USER_DOMAIN_NAME=default
+
+
+/* Domain name or ID containing the project above. Command to check the domain: openstack
+project show <OS_PROJECT_NAME>*/
+
+export OS_PROJECT_DOMAIN_NAME=default
+
+
+/* home directory for dovetail, if install Dovetail Docker container, DOVETAIL_HOME can
+just be /home/opnfv*/
+
+export DOVETAIL_HOME=$HOME/cvp
+
+Export all these variables into environment by,
+
+% source <OpenStack-credential-file-path>
+
+
+The tester should validate that the Openstack environmental settings are correct by,
+% openstack service list
+
+-----------------------------------
+Installing Prerequisite on Jumphost
+-----------------------------------
+
+1. Dovetail requires Python 2.7 and later
+
+Use the following steps to check if the right version of python is already installed,
+and if not, install it.
+
+% python --version
+
+2. Dovetail requires Docker 1.8.0 and later
+
+Use the following steps to check if the right version of Docker is already installed,
+and if not, install it.
+
+% docker --version
+
+As the docker installation process is much complex, you can refer to the official
+document: https://docs.docker.com/engine/installation/linux/
+
+-------------------------------------
+2.2.4 Installing Dovetail on Jumphost
+-------------------------------------
+
+A tester can choose one of the following two methods for installing and running Dovetail.
+In part1, we explain the steps to install Dovetail from the source. In part2, an alternative
+using a Docker image with preinstalled Dovetail is introduced. part1. Installing Dovetail directly
+
+Update and install packages
+
+a) Ubuntu
+
+sudo apt-get update
+
+sudo apt-get -y install gcc git vim python-dev python-pip --no-install-recommends
+
+b) centos and redhat
+
+sudo yum -y update
+
+sudo yum -y install epel-release
+
+sudo yum -y install gcc git vim-enhanced python-devel python-pip
+
+c) fedora
+
+sudo dnf -y update
+
+sudo dnf -y install gcc git vim-enhanced python-devel python-pip redhat-rpm-config
+
+p.s When testing SUT's https service, there need some extra packages, such as
+apt-transport-https. This still remains to be verified.
+
+
+Installing Dovetail
+
+Now we are ready to install Dovetail.
+
+/* Version of dovetail is not specified yet? we are still using the latest in the master
+- this needs to be fixed before launch. */
+
+First change directory to $DOVETAIL_HOME,
+
+% cd $DOVETAIL_HOME
+
+% sudo git clone https://git.opnfv.org/dovetail
+
+% cd $DOVETAIL_HOME/dovetail
+
+% sudo pip install -e ./
+
+/* test dovetail install is successful */
+
+% dovetail -h
+part2. Installing Dovetail Docker Container
+
+The Dovetail project also maintains a Docker image that has Dovetail test tools preinstalled.
+
+Running CVP Test Suite
+======================
+
+------------------
+Running Test Suite
+------------------
+
+The Dovetail client CLI allows the tester to specify which test suite to run.
+By default the results are stored in a local file $DOVETAIL_HOME/dovetail/results.
+
+% dovetail run --testsuite <test suite name> --openrc <path-to-openrc-file> /*?? */
+
+Multiple test suites may be available, testsuites named "debug" and "proposed_tests" are just provided for testing. But for the purpose of running CVP test suite, the test suite name follows the following format,
+
+CVP.<major>.<minor>.<patch> /* test if this format works */
+
+For example, CVP_1_0_0
+
+% dovetail run --testsuite CVP_1_0_0
+
+When the SUT's VIM (Virtual Infrastructure Manager) is Openstack, its configuration is commonly defined in the openrc file. In that case, you can specify the openrc file in the command line,
+
+% dovetail run --testsuite CVP_1_0_0 --openrc <path-to-openrc-file>
+
+In order to report official results to OPNFV, run the CVP test suite and report to OPNFV official URL,
+
+% dovetail run --testsuite <test suite name> --openrc <path-to-openrc-file> --report https://www.opnfv.org/cvp
+
+The official server https://www.opnfv.org/cvp is still under development, there is a temporal server to use http://205.177.226.237:9997/api/v1/results
+
+--------------------------------
+Making Sense of CVP Test Results
+--------------------------------
+
+When a tester is performing trial runs, Dovetail stores results in a local file by default.
+
+% cd $DOVETAIL_HOME/dovetail/results
+
+
+
+1. local file
+
+a) Log file: dovetail.log
+
+/* review the dovetail.log to see if all important information has been captured - in default mode without DEBUG */
+
+/* the end of the log file has a summary of all test case test results */
+
+Additional log files may be of interests: refstack.log, opnfv_yardstick_tcXXX.out ...
+
+b) Example: Openstack refstack test case example
+
+can see the log details in refstack.log, which has the passed/skipped/failed test cases result, the failed test cases have rich debug information
+
+for the users to see why this test case fails.
+
+c) Example: OPNFV Yardstick test case example
+
+for yardstick tool, its log is stored in yardstick.log
+
+for each test case result in Yardstick, the logs are stored in opnfv_yardstick_tcXXX.out, respectively.
+
+
+
+2. OPNFV web interface
+
+wait for the complement of LF, test community, etc.
+2.3.3 Updating Dovetail or Test Suite
+
+% cd $DOVETAIL_HOME/dovetail
+
+% sudo git pull
+
+% sudo pip install -e ./
+
+This step is necessary if dovetail software or the CVP test suite have updates.
+
+
+Other Dovetail Usage
+====================
+
+------------------------
+Running Dovetail Locally
+------------------------
+
+/*DB*/
+
+---------------------------------------------
+Running Dovetail with Experimental Test Cases
+---------------------------------------------
+
+
+--------------------------------------------------
+Running Individual Test Cases or for Special Cases
+--------------------------------------------------
+
+1. Refstack client to run Defcore testcases
+
+a) By default, for Defcore test cases run by Refstack-client, which are consumed by
+DoveTail, are run followed with automatically generated configuration file, i.e.,
+refstack_tempest.conf.
+
+In some circumstances, the automatic configuration file may not quite satisfied with
+the SUT, DoveTail provide a way for users to set its configuration file according
+to its own SUT manually,
+
+besides, the users should define Defcore testcase file, i.e., defcore.txt, at the
+same time. The steps are shown as,
+
+when "Installing Dovetail Docker Container" method is used,
+
+
+% sudo mkdir /home/opnfv/dovetail/userconfig
+
+% cd /home/opnfv/dovetail/userconfig
+
+% touch refstack_tempest.conf defcore.txt
+
+% vim refstack_tempest.conf
+
+% vim defcore.txt
+
+
+the recommend way to set refstack_tempest.conf is shown in
+https://aptira.com/testing-openstack-tempest-part-1/
+
+the recommended way to edit defcore.txt is to open
+https://refstack.openstack.org/api/v1/guidelines/2016.08/tests?target=compute&type=required&alias=true&flag=false
+and copy all the test cases into defcore.txt.
+
+Then use “docker run” to create a container,
+
+
+% sudo docker run --privileged=true -it -v <openrc_path>:<openrc_path> \
+
+-v /home/opnfv/dovetail/results:/home/opnfv/dovetail/results \
+
+-v /home/opnfv/dovetail/userconfig:/home/opnfv/dovetail/userconfig \
+
+-v /var/run/docker.sock:/var/run/docker.sock \
+
+--name <DoveTail_Container_Name> (optional) \
+
+opnfv/dovetail:<Tag> /bin/bash
+
+
+
+there is a need to adjust the CVP_1_0_0 testsuite, for dovetail,
+defcore.tc001.yml and defcore.tc002.yml are used for automatic and
+manual running method, respectively.
+
+Inside the dovetail container,
+
+
+% cd /home/opnfv/dovetail/compliance
+
+% vim CVP_1_0_0.yml
+
+
+to add defcore.tc002 and annotate defcore.tc001.
+
+
+b) when "Installing Dovetail Directly" method is used, before to run
+the dovetail commands, there is a need to set configuration file and
+defcore test cases file
+
+
+% cd $DOVETAIL_HOME/dovetail
+
+% mkdir userconfig
+
+% cd userconfig
+
+% touch refstack_tempest.conf defcore.txt
+
+% vim refstack_tempest.conf
+
+% vim defcore.txt
+
+recommended way to set refstack_tempest.conf and defcore.txt is
+same as above in "Installing Dovetail Docker Container" method section.
+
+
+
+For Defcore test cases manually running method, there is a need to adjust
+the compliance_set test suite,
+
+for dovetail, defcore.tc001.yml and defcore.tc002.yml are used for automatic
+and manual running method, respectively.
+
+
+
+% cd $DOVETAIL_HOME/dovetail/compliance
+
+% vim CVP_1_0_0.yml
+
+
+to add defcore.tc002 and annotate defcore.tc001
+
+3 Dovetail Client CLI Manual
+
+This section contains a brief manual for all the features available through the Dovetail client command line interface (CLI).
+3.1 Check dovetail commands
+
+% dovetail -h
+
+dovetail.PNG
+
+Dovetail has three commands: list, run and show.
+6.2 List
+6.2.1 List help
+
+% dovetail list -h
+
+list-help.PNG
+6.2.2 List a test suite
+
+List command will list all test cases belong to the given test suite.
+
+% dovetail list compliance_set
+
+list-compliance.PNG
+
+% dovetail list debug
+
+list-debug.PNG
+
+The ipv6, example and nfvi are test areas. If no <TESTSUITE> is given, it will list all testsuites.
+6.3 Show
+
+Show command will give the detailed info of one certain test case.
+6.3.1 Show help
+
+% dovetail show -h
+
+show-help.PNG
+6.3.2 Show test case
+
+show-ipv6.PNG
+6.4 Run
+
+Dovetail supports running a named test suite, or one named test area of a test suite.
+6.4.1 Run help
+
+% dovetail run -h
+
+run-help.PNGThere are some options:
+
+func_tag: set FuncTest’s Docker tag, for example stable,latest and danube.1.0
+
+openrc: give the path of OpenStack credential file
+
+yard_tag: set Yardstick’s Docker tag
+
+testarea: set a certain testarea within a certain testsuite
+
+offline: run without pull the docker images, and it requires the jumphost to have these images locally. This will ensure DoveTail run in an offline environment.
+
+report: push results to DB or store with files
+
+testsuite: set the testsuite to be tested
+
+debug: flag to show the debug log messages
+