summaryrefslogtreecommitdiffstats
path: root/docs/testing/developer
diff options
context:
space:
mode:
Diffstat (limited to 'docs/testing/developer')
-rw-r--r--docs/testing/developer/testcaserequirements/index.rst134
1 files changed, 87 insertions, 47 deletions
diff --git a/docs/testing/developer/testcaserequirements/index.rst b/docs/testing/developer/testcaserequirements/index.rst
index 912f19da..ecb56a5e 100644
--- a/docs/testing/developer/testcaserequirements/index.rst
+++ b/docs/testing/developer/testcaserequirements/index.rst
@@ -9,71 +9,96 @@ Compliance Verification Program test case requirements
.. toctree::
:maxdepth: 2
+
CVP Test Suite Purpose and Goals
================================
-The CVP test suite is intended to provide a method for validating the interfaces
-and behaviors of an NFVi platform according to the expected capabilities exposed in
-OPNFV. The behavioral foundation evaluated in these tests should serve to provide
-a functional baseline for VNF deployment and portability across NFVi instances.
-All CVP tests are available in open source and are executed in open source test frameworks.
+The CVP test suite is intended to provide a method for validating the
+interfaces and behaviors of an NFVi platform according to the expected
+capabilities exposed in OPNFV. The behavioral foundation evaluated in these
+tests should serve to provide a functional baseline for VNF deployment and
+portability across NFVi instances. All CVP tests are available in open source
+and are executed in open source test frameworks.
+
Test case requirements
======================
-The following requirements are mandatory for a test to be submitted for consideration in the
- CVP test suite:
+The following requirements are mandatory for a test to be submitted for
+consideration in the CVP test suite:
+
+- All test cases must be fully documented, in a common format. Please consider
+ the existing :ref:`dovetail_testspecifications` as examples.
+
+ - Clearly identifying the test procedure and expected results / metrics to
+ determine a “pass” or “fail” result.
+
+- Tests must be validated for the purpose of CVP, tests should be run with both
+ an expected positive and negative outcome.
+
+- At the current stage of CVP, only functional tests are eligible, performance
+ testing is out of scope.
+
+ - Performance test output could be built in as “for information only”, but
+ must not carry pass/fail metrics.
+
+- Test cases should favor implementation of a published standard interface for
+ validation.
+
+ - Where no standard is available provide API support references.
-- All test cases must be fully documented, in a common format (please refer to the test
- specification directory for examples).
+ - If a standard exists and is not followed, an exemption is required. Such
+ exemptions can be raised in the project meetings first, and if no consensus
+ can be reached, escalated to the TSC.
- - Clearly identifying the test procedure and expected results / metrics to determine a “pass”
- or “fail” result.
+- Test cases must pass on applicable OPNFV reference deployments and release
+ versions.
-- Tests must be validated for the purpose of CVP, tests should be run with both an expected
- positive and negative outcome.
-- At the current stage of CVP, only functional tests are eligible, performance testing
- is out of scope.
+ - Tests must not require a specific NFVi platform composition or installation
+ tool.
- - Performance test output could be built in as “for information only”, but must not carry
- pass/fail metrics.
+ - Tests and test tools must run independently of the method of platform
+ installation and architecture.
-- Test cases should favor implementation of a published standard interface for validation
+ - Tests and test tools must run independently of specific OPNFV components
+ allowing different components such as storage backends or SDN
+ controllers.
- - Where no standard is available provide API support references
- - If a standard exists and is not followed, an exemption is required. Such exemptions
- can be raised in the project meetings first, and if no consensus can be reached, escalated
- to the TSC.
+ - Tests must not require un-merged patches to the relevant upstream projects.
-- Test cases must pass on applicable OPNFV reference deployments and release versions.
+ - Tests must not require features or code which are out of scope for the
+ latest release of the OPNFV project.
- - Tests must not require a specific NFVi platform composition or installation tool
+ - Tests must have a documented history of recent successful verification in
+ OPNFV testing programs including CI, Functest, Yardstick, Bottlenecks,
+ Dovetail, etc. (i.e., all testing programs in OPNFV that regularly validate
+ tests against the release, whether automated or manual).
- - Tests and test tools must run independently of the method of platform installation
- and architecture.
- - Tests and tool must run independent of specific OPNFV components allowing different
- components such as storage backends or SDN controllers.
+ - Tests must be considered optional unless they have a documented history for
+ ALL OPNFV scenarios that are both
+
+ - applicable, i.e., support the feature that the test exercises, and
+
+ - released, i.e., in the OPNFV release supported by the CVP test suite
+ version.
- - Tests must not require un-merged patches to the relevant upstream projects
- - Tests must not require features or code which are out of scope for the latest release of
- the OPNFV project
- - Tests must have a documented history of recent successful verification in OPNFV testing
- programs including CI, Functest, Yardstick, Bottlenecks, Dovetail, etc (i.e. all testing
- programs in OPNFV that regularly validate tests against the release, whether automated or manual)
- - Tests must be considered optional unless they such have a documented history for ALL OPNFV
- scenarios that are both
- - applicable, i.e. support the feature that the test exercises
- - released, i.e. in the OPNFV release supported by the CVP test suite version
- Tests must run against a fully deployed and operational system under test.
-- Tests and test implementations must support stand alone OPNFV and commercial derived OPNFV based solution
+
+- Tests and test implementations must support stand alone OPNFV and commercial
+ OPNFV-derived solutions.
- There can be no dependency on OPNFV resources or infrastructure.
+ - Tests must not require external resources while a test is running, e.g.,
+ connectivity to the Internet. All resources required to run a test, e.g.,
+ VM and container images, are downloaded and installed as part of the system
+ preparation and test tool installation.
+
- The following things must be documented for the test case:
- Use case specification
- Test preconditions
- - Basic test flow execution description
+ - Basic test flow execution description and test assertions
- Pass fail criteria
- The following things may be documented for the test case:
@@ -82,17 +107,32 @@ The following requirements are mandatory for a test to be submitted for consider
- Fault/Error test case descriptions
- Post conditions where the system state may be left changed after completion
-New test case proposals should complete a CVP test case worksheet to ensure that all
-of these considerations are met before the test case is approved for inclusion in the
-CVP test suite.
+New test case proposals should complete a CVP test case worksheet to ensure
+that all of these considerations are met before the test case is approved for
+inclusion in the CVP test suite.
+
Dovetail Test Suite Naming Convention
=====================================
-Test case naming and structure for dovetail needs to be done, it should consider:
+Test case naming and structuring must comply with the following conventions.
+The fully qualified name of a test case must comprise three sections:
+
+`<community>.<test_area>.<test_case_name>`
+
+- **community**: The fully qualified test case name must identify the community
+ or upstream project which developed and maintains the test case. For test
+ cases originating in OPNFV projects, the community identifier is 'opnfv'.
+ Test cases consumed from the OpenStack tempest test suite, are named
+ 'tempest', for example.
+
+- **test_area**: The fully qualified test case name must identify the test case
+ area. For test cases originating in OPNFV projects, the test case area must
+ identify the project name.
-- Identifying the test area
-- Identifying the purpose of the test case
+- **test_case_name**: The fully qualified test case name must include a concise
+ description of the purpose of the test case.
-To be discussed, agreed and amended to this document.
+An example of a fully qualified test case name is
+`opnfv.sdnvpn.router_association_floating_ip`.