summaryrefslogtreecommitdiffstats
path: root/docs/testing/developer/testcaserequirements
diff options
context:
space:
mode:
authorBryan Sullivan <bryan.sullivan@att.com>2017-08-04 06:36:53 -0700
committerWenjing Chu <chu.wenjing@gmail.com>2017-08-22 12:07:39 -0700
commit362fa29fc78d2c6c2ed77df2d5396d07929f12f8 (patch)
tree98eceebe53102a7012f18666ed9dbd8c7c3a074a /docs/testing/developer/testcaserequirements
parent47e451d80dd90ba534b6cfa54b535ee8b483ece5 (diff)
Clarify criteria for testcase inclusion as running in Functest/CI
And additional clarifications. JIRA: DOVETAIL-477 Replace tabs. Further clarify testing programs and optionality criteria. Change-Id: I8eebdf215fb7cc75a6da9748629b1a73d0b71d19 Signed-off-by: Bryan Sullivan <bryan.sullivan@att.com>
Diffstat (limited to 'docs/testing/developer/testcaserequirements')
-rw-r--r--docs/testing/developer/testcaserequirements/index.rst48
1 files changed, 32 insertions, 16 deletions
diff --git a/docs/testing/developer/testcaserequirements/index.rst b/docs/testing/developer/testcaserequirements/index.rst
index 38eb93a1..912f19da 100644
--- a/docs/testing/developer/testcaserequirements/index.rst
+++ b/docs/testing/developer/testcaserequirements/index.rst
@@ -1,9 +1,9 @@
.. This work is licensed under a Creative Commons Attribution 4.0 International License.
.. http://creativecommons.org/licenses/by/4.0
-.. (c) Ericsson AB
+.. (c) Ericsson AB, and others
==========================================================
-Compliance and Verification program test case requirements
+Compliance Verification Program test case requirements
==========================================================
.. toctree::
@@ -21,35 +21,51 @@ All CVP tests are available in open source and are executed in open source test
Test case requirements
======================
-The following requirements are mandatory for test to be submitted for consideration in the
+The following requirements are mandatory for a test to be submitted for consideration in the
CVP test suite:
-- All test cases must be fully documented, in a common format
+- All test cases must be fully documented, in a common format (please refer to the test
+ specification directory for examples).
- - Clearly identifying the test procedure and expected results / metrics to determine a “pass” or “fail” result.
+ - Clearly identifying the test procedure and expected results / metrics to determine a “pass”
+ or “fail” result.
-- Tests must be validated for purpose, tests should be run with both an expected positive and negative outcome.
-- Tests should focus on functionality, not performance.
+- Tests must be validated for the purpose of CVP, tests should be run with both an expected
+ positive and negative outcome.
+- At the current stage of CVP, only functional tests are eligible, performance testing
+ is out of scope.
- - Performance test output could be built in as “for information only”, but must not carry pass/fail metrics.
+ - Performance test output could be built in as “for information only”, but must not carry
+ pass/fail metrics.
- Test cases should favor implementation of a published standard interface for validation
- Where no standard is available provide API support references
- - If a standard exists and is not followed, an exemption is required
+ - If a standard exists and is not followed, an exemption is required. Such exemptions
+ can be raised in the project meetings first, and if no consensus can be reached, escalated
+ to the TSC.
-- Test cases must pass on OPNFV reference deployments
+- Test cases must pass on applicable OPNFV reference deployments and release versions.
- Tests must not require a specific NFVi platform composition or installation tool
- - Tests and test tools must run independently of the method of platform installation and architecture.
- - Tests and tool must run independent of specific OPNFV components allowing different components such as storage backends or SDN controllers.
+ - Tests and test tools must run independently of the method of platform installation
+ and architecture.
+ - Tests and tool must run independent of specific OPNFV components allowing different
+ components such as storage backends or SDN controllers.
- Tests must not require un-merged patches to the relevant upstream projects
- - Tests must not require features or code which are out of scope for the latest release of the OPNFV project
-
+ - Tests must not require features or code which are out of scope for the latest release of
+ the OPNFV project
+ - Tests must have a documented history of recent successful verification in OPNFV testing
+ programs including CI, Functest, Yardstick, Bottlenecks, Dovetail, etc (i.e. all testing
+ programs in OPNFV that regularly validate tests against the release, whether automated or manual)
+ - Tests must be considered optional unless they such have a documented history for ALL OPNFV
+ scenarios that are both
+ - applicable, i.e. support the feature that the test exercises
+ - released, i.e. in the OPNFV release supported by the CVP test suite version
- Tests must run against a fully deployed and operational system under test.
-- Tests and test tools must support stand alone OPNFV and commercial derived OPNFV based solution
+- Tests and test implementations must support stand alone OPNFV and commercial derived OPNFV based solution
- There can be no dependency on OPNFV resources or infrastructure.
@@ -57,7 +73,7 @@ The following requirements are mandatory for test to be submitted for considerat
- Use case specification
- Test preconditions
- - Basic test flow execution descriptor
+ - Basic test flow execution description
- Pass fail criteria
- The following things may be documented for the test case: