diff options
Diffstat (limited to 'docs/testing/user')
19 files changed, 116 insertions, 491 deletions
diff --git a/docs/testing/user/certificationworkflow/ApplicationForm.rst b/docs/testing/user/certificationworkflow/ApplicationForm.rst index c3d3e7eb..247167d8 100644 --- a/docs/testing/user/certificationworkflow/ApplicationForm.rst +++ b/docs/testing/user/certificationworkflow/ApplicationForm.rst @@ -2,11 +2,11 @@ .. http://creativecommons.org/licenses/by/4.0 .. (c) OPNFV, Intel Corporation and others. -.._cvp-application-form: +.._ovp-application-form: -====================================================== -OPNFV COMPLIANCE VERIFICATION PROGRAM APPLICATION FORM -====================================================== +======================================= +OPNFV Verified Program Application Form +======================================= +----------------------------------+--------------------------------------------------------------------------------------------+ @@ -48,7 +48,7 @@ OPNFV COMPLIANCE VERIFICATION PROGRAM APPLICATION FORM | +--------------------------------------------------------------------------------------------+ | | | +----------------------------------+--------------------------------------------------------------------------------------------+ -| Primary business email | *Only the Business email address should be used for official communication with OPNFV CVP* | +| Primary business email | *Only the Business email address should be used for official communication with OPNFV OVP* | | +--------------------------------------------------------------------------------------------+ | | | | +--------------------------------------------------------------------------------------------+ @@ -62,7 +62,7 @@ OPNFV COMPLIANCE VERIFICATION PROGRAM APPLICATION FORM | +--------------------------------------------------------------------------------------------+ | | | +----------------------------------+--------------------------------------------------------------------------------------------+ -| User ID for CVP web portal | *Choose one: (i) Linux Foundation (ii) Openstack (iii) Github (iv) Google (v) Fackbook ID* | +| User ID for OVP web portal | *Choose one: (i) Linux Foundation (ii) Openstack (iii) Github (iv) Google (v) Facebook ID* | | +--------------------------------------------------------------------------------------------+ | | | | +--------------------------------------------------------------------------------------------+ diff --git a/docs/testing/user/certificationworkflow/index.rst b/docs/testing/user/certificationworkflow/index.rst index 022b2c2e..36346215 100644 --- a/docs/testing/user/certificationworkflow/index.rst +++ b/docs/testing/user/certificationworkflow/index.rst @@ -4,27 +4,27 @@ .. _dovetail-certification_workflow: -================================================================ -OPNFV Compliance Verification Program certification workflow -================================================================ +============================================= +OPNFV Verified Program certification workflow +============================================= Introduction ============ This document provides guidance for testers on how to obtain OPNFV compliance -certification. The OPNFV Compliance Verification Program (CVP) is administered by +certification. The OPNFV Verified Program (OVP) is administered by the OPNFV Compliance and Certification (C&C) committee. For further information about the workflow and general inquiries about the -program, please check out the `CVP web portal`_, or contact -the C&C committee by email address cvp@opnfv.org. This email address should be used -for all communication with the CVP. +program, please check out the `OVP web portal`_, or contact +the C&C committee by email address verified@opnfv.org. This email address should be used +for all communication with the OVP. Step 1: Applying ================ A tester should start the process by completing an application. -The application form can found on the `CVP web portal`_ and the following +The application form can found on the `OVP web portal`_ and the following information should be provided: - Organization name @@ -36,9 +36,9 @@ information should be provided: and third party hardware (please specify) - Primary contact name, business email, postal address and phone number Only the primary email address should be used for - official communication with OPNFV CVP. -- User ID for CVP web portal - The CVP web portal supports the Linux Foundation user ID in the current release. + official communication with OPNFV OVP. +- User ID for OVP web portal + The OVP web portal supports the Linux Foundation user ID in the current release. If a new user ID is needed, visit https://identity.linuxfoundation.org. - Location where the verification testing is to be conducted. Choose one: (internal vendor lab, third-party lab) @@ -53,8 +53,8 @@ Once the application information is received and in order, an email response wil sent to the primary contact with confirmation and information to proceed. [Editor's note: -No fee has been established at this time for CVP applications. Recommend -we skip fee for the initial release of CVP.] +No fee has been established at this time for OVP applications. Recommend +we skip fee for the initial release of OVP.] Step 2: Testing =============== @@ -71,20 +71,20 @@ of this ID for future reference. Step 3: Submitting Test Results =============================== -Testers can upload the test results to the `CVP web portal`_. +Testers can upload the test results to the `OVP web portal`_. By default, the results are visible only to the tester who uploaded the data. Testers can self-review the test results through the portal until they are ready to ask -for CVP review. They may also update with or add new test results as needed. +for OVP review. They may also update with or add new test results as needed. Once the tester is satisfied with the test result, the tester grants access to the test result -for CVP review via the portal. The test result is identified by the unique Test ID. +for OVP review via the portal. The test result is identified by the unique Test ID. When a test result is made visible to the reviewers, the web portal will notify -cvp@opnfv.org and Cc the primary contact email that a review request has been made and reference -the Test ID. This will alert the C&C Committee to start the CVP review process. +verified@opnfv.org and Cc the primary contact email that a review request has been made and reference +the Test ID. This will alert the C&C Committee to start the OVP review process. -Step 4: CVP Review +Step 4: OVP Review =================== Upon receiving the email notification and the Test ID, the C&C Committee conducts a @@ -97,14 +97,13 @@ compliance or non-compliance to the C&C Committee. Normally, the outcome of the review should be communicated to the tester within 10 business days after all required information is in order. -If an application is denied, an appeal can be made to the C&C Committee or ultimately to the -Board of Directors of OPNFV. +If an application is denied, an appeal can be made to the C&C Committee. Step 5: Grant of Use of Logo ============================ If an application is approved, further information will be communicated to the tester -on the guidelines of using OPNFV CVP logos and the status of compliance for promotional purposes. +on the guidelines of using OPNFV OVP logos and the status of compliance for promotional purposes. Appendix @@ -117,4 +116,4 @@ Appendix .. References -.. _`CVP web portal`: https://cvp.opnfv.org +.. _`OVP web portal`: https://verified.opnfv.org diff --git a/docs/testing/user/index.rst b/docs/testing/user/index.rst deleted file mode 100644 index 120aba45..00000000 --- a/docs/testing/user/index.rst +++ /dev/null @@ -1,71 +0,0 @@ -.. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. http://creativecommons.org/licenses/by/4.0 -.. (c) Ericsson AB - -====================================================== -Compliance and Verification program user documentation -====================================================== - -.. toctree:: - :maxdepth: 2 - -Version history -=============== - -+------------+----------+------------------+----------------------------------+ -| **Date** | **Ver.** | **Author** | **Comment** | -| | | | | -+------------+----------+------------------+----------------------------------+ -| 2017-03-15 | 0.0.1 | Chris Price | Draft version | -| | | | | -+------------+----------+------------------+----------------------------------+ - - -Introduction -============ - -This document provides all relevant user documentation for executing the OPNFV -compliance and certification program (CVP). The CVP provides mechanisms for evaluating -NFV infrastructures to establish their alignment with the target architectures, behaviours -and functions defined by the OPNFV community. - -Compliance and Verification Program guidelines -============================================== - -The CVP provides publicly available mechanisms for evaluating compliance to OPNFV concepts, -behaviours and architectures. The CVP, through the Compliance and Certification -committee, is responsible for evaluating the results of any evaluations and providing -access to any associated OPNFV branding. Details of the CVP are described in the `CVP Document.`_ - -"Add the final CVP doc URL in plain text here" - -Compliance and Verification program test specification -====================================================== - -The `test specification`_ provides an outline of the prerequisites, methods, tools -& areas that are applicable to the CVP test process. The `test specification`_ is intended -to give a use a comprehensive overview of what is expected of engineer prior to running -the NFV platform evaluation suite, what tools and processes will be used during the -evaluation and a guide as to how the results of the tests will be interpreted. It is advised -that all practitioners of the CVP read the `test specification`_ prior to beginning -the process of evaluation. - -"Add the final test spec URL in plain text here" - -Compliance and Verification program Test Plan -============================================= - -The test spec will outline the processes & areas and the test case descriptions included. -This document will in addition outline how to modularise the areas to be executed, how to -run each area and when the testing process begins and ends. - -Compliance and Verification program Test results -================================================ - -To be added once our test result reporting system and processes are defined. - -References -========== - -.. _`CVP Document.`: https://opnfv.org -.. _`test specification`: https://opnfv.org diff --git a/docs/testing/user/cvpaddendum/index.rst b/docs/testing/user/ovpaddendum/index.rst index cd8a296a..2bb7e8c8 100644 --- a/docs/testing/user/cvpaddendum/index.rst +++ b/docs/testing/user/ovpaddendum/index.rst @@ -3,10 +3,8 @@ .. http://creativecommons.org/licenses/by/4.0 .. (c) Intel and others -.. _dovetail-cvp_addendum: - ==================================================================== -Compliance Verification Program - Guidelines Addendum for Danube +OPNFV Verified Program - Guidelines Addendum for Danube ==================================================================== .. toctree:: @@ -17,12 +15,12 @@ Introduction ============ This addendum provides a high-level description of the testing scope and -pass/fail criteria used in the Compliance Verification Program (CVP) for the -OPNFV Danube release. This information is intended as an overview for CVP +pass/fail criteria used in the OPNFV Verified Program (OVP) for the +OPNFV Danube release. This information is intended as an overview for OVP testers and for the Dovetail Project to help guide test-tool and test-case development for the OPNFV Danube release. The Dovetail project is responsible for documenting -test-case specifications as well as implementing the CVP tool-chain through collaboration -with the OPNFV testing community. CVP testing focuses on establishing the +test-case specifications as well as implementing the OVP tool-chain through collaboration +with the OPNFV testing community. OVP testing focuses on establishing the ability of the System Under Test (SUT) to perform NFVI and VIM operations and support Service Provider oriented features that ensure manageable, resilient and secure networks. @@ -61,8 +59,8 @@ Assumptions about the System Under Test (SUT) include ... Scope of Testing ================ -The `OPNFV CVP Guidelines`_, as approved by the Board of Directors, outlines -the key objectives of the CVP as follows: +The `OPNFV OVP Guidelines`_, as approved by the Board of Directors, outlines +the key objectives of the OVP as follows: - Help build the market for @@ -87,8 +85,8 @@ the implementation of the underlying system under test". OPNFV provides a broad range of capabilities, including the reference platform itself as well as tools-chains and methodologies for building infrastructures, and deploying and testing the platform. -Not all these aspects are in scope for CVP and not all functions and -components are tested in the initial version of CVP. For example, the deployment tools +Not all these aspects are in scope for OVP and not all functions and +components are tested in the initial version of OVP. For example, the deployment tools for the SUT and CI/CD toolchain are currently out of scope. Similarly, performance benchmarking related testing is also out of scope or for further study. Newer functional areas such as MANO (outside of APIs in the NFVI and @@ -98,13 +96,13 @@ VIM) are still developing and are for future considerations. General Approach ---------------- -In order to meet the above objectives for CVP, we aim to follow a general approach +In order to meet the above objectives for OVP, we aim to follow a general approach by first identifying the overall requirements for all stake-holders, then analyzing what OPNFV and the upstream communities can effectively test and verify -presently to derive an initial working scope for CVP, and to recommend what the +presently to derive an initial working scope for OVP, and to recommend what the community should strive to achieve in future releases. -The overall requirements for CVP can be categorized by the basic cloud +The overall requirements for OVP can be categorized by the basic cloud capabilities representing common operations needed by basic VNFs, and additional requirements for VNFs that go beyond the common cloud capabilities including functional extensions, operational capabilities and additional carrier grade @@ -118,7 +116,7 @@ these basic requirements. We are not yet ready to include compliance requirements for capabilities such as hardware portability, carrier grade performance, fault management and other operational features, security, MANO and VNF verification. These areas are -being studied for consideration in future CVP releases. +being studied for consideration in future OVP releases. In some areas, we will start with a limited level of verification initially, constrained by what community resources are able to support at this @@ -144,7 +142,7 @@ In order to define the scope of the Danube-release of the compliance and verification program, this section analyzes NFV-focused platform capabilities with respect to the high-level objectives and the general approach outlined in the previous section. The analysis determines which capabilities are suitable -for inclusion in this release of the CVP and which capabilities are to be +for inclusion in this release of the OVP and which capabilities are to be addressed in future releases. 1. Basic Cloud Capabilities @@ -173,12 +171,12 @@ services. Running such basic VNF leads to a set of common requirements, includin - simple virtual machine resource scheduling on multiple nodes OPNFV mainly supports OpenStack as the VIM up to the Danube release. The -VNFs used in the CVP program, and features in scope for the program which are +VNFs used in the OVP program, and features in scope for the program which are considered to be basic to all VNFs, require commercial OpenStack distributions to support a common basic level of cloud capabilities, and to be compliant to a common specification for these capabilities. This requirement significantly overlaps with OpenStack community's Interop working group's goals, but they are not -identical. The CVP runs the OpenStack Refstack-Compute test cases to verify +identical. The OVP runs the OpenStack Refstack-Compute test cases to verify compliance to the basic common API requirements of cloud management functions and VNF (as a VM) management for OPNFV. Additional NFV specific requirements are added in network data path validation, @@ -199,12 +197,12 @@ NFV has functional requirements beyond the basic common cloud capabilities, esp. in the networking area. Examples like SDNVPN, IPv6, SFC may be considered additional NFV requirements beyond general purpose cloud computing. These feature requirements expand beyond common OpenStack (or other -VIM) requirements. OPNFV CVP will incorporate test cases to verify +VIM) requirements. OPNFV OVP will incorporate test cases to verify compliance in these areas as they become mature. Because these extensions may impose new API demands, maturity and industry adoption is a prerequisite for making them a mandatory requirement for OPNFV compliance. At the time of Danube, -we have not identified a new functional area that is mandatory for CVP. -In the meantime, CVP +we have not identified a new functional area that is mandatory for OVP. +In the meantime, OVP intends to offer tests in some of these areas as an optional extension of the test report to be submitted for review, noting that passing these tests will not be required to pass OPNFV compliance verification. @@ -231,7 +229,7 @@ and should be a mandatory requirement. The current test cases in HA cover the basic area of failure and resource overload conditions for a cloud platform's service availability, including all of the basic cloud capability services, and basic compute and storage loads, -so it is a meaningful first step for CVP. We expect additional high availability +so it is a meaningful first step for OVP. We expect additional high availability scenarios be extended in future releases. 4. Resiliency @@ -245,14 +243,14 @@ OPNFV system resiliency in the Danube release that can be used to provide limited coverage in this area. However, this is a relatively new test methodology in OPNFV, additional study and testing experiences are still needed. We defer the resiliency testing to -future CVP releases. +future OVP releases. 5. Security Security is among the top priorities as a carrier grade requirement by the end-users. Some of the basic common functions, including virtual network isolation, security groups, port security and role based access control are already covered as -part of the basic cloud capabilities that are verified in CVP. These test cases +part of the basic cloud capabilities that are verified in OVP. These test cases however do not yet cover the basic required security capabilities expected of an end-user deployment. It is an area that we should address in the near future, to define a common set of requirements and develop test cases for verifying those @@ -262,7 +260,7 @@ Another common requirement is security vulnerability scanning. While the OPNFV security project integrated tools for security vulnerability scanning, this has not been fully analyzed or exercised in Danube release. This area needs further work to identify the required level of security for the -purpose of OPNFV in order to be integrated into the CVP. End-user inputs on +purpose of OPNFV in order to be integrated into the OVP. End-user inputs on specific requirements in security is needed. 6. Service assurance @@ -293,38 +291,38 @@ NFVI features that users care about. There are a lot of projects in OPNFV developing use cases and sample VNFs, however most are still in early phase and require further enhancements to -become useful additions to the CVP. Examples such as vIMS, or those which are +become useful additions to the OVP. Examples such as vIMS, or those which are not yet available in Danube release, e.g. vCPE, will be valuable additions to -the CVP. These use cases need to be widely accepted, and since they are more -complex, using these VNFs for CVP demands a higher level of community resources +the OVP. These use cases need to be widely accepted, and since they are more +complex, using these VNFs for OVP demands a higher level of community resources to implement, analyze and document these VNFs. Hence, use case testing is not -ready for CVP at the time of Danube, but can be incorporated in Euphrates or as +ready for OVP at the time of Danube, but can be incorporated in Euphrates or as a future roadmap area. 8. Additional capabilities In addition to the capabilities analyzed above, there are further system -aspects which are of importance for the CVP. These comprise operational and +aspects which are of importance for the OVP. These comprise operational and management aspects such as platform in-place upgrades and platform operational insights such as telemetry and logging. Further aspects include API backward compatibility / micro-versioning, workload migration, multi-site federation and interoperability with workload automation platforms, e.g. ONAP. Finally, efficiency aspects such as the hardware and energy footprint of the platform -are worth considering in the CVP. +are worth considering in the OVP. OPNFV is addressing these items on different levels of details in different projects. However, the contributions developed in these projects are not yet considered widely available in commercial systems in order to include them in -the CVP. Hence, these aspects are left for inclusion in future releases of the -CVP. +the OVP. Hence, these aspects are left for inclusion in future releases of the +OVP. -Scope of the Danube-release of the CVP +Scope of the Danube-release of the OVP -------------------------------------- Summarizing the results of the analysis above, the scope of the Danube-release -of the CVP is as follows: +of the OVP is as follows: - Test Area: Basic cloud capabilities @@ -336,8 +334,8 @@ of the CVP is as follows: - *VM resource scheduling* - *Forwarding packets in the data path* -\* The OPNFV CVP utilizes the same set of test cases as the OpenStack -interoperability program *OpenStack Powered Compute*. Passing the OPNFV CVP +\* The OPNFV OVP utilizes the same set of test cases as the OpenStack +interoperability program *OpenStack Powered Compute*. Passing the OPNFV OVP does **not** imply that the SUT is certified according to the *OpenStack Powered Compute* program. *OpenStack Powered Compute* is a trademark of the OpenStack foundation and the corresponding certification label can only be awarded by the @@ -365,8 +363,8 @@ OpenStack foundation. These tested areas represent significant advancement in the direction to meet -the CVP's objectives and end-user expectations, and is a good basis for the -initial phase of CVP. +the OVP's objectives and end-user expectations, and is a good basis for the +initial phase of OVP. Note: The SUT is limited to NFVI and VIM functions. While testing MANO component capabilities is out of scope, certain APIs exposed towards MANO are @@ -375,7 +373,7 @@ elements may be part of the test infrastructure; for example used for workload deployment and provisioning. -Scope considerations for future CVP releases +Scope considerations for future OVP releases -------------------------------------------- Based on the previous analysis, the following items are outside the scope of @@ -417,6 +415,6 @@ Applicants who choose to run the optional test cases can include the results of the optional test cases to highlight the additional compliance. .. References -.. _`OPNFV CVP Guidelines`: https://wiki.opnfv.org/display/dovetail/CVP+document +.. _`OPNFV OVP Guidelines`: https://wiki.opnfv.org/display/dovetail/OVP+document .. _`Pharos specification`: https://wiki.opnfv.org/display/pharos/Pharos+Specification diff --git a/docs/testing/user/systempreparation/index.rst b/docs/testing/user/systempreparation/index.rst index fe1f60d6..afc5dc47 100644 --- a/docs/testing/user/systempreparation/index.rst +++ b/docs/testing/user/systempreparation/index.rst @@ -5,25 +5,25 @@ .. _dovetail-system_preparation_guide: -============================================================ -Compliance Verification Program system preparation guide -============================================================ +=============================================== +OPNFV Verified Program system preparation guide +=============================================== This document provides a general guide to hardware system prerequisites -and expectations for running OPNFV CVP testing. For detailed guide of +and expectations for running OPNFV OVP testing. For detailed guide of preparing software tools and configurations, and conducting the test, please refer to the User Guide :ref:dovetail-testing_user_guide. -The CVP test tools expect that the hardware of the System Under Test (SUT) +The OVP test tools expect that the hardware of the System Under Test (SUT) is Pharos compliant `Pharos specification`_ The Pharos specification itself is a general guideline, rather than a set of specific hard requirements at this time, developed by the OPNFV community. For -the purpose of helping CVP testers, we summarize the main aspects of hardware to -consider in preparation for CVP testing. +the purpose of helping OVP testers, we summarize the main aspects of hardware to +consider in preparation for OVP testing. -As described by the CVP Testing User Guide, the hardware systems involved in -CVP testing includes a Test Node, a System Under Test (SUT) system, and network +As described by the OVP Testing User Guide, the hardware systems involved in +OVP testing includes a Test Node, a System Under Test (SUT) system, and network connectivity between them. The Test Node can be a bare metal machine or a virtual machine that can support @@ -39,13 +39,13 @@ ARM-64. Mixing different architectures in the same SUT is not supported. A minimum of 5 servers, 3 configured for controllers and 2 or more configured for compute resource are expected. However this is not a hard requirement -at this phase. The CVP 1.0 mandatory test cases only require one compute server. At +at this phase. The OVP 1.0 mandatory test cases only require one compute server. At lease two compute servers are required to pass some of the optional test cases -in the current CVP release. CVP control service high availability tests expect two +in the current OVP release. OVP control service high availability tests expect two or more control nodes to pass, depending on the HA mechanism implemented by the SUT. -The SUT is also expected to include components for persistent storage. The CVP +The SUT is also expected to include components for persistent storage. The OVP testing does not expect or impose significant storage size or performance requirements. The SUT is expected to be connected with high performance networks. These networks @@ -56,7 +56,7 @@ and compute services in the SUT - A data network that supports the virtual network capabilities and data path testing Additional networks, such as Light Out Management or storage networks, may be -beneficial and found in the SUT, but they are not a requirement for CVP testing. +beneficial and found in the SUT, but they are not a requirement for OVP testing. .. References .. _`Pharos specification`: https://wiki.opnfv.org/display/pharos/Pharos+Specification diff --git a/docs/testing/user/testspecification/dynamicnetwork/index.rst b/docs/testing/user/testspecification/dynamicnetwork/index.rst index cda1d5b6..a25e4f66 100644 --- a/docs/testing/user/testspecification/dynamicnetwork/index.rst +++ b/docs/testing/user/testspecification/dynamicnetwork/index.rst @@ -59,7 +59,7 @@ a previous test. Specifically, every test performs clean-up operations which return the system to the same state as before the test. All these test cases are included in the test case dovetail.tempest.tc003 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/forwardingpackets/index.rst b/docs/testing/user/testspecification/forwardingpackets/index.rst index c12e0939..4a4ffea9 100644 --- a/docs/testing/user/testspecification/forwardingpackets/index.rst +++ b/docs/testing/user/testspecification/forwardingpackets/index.rst @@ -49,7 +49,7 @@ The test area is structured based on the basic operations of forwarding packets in data path through virtual networks. Specifically, the test performs clean-up operations which return the system to the same state as before the test. -This test case is included in the test case dovetail.tempest.tc001 of cvp test suite. +This test case is included in the test case dovetail.tempest.tc001 of OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/index.rst b/docs/testing/user/testspecification/index.rst index 858d02ce..2a8298c9 100644 --- a/docs/testing/user/testspecification/index.rst +++ b/docs/testing/user/testspecification/index.rst @@ -5,23 +5,23 @@ .. _dovetail-test_case_specification: ================================================== -Compliance Verification program test specification +OPNFV Verified Program test specification ================================================== Introduction ============ -The OPNFV CVP provides a series or test areas aimed to evaluate the operation +The OPNFV OVP provides a series or test areas aimed to evaluate the operation of an NFV system in accordance with carrier networking needs. Each test area contains a number of associated test cases which are described in detail in the associated test specification. -All tests in the CVP are required to fulfill a specific set of criteria in -order that the CVP is able to provide a fair assessment of the system under +All tests in the OVP are required to fulfill a specific set of criteria in +order that the OVP is able to provide a fair assessment of the system under test. Test requirements are described in the 'Test Case Requirements'_ document. -All tests areas addressed in the CVP are covered in the following test +All tests areas addressed in the OVP are covered in the following test specification documents. .. toctree:: diff --git a/docs/testing/user/testspecification/machinelifecycle/index.rst b/docs/testing/user/testspecification/machinelifecycle/index.rst index 5b46b90f..b0cc0d79 100644 --- a/docs/testing/user/testspecification/machinelifecycle/index.rst +++ b/docs/testing/user/testspecification/machinelifecycle/index.rst @@ -60,7 +60,7 @@ created by a previous test. Specifically, every test performs clean-up operations which return the system to the same state as before the test. All these test cases are included in the test case dovetail.tempest.tc004 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/multiplenodes/index.rst b/docs/testing/user/testspecification/multiplenodes/index.rst index 2049f601..4b4859a9 100644 --- a/docs/testing/user/testspecification/multiplenodes/index.rst +++ b/docs/testing/user/testspecification/multiplenodes/index.rst @@ -54,7 +54,7 @@ the state created by a previous test. Specifically, every test performs clean-up operations which return the system to the same state as before the test. All these test cases are included in the test case dovetail.tempest.tc005 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/securitygroup/index.rst b/docs/testing/user/testspecification/securitygroup/index.rst index daacd963..61aa1c4b 100644 --- a/docs/testing/user/testspecification/securitygroup/index.rst +++ b/docs/testing/user/testspecification/securitygroup/index.rst @@ -54,7 +54,7 @@ the state created by a previous test. Specifically, every test performs clean-up operations which return the system to the same state as before the test. All these test cases are included in the test case dovetail.tempest.tc002 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/vimoperationscompute/index.rst b/docs/testing/user/testspecification/vimoperationscompute/index.rst index ea5441ac..64f4356b 100644 --- a/docs/testing/user/testspecification/vimoperationscompute/index.rst +++ b/docs/testing/user/testspecification/vimoperationscompute/index.rst @@ -69,7 +69,7 @@ For brevity, the test cases in this test area are summarized together based on the operations they are testing. All these test cases are included in the test case dovetail.osinterop.tc001 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/vimoperationsidentity/index.rst b/docs/testing/user/testspecification/vimoperationsidentity/index.rst index c2282f85..9790e75e 100644 --- a/docs/testing/user/testspecification/vimoperationsidentity/index.rst +++ b/docs/testing/user/testspecification/vimoperationsidentity/index.rst @@ -54,7 +54,7 @@ The test area is structured based on VIM identity operations. Each test case is able to run independently, i.e. irrelevant of the state created by a previous test. All these test cases are included in the test case dovetail.osinterop.tc001 of -cvp test suite. +OVP test suite. Dependency Description ====================== diff --git a/docs/testing/user/testspecification/vimoperationsimage/index.rst b/docs/testing/user/testspecification/vimoperationsimage/index.rst index 6bf3f4c8..2970207e 100644 --- a/docs/testing/user/testspecification/vimoperationsimage/index.rst +++ b/docs/testing/user/testspecification/vimoperationsimage/index.rst @@ -55,7 +55,7 @@ For brevity, the test cases in this test area are summarized together based on the operations they are testing. All these test cases are included in the test case dovetail.osinterop.tc001 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/vimoperationsnetwork/index.rst b/docs/testing/user/testspecification/vimoperationsnetwork/index.rst index 72ad5766..bf929828 100644 --- a/docs/testing/user/testspecification/vimoperationsnetwork/index.rst +++ b/docs/testing/user/testspecification/vimoperationsnetwork/index.rst @@ -57,7 +57,7 @@ For brevity, the test cases in this test area are summarized together based on the operations they are testing. All these test cases are included in the test case dovetail.osinterop.tc001 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/testspecification/vimoperationsvolume/index.rst b/docs/testing/user/testspecification/vimoperationsvolume/index.rst index c59deb2d..ce51b954 100644 --- a/docs/testing/user/testspecification/vimoperationsvolume/index.rst +++ b/docs/testing/user/testspecification/vimoperationsvolume/index.rst @@ -70,7 +70,7 @@ For brevity, the test cases in this test area are summarized together based on the operations they are testing. All these test cases are included in the test case dovetail.osinterop.tc001 of -cvp test suite. +OVP test suite. Test Descriptions ================= diff --git a/docs/testing/user/teststrategy/index.rst b/docs/testing/user/teststrategy/index.rst deleted file mode 100644 index db05e035..00000000 --- a/docs/testing/user/teststrategy/index.rst +++ /dev/null @@ -1,301 +0,0 @@ -.. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. http://creativecommons.org/licenses/by/4.0 -.. (c) Ericsson AB - -====================================================== -Compliance and Verification program test specification -====================================================== - -.. toctree:: - :maxdepth: 2 - -Version history -=============== - -+------------+----------+------------------+----------------------------------+ -| **Date** | **Ver.** | **Author** | **Comment** | -| | | | | -+------------+----------+------------------+----------------------------------+ -| 2017-03-15 | 0.0.1 | Chris Price | Draft version | -| | | | | -+------------+----------+------------------+----------------------------------+ - - -Introduction -============ - -This test specification provides a detailed outline of the prerequisites for performing evaluation -testing, testing methods and procedures used in the evaluation testing process and the tools provided -for running OPNFV evaluation testing. - -A CVP system under test is assumed to be a stand alone cloud infrastructure running a virtualisation -software stack providing high availability, redundancy and resiliency. OPNFV CVP testing covers the -combination of hardware and software that provide an NFV platform for running virtual workloads, approximately -the VIM & VNFi as defined by ETSI NFV ISG. - -While the NFV platform is a composite system under test comprised of both hardware and software the VIM -& NFVi testing focuses on software evaluation where it is required and assumed the software is running on -a platform deemed to be Pharos compliant. A "Pharos compliant" stand alone hardware system can be summarised -as a POD of at least 3 control and two compute blades to exercise the minimum set of compliance testing. -Pharos compliance is further defined, and expected to be implemented according to the `"Pharos specification"`_. - - -------- -Purpose -------- - -This document is intended to be read by an engineer, intending to run or prepare a system for the evaluation -tests, prior to beginning the preparation for executing the evaluation tests. The document is also useful as -a reference to learn more about OPNFV CVP testing, it's assumptions, targets, methods & tools and expected outcomes. - -The engineer will be guided through configuring and populating an environment that is suitable for executing the -OPNFV compliance evaluation test suite. This includes interpretations of the Pharos specification and assumptions -made by the toolchain on how those interpretations are evaluated. - -In addition to system preparation the document provides a guide for working with the methods and tools associated -with the evaluation test procedure. - ----------------- -Scope of testing ----------------- - -The OPNFV CVP testing suite is implemented to evaluate the compliance of a virtualisation platform with standard -NFV, carrier and communications network, platform requirements. Testing focuses on evaluating the software layer -in the platform in it's ability to provide OPNFV features and behaviours required to host communication and networking -industry applications. This section will provide a brief overview of the target areas and scope addressed by the -testing suites - -Features --------- - -CVP testing addresses primarily features and capabilities of value to, and further developed by, the OPNFV community. -Target areas for CVP testing are to verify the presence and compliance of: - * Validation of common cloud environment requirements inherited from key communities such as OpenStack and ETSI - * Capabilities required to provide consistent application and workload on-boarding, lifecycle management, and scaling - * Networking features required to address a carrier environment including VPN, service chaining, Trunking and peering - * Others?? - -Resilience ----------- - -Availability of service and infrastructure are fundamental principals of carrier networking and as such form a -component of the compliance test suite aimed to establish the ability of a virtualisation infrastructure to identify -and recover from failures across the system. The evaluation criteria specifically target control plane resilience -and the ability to identify, accommodate and recover from hardware failures in the system. - -Security? ---------- - -https://jira.opnfv.org/browse/DOVETAIL-382 - -This section should outline the test strategy related to security, -representing the test cases and requirements on the SUT for security testing. - -Scale ------ - -The ability to scale is important for carrier networking infrastructure and applications. The first iteration of the -compliance evaluation test suites address the need to scale however do not enforce strict requirements on how scale is -achieved. - -Compliance to the Pharos specification is evaluated as a component of the test suite and provides an visibility into -the physical infrastructures ability to scale in accordance with OPNFV requirements. The test suite itself does not -require an infrastructure that is running to be deployed at scale in order to pass the tests. It is assumed the -compliance to Pharos provides enough infrastructure validation that the capability is inherent. - -Characteristics ---------------- - -The OPNFV community invests heavily in ensuring the features and capabilities of the stack are able to run in the -most performant manner according to the needs of the workloads. This can range from the ability to linearly scale -workloads to the ability to process traffic at line rates. - -While each of these is a critical factor in evaluating the performance of a virtualisation infrastructure the CVP -suite does not at this time specify strict requirements on any given capability as part of the evaluation. It is -expected that in future test suites concise performance metrics will be required to be achieved to achieve compliance -at this time the community has elected not to place pass/fail requirements on characteristics. - - ---------------------- -Definitions and terms ---------------------- - -This document uses a number of acronyms and terms the reader may not be familiar with. For a full glossary of -terms used in OPNFV, refer to the `OPNFV Glossary`_. - -+------------+----------------------------------------------------------------+ -| **Term** | **Description** | -| | | -+------------+----------------------------------------------------------------+ -| CVP | The OPNFVCompliance and Verification Program | -| | | -+------------+----------------------------------------------------------------+ -| SUT | System under test; the complete system targeted by the | -| | test cases, including software hardware and configuration. | -| | | -+------------+----------------------------------------------------------------+ -| More | Additional entries to be added to this table | -| | | -+------------+----------------------------------------------------------------+ - - -Overview -======== - -This section of the document will describe in details the processes and procedures required to perform OPNFV CVP -compliance testing. The section is structured to address; planning and preparation, the approach to testing, the -scope of test activities including details of the test areas, methods and tools used in the testing and the result, -reporting & output of the test suites when run. - -Test planning and preparation -============================= - -https://jira.opnfv.org/browse/DOVETAIL-384 - -Give an outline of the planning phase, refer to the detailed system prep guide and test guides here. - - ../../systempreparation/index.rst - -Feature testing scope and approach -================================== - -------------------- -Pre-test validation -------------------- - -Describe how to evaluate test readiness here. -I suggest this be a process of doing a "dry run" and evaluating the results on the DB. This should not -need to be reproduced later in the document. - - -------------- -Test approach -------------- - -Here we should describe the way we approach testing different areas. API through RefStack, resilience through -ETSI test implementations, security is done in xyz way. This should serve as an introduction to the following -feature test scope sections and provide common information not to be replicated further down. - - ------------------- -Feature test scope ------------------- - -Included test areas -------------------- - -CVP testing for the Danube release focuses on evaluating the ability of a platform to host basic carrier -networking related workloads. The testing focuses on establishing the ability of the SUT to perform -basic NFVi operations such as managing images, instatiating workload & networking these workloads in a -secure and resiliant manner. - -Many OPNFV features are derived from our target upstream communities resulting in the tests focusing -on exposing behaviour present through those development efforts. This approach to OPNFV development -is reflected in our CVP testing where upstream validation procedures are leveraged to validate the -composite platform. The OpenStack `RefStack <https://refstack.openstack.org/#/>`_ test suites are -leveraged in OPNFV for performing VIM validation according to expected behaviour in OPNFV. - -OPNFV CVP testing has explicit requirements on hardware, control plane and compute topologies which -are assumed to be in place, these are covered in the `Preparing the virtualisation infrastructure`_ -section of this document. Tests may fail if the system is not prepared and configured in accordance -the prpeparation guidelines. - -Excluded test areas -------------------- - -The CVP testing procedure in Danube do not cover all aspects of the available OPNFV system, nor feature -suites. To ensure the highest quality of evaluation that provides a trustworthy foundation for industry -compliance toward a common OPNFV standard tests and test areas are selected based on three key principals; -maturity of the test area and framework, availability of features and capabilities across OPNFV compositions, -relevance and value to the industry for evaluation. - -In the Danube release of the CVP we have elected to esbalish an evaluation suite for only the common base -features of the platform. Features areas that are optional or not yet mature in the platform are expluded. -This includes a number of optinal networking features such as BGP VPN networking and Service Chaining which -are intended to be included as optional evaluation areas in future CVP releases. - -The Danube release of the OPNFV CVP testing suite in addition dose not attempt to provide an evaluation criteria -for hardware. Any hardware used in the evaluation testing must comply to the pre-resuities outlined in the -`Preparing the virtualisation infrastructure`_ section of this document. Although the hardware is not tested -itself and no qualification metric has been established for the hardware in the Danube release. - -Test criteria and reporting ---------------------------- - -https://jira.opnfv.org/browse/DOVETAIL-389 - -This section should specify the criteria to be used to decide whether a test item has passed or failed. -As each area may have differences it is important to ensure the user can react to a failure in accordance -with it's relevance to the overall test suites running. - -Critical functions that are mandatory should be identified here. If any of these fail the system testing -should be halted. If we can having a grading or sorts described here would be helpful as a "guide" for a -tester to know if they should submit, resolve an item in their stack, or try again at another time. - - ---------------------- -Test design and tools ---------------------- - -This section needs to be done once we know the tools and areas covered by the CVP testing suites. Parked for now. - -VIM NBI testing ---------------- - -Describe the test areas addressed and how the tools are designed. It is important to understand the behaviour -of the testing framework when running these tests. Here we get into the details of behaviour for each of -the test frameworks we use, what they are testing and how the results are to be interpreted. - -Outline the tool in detail, in this case RefStack. How does it work, is it run to completion, is reporting -done per test case when do I as a user know I should do something? - -Summarise the tests to be executed by this test framework and the purpose of those tests in the evaluation -of the CVP. Are there dependancies between tests, does the tool expect a certain behaviour, do the test -cases have specific dependancies. This provides the overall context of the evaluation performed by this -toolchain / suite and I would not want to be surprised by something when I run the tests after reading this. - -Next test area --------------- - -What is the test area, what tools, how do they work what does it mean to me as a tester? - -Another test area ------------------ - -Again what is the test area, what tools, how do they work what does it mean to me as a tester? - - -CVP test execution -================== - -This section should identify the test procedure being executed. Include all people and -roles involved in the execution of a CVP test. Include procedures, mailing lists, escalation, support -and periods of waiting for results. - -The general workflow I assume should be something like this: - -* Log into the CVP test case staging computer -* Open a Browser and Log into the CVP tool. -* Select the CVP suite to run, agree to the questions (there should be only 1 for now) -* Run the tests (we should be able to launch this from the Web UI if they are on the hosting machine) -* Wait for the tool to report completed. -* Review your results, refer to trouble shooting or support as needed -* Submit your results for CVP evaluation - ------------- -Test reports ------------- - -Describe the process of producing and accessing the test report. This shoudl be a sub-section of CVP test execution I think. - -how do I connect a test suite to my account to get a report? How do I access the report when it is ready, -how do I identify one report from another in the toolchain? We should go into all the details here and point -to the tool, referring to the "preparation" section of this document if needed for context. - - -References -========== - -.. _`"Pharos specification"`: https://opnfv.org/ -.. _`OPNFV Glossary`: http://docs.opnfv.org/en/latest/glossary/index.html - diff --git a/docs/testing/user/userguide/index.rst b/docs/testing/user/userguide/index.rst index 89e76c11..70f78b5c 100644 --- a/docs/testing/user/userguide/index.rst +++ b/docs/testing/user/userguide/index.rst @@ -5,7 +5,7 @@ .. _dovetail-testing_user_guide: ******************************************************** -Compliance Verification Program Testing User Guide +OPNFV Verified Program Testing User Guide ******************************************************** .. toctree:: diff --git a/docs/testing/user/userguide/testing_guide.rst b/docs/testing/user/userguide/testing_guide.rst index e55c1595..54fa294b 100644 --- a/docs/testing/user/userguide/testing_guide.rst +++ b/docs/testing/user/userguide/testing_guide.rst @@ -3,15 +3,15 @@ .. (c) OPNFV, Huawei Technologies Co.,Ltd and others. ========================================== -Conducting CVP Testing with Dovetail +Conducting OVP Testing with Dovetail ========================================== Overview ------------------------------ -The Dovetail testing framework for CVP consists of two major parts: the testing client that +The Dovetail testing framework for OVP consists of two major parts: the testing client that executes all test cases in a lab (vendor self-testing or a third party lab), -and the server system that is hosted by the CVP administrator to store and +and the server system that is hosted by the OVP administrator to store and view test results based on a web API. The following diagram illustrates this overall framework. @@ -26,7 +26,7 @@ The above diagram assumes that the tester's Test Host is situated in a DMZ, whic has internal network access to the SUT and external access via the public Internet. The public Internet connection allows for easy installation of the Dovetail containers. A singular compressed file that includes all the underlying results can be pulled from -the Test Host and uploaded to the OPNFV CVP server. +the Test Host and uploaded to the OPNFV OVP server. This arrangement may not be supported in some labs. Dovetail also supports an offline mode of installation that is illustrated in the next diagram. @@ -41,11 +41,11 @@ the Test Host. While it is possible to run the Test Host as a virtual machine, this user guide assumes it is a physical machine for simplicity. The rest of this guide will describe how to install the Dovetail tool as a -Docker container image, go over the steps of running the CVP test suite, and +Docker container image, go over the steps of running the OVP test suite, and then discuss how to view test results and make sense of them. Readers interested -in using Dovetail for its functionalities beyond CVP testing, e.g. for in-house +in using Dovetail for its functionalities beyond OVP testing, e.g. for in-house or extended testing, should consult the Dovetail developer's guide for additional information. @@ -109,7 +109,7 @@ Installing Prerequisite Packages on the Test Host The main prerequisite software for Dovetail are Python and Docker. -In the CVP test suite for the Danube release, Dovetail requires Python 2.7. Various minor +In the OVP test suite for the Danube release, Dovetail requires Python 2.7. Various minor versions of Python 2.7.x are known to work Dovetail, but there are no assurances. Python 3.x is not supported at this time. @@ -375,7 +375,7 @@ Installing Dovetail on the Test Host The Dovetail project maintains a Docker image that has Dovetail test tools preinstalled. This Docker image is tagged with versions. Before pulling the Dovetail image, check the -OPNFV's CVP web page first to determine the right tag for CVP testing. +OPNFV's OVP web page first to determine the right tag for OVP testing. Online Test Host """""""""""""""" @@ -488,10 +488,10 @@ Build Local DB and Testapi Services ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The steps in this section only need to be executed if the user plans on storing consolidated -results on the Test Host that can be uploaded to the CVP portal. +results on the Test Host that can be uploaded to the OVP portal. Dovetail needs to build the local DB and testapi service for storing and reporting results -to the CVP web portal. There is a script in the Dovetail container for building the local DB. +to the OVP web portal. There is a script in the Dovetail container for building the local DB. The ports 27017 and 8000 are used by the DB and testapi respectively. If the Test Host is using these ports for existing services, to avoid conflicts, remap the ports to values that are unused. Execute the commands below in the Dovetail container to remap ports, as required. @@ -515,8 +515,8 @@ To validate the DB and testapi services are running successfully, navigate to th the IP address of the Test Host and the testapi port number. If you can access this URL successfully, the services are up and running. -Running the CVP Test Suite --------------------------- +Running the OVP Test Suite +---------------------------- All or a subset of the available tests can be executed at any location within the Dovetail container prompt. You can refer to :ref:`cli-reference` @@ -528,7 +528,7 @@ for the details of the CLI. $ dovetail run --testsuite <test-suite-name> The '--testsuite' option is used to control the set of tests intended for execution -at a high level. For the purposes of running the CVP test suite, the test suite name follows +at a high level. For the purposes of running the OVP test suite, the test suite name follows the following format, ``ovp.<major>.<minor>.<patch>``. The latest and default test suite is ovp.1.0.0. @@ -555,7 +555,7 @@ arguments 'ipv6', 'sdnvpn' and 'tempest'. By default, results are stored in local files on the Test Host at ``$DOVETAIL_HOME/results``. Each time the 'dovetail run' command is executed, the results in the aforementioned directory -are overwritten. To create a singular compressed result file for upload to the CVP portal or +are overwritten. To create a singular compressed result file for upload to the OVP portal or for archival purposes, the results need to pushed to the local DB. This can be achieved by using the '--report' option with an argument syntax as shown below. Note, that the Test Host IP address and testapi port number must be substituted with appropriate values. @@ -594,10 +594,10 @@ When test execution is complete, a tar file with all result and log files is wri ``$DOVETAIL_HOME`` on the Test Host. An example filename is ``${DOVETAIL_HOME}/logs_20180105_0858.tar.gz``. The file is named using a timestamp that follows the convention 'YearMonthDay-HourMinute'. In this case, it was generated -at 08:58 on January 5th, 2018. This tar file is used to upload to the CVP portal. +at 08:58 on January 5th, 2018. This tar file is used to upload to the OVP portal. -Making Sense of CVP Test Results +Making Sense of OVP Test Results ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ When a tester is performing trial runs, Dovetail stores results in local files on the Test @@ -646,10 +646,10 @@ and dovetail_ha_tcXXX.out will not be created. ``sdnvpn_logs/dovetail.sdnvpn.tcXXX.log`` and ``tempest_logs/dovetail.tempest.tcXXX.log``, respectively. They all have the passed, skipped and failed test cases results. -CVP Portal Web Interface +OVP Portal Web Interface ------------------------ -The CVP portal is a public web interface for the community to collaborate on results +The OVP portal is a public web interface for the community to collaborate on results and to submit results for official OPNFV compliance verification. The portal can be used as a resource by users and testers to navigate and inspect results more easily than by manually inspecting the log files. The portal also allows users to share results in a private manner @@ -699,7 +699,7 @@ Updating Dovetail or a Test Suite --------------------------------- Follow the instructions in section `Installing Dovetail on the Test Host`_ and -`Running the CVP Test Suite`_ by replacing the docker images with new_tags, +`Running the OVP Test Suite`_ by replacing the docker images with new_tags, .. code-block:: bash @@ -707,6 +707,6 @@ Follow the instructions in section `Installing Dovetail on the Test Host`_ and sudo docker pull opnfv/functest:<functest_new_tag> sudo docker pull opnfv/yardstick:<yardstick_new_tag> -This step is necessary if dovetail software or the CVP test suite have updates. +This step is necessary if dovetail software or the OVP test suite have updates. |