summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--docs/testing/user/reviewerguide/images/ovp_log_setup.pngbin72844 -> 12385 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/ovp_pass_fraction.pngbin83821 -> 36240 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/ovp_pass_percentage.pngbin22057 -> 8757 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/ovp_result_overview.pngbin71800 -> 0 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/ovp_result_review.pngbin14835 -> 13652 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/ovp_top_nav.pngbin21387 -> 20489 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/review_status.pngbin0 -> 9887 bytes
-rw-r--r--docs/testing/user/reviewerguide/images/sut_info.pngbin17061 -> 12822 bytes
-rw-r--r--docs/testing/user/reviewerguide/index.rst171
9 files changed, 90 insertions, 81 deletions
diff --git a/docs/testing/user/reviewerguide/images/ovp_log_setup.png b/docs/testing/user/reviewerguide/images/ovp_log_setup.png
index 4a68d9b6..f53b94d9 100644
--- a/docs/testing/user/reviewerguide/images/ovp_log_setup.png
+++ b/docs/testing/user/reviewerguide/images/ovp_log_setup.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/ovp_pass_fraction.png b/docs/testing/user/reviewerguide/images/ovp_pass_fraction.png
index 94dcd45a..30672e02 100644
--- a/docs/testing/user/reviewerguide/images/ovp_pass_fraction.png
+++ b/docs/testing/user/reviewerguide/images/ovp_pass_fraction.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/ovp_pass_percentage.png b/docs/testing/user/reviewerguide/images/ovp_pass_percentage.png
index 0d477a78..1a61f7b4 100644
--- a/docs/testing/user/reviewerguide/images/ovp_pass_percentage.png
+++ b/docs/testing/user/reviewerguide/images/ovp_pass_percentage.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/ovp_result_overview.png b/docs/testing/user/reviewerguide/images/ovp_result_overview.png
deleted file mode 100644
index 1f66a69c..00000000
--- a/docs/testing/user/reviewerguide/images/ovp_result_overview.png
+++ /dev/null
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/ovp_result_review.png b/docs/testing/user/reviewerguide/images/ovp_result_review.png
index 427127e0..56633447 100644
--- a/docs/testing/user/reviewerguide/images/ovp_result_review.png
+++ b/docs/testing/user/reviewerguide/images/ovp_result_review.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/ovp_top_nav.png b/docs/testing/user/reviewerguide/images/ovp_top_nav.png
index 3dfc0b09..a1c261f8 100644
--- a/docs/testing/user/reviewerguide/images/ovp_top_nav.png
+++ b/docs/testing/user/reviewerguide/images/ovp_top_nav.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/review_status.png b/docs/testing/user/reviewerguide/images/review_status.png
new file mode 100644
index 00000000..911b06fd
--- /dev/null
+++ b/docs/testing/user/reviewerguide/images/review_status.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/images/sut_info.png b/docs/testing/user/reviewerguide/images/sut_info.png
index 53c3d51a..29c249b2 100644
--- a/docs/testing/user/reviewerguide/images/sut_info.png
+++ b/docs/testing/user/reviewerguide/images/sut_info.png
Binary files differ
diff --git a/docs/testing/user/reviewerguide/index.rst b/docs/testing/user/reviewerguide/index.rst
index 374debfb..f08ae784 100644
--- a/docs/testing/user/reviewerguide/index.rst
+++ b/docs/testing/user/reviewerguide/index.rst
@@ -2,9 +2,9 @@
.. http://creativecommons.org/licenses/by/4.0
.. (c) Ericsson AB
-=============================================
+==================
OVP Reviewer Guide
-=============================================
+==================
.. toctree::
:maxdepth: 2
@@ -16,120 +16,110 @@ Introduction
This document provides detailed guidance for reviewers on how to handle the result review
process.
-The OPNFV Verified program (OVP) provides the ability for users to upload test results in
+The OPNFV Verification Program (OVP) provides the ability for users to upload test results in
`OVP portal <https://nfvi-verified.lfnetworking.org>`_ and request from OVP community to review them.
-After the user submit for review the test results **Status** is changed from 'private' to 'review'
-(as shown in figure 2).
OVP administrator will ask for review volunteers using the ovp-support@lfnetworking.org email alias.
The incoming results for review will be identified by the administrator with particular **Test ID**
and **Owner** values.
Volunteers that will accept the review request can access the test results by login to the
-`OVP portal <https://nfvi-verified.lfnetworking.org>`_ and the click on the **My Results** tab in top-level
-navigation bar.
+`OVP portal <https://nfvi-verified.lfnetworking.org>`_ and then click on the **Incoming Reviews**
+tab in top-level navigation bar.
.. image:: images/ovp_top_nav.png
:align: center
:scale: 100%
-Figure 1
-The corresponding OVP portal result will have a status of 'review'.
+After the user submit for review the test results **Status** is changed from 'private' to 'review'.
+Reviewers can find that the corresponding OVP portal result will have a status of 'review'.
+Also there are **Application** information list here for review. All the application information
+is submitted by users at the same time they submit their results for review. Reviewers can also
+find who has already approve/not approve the test results by clicking on the **View Reviews**.
.. image:: images/ovp_result_review.png
:align: center
:scale: 100%
-Figure 2
Reviewers must follow the checklist below to ensure review consistency for the OPNFV
-Verified Program (OVP) 2018.09 (Fraser) release at a minimum.
+Verification Program (OVP) 2019.12 (Hunter) release at a minimum.
-#. **Mandatory Test Area Results** - Validate that results for all mandatory test areas are present.
-#. **Test-Case Pass Percentage** - Ensure all tests have passed (100% pass rate).
-#. **Log File Verification** - Inspect the log file for each test area.
+#. **Test Case Pass Percentage** - Ensure all mandatory tests have passed (100% pass rate).
+#. **Mandatory Test Case Results** - Validate that results for all mandatory test cases are present.
+#. **Log File Verification** - Inspect the log file for each test case.
#. **SUT Info Verification** - Validate the system under test (SUT) hardware and software endpoint info is present.
+Test Case Pass Percentage
+=========================
-1. Mandatory Test Area Results
-==============================
+All mandatory test cases have to run successfully. The below figure of the **Test Run Results**
+is one method and shows that 96.71% of the mandatory test cases have passed.
+This value must not be lower than 100%.
+
+.. image:: images/ovp_pass_percentage.png
+ :align: center
+ :width: 350 px
-Test results can be displayed by clicking on the hyperlink under the 'Test ID' column.
-User should validate that results for all mandatory test areas are included in the overall test suite. The required
-mandatory test cases are:
-- functest.vping.userdata
-- functest.vping.ssh
+Mandatory Test Case Results
+===========================
+
+Test results can be displayed by clicking on the hyperlink under the **Test ID** column.
+Reviewers should validate that results for all mandatory test cases are included in the overall
+test suite. The required mandatory test cases are:
+
- bottlenecks.stress.ping
-- functest.tempest.osinterop
+- functest.security.patrole
- functest.tempest.compute
- functest.tempest.identity_v3
- functest.tempest.image
+- functest.tempest.ipv6_api
- functest.tempest.network_api
-- functest.tempest.volume
- functest.tempest.neutron_trunk_ports
-- functest.tempest.ipv6_api
-- functest.security.patrole
-- yardstick.ha.nova_api
-- yardstick.ha.neutron_server
-- yardstick.ha.keystone
-- yardstick.ha.glance_api
+- functest.tempest.osinterop
+- functest.tempest.volume
+- functest.vping.ssh
+- functest.vping.userdata
- yardstick.ha.cinder_api
- yardstick.ha.cpu_load
+- yardstick.ha.database
- yardstick.ha.disk_load
+- yardstick.ha.glance_api
- yardstick.ha.haproxy
+- yardstick.ha.keystone
+- yardstick.ha.neutron_server
+- yardstick.ha.nova_api
- yardstick.ha.rabbitmq
-- yardstick.ha.database
*Note, that the 'Test ID' column in this view condenses the UUID used for 'Test ID' to
eight characters even though the 'Test ID' is a longer UUID in the back-end.*
-.. image:: images/ovp_result_overview.png
- :align: center
- :scale: 100%
-
-Figure 3
-
-2. Test-Case Pass Percentage
-============================
+Failed test cases can be easy identified by the color of pass/total number:
-All mandatory test-cases have to run successfully. The below diagram of the 'Test Run Results' is one method and
-shows that 98.15% of the mandatory test-cases have passed.
-This value must not be lower than 100%.
-
-.. image:: images/ovp_pass_percentage.png
- :align: center
- :width: 350 px
-
-Figure 4
-
-Failed test cases can also be easy identified by the color of pass/total number. :
-
-- Green when all test-cases pass
-- Orange when at least one fails
-- Red when all test-cases fail
+- **Green** when all test cases pass
+- **Orange** when at least one fails/skips
+- **Red** when all test cases fail/skip
.. image:: images/ovp_pass_fraction.png
:align: center
:width: 350 px
-Figure 5
-3. Log File Verification
-========================
+Log File Verification
+=====================
Each log file of the mandatory test cases have to be verified for content.
Log files can be displayed by clicking on the setup icon to the right of the results,
-as shown in figure below.
+as shown in the figure below.
.. image:: images/ovp_log_setup.png
:align: center
:scale: 100%
-Figure 6
*Note, all log files can be found at results/ directory as shown at the following table.*
@@ -148,37 +138,46 @@ Figure 6
+------------------------+--------------------------+
-The bottlenecks log must contain the 'SUCCESS' result as shown in following example:
+Bottlenecks Logs
+----------------
- 2018-08-22 14:11:21,815 [INFO] yardstick.benchmark.core.task task.py:127 Testcase: "ping_bottlenecks" **SUCCESS**!!!
+It must contain the 'SUCCESS' result at the end of Bottlenecks log as shown in following example:
-Functest logs opens an html page that lists all test cases as shown in figure 7. All test cases must have run
-successfuly.
+ 2019-12-03 07:35:14,630 [INFO] yardstick.benchmark.core.task task.py:129 Testcase: "ping_bottlenecks" SUCCESS!!!
-.. image:: images/ovp_log_files_functest_image.png
- :align: center
- :scale: 100%
-Figure 7
+Functest Logs
+-------------
-For the vping test area log file (functest.log). The two entries displayed in the tables below must be present in
-this log file.
+There are 2 different types of Functest logs, one is plain text for **vping** test cases and the other
+is html file for **tempest** and **security** test cases.
-**functest.vping_userdata**
+For **vping** test cases, two entries displayed in the tables below must be present in log files.
+
+**functest.vping.ssh**
.. image:: images/ovp_vping_ssh.png
:align: center
:scale: 100%
-Figure 8
-**functest.vping_ssh**
+**functest.vping.userdata**
.. image:: images/ovp_vping_user.png
:align: center
:scale: 100%
-Figure 9
+
+For **tempest** and **security** test cases, it opens an html page that lists all test cases as shown
+below. All test cases must have run successfully.
+
+.. image:: images/ovp_log_files_functest_image.png
+ :align: center
+ :scale: 100%
+
+
+Yardstick Logs
+--------------
The yardstick log must contain the 'SUCCESS' result for each of the test-cases within this
test area. This can be verified by searching the log for the keyword 'SUCCESS'.
@@ -190,29 +189,39 @@ An example of a FAILED and a SUCCESS test case are listed below:
2018-08-28 10:23:41,907 [INFO] yardstick.benchmark.core.task task.py:127 Testcase: "opnfv_yardstick_tc052" **SUCCESS**!!!
-4. SUT Info Verification
-========================
+SUT Info Verification
+=====================
SUT information must be present in the results to validate that all required endpoint services
and at least two controllers were present during test execution. For the results shown below,
-click the '**info**' hyperlink in the **SUT** column to navigate to the SUT information page.
+click the **info** hyperlink in the **SUT** column to navigate to the SUT information page.
.. image:: images/sut_info.png
:align: center
:scale: 100%
-Figure 10
-In the '**Endpoints**' listing shown below for the SUT VIM component, ensure that services are
+In the **Endpoints** listing shown below for the SUT VIM component, ensure that services are
present for identify, compute, image, volume and network at a minimum by inspecting the
-'**Service Type**' column.
+**Service Type** column.
.. image:: images/sut_endpoints.png
:align: center
:scale: 100%
-Figure 11
-Inspect the '**Hosts**' listing found below the Endpoints secion of the SUT info page and ensure
+Inspect the **Hosts** listing found below the Endpoints secion of the SUT info page and ensure
at least two hosts are present, as two controllers are required the for the mandatory HA
-test-cases.
+test cases.
+
+
+Approve or Not Approve Results
+==============================
+
+When you decide to approve or not approve this test, you can click the **Operation** and choose
+**approve** or **not approve**. Once you have approved or not approved the test, you can click
+**View Reviews** to find the review status as shown below.
+
+.. image:: images/review_status.png
+ :align: center
+ :scale: 100%