summaryrefslogtreecommitdiffstats
path: root/docs/testing/user/reviewerguide/index.rst
diff options
context:
space:
mode:
Diffstat (limited to 'docs/testing/user/reviewerguide/index.rst')
-rw-r--r--docs/testing/user/reviewerguide/index.rst271
1 files changed, 146 insertions, 125 deletions
diff --git a/docs/testing/user/reviewerguide/index.rst b/docs/testing/user/reviewerguide/index.rst
index bd615f19..f08ae784 100644
--- a/docs/testing/user/reviewerguide/index.rst
+++ b/docs/testing/user/reviewerguide/index.rst
@@ -2,9 +2,9 @@
.. http://creativecommons.org/licenses/by/4.0
.. (c) Ericsson AB
-=============================================
-OPNFV Verified Program 2018.01 Reviewer Guide
-=============================================
+==================
+OVP Reviewer Guide
+==================
.. toctree::
:maxdepth: 2
@@ -13,194 +13,215 @@ OPNFV Verified Program 2018.01 Reviewer Guide
Introduction
============
-This reviewer guide provides detailed guidance for reviewers on how to handle the result review
-process. Reviewers must follow the checklist below to ensure review consistency for the OPNFV
-Verified Program (OVP) 2018.01 (Danube) release at a minimum.
-
-#. **Mandatory Test Area Results** - Validate that results for all mandatory test areas are present.
-#. **Test-Case Count within Mandatory Test Area** - Check that the total number of test-cases are present in each mandatory test area.
-#. **Test-Case Pass Percentage** - Ensure all tests have passed (100% pass rate).
-#. **Log File Verification** - Inspect the log file for each test area (osinterop, ha, vping).
-#. **SUT Info Verification** - Validate the system under test (SUT) hardware and software endpoint info is present.
+This document provides detailed guidance for reviewers on how to handle the result review
+process.
+The OPNFV Verification Program (OVP) provides the ability for users to upload test results in
+`OVP portal <https://nfvi-verified.lfnetworking.org>`_ and request from OVP community to review them.
-1. Mandatory Test Area Results
-==============================
+OVP administrator will ask for review volunteers using the ovp-support@lfnetworking.org email alias.
+The incoming results for review will be identified by the administrator with particular **Test ID**
+and **Owner** values.
-Validate that results for all mandatory test areas are included in the overall test suite. The
-required mandatory test areas are:
- - **osinterop**
- - **vping**
- - **ha**
+Volunteers that will accept the review request can access the test results by login to the
+`OVP portal <https://nfvi-verified.lfnetworking.org>`_ and then click on the **Incoming Reviews**
+tab in top-level navigation bar.
-Login to the OVP portal at:
+.. image:: images/ovp_top_nav.png
+ :align: center
+ :scale: 100%
-*https://verified.opnfv.org*
-Click on the 'My Results' tab in top-level navigation bar.
+After the user submit for review the test results **Status** is changed from 'private' to 'review'.
+Reviewers can find that the corresponding OVP portal result will have a status of 'review'.
+Also there are **Application** information list here for review. All the application information
+is submitted by users at the same time they submit their results for review. Reviewers can also
+find who has already approve/not approve the test results by clicking on the **View Reviews**.
-.. image:: danube/images/ovp_top_nav.png
+.. image:: images/ovp_result_review.png
:align: center
:scale: 100%
-The OVP administrator will ask for review volunteers using the verified@opnfv.org email alias. The
-incoming results for review will be identified by the administrator with particular 'Test ID'
-and 'Owner' values. The corresponding OVP portal result will have a status of 'review'.
-.. image:: danube/images/ovp_result_review.png
+Reviewers must follow the checklist below to ensure review consistency for the OPNFV
+Verification Program (OVP) 2019.12 (Hunter) release at a minimum.
+
+#. **Test Case Pass Percentage** - Ensure all mandatory tests have passed (100% pass rate).
+#. **Mandatory Test Case Results** - Validate that results for all mandatory test cases are present.
+#. **Log File Verification** - Inspect the log file for each test case.
+#. **SUT Info Verification** - Validate the system under test (SUT) hardware and software endpoint info is present.
+
+
+Test Case Pass Percentage
+=========================
+
+All mandatory test cases have to run successfully. The below figure of the **Test Run Results**
+is one method and shows that 96.71% of the mandatory test cases have passed.
+This value must not be lower than 100%.
+
+.. image:: images/ovp_pass_percentage.png
:align: center
- :scale: 100%
+ :width: 350 px
-In the example above, this information will be provided as:
-- Test ID: a00c47e8
-- Owner: jtaylor
-Click on the hyperlink within the 'Test ID' column.
+Mandatory Test Case Results
+===========================
+
+Test results can be displayed by clicking on the hyperlink under the **Test ID** column.
+Reviewers should validate that results for all mandatory test cases are included in the overall
+test suite. The required mandatory test cases are:
+
+- bottlenecks.stress.ping
+- functest.security.patrole
+- functest.tempest.compute
+- functest.tempest.identity_v3
+- functest.tempest.image
+- functest.tempest.ipv6_api
+- functest.tempest.network_api
+- functest.tempest.neutron_trunk_ports
+- functest.tempest.osinterop
+- functest.tempest.volume
+- functest.vping.ssh
+- functest.vping.userdata
+- yardstick.ha.cinder_api
+- yardstick.ha.cpu_load
+- yardstick.ha.database
+- yardstick.ha.disk_load
+- yardstick.ha.glance_api
+- yardstick.ha.haproxy
+- yardstick.ha.keystone
+- yardstick.ha.neutron_server
+- yardstick.ha.nova_api
+- yardstick.ha.rabbitmq
*Note, that the 'Test ID' column in this view condenses the UUID used for 'Test ID' to
eight characters even though the 'Test ID' is a longer UUID in the back-end.*
-.. image:: danube/images/ovp_result_overview.png
- :align: center
- :scale: 100%
+Failed test cases can be easy identified by the color of pass/total number:
-The 'Test ID' hyperlink toggles the view to a top-level listing of the results displayed above.
-Validate that osinterop, vping and ha test area results are all present within the view.
+- **Green** when all test cases pass
+- **Orange** when at least one fails/skips
+- **Red** when all test cases fail/skip
+.. image:: images/ovp_pass_fraction.png
+ :align: center
+ :width: 350 px
-2. Test-Case Count within Mandatory Test Area
-=============================================
-Validate the test-case count within each test area. For the OVP 2018.01 release, this must break
-down as outlined in the table below.
+Log File Verification
+=====================
-.. image:: danube/images/ovp_test_count.png
- :align: center
- :scale: 100%
+Each log file of the mandatory test cases have to be verified for content.
-In the diagram above (from section 1), these counts can be gleaned from the numbers to the
-right of the test-cases. The total number is given for the osinterop (dovetail.osinterop.tc001)
-test area at 205. The vping (dovetail.vping.tc00x) and ha (dovetail.ha.tc00x) test-cases are
-broken down separately with a line for each test-case. Directly above the 'Test Result Overview'
-listing there's a summary labelled 'Test Run Results' shown below. For OVP 2018.01, a mandatory
-total of **215** test-cases must be present (205 osinterop + 8 ha + 2 vping).
+Log files can be displayed by clicking on the setup icon to the right of the results,
+as shown in the figure below.
-.. image:: danube/images/ovp_missing_ha.png
+.. image:: images/ovp_log_setup.png
:align: center
:scale: 100%
-An example of a listing that should flag a negative review is shown above. The mandatory ha test
-area is missing one test case (dovetail.ha.tc008).
-3. Test-Case Pass Percentage
-============================
+*Note, all log files can be found at results/ directory as shown at the following table.*
-All mandatory test-cases must pass. This can be validated in multiple ways. The below diagram of
-the 'Test Run Results' is one method and shows that 100% of the mandatory test-cases have passed.
-This value must not be lower than 100%.
++------------------------+--------------------------+
+| **Mandatory Test Case**| **Location** |
++------------------------+--------------------------+
+| bottlenecks | results/stress_logs/ |
++------------------------+--------------------------+
+| functest.vping | results/vping_logs/ |
++------------------------+--------------------------+
+| functest.tempest | results/tempest_logs/ |
++------------------------+--------------------------+
+| functest.security | results/security_logs/ |
++------------------------+--------------------------+
+| yardstick | results/ha_logs/ |
++------------------------+--------------------------+
-.. image:: danube/images/ovp_pass_percentage.png
- :align: center
- :width: 350 px
-Another method to check that all mandatory test-cases have passed is shown in the diagram below.
-The pass/total is given as a fraction and highlighted here in yellow. For the osinterop test area,
-the result must display [205/205] and for each of the test-cases under the vping and ha test areas
-[1/1] must be displayed.
+Bottlenecks Logs
+----------------
-.. image:: danube/images/ovp_pass_fraction.png
- :align: center
- :width: 270 px
+It must contain the 'SUCCESS' result at the end of Bottlenecks log as shown in following example:
-4. Log File Verification
-========================
+ 2019-12-03 07:35:14,630 [INFO] yardstick.benchmark.core.task task.py:129 Testcase: "ping_bottlenecks" SUCCESS!!!
-Three log files must be verified for content within each mandatory test area. The log files for
-each of the test areas is noted in the table below.
-.. image:: danube/images/ovp_log_files.png
- :align: center
- :scale: 100%
+Functest Logs
+-------------
+
+There are 2 different types of Functest logs, one is plain text for **vping** test cases and the other
+is html file for **tempest** and **security** test cases.
-The three log files can be displayed by clicking on the setup icon to the right of the results,
-as shown in the diagram below.
+For **vping** test cases, two entries displayed in the tables below must be present in log files.
-*Note, while the vping and ha test areas list multiple test-cases in the below diagram, there is
-a single log file for all test-cases within these test areas.*
+**functest.vping.ssh**
-.. image:: danube/images/ovp_log_setup.png
+.. image:: images/ovp_vping_ssh.png
:align: center
:scale: 100%
-Within the osinterop log (dovetail.osinterop.tc001.log), scroll down to the area of the log that
-begins to list the results of each test-case executed. This can be located by looking for lines
-prefaced with '**tempest.api**' and ending with '**... ok**'.
-.. image:: danube/images/ovp_log_test_count.png
+**functest.vping.userdata**
+
+.. image:: images/ovp_vping_user.png
:align: center
:scale: 100%
-The number of lines within the osinterop log for test-cases must add up according to the table
-above, where test-cases are broken down according to compute, identity, image, network and volume,
-with respective counts given in the table. The ha log (yardstick.log) must contain the 'PASS'
-result for each of the eight test-cases within this test area. This can be verified by searching
-the log for the keyword 'PASS'.
+For **tempest** and **security** test cases, it opens an html page that lists all test cases as shown
+below. All test cases must have run successfully.
+.. image:: images/ovp_log_files_functest_image.png
+ :align: center
+ :scale: 100%
-The eight lines to validate are listed below:
- - 017-10-16 05:07:49,158 yardstick.benchmark.scenarios.availability.serviceha serviceha.py:81
- INFO The HA test case PASS the SLA
- - 2017-10-16 05:08:31,387 yardstick.benchmark.scenarios.availability.serviceha serviceha.py:81
- INFO The HA test case PASS the SLA
- - 2017-10-16 05:09:13,669 yardstick.benchmark.scenarios.availability.serviceha serviceha.py:81
- INFO The HA test case PASS the SLA
- - 2017-10-16 05:09:55,967 yardstick.benchmark.scenarios.availability.serviceha serviceha.py:81
- INFO The HA test case PASS the SLA
- - 2017-10-16 05:10:38,407 yardstick.benchmark.scenarios.availability.serviceha serviceha.py:81
- INFO The HA test case PASS the SLA
- - 2017-10-16 05:11:00,030 yardstick.benchmark.scenarios.availability.scenario_general
- scenario_general.py:71 INFO [92m Congratulations, the HA test case PASS! [0m
- - 2017-10-16 05:11:22,536 yardstick.benchmark.scenarios.availability.scenario_general
- scenario_general.py:71 INFO [92m Congratulations, the HA test case PASS! [0m
- - 2017-10-16 05:12:07,880 yardstick.benchmark.scenarios.availability.scenario_general
- scenario_general.py:71 INFO [92m Congratulations, the HA test case PASS! [0m
+Yardstick Logs
+--------------
+The yardstick log must contain the 'SUCCESS' result for each of the test-cases within this
+test area. This can be verified by searching the log for the keyword 'SUCCESS'.
-The final validation is for the vping test area log file (functest.log). The two entries
-displayed in the diagrams below must be present in this log file.
+An example of a FAILED and a SUCCESS test case are listed below:
- - vping_userdata
- - vping_ssh
+ 2018-08-28 10:25:09,946 [ERROR] yardstick.benchmark.scenarios.availability.monitor.monitor_multi monitor_multi.py:78 SLA **failure**: 14.015082 > 5.000000
-.. image:: danube/images/ovp_vping_user.png
- :align: center
- :scale: 100%
+ 2018-08-28 10:23:41,907 [INFO] yardstick.benchmark.core.task task.py:127 Testcase: "opnfv_yardstick_tc052" **SUCCESS**!!!
-.. image:: danube/images/ovp_vping_ssh.png
- :align: center
- :scale: 100%
-5. SUT Info Verification
-========================
+SUT Info Verification
+=====================
SUT information must be present in the results to validate that all required endpoint services
and at least two controllers were present during test execution. For the results shown below,
-click the '**info**' hyperlink in the **SUT** column to navigate to the SUT information page.
+click the **info** hyperlink in the **SUT** column to navigate to the SUT information page.
-.. image:: danube/images/sut_info.png
+.. image:: images/sut_info.png
:align: center
:scale: 100%
-In the '**Endpoints**' listing shown below for the SUT VIM component, ensure that services are
+
+In the **Endpoints** listing shown below for the SUT VIM component, ensure that services are
present for identify, compute, image, volume and network at a minimum by inspecting the
-'**Service Type**' column.
+**Service Type** column.
-.. image:: danube/images/sut_endpoints.png
+.. image:: images/sut_endpoints.png
:align: center
:scale: 100%
-Inspect the '**Hosts**' listing found below the Endpoints secion of the SUT info page and ensure
+
+Inspect the **Hosts** listing found below the Endpoints secion of the SUT info page and ensure
at least two hosts are present, as two controllers are required the for the mandatory HA
-test-cases.
+test cases.
+
+
+Approve or Not Approve Results
+==============================
+
+When you decide to approve or not approve this test, you can click the **Operation** and choose
+**approve** or **not approve**. Once you have approved or not approved the test, you can click
+**View Reviews** to find the review status as shown below.
+
+.. image:: images/review_status.png
+ :align: center
+ :scale: 100%