summaryrefslogtreecommitdiffstats
path: root/docs/testing/ecosystem/overview.rst
diff options
context:
space:
mode:
authorMorgan Richomme <morgan.richomme@orange.com>2017-09-14 11:53:28 +0200
committerMorgan Richomme <morgan.richomme@orange.com>2017-09-15 18:09:12 +0200
commit99500b5ab19db0c4e9ef704b6d892e3e1a034aa6 (patch)
treec6349c72fa182c544968ef8d75f20952699cdce0 /docs/testing/ecosystem/overview.rst
parentea001dac25d4348859d9f16f5009f0bc38e21fd2 (diff)
Testing group documentation update for Euphrates
Change-Id: Ic27cbc0d29c3c1e162814e5314a70b75eebd1714 Signed-off-by: Morgan Richomme <morgan.richomme@orange.com>
Diffstat (limited to 'docs/testing/ecosystem/overview.rst')
-rw-r--r--docs/testing/ecosystem/overview.rst222
1 files changed, 156 insertions, 66 deletions
diff --git a/docs/testing/ecosystem/overview.rst b/docs/testing/ecosystem/overview.rst
index ed1657c87..43ee7771d 100644
--- a/docs/testing/ecosystem/overview.rst
+++ b/docs/testing/ecosystem/overview.rst
@@ -1,21 +1,21 @@
.. This work is licensed under a Creative Commons Attribution 4.0 International License.
.. SPDX-License-Identifier: CC-BY-4.0
-=============
-OPNFV testing
-=============
+======================
+OPNFV Testing Overview
+======================
Introduction
============
-Testing is one of the key activities in OPNFV and includes unit, feature, component, system
-level testing for development, automated deployment, performance characterization or stress
-testing.
+Testing is one of the key activities in OPNFV and includes unit, feature,
+component, system level testing for development, automated deployment,
+performance characterization and stress testing.
Test projects are dedicated to provide frameworks, tooling and test-cases categorized as
functional, performance or compliance testing. Test projects fulfill different roles such as
verifying VIM functionality, benchmarking components and platforms or analysis of measured
-KPIs for the scenarios released in OPNFV.
+KPIs for OPNFV release scenarios.
Feature projects also provide their own test suites that either run independently or within a
test project.
@@ -24,13 +24,10 @@ This document details the OPNFV testing ecosystem, describes common test compone
by individual OPNFV projects and provides links to project specific documentation.
-OPNFV testing ecosystem
-=======================
-
-The testing projects
---------------------
+The OPNFV Testing Ecosystem
+===========================
-The OPNFV testing projects may be summarized as follows:
+The OPNFV testing projects are represented in the following diagram:
.. figure:: ../../images/OPNFV_testing_working_group.png
:align: center
@@ -92,13 +89,20 @@ The major testing projects are described in the table below:
| | pass/fail thresholds for test, staging, and production |
| | NFVI environments. |
+----------------+---------------------------------------------------------+
-| VSperf | This project provides a framework for automation of NFV |
-| | data-plane performance testing and benchmarking. The |
-| | NFVI fast-path includes switch technology and network |
-| | with physical and virtual interfaces. VSperf can be |
-| | used to evaluate the suitability of different Switch |
-| | implementations and features, quantify data-path |
-| | performance and optimize platform configurations. |
+| VSPERF | VSPERF is an OPNFV project that provides an automated |
+| | test-framework and comprehensive test suite based on |
+| | Industry Test Specifications for measuring NFVI |
+| | data-plane performance. The data-path includes switching|
+| | technologies with physical and virtual network |
+| | interfaces. The VSPERF architecture is switch and |
+| | traffic generator agnostic and test cases can be easily |
+| | customized. Software versions and configurations |
+| | including the vSwitch (OVS or VPP) as well as the |
+| | network topology are controlled by VSPERF (independent |
+| | of OpenStack). VSPERF is used as a development tool for |
+| | optimizing switching technologies, qualification of |
+| | packet processing components and for pre-deployment |
+| | evaluation of the NFV platform data-path. |
+----------------+---------------------------------------------------------+
| Yardstick | The goal of the Project is to verify the infrastructure |
| | compliance when running VNF applications. NFV Use Cases |
@@ -112,16 +116,27 @@ The major testing projects are described in the table below:
+----------------+---------------------------------------------------------+
-===================================
-The testing working group resources
-===================================
+===============================
+Testing Working Group Resources
+===============================
-The assets
-==========
+Test Results Collection Framework
+=================================
-Overall Architecture
---------------------
-The Test result management can be summarized as follows::
+Any test project running in the global OPNFV lab infrastructure and is
+integrated with OPNFV CI can push test results to the community Test Database
+using a common Test API. This database can be used to track the evolution of
+testing and analyse test runs to compare results across installers, scenarios
+and between technically and geographically diverse hardware environments.
+
+Results from the databse are used to generate a dashboard with the current test
+status for each testing project. Please note that you can also deploy the Test
+Database and Test API locally in your own environment.
+
+Overall Test Architecture
+-------------------------
+
+The management of test results can be summarized as follows::
+-------------+ +-------------+ +-------------+
| | | | | |
@@ -149,14 +164,14 @@ The Test result management can be summarized as follows::
| |
+----------------------+ +----------------------+
| | | |
- | Testing Dashboards | | Landing page |
+ | Testing Dashboards | | Test Landing page |
| | | |
+----------------------+ +----------------------+
-The testing databases
----------------------
-A Mongo DB Database has been introduced for the Brahmaputra release.
+The Test Database
+-----------------
+A Mongo DB Database was introduced for the Brahmaputra release.
The following collections are declared in this database:
* pods: the list of pods used for production CI
* projects: the list of projects providing test cases
@@ -164,21 +179,21 @@ The following collections are declared in this database:
* results: the results of the test cases
* scenarios: the OPNFV scenarios tested in CI
-This database can be used by any project through the testapi.
-Please note that projects may also use additional databases. This database is
-mainly use to colelct CI results and scenario trust indicators.
+This database can be used by any project through the Test API.
+Please note that projects may also use additional databases. The Test
+Database is mainly use to collect CI test results and generate scenario
+trust indicators. The Test Database is cloned for OPNFV Plugfests in
+order to provide a private datastore only accessible to Plugfest participants.
-This database is also cloned for OPNFV Plugfest.
-
-The test API
-------------
+Test API description
+--------------------
The Test API is used to declare pods, projects, test cases and test results.
Pods correspond to the cluster of machines (3 controller and 2 compute nodes in
HA mode) used to run the tests and defined in Pharos project.
-The results pushed in the database are related to pods, projects and cases.
-If you try to push results of test done on non referenced pod, the API will
-return an error message.
+The results pushed in the database are related to pods, projects and test cases.
+Trying to push results generated from a non-referenced pod will return an error
+message by the Test API.
An additional method dashboard has been added to post-process the raw results in
the Brahmaputra release (deprecated in Colorado release).
@@ -192,53 +207,110 @@ The data model is very basic, 5 objects are available:
For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html
+The code of the Test API is hosted in the releng repository `[TST2]`_.
+The static documentation of the Test API can be found at `[TST3]`_.
+The Test API has been dockerized and may be installed locally in your lab.
+
+The deployment of the Test API has been automated.
+A jenkins job manages:
+
+ * the unit tests of the Test API
+ * the creation of a new docker file
+ * the deployment of the new Test API
+ * the archive of the old Test API
+ * the backup of the Mongo DB
+
+Test API Authorization
+----------------------
+
+PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs
+to be added in the request using a header 'X-Auth-Token' for access to the database.
+
+e.g::
+ headers['X-Auth-Token']
+
+The value of the header i.e the token can be accessed in the jenkins environment variable
+*TestApiToken*. The token value is added as a masked password.
+
+.. code-block:: python
+
+ headers['X-Auth-Token'] = os.environ.get('TestApiToken')
+
+The above example is in Python. Token based authentication has been added so
+that only CI pods running Jenkins jobs can access to the database. Please note
+that currently token authorization is implemented but is not yet enabled.
-The reporting
--------------
+
+Test Project Reporting
+======================
The reporting page for the test projects is http://testresults.opnfv.org/reporting/
.. figure:: ../../images/reporting_page.png
:align: center
:alt: Testing group reporting page
-This page provides a reporting per OPNFV release and per testing project.
+This page provides reporting per OPNFV release and per testing project.
-.. figure:: ../../images/reporting_danube_page.png
+.. figure:: ../../images/reportingMaster.png
:align: center
- :alt: Testing group Danube reporting page
+ :alt: Testing group Euphrates reporting page
-An evolution of this page is planned.
-It was decided to unify the reporting by creating a landing page that should give
-the scenario status in one glance (it was previously consolidated manually
-on a wiki page).
+An evolution of the reporting page is planned to unify test reporting by creating
+a landing page that shows the scenario status with one glance (this information was
+previously consolidated manually on a wiki page). The landing page will be displayed
+per scenario and show:
-The landing page (planned for Danube 2.0) will be displayed per scenario:
* the status of the deployment
- * the score of the test projectS
+ * the score from each test suite. There is no overall score, it is determined
+ by each test project.
* a trust indicator
-Additional filters (version, installer, test collection time window,... ) are
-included.
-The test case catalog
----------------------
-Until the Colorado release, each testing project was managing the list of its
-test cases. It was very hard to have a global view of the available test cases
-among the different test projects. A common view was possible through the API
+Test Case Catalog
+=================
+Until the Colorado release, each testing project managed the list of its
+test cases. This made it very hard to have a global view of the available test
+cases from the different test projects. A common view was possible through the API
but it was not very user friendly.
-In fact you may know all the cases per project calling:
+Test cases per project may be listed by calling:
http://testresults.opnfv.org/test/api/v1/projects/<project_name>/cases
with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick
-It was decided to build a web site providing a consistent view of the test cases
-per project and allow any scenario owner to build his/her custom list of tests
-(Danube 2.0).
+A test case catalog has now been realized `[TST4]`_. Roll over the project then
+click to get the list of test cases, click on the case to get more details.
+
+.. figure:: ../../images/TestcaseCatalog.png
+ :align: center
+ :alt: Testing group testcase catalog
-Other resources
+Test Dashboards
===============
+The Test Dashboard is used to provide a consistent view of the results collected in CI.
+The results shown on the dashboard are post processed from the Database, which only
+contains raw results.
+The dashboard can be used in addition of the reporting page (high level view) to allow
+the creation of specific graphs according to what the test owner wants to show.
+
+In Brahmaputra, a basic dashboard was created in Functest.
+In Colorado, Yardstick used Grafana (time based graphs) and ELK (complex
+graphs).
+Since Danube, the OPNFV testing community decided to adopt the ELK framework and to
+use Bitergia for creating highly flexible dashboards `[TST5]`_.
+
+.. figure:: ../../images/DashboardBitergia.png
+ :align: center
+ :alt: Testing group testcase catalog
+
+
+ OPNFV Test Group Information
+=============================
+
+For more information or to participate in the OPNFV test community please see the
+following:
+
wiki: https://wiki.opnfv.org/testing
mailing list: test-wg@lists.opnfv.org
@@ -249,8 +321,9 @@ weekly meeting (https://wiki.opnfv.org/display/meetings/TestPerf):
* Usual time: Every Thursday 15:00-16:00 UTC / 7:00-8:00 PST
* APAC time: 2nd Wednesday of the month 8:00-9:00 UTC
+
=======================
-Reference documentation
+Reference Documentation
=======================
+----------------+---------------------------------------------------------+
@@ -272,3 +345,20 @@ Reference documentation
+----------------+---------------------------------------------------------+
| Yardstick | https://wiki.opnfv.org/display/yardstick/Yardstick |
+----------------+---------------------------------------------------------+
+
+
+`[TST1]`_: OPNFV web site
+
+`[TST2]`_: Test utils in Releng
+
+`[TST3]`_: TestAPI autogenerated documentation
+
+`[TST4]`_: Testcase catalog
+
+`[TST5]`_: Testing group dashboard
+
+.. _`[TST1]`: http://www.opnfv.org
+.. _`[TST2]`: https://git.opnfv.org/functest/tree/releng/utils/tests
+.. _`[TST3]`: http://artifacts.opnfv.org/releng/docs/testapi.html
+.. _`[TST4]`: http://testresults.opnfv.org/testing/index.html#!/select/visual
+.. _`[TST5]`: https://opnfv.biterg.io:443/goto/283dba93ca18e95964f852c63af1d1ba