aboutsummaryrefslogtreecommitdiffstats
path: root/docs/testing
diff options
context:
space:
mode:
authorStuart Mackie <wsmackie@juniper.net>2017-08-29 05:21:36 -0700
committerStuart Mackie <wsmackie@juniper.net>2017-08-29 05:21:36 -0700
commitfce102283bab73ed08c292fce03e39c52f4a1fe2 (patch)
tree299e9f8e5daca49f74f207cbe6699295b9115876 /docs/testing
parent711967ae9639095ce41500bb0e6f80c8b80fab95 (diff)
Added doc directories
Change-Id: I671d7c3ad4f4e5e476c98f53780d867dc94b3089 Signed-off-by: Stuart Mackie <wsmackie@juniper.net>
Diffstat (limited to 'docs/testing')
-rw-r--r--docs/testing/developer/devguide/index.rst353
-rw-r--r--docs/testing/ecosystem/index.rst14
-rw-r--r--docs/testing/ecosystem/overview.rst274
-rw-r--r--docs/testing/testing-dev.rst51
-rw-r--r--docs/testing/testing-user.rst65
5 files changed, 757 insertions, 0 deletions
diff --git a/docs/testing/developer/devguide/index.rst b/docs/testing/developer/devguide/index.rst
new file mode 100644
index 0000000..8668859
--- /dev/null
+++ b/docs/testing/developer/devguide/index.rst
@@ -0,0 +1,353 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. SPDX-License-Identifier: CC-BY-4.0
+
+***********************
+Testing developer guide
+***********************
+
+.. toctree::
+ :numbered:
+ :maxdepth: 2
+
+
+============
+Introduction
+============
+
+The OPNFV testing ecosystem is wide.
+
+The goal of this guide consists in providing some guidelines for new developers
+involved in test areas.
+
+For the description of the ecosystem, see `[1]`_.
+
+
+=================
+Developer journey
+=================
+
+Be involved in the testing group
+================================
+
+Best practices
+==============
+
+Unit tests
+----------
+
+Dockerization
+-------------
+
+API
+---
+
+CLI
+---
+
+Traffic generators
+------------------
+
+Towards a pseudo micro services approach
+----------------------------------------
+
+======================================
+Testing group configuration parameters
+======================================
+
+
+Testing categories
+==================
+
+The testing group defined several categories also known as tiers. These
+categories can be used to group test suites.
+
++----------------+-------------------------------------------------------------+
+| Healthcheck | Simple and quick healthcheck tests case |
++----------------+-------------------------------------------------------------+
+| Smoke | Set of smoke test cases/suites to validate the release |
++----------------+-------------------------------------------------------------+
+| Features | Test cases that validate a specific feature on top of OPNFV.|
+| | Those come from Feature projects and need a bit of support |
+| | for integration |
++----------------+-------------------------------------------------------------+
+| Components | Tests on a specific component (e.g. OpenStack, OVS, DPDK,..)|
+| | It may extend smoke tests |
++----------------+-------------------------------------------------------------+
+| Performance | Performance qualification |
++----------------+-------------------------------------------------------------+
+| VNF | Test cases related to deploy an open source VNF including |
+| | an orchestrator |
++----------------+-------------------------------------------------------------+
+| Stress | Stress and robustness tests |
++----------------+-------------------------------------------------------------+
+| In Service | In service testing |
++----------------+-------------------------------------------------------------+
+
+Testing domains
+===============
+
+The domains deal with the technical scope of the tests. It shall correspond to
+domains defined for the certification program:
+
+ * compute
+ * network
+ * storage
+ * hypervisor
+ * container
+ * vim
+ * mano
+ * vnf
+ * ...
+
+Testing coverage
+=================
+One of the goals of the testing working group is to identify the poorly covered
+areas and avoid testing overlap.
+Ideally based on the declaration of the test cases, through the tags, domains
+and tier fields, it shall be possible to create heuristic maps.
+
+
+==============================
+Testing group generic enablers
+==============================
+
+
+TestAPI framework
+=================
+
+The OPNFV testing group created a test collection database to collect
+the test results from CI:
+
+
+ http://testresults.opnfv.org/test/swagger/spec.html
+
+Any test project running on any lab integrated in CI can push the
+results to this database.
+This database can be used to see the evolution of the tests and compare
+the results versus the installers, the scenarios or the labs.
+It is used to produce a dashboard with the current test status of the project.
+
+
+Overall Architecture
+--------------------
+The Test result management can be summarized as follows::
+
+ +-------------+ +-------------+ +-------------+
+ | | | | | |
+ | Test | | Test | | Test |
+ | Project #1 | | Project #2 | | Project #N |
+ | | | | | |
+ +-------------+ +-------------+ +-------------+
+ | | |
+ V V V
+ +-----------------------------------------+
+ | |
+ | Test Rest API front end |
+ | |
+ +-----------------------------------------+
+ A |
+ | V
+ | +-------------------------+
+ | | |
+ | | Test Results DB |
+ | | Mongo DB |
+ | | |
+ | +-------------------------+
+ |
+ |
+ +----------------------+
+ | |
+ | test Dashboard |
+ | |
+ +----------------------+
+
+TestAPI description
+-------------------
+The TestAPI is used to declare pods, projects, test cases and test
+results. Pods are the sets of bare metal or virtual servers and networking
+equipments used to run the tests.
+
+The results pushed in the database are related to pods, projects and test cases.
+If you try to push results of test done on non referenced pod, the API will
+return an error message.
+
+An additional method dashboard has been added to post-process
+the raw results in release Brahmaputra (deprecated in Colorado).
+
+The data model is very basic, 5 objects are created:
+
+ * Pods
+ * Projects
+ * Testcases
+ * Results
+ * Scenarios
+
+The code of the API is hosted in the releng repository `[6]`_.
+The static documentation of the API can be found at `[7]`_.
+The TestAPI has been dockerized and may be installed locally in your
+lab. See `[15]`_ for details.
+
+The deployment of the TestAPI has been automated.
+A jenkins job manages:
+
+ * the unit tests of the TestAPI
+ * the creation of a new docker file
+ * the deployment of the new TestAPI
+ * the archive of the old TestAPI
+ * the backup of the Mongo DB
+
+TestAPI Authorization
+~~~~~~~~~~~~~~~~~~~~~
+
+PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs
+to be added in the request using a header 'X-Auth-Token' for access to the database.
+
+e.g::
+ headers['X-Auth-Token']
+
+The value of the header i.e the token can be accessed in the jenkins environment variable
+*TestApiToken*. The token value is added as a masked password.
+
+.. code-block:: python
+
+ headers['X-Auth-Token'] = os.environ.get('TestApiToken')
+
+The above example is in Python. Token based authentication has been added so that only ci pods
+jenkins job can have access to the database.
+
+Please note that currently token authorization is implemented but is not yet enabled.
+
+Reporting
+=========
+
+An automatic reporting page has been created in order to provide a
+consistent view of the scenarios.
+
+In this page, each scenario is evaluated according to test criteria.
+The code for the automatic reporting is available at `[8]`_.
+
+The results are collected from the centralized database every day and,
+per scenario. A score is calculated based on the results from the last
+10 days.
+
+Dashboard
+=========
+
+Dashboard is used to provide a consistent view of the results collected in CI.
+The results showed on the dashboard are post processed from the Database,
+which only contains raw results.
+
+It can be used in addition of the reporting page (high level view) to allow
+the creation of specific graphs according to what the test owner wants to show.
+
+In Brahmaputra, a basic home made dashboard was created in Functest.
+In Colorado, Yardstick adopted Grafana (time based graphs) and ELK (complex
+graphs).
+Since Danube, the testing community decided to adopt ELK framework and to rely
+on bitergia. It was not implemented for Danube but it is planned for Euphrates.
+
+Bitergia already provides a dashboard for code and infrastructure.
+A new Test tab will be added. The dataset will be built by consuming
+the TestAPI.
+
+See `[3]`_ for details.
+
+
+=======
+How TOs
+=======
+
+Where can I find information on the different test projects?
+===========================================================
+
+
+How can I contribute to a test project?
+=======================================
+
+
+Where can I find hardware resources?
+====================================
+
+
+How do I integrate my tests in CI?
+==================================
+
+
+How to declare my tests in the test Database?
+=============================================
+
+
+How to push your results into the Test Database?
+================================================
+
+The test database is used to collect test results. By default it is
+enabled only for CI tests from Production CI pods.
+
+Please note that it is possible to create your own local database.
+
+A dedicated database is for instance created for each plugfest.
+
+The architecture and associated API is described in previous chapter.
+If you want to push your results from CI, you just have to call the API
+at the end of your script.
+
+You can also reuse a python function defined in functest_utils.py `[5]`_
+
+
+Where can I find the documentation on the test API?
+===================================================
+
+http://artifacts.opnfv.org/releng/docs/testapi.html
+
+
+
+I have tests, to which category should I declare them?
+======================================================
+
+
+
+The main ambiguity could be between features and VNF.
+In fact sometimes you have to spawn VMs to demonstrate the capabilities of the
+feature you introduced.
+We recommend to declare your test in the feature category.
+
+VNF category is really dedicated to test including:
+
+ * creation of resources
+ * deployement of an orchestrator/VNFM
+ * deployment of the VNF
+ * test of the VNFM
+ * free resources
+
+The goal is not to study a particular feature on the infrastructure but to have
+a whole end to end test of a VNF automatically deployed in CI.
+Moreover VNF are run in weekly jobs (one a week), feature tests are in daily
+jobs and use to get a scenario score.
+
+Where are the logs of CI runs?
+==============================
+
+Logs and configuration files can be pushed to artifact server from the CI under
+http://artifacts.opnfv.org/<project name>
+
+
+==========
+References
+==========
+
+_`[1]`: http://docs.opnfv.org/en/stable-danube/testing/ecosystem/overview.html
+
+_`[2]`: http://www.opnfv.org
+
+_`[3]`: https://wiki.opnfv.org/display/testing/Result+alignment+for+ELK+post-processing
+
+_`[4]`: https://wiki.opnfv.org/display/INF/CI+Scenario+Naming
+
+_`[5]`: https://git.opnfv.org/functest/tree/functest/utils/functest_utils.py#176
+
+_`[6]`: https://git.opnfv.org/functest/tree/releng
+
+_`[7]`: http://artifacts.opnfv.org/releng/docs/testapi.html
+
+
+IRC support chan: #opnfv-testperf
diff --git a/docs/testing/ecosystem/index.rst b/docs/testing/ecosystem/index.rst
new file mode 100644
index 0000000..f51fa19
--- /dev/null
+++ b/docs/testing/ecosystem/index.rst
@@ -0,0 +1,14 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) Christopher Price (Ericsson AB)
+
+========================
+Test Framework Overview
+========================
+
+.. toctree::
+ :maxdepth: 2
+
+ ./abstract
+ ./overview
+
diff --git a/docs/testing/ecosystem/overview.rst b/docs/testing/ecosystem/overview.rst
new file mode 100644
index 0000000..ed1657c
--- /dev/null
+++ b/docs/testing/ecosystem/overview.rst
@@ -0,0 +1,274 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. SPDX-License-Identifier: CC-BY-4.0
+
+=============
+OPNFV testing
+=============
+
+Introduction
+============
+
+Testing is one of the key activities in OPNFV and includes unit, feature, component, system
+level testing for development, automated deployment, performance characterization or stress
+testing.
+
+Test projects are dedicated to provide frameworks, tooling and test-cases categorized as
+functional, performance or compliance testing. Test projects fulfill different roles such as
+verifying VIM functionality, benchmarking components and platforms or analysis of measured
+KPIs for the scenarios released in OPNFV.
+
+Feature projects also provide their own test suites that either run independently or within a
+test project.
+
+This document details the OPNFV testing ecosystem, describes common test components used
+by individual OPNFV projects and provides links to project specific documentation.
+
+
+OPNFV testing ecosystem
+=======================
+
+The testing projects
+--------------------
+
+The OPNFV testing projects may be summarized as follows:
+
+.. figure:: ../../images/OPNFV_testing_working_group.png
+ :align: center
+ :alt: Overview of OPNFV Testing projects
+
+The major testing projects are described in the table below:
+
++----------------+---------------------------------------------------------+
+| Project | Description |
++================+=========================================================+
+| Bottlenecks | This project aims to find system bottlenecks by testing |
+| | and verifying OPNFV infrastructure in a staging |
+| | environment before committing it to a production |
+| | environment. Instead of debugging a deployment in |
+| | production environment, an automatic method for |
+| | executing benchmarks which plans to validate the |
+| | deployment during staging is adopted. This project |
+| | forms a staging framework to find bottlenecks and to do |
+| | analysis of the OPNFV infrastructure. |
++----------------+---------------------------------------------------------+
+| CPerf | SDN Controller benchmarks and performance testing, |
+| | applicable to controllers in general. Collaboration of |
+| | upstream controller testing experts, external test tool |
+| | developers and the standards community. Primarily |
+| | contribute to upstream/external tooling, then add jobs |
+| | to run those tools on OPNFV's infrastructure. |
++----------------+---------------------------------------------------------+
+| Dovetail | This project intends to define and provide a set of |
+| | OPNFV related validation criteria that will provide |
+| | input for the evaluation of the use of OPNFV trademarks.|
+| | The dovetail project is executed with the guidance and |
+| | oversight of the Compliance and Certification committee |
+| | and work to secure the goals of the C&C committee for |
+| | each release. The project intends to incrementally |
+| | define qualification criteria that establish the |
+| | foundations of how we are able to measure the ability to|
+| | utilize the OPNFV platform, how the platform itself |
+| | should behave, and how applications may be deployed on |
+| | the platform. |
++----------------+---------------------------------------------------------+
+| Functest | This project deals with the functional testing of the |
+| | VIM and NFVI. It leverages several upstream test suites |
+| | (OpenStack, ODL, ONOS, etc.) and can be used by feature |
+| | project to launch feature test suites in CI/CD. |
+| | The project is used for scenario validation. |
++----------------+---------------------------------------------------------+
+| Qtip | QTIP as the project for "Platform Performance |
+| | Benchmarking" in OPNFV aims to provide user a simple |
+| | indicator for performance, supported by comprehensive |
+| | testing data and transparent calculation formula. |
+| | It provides a platform with common services for |
+| | performance benchmarking which helps users to build |
+| | indicators by themselves with ease. |
++----------------+---------------------------------------------------------+
+| Storperf | The purpose of this project is to provide a tool to |
+| | measure block and object storage performance in an NFVI.|
+| | When complemented with a characterization of typical VF |
+| | storage performance requirements, it can provide |
+| | pass/fail thresholds for test, staging, and production |
+| | NFVI environments. |
++----------------+---------------------------------------------------------+
+| VSperf | This project provides a framework for automation of NFV |
+| | data-plane performance testing and benchmarking. The |
+| | NFVI fast-path includes switch technology and network |
+| | with physical and virtual interfaces. VSperf can be |
+| | used to evaluate the suitability of different Switch |
+| | implementations and features, quantify data-path |
+| | performance and optimize platform configurations. |
++----------------+---------------------------------------------------------+
+| Yardstick | The goal of the Project is to verify the infrastructure |
+| | compliance when running VNF applications. NFV Use Cases |
+| | described in ETSI GS NFV 001 show a large variety of |
+| | applications, each defining specific requirements and |
+| | complex configuration on the underlying infrastructure |
+| | and test tools.The Yardstick concept decomposes typical |
+| | VNF work-load performance metrics into a number of |
+| | characteristics/performance vectors, which each of them |
+| | can be represented by distinct test-cases. |
++----------------+---------------------------------------------------------+
+
+
+===================================
+The testing working group resources
+===================================
+
+The assets
+==========
+
+Overall Architecture
+--------------------
+The Test result management can be summarized as follows::
+
+ +-------------+ +-------------+ +-------------+
+ | | | | | |
+ | Test | | Test | | Test |
+ | Project #1 | | Project #2 | | Project #N |
+ | | | | | |
+ +-------------+ +-------------+ +-------------+
+ | | |
+ V V V
+ +---------------------------------------------+
+ | |
+ | Test Rest API front end |
+ | http://testresults.opnfv.org/test |
+ | |
+ +---------------------------------------------+
+ ^ | ^
+ | V |
+ | +-------------------------+ |
+ | | | |
+ | | Test Results DB | |
+ | | Mongo DB | |
+ | | | |
+ | +-------------------------+ |
+ | |
+ | |
+ +----------------------+ +----------------------+
+ | | | |
+ | Testing Dashboards | | Landing page |
+ | | | |
+ +----------------------+ +----------------------+
+
+
+The testing databases
+---------------------
+A Mongo DB Database has been introduced for the Brahmaputra release.
+The following collections are declared in this database:
+ * pods: the list of pods used for production CI
+ * projects: the list of projects providing test cases
+ * testcases: the test cases related to a given project
+ * results: the results of the test cases
+ * scenarios: the OPNFV scenarios tested in CI
+
+This database can be used by any project through the testapi.
+Please note that projects may also use additional databases. This database is
+mainly use to colelct CI results and scenario trust indicators.
+
+This database is also cloned for OPNFV Plugfest.
+
+
+The test API
+------------
+The Test API is used to declare pods, projects, test cases and test results.
+Pods correspond to the cluster of machines (3 controller and 2 compute nodes in
+HA mode) used to run the tests and defined in Pharos project.
+The results pushed in the database are related to pods, projects and cases.
+If you try to push results of test done on non referenced pod, the API will
+return an error message.
+
+An additional method dashboard has been added to post-process the raw results in
+the Brahmaputra release (deprecated in Colorado release).
+
+The data model is very basic, 5 objects are available:
+ * Pods
+ * Projects
+ * Testcases
+ * Results
+ * Scenarios
+
+For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html
+
+
+The reporting
+-------------
+The reporting page for the test projects is http://testresults.opnfv.org/reporting/
+
+.. figure:: ../../images/reporting_page.png
+ :align: center
+ :alt: Testing group reporting page
+
+This page provides a reporting per OPNFV release and per testing project.
+
+.. figure:: ../../images/reporting_danube_page.png
+ :align: center
+ :alt: Testing group Danube reporting page
+
+An evolution of this page is planned.
+It was decided to unify the reporting by creating a landing page that should give
+the scenario status in one glance (it was previously consolidated manually
+on a wiki page).
+
+The landing page (planned for Danube 2.0) will be displayed per scenario:
+ * the status of the deployment
+ * the score of the test projectS
+ * a trust indicator
+
+Additional filters (version, installer, test collection time window,... ) are
+included.
+
+The test case catalog
+---------------------
+Until the Colorado release, each testing project was managing the list of its
+test cases. It was very hard to have a global view of the available test cases
+among the different test projects. A common view was possible through the API
+but it was not very user friendly.
+In fact you may know all the cases per project calling:
+
+ http://testresults.opnfv.org/test/api/v1/projects/<project_name>/cases
+
+with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick
+
+It was decided to build a web site providing a consistent view of the test cases
+per project and allow any scenario owner to build his/her custom list of tests
+(Danube 2.0).
+
+Other resources
+===============
+
+wiki: https://wiki.opnfv.org/testing
+
+mailing list: test-wg@lists.opnfv.org
+
+IRC chan: #opnfv-testperf
+
+weekly meeting (https://wiki.opnfv.org/display/meetings/TestPerf):
+ * Usual time: Every Thursday 15:00-16:00 UTC / 7:00-8:00 PST
+ * APAC time: 2nd Wednesday of the month 8:00-9:00 UTC
+
+=======================
+Reference documentation
+=======================
+
++----------------+---------------------------------------------------------+
+| Project | Documentation links |
++================+=========================================================+
+| Bottlenecks | https://wiki.opnfv.org/display/bottlenecks/Bottlenecks |
++----------------+---------------------------------------------------------+
+| CPerf | https://wiki.opnfv.org/display/cperf |
++----------------+---------------------------------------------------------+
+| Dovetail | https://wiki.opnfv.org/display/dovetail |
++----------------+---------------------------------------------------------+
+| Functest | https://wiki.opnfv.org/display/functest/ |
++----------------+---------------------------------------------------------+
+| Qtip | https://wiki.opnfv.org/display/qtip |
++----------------+---------------------------------------------------------+
+| Storperf | https://wiki.opnfv.org/display/storperf/Storperf |
++----------------+---------------------------------------------------------+
+| VSperf | https://wiki.opnfv.org/display/vsperf |
++----------------+---------------------------------------------------------+
+| Yardstick | https://wiki.opnfv.org/display/yardstick/Yardstick |
++----------------+---------------------------------------------------------+
diff --git a/docs/testing/testing-dev.rst b/docs/testing/testing-dev.rst
new file mode 100644
index 0000000..e7b6800
--- /dev/null
+++ b/docs/testing/testing-dev.rst
@@ -0,0 +1,51 @@
+.. _testing-dev:
+
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+
+========================
+Testing Developer Guides
+========================
+
+Testing group
+-------------
+.. include:: ./developer/devguide/index.rst
+
+Bottlenecks
+------------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/bottlenecks/docs/testing/developer/devguide/index
+
+
+Functest
+---------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/functest/docs/testing/developer/devguide/index
+
+
+QTIP
+-----
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/qtip/docs/testing/developer/devguide/index
+
+
+VSPERF
+-------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/vswitchperf/docs/testing/developer/devguide/index
+
+
+Yardstick
+---------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/yardstick/docs/testing/developer/devguide/index
diff --git a/docs/testing/testing-user.rst b/docs/testing/testing-user.rst
new file mode 100644
index 0000000..198b090
--- /dev/null
+++ b/docs/testing/testing-user.rst
@@ -0,0 +1,65 @@
+.. _testing-userguide:
+
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+
+===================
+Testing User Guides
+===================
+
+Bottlenecks
+------------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/bottlenecks/docs/testing/user/configguide/index
+ ../submodules/bottlenecks/docs/testing/user/userguide/index
+
+
+Functest
+---------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/functest/docs/testing/user/configguide/index
+ ../submodules/functest/docs/testing/user/userguide/index
+
+
+QTIP
+-----
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/qtip/docs/testing/user/configguide/index
+ ../submodules/qtip/docs/testing/user/userguide/index
+
+
+Storperf
+--------
+
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/storperf/docs/testing/user/index
+
+
+VSPERF
+------
+
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/vswitchperf/docs/testing/user/configguide/index
+ ../submodules/vswitchperf/docs/testing/user/userguide/index
+
+
+Yardstick
+----------
+.. toctree::
+ :maxdepth: 1
+
+ ../submodules/yardstick/docs/testing/user/configguide/index
+ ../submodules/yardstick/docs/testing/user/userguide/index
+
+
+