From 56931a0edc8149f04172419fe8b894f9aa03202a Mon Sep 17 00:00:00 2001 From: Morgan Richomme Date: Thu, 14 Sep 2017 11:53:28 +0200 Subject: Testing group documentation update for Euphrates Change-Id: Ic27cbc0d29c3c1e162814e5314a70b75eebd1714 Signed-off-by: Morgan Richomme (cherry picked from commit 99500b5ab19db0c4e9ef704b6d892e3e1a034aa6) --- docs/images/API-operations.png | Bin 0 -> 154005 bytes docs/images/CreateCase.png | Bin 0 -> 76355 bytes docs/images/DashboardBitergia.png | Bin 0 -> 101316 bytes docs/images/TestcaseCatalog.png | Bin 0 -> 129186 bytes docs/images/reportingMaster.png | Bin 0 -> 61020 bytes docs/images/swaggerUI.png | Bin 0 -> 31824 bytes docs/testing/developer/devguide/dev-guide.rst | 327 +++++++++++--------------- docs/testing/ecosystem/overview.rst | 222 +++++++++++------ docs/testing/testing-dev.rst | 2 +- docs/testing/testing-user.rst | 6 +- 10 files changed, 297 insertions(+), 260 deletions(-) create mode 100644 docs/images/API-operations.png create mode 100644 docs/images/CreateCase.png create mode 100644 docs/images/DashboardBitergia.png create mode 100644 docs/images/TestcaseCatalog.png create mode 100644 docs/images/reportingMaster.png create mode 100644 docs/images/swaggerUI.png diff --git a/docs/images/API-operations.png b/docs/images/API-operations.png new file mode 100644 index 000000000..f1ef662aa Binary files /dev/null and b/docs/images/API-operations.png differ diff --git a/docs/images/CreateCase.png b/docs/images/CreateCase.png new file mode 100644 index 000000000..311701716 Binary files /dev/null and b/docs/images/CreateCase.png differ diff --git a/docs/images/DashboardBitergia.png b/docs/images/DashboardBitergia.png new file mode 100644 index 000000000..1e3eb7bb5 Binary files /dev/null and b/docs/images/DashboardBitergia.png differ diff --git a/docs/images/TestcaseCatalog.png b/docs/images/TestcaseCatalog.png new file mode 100644 index 000000000..770199a01 Binary files /dev/null and b/docs/images/TestcaseCatalog.png differ diff --git a/docs/images/reportingMaster.png b/docs/images/reportingMaster.png new file mode 100644 index 000000000..171f33b84 Binary files /dev/null and b/docs/images/reportingMaster.png differ diff --git a/docs/images/swaggerUI.png b/docs/images/swaggerUI.png new file mode 100644 index 000000000..92d5688ba Binary files /dev/null and b/docs/images/swaggerUI.png differ diff --git a/docs/testing/developer/devguide/dev-guide.rst b/docs/testing/developer/devguide/dev-guide.rst index 95d3ac574..494c21e18 100644 --- a/docs/testing/developer/devguide/dev-guide.rst +++ b/docs/testing/developer/devguide/dev-guide.rst @@ -19,7 +19,7 @@ The OPNFV testing ecosystem is wide. The goal of this guide consists in providing some guidelines for new developers involved in test areas. -For the description of the ecosystem, see `[1]`_. +For the description of the ecosystem, see `[DEV1]`_. ================= Developer journey @@ -30,6 +30,8 @@ There are several ways to join test projects as a developer. In fact you may: * Develop new test cases * Develop frameworks * Develop tooling (reporting, dashboards, graphs, middleware,...) + * Troubleshoot results + * Post-process results These different tasks may be done within a specific project or as a shared resource accross the different projects. @@ -48,6 +50,11 @@ Tooling may be specific to a project or generic to all the projects. For specific tooling, please report to the test project user guide. The tooling used by several test projects will be detailed in this document. +The best event to meet the testing community is probably the plugfest. Such an +event is organized after each release. Most of the test projects are present. + +The summit is also a good opportunity to meet most of the actors `[DEV4]`_. + Be involved in the testing group ================================ @@ -57,8 +64,8 @@ consistant test strategy (test case definition, scope of projects, resources for long duration, documentation, ...) and align tooling or best practices. A weekly meeting is organized, the agenda may be amended by any participant. -2 slots have been defined (US and APAC). Agendas and minutes are public.See -`[8]`_ for details. +2 slots have been defined (US/Europe and APAC). Agendas and minutes are public. +See `[DEV3]`_ for details. The testing group IRC channel is #opnfv-testperf Best practices @@ -71,7 +78,7 @@ indicative. Contact the testing group for further details. Repository structure ---------------------- +-------------------- Most of the projects have a similar structure, which can be defined as follows:: @@ -110,7 +117,7 @@ projects. This pseudo micro service approach should allow a flexible use of the different projects and reduce the risk of overlapping. In fact if project A provides an API to deploy a traffic generator, it is better to reuse it rather than implementing a new way to deploy it. This approach has not been implemented -yet but the prerequisite consiting in exposing and API has already been done by +yet but the prerequisites consiting in exposing and API has already been done by several test projects. @@ -122,25 +129,76 @@ possible to prepare the environement and run tests through a CLI. Dockerization ------------- -Dockerization has been introduced in Brahmaputra and adopted by the test +Dockerization has been introduced in Brahmaputra and adopted by most of the test projects. Docker containers are pulled on the jumphost of OPNFV POD. - - - - -Unit tests ----------- - - -Traffic generators ------------------- + + +Code quality +------------ + +It is recommended to control the quality of the code of the testing projects, +and more precisely to implement some verifications before any merge: + * pep8 + * pylint + * unit tests (python 2.7) + * unit tests (python 3.5) + + +The code of the test project must be covered by unit tests. The coverage +shall be reasonable and not decrease when adding new features to the framework. +The use of tox is recommended. +It is possible to implement strict rules (no decrease of pylint score, unit +test coverages) on critical python classes. + + +Third party tooling +------------------- + +Several test projects integrate third party tooling for code quality check +and/or traffic generation. Some of the tools can be listed as follows: + ++---------------+----------------------+------------------------------------+ +| Project | Tool | Comments | ++===============+======================+====================================+ +| Bottlenecks | TODO | | ++---------------+----------------------+------------------------------------+ +| Functest | Tempest | OpenStack test tooling | +| | Rally | OpenStack test tooling | +| | Refstack | OpenStack test tooling | +| | RobotFramework | Used for ODL tests | ++---------------+----------------------+------------------------------------+ +| QTIP | Unixbench | | +| | RAMSpeed | | +| | nDPI | | +| | openSSL | | +| | inxi | | ++---------------+----------------------+------------------------------------+ +| Storperf | TODO | | ++---------------+----------------------+------------------------------------+ +| VSPERF | TODO | | ++---------------+----------------------+------------------------------------+ +| Yardstick | Moongen | Traffic generator | +| | Trex | Traffic generator | +| | Pktgen | Traffic generator | +| | IxLoad, IxNet | Traffic generator | +| | SPEC | Compute | +| | Unixbench | Compute | +| | RAMSpeed | Compute | +| | LMBench | Compute | +| | Iperf3 | Network | +| | Netperf | Network | +| | Pktgen-DPDK | Network | +| | Testpmd | Network | +| | L2fwd | Network | +| | Fio | Storage | +| | Bonnie++ | Storage | ++---------------+----------------------+------------------------------------+ ====================================== Testing group configuration parameters ====================================== - Testing categories ================== @@ -195,184 +253,76 @@ Ideally based on the declaration of the test cases, through the tags, domains and tier fields, it shall be possible to create heuristic maps. -================= -TestAPI framework -================= - -The OPNFV testing group created a test collection database to collect -the test results from CI: - - - http://testresults.opnfv.org/test/swagger/spec.html - -Any test project running on any lab integrated in CI can push the -results to this database. -This database can be used to see the evolution of the tests and compare -the results versus the installers, the scenarios or the labs. -It is used to produce a dashboard with the current test status of the project. - - -Overall Architecture -==================== -The Test result management can be summarized as follows:: - - +-------------+ +-------------+ +-------------+ - | | | | | | - | Test | | Test | | Test | - | Project #1 | | Project #2 | | Project #N | - | | | | | | - +-------------+ +-------------+ +-------------+ - | | | - V V V - +-----------------------------------------+ - | | - | Test Rest API front end | - | | - +-----------------------------------------+ - | | - | V - | +-------------------------+ - | | | - | | Test Results DB | - | | Mongo DB | - | | | - | +-------------------------+ - | - | - +----------------------+ - | | - | test Dashboard | - | | - +----------------------+ - -TestAPI description -=================== -The TestAPI is used to declare pods, projects, test cases and test -results. Pods are the sets of bare metal or virtual servers and networking -equipments used to run the tests. - -The results pushed in the database are related to pods, projects and test cases. -If you try to push results of test done on non referenced pod, the API will -return an error message. - -An additional method dashboard has been added to post-process -the raw results in release Brahmaputra (deprecated in Colorado). - -The data model is very basic, 5 objects are created: - - * Pods - * Projects - * Testcases - * Results - * Scenarios - -The code of the API is hosted in the releng repository `[6]`_. -The static documentation of the API can be found at `[7]`_. -The TestAPI has been dockerized and may be installed locally in your -lab. See `[15]`_ for details. - -The deployment of the TestAPI has been automated. -A jenkins job manages: - - * the unit tests of the TestAPI - * the creation of a new docker file - * the deployment of the new TestAPI - * the archive of the old TestAPI - * the backup of the Mongo DB - -TestAPI Authorization ---------------------- - -PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs -to be added in the request using a header 'X-Auth-Token' for access to the database. - -e.g:: - headers['X-Auth-Token'] - -The value of the header i.e the token can be accessed in the jenkins environment variable -*TestApiToken*. The token value is added as a masked password. - -.. code-block:: python - - headers['X-Auth-Token'] = os.environ.get('TestApiToken') - -The above example is in Python. Token based authentication has been added so that only ci pods -jenkins job can have access to the database. - -Please note that currently token authorization is implemented but is not yet enabled. - -=============================== -Feedback from the testing group -================================ - -Test case catalog -=================== - -A test case catalog has been realized. Roll over the project then click to get -the list of test cases, click on the case to get more details. - -.. raw:: html - :url: http://testresults.opnfv.org/reporting2/reporting/index.html#!/select/visual - -Reporting -========= - -An automatic reporting page has been created in order to provide a -consistent view of the scenarios. - -In this page, each scenario is evaluated according to test criteria. -The code for the automatic reporting is available at `[8]`_. - -The results are collected from the centralized database every day and, -per scenario. A score is calculated based on the results from the last -10 days. - -Dashboard -========= - -Dashboard is used to provide a consistent view of the results collected in CI. -The results showed on the dashboard are post processed from the Database, -which only contains raw results. - -It can be used in addition of the reporting page (high level view) to allow -the creation of specific graphs according to what the test owner wants to show. - -In Brahmaputra, a basic home made dashboard was created in Functest. -In Colorado, Yardstick adopted Grafana (time based graphs) and ELK (complex -graphs). -Since Danube, the testing community decided to adopt ELK framework and to rely -on bitergia. It was not implemented for Danube but it is planned for Euphrates. - -Bitergia already provides a dashboard for code and infrastructure. -A new Test tab will be added. The dataset will be built by consuming -the TestAPI. - -See `[3]`_ for details. - - ======= How TOs ======= Where can I find information on the different test projects? =========================================================== +On http://docs.opnfv.org! A section is dedicated to the testing projects. You +will find the overview of the ecosystem and the links to the project documents. + +Another source is the testing wiki on https://wiki.opnfv.org/display/testing + +You may also contact the testing group on the IRC channel #opnfv-testperf or by +mail at test-wg AT lists.opnfv.org (testing group) or opnfv-tech-discuss AT +lists.opnfv.org (generic technical discussions). How can I contribute to a test project? ======================================= +As any project, the best solution is to contact the project. The project +members with their email address can be found under +https://git.opnfv.org//tree/INFO + +You may also send a mail to the testing mailing list or use the IRC channel +#opnfv-testperf Where can I find hardware resources? ==================================== +You should discuss this topic with the project you are working with. If you need +access to an OPNFV community POD, it is possible to contact the infrastructure +group. Depending on your needs (scenario/installer/tooling), it should be +possible to find free time slots on one OPNFV community POD from the Pharos +federation. Create a JIRA ticket to describe your needs on +https://jira.opnfv.org/projects/INFRA. +You must already be an OPNFV contributor. See +https://wiki.opnfv.org/display/DEV/Developer+Getting+Started. + +Please note that lots of projects have their own "how to contribute" or +"get started" page on the OPNFV wiki. How do I integrate my tests in CI? ================================== - +It shall be discussed directly with the project you are working with. It is +done through jenkins jobs calling testing project files but the way to onboard +cases differ from one project to another. How to declare my tests in the test Database? ============================================= +If you have access to the test API swagger (access granted to contributors), you +may use the swagger interface of the test API to declare your project. +The URL is http://testresults.opnfv.org/test/swagger/spec.html. + +.. figure:: ../../../images/swaggerUI.png + :align: center + :alt: Testing Group Test API swagger +Click on *Spec*, the list of available methods must be displayed. + +.. figure:: ../../../images/API-operations.png + :align: center + :alt: Testing Group Test API swagger + +For the declaration of a new project use the POST /api/v1/projects method. +For the declaration of new test cases in an existing project, use the POST + /api/v1/projects/{project_name}/cases method + + .. figure:: ../../../images/CreateCase.png + :align: center + :alt: Testing group declare new test case How to push your results into the Test Database? ================================================ @@ -388,20 +338,21 @@ The architecture and associated API is described in previous chapter. If you want to push your results from CI, you just have to call the API at the end of your script. -You can also reuse a python function defined in functest_utils.py `[5]`_ +You can also reuse a python function defined in functest_utils.py `[DEV2]`_ Where can I find the documentation on the test API? =================================================== +The Test API is now documented in this document (see sections above). +You may also find autogenerated documentation in http://artifacts.opnfv.org/releng/docs/testapi.html - - +A web protal is also under construction for certification at +http://testresults.opnfv.org/test/#/ I have tests, to which category should I declare them? ====================================================== - - +See table above. The main ambiguity could be between features and VNF. In fact sometimes you have to spawn VMs to demonstrate the capabilities of the @@ -432,21 +383,17 @@ http://artifacts.opnfv.org/ References ========== -_`[1]`: http://docs.opnfv.org/en/stable-danube/testing/ecosystem/overview.html - -_`[2]`: http://www.opnfv.org - -_`[3]`: https://wiki.opnfv.org/display/testing/Result+alignment+for+ELK+post-processing - -_`[4]`: https://wiki.opnfv.org/display/INF/CI+Scenario+Naming - -_`[5]`: https://git.opnfv.org/functest/tree/functest/utils/functest_utils.py#176 +`[DEV1]`_: OPNFV Testing Ecosystem -_`[6]`: https://git.opnfv.org/functest/tree/releng +`[DEV2]`_: Python code sample to push results into the Database -_`[7]`: http://artifacts.opnfv.org/releng/docs/testapi.html +`[DEV3]`_: Testing group wiki page -_`[8]`: https://wiki.opnfv.org/display/meetings/Test+Working+Group+Weekly+Meeting +`[DEV4]`_: Conversation with the testing community, OPNFV Beijing Summit +.. _`[DEV1]`: http://docs.opnfv.org/en/latest/testing/ecosystem/index.html +.. _`[DEV2]`: https://git.opnfv.org/functest/tree/functest/utils/functest_utils.py#176 +.. _`[DEV3]`: https://wiki.opnfv.org/display/meetings/Test+Working+Group+Weekly+Meeting +.. _`[DEV4]`: https://www.youtube.com/watch?v=f9VAUdEqHoA IRC support chan: #opnfv-testperf diff --git a/docs/testing/ecosystem/overview.rst b/docs/testing/ecosystem/overview.rst index ed1657c87..43ee7771d 100644 --- a/docs/testing/ecosystem/overview.rst +++ b/docs/testing/ecosystem/overview.rst @@ -1,21 +1,21 @@ .. This work is licensed under a Creative Commons Attribution 4.0 International License. .. SPDX-License-Identifier: CC-BY-4.0 -============= -OPNFV testing -============= +====================== +OPNFV Testing Overview +====================== Introduction ============ -Testing is one of the key activities in OPNFV and includes unit, feature, component, system -level testing for development, automated deployment, performance characterization or stress -testing. +Testing is one of the key activities in OPNFV and includes unit, feature, +component, system level testing for development, automated deployment, +performance characterization and stress testing. Test projects are dedicated to provide frameworks, tooling and test-cases categorized as functional, performance or compliance testing. Test projects fulfill different roles such as verifying VIM functionality, benchmarking components and platforms or analysis of measured -KPIs for the scenarios released in OPNFV. +KPIs for OPNFV release scenarios. Feature projects also provide their own test suites that either run independently or within a test project. @@ -24,13 +24,10 @@ This document details the OPNFV testing ecosystem, describes common test compone by individual OPNFV projects and provides links to project specific documentation. -OPNFV testing ecosystem -======================= - -The testing projects --------------------- +The OPNFV Testing Ecosystem +=========================== -The OPNFV testing projects may be summarized as follows: +The OPNFV testing projects are represented in the following diagram: .. figure:: ../../images/OPNFV_testing_working_group.png :align: center @@ -92,13 +89,20 @@ The major testing projects are described in the table below: | | pass/fail thresholds for test, staging, and production | | | NFVI environments. | +----------------+---------------------------------------------------------+ -| VSperf | This project provides a framework for automation of NFV | -| | data-plane performance testing and benchmarking. The | -| | NFVI fast-path includes switch technology and network | -| | with physical and virtual interfaces. VSperf can be | -| | used to evaluate the suitability of different Switch | -| | implementations and features, quantify data-path | -| | performance and optimize platform configurations. | +| VSPERF | VSPERF is an OPNFV project that provides an automated | +| | test-framework and comprehensive test suite based on | +| | Industry Test Specifications for measuring NFVI | +| | data-plane performance. The data-path includes switching| +| | technologies with physical and virtual network | +| | interfaces. The VSPERF architecture is switch and | +| | traffic generator agnostic and test cases can be easily | +| | customized. Software versions and configurations | +| | including the vSwitch (OVS or VPP) as well as the | +| | network topology are controlled by VSPERF (independent | +| | of OpenStack). VSPERF is used as a development tool for | +| | optimizing switching technologies, qualification of | +| | packet processing components and for pre-deployment | +| | evaluation of the NFV platform data-path. | +----------------+---------------------------------------------------------+ | Yardstick | The goal of the Project is to verify the infrastructure | | | compliance when running VNF applications. NFV Use Cases | @@ -112,16 +116,27 @@ The major testing projects are described in the table below: +----------------+---------------------------------------------------------+ -=================================== -The testing working group resources -=================================== +=============================== +Testing Working Group Resources +=============================== -The assets -========== +Test Results Collection Framework +================================= -Overall Architecture --------------------- -The Test result management can be summarized as follows:: +Any test project running in the global OPNFV lab infrastructure and is +integrated with OPNFV CI can push test results to the community Test Database +using a common Test API. This database can be used to track the evolution of +testing and analyse test runs to compare results across installers, scenarios +and between technically and geographically diverse hardware environments. + +Results from the databse are used to generate a dashboard with the current test +status for each testing project. Please note that you can also deploy the Test +Database and Test API locally in your own environment. + +Overall Test Architecture +------------------------- + +The management of test results can be summarized as follows:: +-------------+ +-------------+ +-------------+ | | | | | | @@ -149,14 +164,14 @@ The Test result management can be summarized as follows:: | | +----------------------+ +----------------------+ | | | | - | Testing Dashboards | | Landing page | + | Testing Dashboards | | Test Landing page | | | | | +----------------------+ +----------------------+ -The testing databases ---------------------- -A Mongo DB Database has been introduced for the Brahmaputra release. +The Test Database +----------------- +A Mongo DB Database was introduced for the Brahmaputra release. The following collections are declared in this database: * pods: the list of pods used for production CI * projects: the list of projects providing test cases @@ -164,21 +179,21 @@ The following collections are declared in this database: * results: the results of the test cases * scenarios: the OPNFV scenarios tested in CI -This database can be used by any project through the testapi. -Please note that projects may also use additional databases. This database is -mainly use to colelct CI results and scenario trust indicators. +This database can be used by any project through the Test API. +Please note that projects may also use additional databases. The Test +Database is mainly use to collect CI test results and generate scenario +trust indicators. The Test Database is cloned for OPNFV Plugfests in +order to provide a private datastore only accessible to Plugfest participants. -This database is also cloned for OPNFV Plugfest. - -The test API ------------- +Test API description +-------------------- The Test API is used to declare pods, projects, test cases and test results. Pods correspond to the cluster of machines (3 controller and 2 compute nodes in HA mode) used to run the tests and defined in Pharos project. -The results pushed in the database are related to pods, projects and cases. -If you try to push results of test done on non referenced pod, the API will -return an error message. +The results pushed in the database are related to pods, projects and test cases. +Trying to push results generated from a non-referenced pod will return an error +message by the Test API. An additional method dashboard has been added to post-process the raw results in the Brahmaputra release (deprecated in Colorado release). @@ -192,53 +207,110 @@ The data model is very basic, 5 objects are available: For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html +The code of the Test API is hosted in the releng repository `[TST2]`_. +The static documentation of the Test API can be found at `[TST3]`_. +The Test API has been dockerized and may be installed locally in your lab. + +The deployment of the Test API has been automated. +A jenkins job manages: + + * the unit tests of the Test API + * the creation of a new docker file + * the deployment of the new Test API + * the archive of the old Test API + * the backup of the Mongo DB + +Test API Authorization +---------------------- + +PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs +to be added in the request using a header 'X-Auth-Token' for access to the database. + +e.g:: + headers['X-Auth-Token'] + +The value of the header i.e the token can be accessed in the jenkins environment variable +*TestApiToken*. The token value is added as a masked password. + +.. code-block:: python + + headers['X-Auth-Token'] = os.environ.get('TestApiToken') + +The above example is in Python. Token based authentication has been added so +that only CI pods running Jenkins jobs can access to the database. Please note +that currently token authorization is implemented but is not yet enabled. -The reporting -------------- + +Test Project Reporting +====================== The reporting page for the test projects is http://testresults.opnfv.org/reporting/ .. figure:: ../../images/reporting_page.png :align: center :alt: Testing group reporting page -This page provides a reporting per OPNFV release and per testing project. +This page provides reporting per OPNFV release and per testing project. -.. figure:: ../../images/reporting_danube_page.png +.. figure:: ../../images/reportingMaster.png :align: center - :alt: Testing group Danube reporting page + :alt: Testing group Euphrates reporting page -An evolution of this page is planned. -It was decided to unify the reporting by creating a landing page that should give -the scenario status in one glance (it was previously consolidated manually -on a wiki page). +An evolution of the reporting page is planned to unify test reporting by creating +a landing page that shows the scenario status with one glance (this information was +previously consolidated manually on a wiki page). The landing page will be displayed +per scenario and show: -The landing page (planned for Danube 2.0) will be displayed per scenario: * the status of the deployment - * the score of the test projectS + * the score from each test suite. There is no overall score, it is determined + by each test project. * a trust indicator -Additional filters (version, installer, test collection time window,... ) are -included. -The test case catalog ---------------------- -Until the Colorado release, each testing project was managing the list of its -test cases. It was very hard to have a global view of the available test cases -among the different test projects. A common view was possible through the API +Test Case Catalog +================= +Until the Colorado release, each testing project managed the list of its +test cases. This made it very hard to have a global view of the available test +cases from the different test projects. A common view was possible through the API but it was not very user friendly. -In fact you may know all the cases per project calling: +Test cases per project may be listed by calling: http://testresults.opnfv.org/test/api/v1/projects//cases with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick -It was decided to build a web site providing a consistent view of the test cases -per project and allow any scenario owner to build his/her custom list of tests -(Danube 2.0). +A test case catalog has now been realized `[TST4]`_. Roll over the project then +click to get the list of test cases, click on the case to get more details. + +.. figure:: ../../images/TestcaseCatalog.png + :align: center + :alt: Testing group testcase catalog -Other resources +Test Dashboards =============== +The Test Dashboard is used to provide a consistent view of the results collected in CI. +The results shown on the dashboard are post processed from the Database, which only +contains raw results. +The dashboard can be used in addition of the reporting page (high level view) to allow +the creation of specific graphs according to what the test owner wants to show. + +In Brahmaputra, a basic dashboard was created in Functest. +In Colorado, Yardstick used Grafana (time based graphs) and ELK (complex +graphs). +Since Danube, the OPNFV testing community decided to adopt the ELK framework and to +use Bitergia for creating highly flexible dashboards `[TST5]`_. + +.. figure:: ../../images/DashboardBitergia.png + :align: center + :alt: Testing group testcase catalog + + + OPNFV Test Group Information +============================= + +For more information or to participate in the OPNFV test community please see the +following: + wiki: https://wiki.opnfv.org/testing mailing list: test-wg@lists.opnfv.org @@ -249,8 +321,9 @@ weekly meeting (https://wiki.opnfv.org/display/meetings/TestPerf): * Usual time: Every Thursday 15:00-16:00 UTC / 7:00-8:00 PST * APAC time: 2nd Wednesday of the month 8:00-9:00 UTC + ======================= -Reference documentation +Reference Documentation ======================= +----------------+---------------------------------------------------------+ @@ -272,3 +345,20 @@ Reference documentation +----------------+---------------------------------------------------------+ | Yardstick | https://wiki.opnfv.org/display/yardstick/Yardstick | +----------------+---------------------------------------------------------+ + + +`[TST1]`_: OPNFV web site + +`[TST2]`_: Test utils in Releng + +`[TST3]`_: TestAPI autogenerated documentation + +`[TST4]`_: Testcase catalog + +`[TST5]`_: Testing group dashboard + +.. _`[TST1]`: http://www.opnfv.org +.. _`[TST2]`: https://git.opnfv.org/functest/tree/releng/utils/tests +.. _`[TST3]`: http://artifacts.opnfv.org/releng/docs/testapi.html +.. _`[TST4]`: http://testresults.opnfv.org/testing/index.html#!/select/visual +.. _`[TST5]`: https://opnfv.biterg.io:443/goto/283dba93ca18e95964f852c63af1d1ba diff --git a/docs/testing/testing-dev.rst b/docs/testing/testing-dev.rst index e7b680044..5e23312fb 100644 --- a/docs/testing/testing-dev.rst +++ b/docs/testing/testing-dev.rst @@ -9,7 +9,7 @@ Testing Developer Guides Testing group ------------- -.. include:: ./developer/devguide/index.rst +.. include:: ./developer/devguide/index Bottlenecks ------------ diff --git a/docs/testing/testing-user.rst b/docs/testing/testing-user.rst index 198b090e6..6b533a26d 100644 --- a/docs/testing/testing-user.rst +++ b/docs/testing/testing-user.rst @@ -7,6 +7,9 @@ Testing User Guides =================== +This page provides the links to the installation, configuration and user guides +of the different test projects. + Bottlenecks ------------ .. toctree:: @@ -60,6 +63,3 @@ Yardstick ../submodules/yardstick/docs/testing/user/configguide/index ../submodules/yardstick/docs/testing/user/userguide/index - - - -- cgit 1.2.3-korg