diff options
Diffstat (limited to 'docs')
46 files changed, 580 insertions, 417 deletions
diff --git a/docs/how-to-use-docs/include-documentation.rst b/docs/how-to-use-docs/include-documentation.rst index 7e9d54462..d1a5a6227 100644 --- a/docs/how-to-use-docs/include-documentation.rst +++ b/docs/how-to-use-docs/include-documentation.rst @@ -240,4 +240,15 @@ Clone the opnfvdocs repository and your submodule to .gitmodules following the c git submodule init $reponame/ git submodule update $reponame/ git add . + git commit -sv git review + +Removing a project repository as a submodule +-------------------------- + git rm docs/submodules/$reponame + rm -rf .git/modules/$reponame + git config -f .git/config --remove-section submodule.$reponame 2> /dev/null + git add . + git commit -sv + git review + diff --git a/docs/images/API-operations.png b/docs/images/API-operations.png Binary files differnew file mode 100644 index 000000000..f1ef662aa --- /dev/null +++ b/docs/images/API-operations.png diff --git a/docs/images/CreateCase.png b/docs/images/CreateCase.png Binary files differnew file mode 100644 index 000000000..311701716 --- /dev/null +++ b/docs/images/CreateCase.png diff --git a/docs/images/DashboardBitergia.png b/docs/images/DashboardBitergia.png Binary files differnew file mode 100644 index 000000000..1e3eb7bb5 --- /dev/null +++ b/docs/images/DashboardBitergia.png diff --git a/docs/images/TestcaseCatalog.png b/docs/images/TestcaseCatalog.png Binary files differnew file mode 100644 index 000000000..770199a01 --- /dev/null +++ b/docs/images/TestcaseCatalog.png diff --git a/docs/images/reportingMaster.png b/docs/images/reportingMaster.png Binary files differnew file mode 100644 index 000000000..171f33b84 --- /dev/null +++ b/docs/images/reportingMaster.png diff --git a/docs/images/swaggerUI.png b/docs/images/swaggerUI.png Binary files differnew file mode 100644 index 000000000..92d5688ba --- /dev/null +++ b/docs/images/swaggerUI.png diff --git a/docs/submodules/apex b/docs/submodules/apex -Subproject 002537a8dd9dde4f488e2be75f0f69ea43b500b +Subproject d76b7e53517116a4b5cbe10dbf69a1a82e9b9a4 diff --git a/docs/submodules/armband b/docs/submodules/armband -Subproject e0d9c392488c74393574fcd93ad1f8a3ad57613 +Subproject 0ced943988c95f9e033dd5d14cfe54870c50fa0 diff --git a/docs/submodules/availability b/docs/submodules/availability -Subproject 115c90829b9cfd773853fd3ee0293599308d597 +Subproject 374fe9ba9a1fc9c10eb6fa44a25b177f1e9ad96 diff --git a/docs/submodules/barometer b/docs/submodules/barometer -Subproject 151a5d6db45763a4c130a37d69f120704fac803 +Subproject b104e2981f0b0ec19f109e220cddc76d97d0ed1 diff --git a/docs/submodules/bottlenecks b/docs/submodules/bottlenecks -Subproject 6edf7710c416949a9c1ad3520b6febd3a41deb6 +Subproject ad12c5707ac1191063af5cd33d54dd30ff64978 diff --git a/docs/submodules/compass4nfv b/docs/submodules/compass4nfv -Subproject c450b54ffb461e5a9a157b13679686170f7c385 +Subproject 074eab95235e35431b0439e42d253d44b139004 diff --git a/docs/submodules/container4nfv b/docs/submodules/container4nfv new file mode 160000 +Subproject f8745467295773dd0eea75079f5e2b0b50bfcd2 diff --git a/docs/submodules/daisy b/docs/submodules/daisy -Subproject f3ad3936a5d395ecb31849835e6517bdc5de8d5 +Subproject 2045ccff6a31ce649cfabc0ba896e9d0d708e3e diff --git a/docs/submodules/doctor b/docs/submodules/doctor -Subproject 96665938926b5083cf917517e2ecb7c7267d72c +Subproject a321477d00ccb84b0d99e7476e4306605498409 diff --git a/docs/submodules/domino b/docs/submodules/domino -Subproject 21f973902ee4428462564d27719f366b41ce04c +Subproject eb2fbb1315e6489dd159c8227030d035bdeb186 diff --git a/docs/submodules/dovetail b/docs/submodules/dovetail -Subproject 344f27b87856faa0f5440d198b81038ae7b4562 +Subproject 312db1d49ecfbefbb8b8795cfc1c2b58c07b711 diff --git a/docs/submodules/fds b/docs/submodules/fds -Subproject b0fb36a3ef00e7959a750a96b86fc87bc41a194 +Subproject f5e75a40dd663fd42645f5a334983ce519a697b diff --git a/docs/submodules/fuel b/docs/submodules/fuel -Subproject f2ea17e489b4c3f5e0b5cdc75860b094f3a05b9 +Subproject 2e9fbb20072005831fe96af94b8d1495f5eb30a diff --git a/docs/submodules/functest b/docs/submodules/functest -Subproject af86e67845ff7d074d389aa54cd4c3744d03734 +Subproject 31524290992dad02d5ca1e6b17304bf31b56b7f diff --git a/docs/submodules/ipv6 b/docs/submodules/ipv6 -Subproject a3b23660532d314de57a51c98ef30dddf14f5e8 +Subproject be8ef89c7d887117aa6f0d6ef3086509b0dbb3b diff --git a/docs/submodules/joid b/docs/submodules/joid -Subproject fd8bd0c1beb084655a06a2cc5409fde190951e8 +Subproject 30f7cf181e9e3b275d9ce62c2f74e9ab443fe5a diff --git a/docs/submodules/kvmfornfv b/docs/submodules/kvmfornfv -Subproject d651cc71797f8f32b0fe40ca4ee1c21d50558fd +Subproject 827627ae5f5674775062ab6a8a31a0ae1bbba7c diff --git a/docs/submodules/nfvbench b/docs/submodules/nfvbench -Subproject 4d3864c3972250654c3750764c2cf18e13c631d +Subproject 846250f4c570798a1058dd8d3a405513c490093 diff --git a/docs/submodules/openretriever b/docs/submodules/openretriever deleted file mode 160000 -Subproject b7554f86392beedae07cf5e103d2da4c2c8971a diff --git a/docs/submodules/orchestra b/docs/submodules/orchestra new file mode 160000 +Subproject d897a8fae077e366a9a9e1fa53d160b0db6e356 diff --git a/docs/submodules/ovn4nfv b/docs/submodules/ovn4nfv new file mode 160000 +Subproject 810bb7f5c6615a2d90d9451337e6df6a01702e5 diff --git a/docs/submodules/parser b/docs/submodules/parser -Subproject d66af1c9113ec897049a55b80ca70496651502b +Subproject a224201e408aca8424b076b5e46138b5e322a9e diff --git a/docs/submodules/pharos b/docs/submodules/pharos -Subproject aa88673c38be12368dead5e8241fb915d790c43 +Subproject b201dbf1f3f8723793a06351b7cdd2389d44565 diff --git a/docs/submodules/promise b/docs/submodules/promise -Subproject 7c94bb3d81a1caabe54afef0387d7055e4e08f4 +Subproject c0f8f3487b1782eeb05ce6d0d2c4b8578594e5d diff --git a/docs/submodules/qtip b/docs/submodules/qtip -Subproject 4bc87f5dc55cfbcc1cc7cdcbd1b6414d0da1e98 +Subproject 583d7181b15ae6f8d6319c36bad6fd043f47c89 diff --git a/docs/submodules/releng b/docs/submodules/releng -Subproject eabeb68cdd73b39bfd9f1cd9e5283b7cbf87459 +Subproject 31056a7180802ecfcd3e8abb58e74911ef67964 diff --git a/docs/submodules/releng-xci b/docs/submodules/releng-xci new file mode 160000 +Subproject 9b0b4e1979633c4320dc0dea0707910c36cb056 diff --git a/docs/submodules/samplevnf b/docs/submodules/samplevnf -Subproject ada951272b7d7cc06dd0086276586d1d4032261 +Subproject be23f20bcddd63dec506515c1518c995b0f8aa8 diff --git a/docs/submodules/sdnvpn b/docs/submodules/sdnvpn -Subproject 8b31653352a27e1dc9495d6f19b25849ffbbda1 +Subproject 8a20b43bfb9723ae1f2410a5448363a5e3f38f7 diff --git a/docs/submodules/sfc b/docs/submodules/sfc -Subproject 0339379381aca3e0234125945c5dabbe895b503 +Subproject 9c11d59035ff1741c5e6d935aa7c2ed23d15f48 diff --git a/docs/submodules/snaps b/docs/submodules/snaps -Subproject 49aaa5d61e87e11c5d5b9ce7dd2fa598f16b82a +Subproject 6229af39550dfef0b44f79b5d17c184bb098e69 diff --git a/docs/submodules/storperf b/docs/submodules/storperf -Subproject 383d814f0d5d4db55c6f4c621fa346ceab580a3 +Subproject 85f0bd5bfe83456a7e73fe12d2e3232c4f58e35 diff --git a/docs/submodules/vswitchperf b/docs/submodules/vswitchperf -Subproject 3e22a019e196996230b47d91b9c3c4893656a3c +Subproject 0549aa1f1a694899fec3b16b44230b5c60d2fa2 diff --git a/docs/submodules/yardstick b/docs/submodules/yardstick -Subproject 952ccb549c08b620a35f20ae809e0cea88ae4d9 +Subproject b00112e33caffee6b6b01402537e68007fdc8cb diff --git a/docs/testing/developer/devguide/dev-guide.rst b/docs/testing/developer/devguide/dev-guide.rst new file mode 100644 index 000000000..494c21e18 --- /dev/null +++ b/docs/testing/developer/devguide/dev-guide.rst @@ -0,0 +1,399 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. SPDX-License-Identifier: CC-BY-4.0 + +*********************** +Testing developer guide +*********************** + +.. toctree:: + :numbered: + :maxdepth: 2 + + +============ +Introduction +============ + +The OPNFV testing ecosystem is wide. + +The goal of this guide consists in providing some guidelines for new developers +involved in test areas. + +For the description of the ecosystem, see `[DEV1]`_. + +================= +Developer journey +================= + +There are several ways to join test projects as a developer. In fact you may: + + * Develop new test cases + * Develop frameworks + * Develop tooling (reporting, dashboards, graphs, middleware,...) + * Troubleshoot results + * Post-process results + +These different tasks may be done within a specific project or as a shared +resource accross the different projects. + +If you develop new test cases, the best practice is to contribute upstream as +much as possible. You may contact the testing group to know which project - in +OPNFV or upstream - would be the best place to host the test cases. Such +contributions are usually directly connected to a specific project, more details +can be found in the user guides of the testing projects. + +Each OPNFV testing project provides test cases and the framework to manage them. +As a developer, you can obviously contribute to them. The developer guide of +the testing projects shall indicate the procedure to follow. + +Tooling may be specific to a project or generic to all the projects. For +specific tooling, please report to the test project user guide. The tooling used +by several test projects will be detailed in this document. + +The best event to meet the testing community is probably the plugfest. Such an +event is organized after each release. Most of the test projects are present. + +The summit is also a good opportunity to meet most of the actors `[DEV4]`_. + +Be involved in the testing group +================================ + +The testing group is a self organized working group. The OPNFV projects dealing +with testing are invited to participate in order to elaborate and consolidate a +consistant test strategy (test case definition, scope of projects, resources for +long duration, documentation, ...) and align tooling or best practices. + +A weekly meeting is organized, the agenda may be amended by any participant. +2 slots have been defined (US/Europe and APAC). Agendas and minutes are public. +See `[DEV3]`_ for details. +The testing group IRC channel is #opnfv-testperf + +Best practices +============== + +All the test projects do not have the same maturity and/or number of +contributors. The nature of the test projects may be also different. The +following best practices may not be acurate for all the projects and are only +indicative. Contact the testing group for further details. + + +Repository structure +-------------------- + +Most of the projects have a similar structure, which can be defined as follows:: + + `-- home + |-- requirements.txt + |-- setup.py + |-- tox.ini + | + |-- <project> + | |-- <api> + | |-- <framework> + | `-- <test cases> + | + |-- docker + | |-- Dockerfile + | `-- Dockerfile.aarch64.patch + |-- <unit tests> + `- docs + |-- release + | |-- release-notes + | `-- results + `-- testing + |-- developer + | `-- devguide + |-- user + `-- userguide + + +API +--- +Test projects are installing tools and triggering tests. When it is possible it +is recommended to implement an API in order to perform the different actions. + +Each test project should be able to expose and consume APIs from other test +projects. This pseudo micro service approach should allow a flexible use of +the different projects and reduce the risk of overlapping. In fact if project A +provides an API to deploy a traffic generator, it is better to reuse it rather +than implementing a new way to deploy it. This approach has not been implemented +yet but the prerequisites consiting in exposing and API has already been done by +several test projects. + + +CLI +--- +Most of the test projects provide a docker as deliverable. Once connected, it is +possible to prepare the environement and run tests through a CLI. + + +Dockerization +------------- +Dockerization has been introduced in Brahmaputra and adopted by most of the test +projects. Docker containers are pulled on the jumphost of OPNFV POD. +<TODO Jose/Mark/Alec> + +Code quality +------------ + +It is recommended to control the quality of the code of the testing projects, +and more precisely to implement some verifications before any merge: + * pep8 + * pylint + * unit tests (python 2.7) + * unit tests (python 3.5) + + +The code of the test project must be covered by unit tests. The coverage +shall be reasonable and not decrease when adding new features to the framework. +The use of tox is recommended. +It is possible to implement strict rules (no decrease of pylint score, unit +test coverages) on critical python classes. + + +Third party tooling +------------------- + +Several test projects integrate third party tooling for code quality check +and/or traffic generation. Some of the tools can be listed as follows: + ++---------------+----------------------+------------------------------------+ +| Project | Tool | Comments | ++===============+======================+====================================+ +| Bottlenecks | TODO | | ++---------------+----------------------+------------------------------------+ +| Functest | Tempest | OpenStack test tooling | +| | Rally | OpenStack test tooling | +| | Refstack | OpenStack test tooling | +| | RobotFramework | Used for ODL tests | ++---------------+----------------------+------------------------------------+ +| QTIP | Unixbench | | +| | RAMSpeed | | +| | nDPI | | +| | openSSL | | +| | inxi | | ++---------------+----------------------+------------------------------------+ +| Storperf | TODO | | ++---------------+----------------------+------------------------------------+ +| VSPERF | TODO | | ++---------------+----------------------+------------------------------------+ +| Yardstick | Moongen | Traffic generator | +| | Trex | Traffic generator | +| | Pktgen | Traffic generator | +| | IxLoad, IxNet | Traffic generator | +| | SPEC | Compute | +| | Unixbench | Compute | +| | RAMSpeed | Compute | +| | LMBench | Compute | +| | Iperf3 | Network | +| | Netperf | Network | +| | Pktgen-DPDK | Network | +| | Testpmd | Network | +| | L2fwd | Network | +| | Fio | Storage | +| | Bonnie++ | Storage | ++---------------+----------------------+------------------------------------+ + + +====================================== +Testing group configuration parameters +====================================== + +Testing categories +================== + +The testing group defined several categories also known as tiers. These +categories can be used to group test suites. + ++----------------+-------------------------------------------------------------+ +| Category | Description | ++================+=============================================================+ +| Healthcheck | Simple and quick healthcheck tests case | ++----------------+-------------------------------------------------------------+ +| Smoke | Set of smoke test cases/suites to validate the release | ++----------------+-------------------------------------------------------------+ +| Features | Test cases that validate a specific feature on top of OPNFV.| +| | Those come from Feature projects and need a bit of support | +| | for integration | ++----------------+-------------------------------------------------------------+ +| Components | Tests on a specific component (e.g. OpenStack, OVS, DPDK,..)| +| | It may extend smoke tests | ++----------------+-------------------------------------------------------------+ +| Performance | Performance qualification | ++----------------+-------------------------------------------------------------+ +| VNF | Test cases related to deploy an open source VNF including | +| | an orchestrator | ++----------------+-------------------------------------------------------------+ +| Stress | Stress and robustness tests | ++----------------+-------------------------------------------------------------+ +| In Service | In service testing | ++----------------+-------------------------------------------------------------+ + +Testing domains +=============== + +The domains deal with the technical scope of the tests. It shall correspond to +domains defined for the certification program: + + * compute + * network + * storage + * hypervisor + * container + * vim + * mano + * vnf + * ... + +Testing coverage +================= +One of the goals of the testing working group is to identify the poorly covered +areas and avoid testing overlap. +Ideally based on the declaration of the test cases, through the tags, domains +and tier fields, it shall be possible to create heuristic maps. + + +======= +How TOs +======= + +Where can I find information on the different test projects? +=========================================================== +On http://docs.opnfv.org! A section is dedicated to the testing projects. You +will find the overview of the ecosystem and the links to the project documents. + +Another source is the testing wiki on https://wiki.opnfv.org/display/testing + +You may also contact the testing group on the IRC channel #opnfv-testperf or by +mail at test-wg AT lists.opnfv.org (testing group) or opnfv-tech-discuss AT +lists.opnfv.org (generic technical discussions). + + +How can I contribute to a test project? +======================================= +As any project, the best solution is to contact the project. The project +members with their email address can be found under +https://git.opnfv.org/<project>/tree/INFO + +You may also send a mail to the testing mailing list or use the IRC channel +#opnfv-testperf + + +Where can I find hardware resources? +==================================== +You should discuss this topic with the project you are working with. If you need +access to an OPNFV community POD, it is possible to contact the infrastructure +group. Depending on your needs (scenario/installer/tooling), it should be +possible to find free time slots on one OPNFV community POD from the Pharos +federation. Create a JIRA ticket to describe your needs on +https://jira.opnfv.org/projects/INFRA. +You must already be an OPNFV contributor. See +https://wiki.opnfv.org/display/DEV/Developer+Getting+Started. + +Please note that lots of projects have their own "how to contribute" or +"get started" page on the OPNFV wiki. + + +How do I integrate my tests in CI? +================================== +It shall be discussed directly with the project you are working with. It is +done through jenkins jobs calling testing project files but the way to onboard +cases differ from one project to another. + +How to declare my tests in the test Database? +============================================= +If you have access to the test API swagger (access granted to contributors), you +may use the swagger interface of the test API to declare your project. +The URL is http://testresults.opnfv.org/test/swagger/spec.html. + +.. figure:: ../../../images/swaggerUI.png + :align: center + :alt: Testing Group Test API swagger + +Click on *Spec*, the list of available methods must be displayed. + +.. figure:: ../../../images/API-operations.png + :align: center + :alt: Testing Group Test API swagger + +For the declaration of a new project use the POST /api/v1/projects method. +For the declaration of new test cases in an existing project, use the POST + /api/v1/projects/{project_name}/cases method + + .. figure:: ../../../images/CreateCase.png + :align: center + :alt: Testing group declare new test case + +How to push your results into the Test Database? +================================================ + +The test database is used to collect test results. By default it is +enabled only for CI tests from Production CI pods. + +Please note that it is possible to create your own local database. + +A dedicated database is for instance created for each plugfest. + +The architecture and associated API is described in previous chapter. +If you want to push your results from CI, you just have to call the API +at the end of your script. + +You can also reuse a python function defined in functest_utils.py `[DEV2]`_ + + +Where can I find the documentation on the test API? +=================================================== + +The Test API is now documented in this document (see sections above). +You may also find autogenerated documentation in +http://artifacts.opnfv.org/releng/docs/testapi.html +A web protal is also under construction for certification at +http://testresults.opnfv.org/test/#/ + +I have tests, to which category should I declare them? +====================================================== +See table above. + +The main ambiguity could be between features and VNF. +In fact sometimes you have to spawn VMs to demonstrate the capabilities of the +feature you introduced. +We recommend to declare your test in the feature category. + +VNF category is really dedicated to test including: + + * creation of resources + * deployement of an orchestrator/VNFM + * deployment of the VNF + * test of the VNFM + * free resources + +The goal is not to study a particular feature on the infrastructure but to have +a whole end to end test of a VNF automatically deployed in CI. +Moreover VNF are run in weekly jobs (one a week), feature tests are in daily +jobs and use to get a scenario score. + +Where are the logs of CI runs? +============================== + +Logs and configuration files can be pushed to artifact server from the CI under +http://artifacts.opnfv.org/<project name> + + +========== +References +========== + +`[DEV1]`_: OPNFV Testing Ecosystem + +`[DEV2]`_: Python code sample to push results into the Database + +`[DEV3]`_: Testing group wiki page + +`[DEV4]`_: Conversation with the testing community, OPNFV Beijing Summit + +.. _`[DEV1]`: http://docs.opnfv.org/en/latest/testing/ecosystem/index.html +.. _`[DEV2]`: https://git.opnfv.org/functest/tree/functest/utils/functest_utils.py#176 +.. _`[DEV3]`: https://wiki.opnfv.org/display/meetings/Test+Working+Group+Weekly+Meeting +.. _`[DEV4]`: https://www.youtube.com/watch?v=f9VAUdEqHoA + +IRC support chan: #opnfv-testperf diff --git a/docs/testing/developer/devguide/index.rst b/docs/testing/developer/devguide/index.rst index 866885956..f661ed335 100644 --- a/docs/testing/developer/devguide/index.rst +++ b/docs/testing/developer/devguide/index.rst @@ -1,353 +1,13 @@ .. This work is licensed under a Creative Commons Attribution 4.0 International License. -.. SPDX-License-Identifier: CC-BY-4.0 +.. http://creativecommons.org/licenses/by/4.0 +.. (c) Christopher Price (Ericsson AB) -*********************** -Testing developer guide -*********************** +======================== +Test Framework Overview +======================== .. toctree:: - :numbered: :maxdepth: 2 - -============ -Introduction -============ - -The OPNFV testing ecosystem is wide. - -The goal of this guide consists in providing some guidelines for new developers -involved in test areas. - -For the description of the ecosystem, see `[1]`_. - - -================= -Developer journey -================= - -Be involved in the testing group -================================ - -Best practices -============== - -Unit tests ----------- - -Dockerization -------------- - -API ---- - -CLI ---- - -Traffic generators ------------------- - -Towards a pseudo micro services approach ----------------------------------------- - -====================================== -Testing group configuration parameters -====================================== - - -Testing categories -================== - -The testing group defined several categories also known as tiers. These -categories can be used to group test suites. - -+----------------+-------------------------------------------------------------+ -| Healthcheck | Simple and quick healthcheck tests case | -+----------------+-------------------------------------------------------------+ -| Smoke | Set of smoke test cases/suites to validate the release | -+----------------+-------------------------------------------------------------+ -| Features | Test cases that validate a specific feature on top of OPNFV.| -| | Those come from Feature projects and need a bit of support | -| | for integration | -+----------------+-------------------------------------------------------------+ -| Components | Tests on a specific component (e.g. OpenStack, OVS, DPDK,..)| -| | It may extend smoke tests | -+----------------+-------------------------------------------------------------+ -| Performance | Performance qualification | -+----------------+-------------------------------------------------------------+ -| VNF | Test cases related to deploy an open source VNF including | -| | an orchestrator | -+----------------+-------------------------------------------------------------+ -| Stress | Stress and robustness tests | -+----------------+-------------------------------------------------------------+ -| In Service | In service testing | -+----------------+-------------------------------------------------------------+ - -Testing domains -=============== - -The domains deal with the technical scope of the tests. It shall correspond to -domains defined for the certification program: - - * compute - * network - * storage - * hypervisor - * container - * vim - * mano - * vnf - * ... - -Testing coverage -================= -One of the goals of the testing working group is to identify the poorly covered -areas and avoid testing overlap. -Ideally based on the declaration of the test cases, through the tags, domains -and tier fields, it shall be possible to create heuristic maps. - - -============================== -Testing group generic enablers -============================== - - -TestAPI framework -================= - -The OPNFV testing group created a test collection database to collect -the test results from CI: - - - http://testresults.opnfv.org/test/swagger/spec.html - -Any test project running on any lab integrated in CI can push the -results to this database. -This database can be used to see the evolution of the tests and compare -the results versus the installers, the scenarios or the labs. -It is used to produce a dashboard with the current test status of the project. - - -Overall Architecture --------------------- -The Test result management can be summarized as follows:: - - +-------------+ +-------------+ +-------------+ - | | | | | | - | Test | | Test | | Test | - | Project #1 | | Project #2 | | Project #N | - | | | | | | - +-------------+ +-------------+ +-------------+ - | | | - V V V - +-----------------------------------------+ - | | - | Test Rest API front end | - | | - +-----------------------------------------+ - A | - | V - | +-------------------------+ - | | | - | | Test Results DB | - | | Mongo DB | - | | | - | +-------------------------+ - | - | - +----------------------+ - | | - | test Dashboard | - | | - +----------------------+ - -TestAPI description -------------------- -The TestAPI is used to declare pods, projects, test cases and test -results. Pods are the sets of bare metal or virtual servers and networking -equipments used to run the tests. - -The results pushed in the database are related to pods, projects and test cases. -If you try to push results of test done on non referenced pod, the API will -return an error message. - -An additional method dashboard has been added to post-process -the raw results in release Brahmaputra (deprecated in Colorado). - -The data model is very basic, 5 objects are created: - - * Pods - * Projects - * Testcases - * Results - * Scenarios - -The code of the API is hosted in the releng repository `[6]`_. -The static documentation of the API can be found at `[7]`_. -The TestAPI has been dockerized and may be installed locally in your -lab. See `[15]`_ for details. - -The deployment of the TestAPI has been automated. -A jenkins job manages: - - * the unit tests of the TestAPI - * the creation of a new docker file - * the deployment of the new TestAPI - * the archive of the old TestAPI - * the backup of the Mongo DB - -TestAPI Authorization -~~~~~~~~~~~~~~~~~~~~~ - -PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs -to be added in the request using a header 'X-Auth-Token' for access to the database. - -e.g:: - headers['X-Auth-Token'] - -The value of the header i.e the token can be accessed in the jenkins environment variable -*TestApiToken*. The token value is added as a masked password. - -.. code-block:: python - - headers['X-Auth-Token'] = os.environ.get('TestApiToken') - -The above example is in Python. Token based authentication has been added so that only ci pods -jenkins job can have access to the database. - -Please note that currently token authorization is implemented but is not yet enabled. - -Reporting -========= - -An automatic reporting page has been created in order to provide a -consistent view of the scenarios. - -In this page, each scenario is evaluated according to test criteria. -The code for the automatic reporting is available at `[8]`_. - -The results are collected from the centralized database every day and, -per scenario. A score is calculated based on the results from the last -10 days. - -Dashboard -========= - -Dashboard is used to provide a consistent view of the results collected in CI. -The results showed on the dashboard are post processed from the Database, -which only contains raw results. - -It can be used in addition of the reporting page (high level view) to allow -the creation of specific graphs according to what the test owner wants to show. - -In Brahmaputra, a basic home made dashboard was created in Functest. -In Colorado, Yardstick adopted Grafana (time based graphs) and ELK (complex -graphs). -Since Danube, the testing community decided to adopt ELK framework and to rely -on bitergia. It was not implemented for Danube but it is planned for Euphrates. - -Bitergia already provides a dashboard for code and infrastructure. -A new Test tab will be added. The dataset will be built by consuming -the TestAPI. - -See `[3]`_ for details. - - -======= -How TOs -======= - -Where can I find information on the different test projects? -=========================================================== - - -How can I contribute to a test project? -======================================= - - -Where can I find hardware resources? -==================================== - - -How do I integrate my tests in CI? -================================== - - -How to declare my tests in the test Database? -============================================= - - -How to push your results into the Test Database? -================================================ - -The test database is used to collect test results. By default it is -enabled only for CI tests from Production CI pods. - -Please note that it is possible to create your own local database. - -A dedicated database is for instance created for each plugfest. - -The architecture and associated API is described in previous chapter. -If you want to push your results from CI, you just have to call the API -at the end of your script. - -You can also reuse a python function defined in functest_utils.py `[5]`_ - - -Where can I find the documentation on the test API? -=================================================== - -http://artifacts.opnfv.org/releng/docs/testapi.html - - - -I have tests, to which category should I declare them? -====================================================== - - - -The main ambiguity could be between features and VNF. -In fact sometimes you have to spawn VMs to demonstrate the capabilities of the -feature you introduced. -We recommend to declare your test in the feature category. - -VNF category is really dedicated to test including: - - * creation of resources - * deployement of an orchestrator/VNFM - * deployment of the VNF - * test of the VNFM - * free resources - -The goal is not to study a particular feature on the infrastructure but to have -a whole end to end test of a VNF automatically deployed in CI. -Moreover VNF are run in weekly jobs (one a week), feature tests are in daily -jobs and use to get a scenario score. - -Where are the logs of CI runs? -============================== - -Logs and configuration files can be pushed to artifact server from the CI under -http://artifacts.opnfv.org/<project name> - - -========== -References -========== - -_`[1]`: http://docs.opnfv.org/en/stable-danube/testing/ecosystem/overview.html - -_`[2]`: http://www.opnfv.org - -_`[3]`: https://wiki.opnfv.org/display/testing/Result+alignment+for+ELK+post-processing - -_`[4]`: https://wiki.opnfv.org/display/INF/CI+Scenario+Naming - -_`[5]`: https://git.opnfv.org/functest/tree/functest/utils/functest_utils.py#176 - -_`[6]`: https://git.opnfv.org/functest/tree/releng - -_`[7]`: http://artifacts.opnfv.org/releng/docs/testapi.html - - -IRC support chan: #opnfv-testperf + ./abstract + ./dev-guide diff --git a/docs/testing/ecosystem/overview.rst b/docs/testing/ecosystem/overview.rst index ed1657c87..42e38ac04 100644 --- a/docs/testing/ecosystem/overview.rst +++ b/docs/testing/ecosystem/overview.rst @@ -1,21 +1,21 @@ .. This work is licensed under a Creative Commons Attribution 4.0 International License. .. SPDX-License-Identifier: CC-BY-4.0 -============= -OPNFV testing -============= +====================== +OPNFV Testing Overview +====================== Introduction ============ -Testing is one of the key activities in OPNFV and includes unit, feature, component, system -level testing for development, automated deployment, performance characterization or stress -testing. +Testing is one of the key activities in OPNFV and includes unit, feature, +component, system level testing for development, automated deployment, +performance characterization and stress testing. Test projects are dedicated to provide frameworks, tooling and test-cases categorized as functional, performance or compliance testing. Test projects fulfill different roles such as verifying VIM functionality, benchmarking components and platforms or analysis of measured -KPIs for the scenarios released in OPNFV. +KPIs for OPNFV release scenarios. Feature projects also provide their own test suites that either run independently or within a test project. @@ -24,13 +24,10 @@ This document details the OPNFV testing ecosystem, describes common test compone by individual OPNFV projects and provides links to project specific documentation. -OPNFV testing ecosystem -======================= - -The testing projects --------------------- +The OPNFV Testing Ecosystem +=========================== -The OPNFV testing projects may be summarized as follows: +The OPNFV testing projects are represented in the following diagram: .. figure:: ../../images/OPNFV_testing_working_group.png :align: center @@ -92,13 +89,20 @@ The major testing projects are described in the table below: | | pass/fail thresholds for test, staging, and production | | | NFVI environments. | +----------------+---------------------------------------------------------+ -| VSperf | This project provides a framework for automation of NFV | -| | data-plane performance testing and benchmarking. The | -| | NFVI fast-path includes switch technology and network | -| | with physical and virtual interfaces. VSperf can be | -| | used to evaluate the suitability of different Switch | -| | implementations and features, quantify data-path | -| | performance and optimize platform configurations. | +| VSPERF | VSPERF is an OPNFV project that provides an automated | +| | test-framework and comprehensive test suite based on | +| | Industry Test Specifications for measuring NFVI | +| | data-plane performance. The data-path includes switching| +| | technologies with physical and virtual network | +| | interfaces. The VSPERF architecture is switch and | +| | traffic generator agnostic and test cases can be easily | +| | customized. Software versions and configurations | +| | including the vSwitch (OVS or VPP) as well as the | +| | network topology are controlled by VSPERF (independent | +| | of OpenStack). VSPERF is used as a development tool for | +| | optimizing switching technologies, qualification of | +| | packet processing components and for pre-deployment | +| | evaluation of the NFV platform data-path. | +----------------+---------------------------------------------------------+ | Yardstick | The goal of the Project is to verify the infrastructure | | | compliance when running VNF applications. NFV Use Cases | @@ -112,16 +116,27 @@ The major testing projects are described in the table below: +----------------+---------------------------------------------------------+ -=================================== -The testing working group resources -=================================== +=============================== +Testing Working Group Resources +=============================== -The assets -========== +Test Results Collection Framework +================================= -Overall Architecture --------------------- -The Test result management can be summarized as follows:: +Any test project running in the global OPNFV lab infrastructure and is +integrated with OPNFV CI can push test results to the community Test Database +using a common Test API. This database can be used to track the evolution of +testing and analyse test runs to compare results across installers, scenarios +and between technically and geographically diverse hardware environments. + +Results from the databse are used to generate a dashboard with the current test +status for each testing project. Please note that you can also deploy the Test +Database and Test API locally in your own environment. + +Overall Test Architecture +------------------------- + +The management of test results can be summarized as follows:: +-------------+ +-------------+ +-------------+ | | | | | | @@ -149,14 +164,14 @@ The Test result management can be summarized as follows:: | | +----------------------+ +----------------------+ | | | | - | Testing Dashboards | | Landing page | + | Testing Dashboards | | Test Landing page | | | | | +----------------------+ +----------------------+ -The testing databases ---------------------- -A Mongo DB Database has been introduced for the Brahmaputra release. +The Test Database +----------------- +A Mongo DB Database was introduced for the Brahmaputra release. The following collections are declared in this database: * pods: the list of pods used for production CI * projects: the list of projects providing test cases @@ -164,21 +179,21 @@ The following collections are declared in this database: * results: the results of the test cases * scenarios: the OPNFV scenarios tested in CI -This database can be used by any project through the testapi. -Please note that projects may also use additional databases. This database is -mainly use to colelct CI results and scenario trust indicators. +This database can be used by any project through the Test API. +Please note that projects may also use additional databases. The Test +Database is mainly use to collect CI test results and generate scenario +trust indicators. The Test Database is cloned for OPNFV Plugfests in +order to provide a private datastore only accessible to Plugfest participants. -This database is also cloned for OPNFV Plugfest. - -The test API ------------- +Test API description +-------------------- The Test API is used to declare pods, projects, test cases and test results. Pods correspond to the cluster of machines (3 controller and 2 compute nodes in HA mode) used to run the tests and defined in Pharos project. -The results pushed in the database are related to pods, projects and cases. -If you try to push results of test done on non referenced pod, the API will -return an error message. +The results pushed in the database are related to pods, projects and test cases. +Trying to push results generated from a non-referenced pod will return an error +message by the Test API. An additional method dashboard has been added to post-process the raw results in the Brahmaputra release (deprecated in Colorado release). @@ -192,53 +207,110 @@ The data model is very basic, 5 objects are available: For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html +The code of the Test API is hosted in the releng repository `[TST2]`_. +The static documentation of the Test API can be found at `[TST3]`_. +The Test API has been dockerized and may be installed locally in your lab. + +The deployment of the Test API has been automated. +A jenkins job manages: + + * the unit tests of the Test API + * the creation of a new docker file + * the deployment of the new Test API + * the archive of the old Test API + * the backup of the Mongo DB + +Test API Authorization +---------------------- + +PUT/DELETE/POST operations of the TestAPI now require token based authorization. The token needs +to be added in the request using a header 'X-Auth-Token' for access to the database. + +e.g:: + headers['X-Auth-Token'] + +The value of the header i.e the token can be accessed in the jenkins environment variable +*TestApiToken*. The token value is added as a masked password. + +.. code-block:: python + + headers['X-Auth-Token'] = os.environ.get('TestApiToken') + +The above example is in Python. Token based authentication has been added so +that only CI pods running Jenkins jobs can access to the database. Please note +that currently token authorization is implemented but is not yet enabled. -The reporting -------------- + +Test Project Reporting +====================== The reporting page for the test projects is http://testresults.opnfv.org/reporting/ .. figure:: ../../images/reporting_page.png :align: center :alt: Testing group reporting page -This page provides a reporting per OPNFV release and per testing project. +This page provides reporting per OPNFV release and per testing project. -.. figure:: ../../images/reporting_danube_page.png +.. figure:: ../../images/reportingMaster.png :align: center - :alt: Testing group Danube reporting page + :alt: Testing group Euphrates reporting page -An evolution of this page is planned. -It was decided to unify the reporting by creating a landing page that should give -the scenario status in one glance (it was previously consolidated manually -on a wiki page). +An evolution of the reporting page is planned to unify test reporting by creating +a landing page that shows the scenario status with one glance (this information was +previously consolidated manually on a wiki page). The landing page will be displayed +per scenario and show: -The landing page (planned for Danube 2.0) will be displayed per scenario: * the status of the deployment - * the score of the test projectS + * the score from each test suite. There is no overall score, it is determined + by each test project. * a trust indicator -Additional filters (version, installer, test collection time window,... ) are -included. -The test case catalog ---------------------- -Until the Colorado release, each testing project was managing the list of its -test cases. It was very hard to have a global view of the available test cases -among the different test projects. A common view was possible through the API +Test Case Catalog +================= +Until the Colorado release, each testing project managed the list of its +test cases. This made it very hard to have a global view of the available test +cases from the different test projects. A common view was possible through the API but it was not very user friendly. -In fact you may know all the cases per project calling: +Test cases per project may be listed by calling: http://testresults.opnfv.org/test/api/v1/projects/<project_name>/cases with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick -It was decided to build a web site providing a consistent view of the test cases -per project and allow any scenario owner to build his/her custom list of tests -(Danube 2.0). +A test case catalog has now been realized `[TST4]`_. Roll over the project then +click to get the list of test cases, click on the case to get more details. + +.. figure:: ../../images/TestcaseCatalog.png + :align: center + :alt: Testing group testcase catalog -Other resources +Test Dashboards =============== +The Test Dashboard is used to provide a consistent view of the results collected in CI. +The results shown on the dashboard are post processed from the Database, which only +contains raw results. +The dashboard can be used in addition of the reporting page (high level view) to allow +the creation of specific graphs according to what the test owner wants to show. + +In Brahmaputra, a basic dashboard was created in Functest. +In Colorado, Yardstick used Grafana (time based graphs) and ELK (complex +graphs). +Since Danube, the OPNFV testing community decided to adopt the ELK framework and to +use Bitergia for creating highly flexible dashboards `[TST5]`_. + +.. figure:: ../../images/DashboardBitergia.png + :align: center + :alt: Testing group testcase catalog + + +OPNFV Test Group Information +============================ + +For more information or to participate in the OPNFV test community please see the +following: + wiki: https://wiki.opnfv.org/testing mailing list: test-wg@lists.opnfv.org @@ -249,8 +321,9 @@ weekly meeting (https://wiki.opnfv.org/display/meetings/TestPerf): * Usual time: Every Thursday 15:00-16:00 UTC / 7:00-8:00 PST * APAC time: 2nd Wednesday of the month 8:00-9:00 UTC + ======================= -Reference documentation +Reference Documentation ======================= +----------------+---------------------------------------------------------+ @@ -272,3 +345,20 @@ Reference documentation +----------------+---------------------------------------------------------+ | Yardstick | https://wiki.opnfv.org/display/yardstick/Yardstick | +----------------+---------------------------------------------------------+ + + +`[TST1]`_: OPNFV web site + +`[TST2]`_: Test utils in Releng + +`[TST3]`_: TestAPI autogenerated documentation + +`[TST4]`_: Testcase catalog + +`[TST5]`_: Testing group dashboard + +.. _`[TST1]`: http://www.opnfv.org +.. _`[TST2]`: https://git.opnfv.org/functest/tree/releng/utils/tests +.. _`[TST3]`: http://artifacts.opnfv.org/releng/docs/testapi.html +.. _`[TST4]`: http://testresults.opnfv.org/testing/index.html#!/select/visual +.. _`[TST5]`: https://opnfv.biterg.io:443/goto/283dba93ca18e95964f852c63af1d1ba diff --git a/docs/testing/testing-dev.rst b/docs/testing/testing-dev.rst index e7b680044..7f56b7bfa 100644 --- a/docs/testing/testing-dev.rst +++ b/docs/testing/testing-dev.rst @@ -9,7 +9,10 @@ Testing Developer Guides Testing group ------------- -.. include:: ./developer/devguide/index.rst +.. toctree:: + :maxdepth: 1 + + ./developer/devguide/index Bottlenecks ------------ diff --git a/docs/testing/testing-user.rst b/docs/testing/testing-user.rst index 198b090e6..6b533a26d 100644 --- a/docs/testing/testing-user.rst +++ b/docs/testing/testing-user.rst @@ -7,6 +7,9 @@ Testing User Guides =================== +This page provides the links to the installation, configuration and user guides +of the different test projects. + Bottlenecks ------------ .. toctree:: @@ -60,6 +63,3 @@ Yardstick ../submodules/yardstick/docs/testing/user/configguide/index ../submodules/yardstick/docs/testing/user/userguide/index - - - |