From 8a4427306c179cbf469d9d6e689a6e5427d50ea4 Mon Sep 17 00:00:00 2001 From: Morgan Richomme Date: Wed, 29 Mar 2017 09:33:39 +0200 Subject: Update Testing ecosystem documentation - new ecosystem figure - add reporting figure - indicate next steps for Danube 2.0 pending question (cperf) Change-Id: I3dcf5a4874f0585f433a693117764641aab8faf8 Signed-off-by: Morgan Richomme --- docs/testing/ecosystem/overview.rst | 154 ++++++++++-------------------------- 1 file changed, 43 insertions(+), 111 deletions(-) (limited to 'docs/testing/ecosystem/overview.rst') diff --git a/docs/testing/ecosystem/overview.rst b/docs/testing/ecosystem/overview.rst index a895d1ccc..7d793f4b4 100644 --- a/docs/testing/ecosystem/overview.rst +++ b/docs/testing/ecosystem/overview.rst @@ -12,7 +12,7 @@ Testing is one of the key activities in OPNFV and includes unit, feature, compon level testing for development, automated deployment, performance characterization or stress testing. -Test projects are dedicated to provide frameworks, tooling and test-cases catagorized as +Test projects are dedicated to provide frameworks, tooling and test-cases categorized as functional, performance or compliance testing. Test projects fulfill different roles such as verifying VIM functionality, benchmarking components and platforms or analysis of measured KPIs for the scenarios released in OPNFV. @@ -32,7 +32,7 @@ The testing projects The OPNFV testing projects may be summarized as follows: -.. figure:: https://wiki.opnfv.org/download/attachments/8688867/EcoSystem%20Copy.png +.. figure:: ../../images/OPNFV_testing_working_group.png :align: center :alt: Overview of OPNFV Testing projects @@ -169,136 +169,68 @@ This database is also cloned for OPNFV Plugfest. The test API ------------ -The Test API is used to declare pods, projects, test cases and test -results. Pods are the pods used to run the tests. -The results pushed in the database are related to pods, projects and -cases. If you try to push results of test done on non referenced pod, -the API will return an error message. +The Test API is used to declare pods, projects, test cases and test results. +Pods correspond to the cluster of machines (3 controller and 2 compute nodes in +HA mode) used to run the tests and defined in Pharos project. +The results pushed in the database are related to pods, projects and cases. +If you try to push results of test done on non referenced pod, the API will +return an error message. -An additional method dashboard has been added to post-process -the raw results in the Brahmaputra release (deprecated in Colorado release). - -The data model is very basic, 4 objects are created: +An additional method dashboard has been added to post-process the raw results in +the Brahmaputra release (deprecated in Colorado release). +The data model is very basic, 5 objects are available: * Pods * Projects * Testcases * Results + * Scenarios -Pods:: - - { - "id": , - "details": , - "creation_date": "YYYY-MM-DD HH:MM:SS", - "name": , - "mode": , - "role": - }, - -Projects:: - - { - "id": , - "name": , - "creation_date": "YYYY-MM-DD HH:MM:SS", - "description": - }, - -Testcases:: - - { - "id": , - "name":, - "project_name":, - "creation_date": "YYYY-MM-DD HH:MM:SS", - "description": , - "url": - }, - -Results:: - - { - "_id": , - "case_name": , - "project_name": , - "pod_name": , - "installer": , - "version": , - "start_date": "YYYY-MM-DD HH:MM:SS", - "stop_date": "YYYY-MM-DD HH:MM:SS", - "build_tag": , - "scenario": , - "criteria": , - "trust_indicator": { - "current": 0, - "histories": [] - } - } - - Scenarios:: - - { - "id": , - "name":, - "name" : "os-odl_l2-nofeature-ha", - "installers":[ - { - "installer" : , - "versions": [ - { - "version": , - "owner": , - "custom list": { "projects": [{ - "functest" : [ "vping_ssh", "vping_userdata", "tempest_smoke_serial", "rally_sanity", "odl", "doctor"], - "yardstick" : [ "tc002","tc005","tc010","tc011","tc012","tc014","tc037","tc055","tc063","tc069","tc070","tc071","tc072","tc075"]}]}, - "score": { "projects": [{ - "functest" : [{"date": YYY-MM-DD HH:MM, "score":}, {"date": YYY-MM-DD HH:MM, "score":}, ...], - "yardstick" : [{"date": YYY-MM-DD HH:MM, "score":}, {"date": YYY-MM-DD HH:MM, "score":}, ...]}]}, - "trust_indicator": { "projects": [{ - "functest" : [{"date": YYY-MM-DD HH:MM,"status":}, {"date": YYY-MM-DD HH:MM,"status":},...], - "yardstick" : [{"date": YYY-MM-DD HH:MM,"status":}, {"date": YYY-MM-DD HH:MM,"status":},...]}]}}, - { .... - }, - -For detailed information, please go to - - http://testresults.opnfv.org/test/swagger/spec.html - - Authentication: opnfv/api@opnfv - -Please notes that POST/DELETE/PUT operations for test or study purpose via -swagger website is not allowed, because it will change the real data in -the database. +For detailed information, please go to http://artifacts.opnfv.org/releng/docs/testapi.html The reporting ------------- -Until the Colorado release, each testing project was reporting a status on a dedicated page. +The reporting page for the test projects is http://testresults.opnfv.org/reporting/ + +.. figure:: ../../images/reporting_page.png + :align: center + :alt: Testing group reporting page + +This page provides a reporting per OPNFV release and per testing project. + +.. figure:: ../../images/reporting_danube_page.png + :align: center + :alt: Testing group Danube reporting page + +An evolution of this page is planned. It was decided to unify the reporting by creating a landing page that should give the scenario status in one glance (it was previously consolidated manually -on a wiki page). The landing page will be display per scenario: +on a wiki page). + +The landing page (planned for Danube 2.0) will be displayed per scenario: * the status of the deployment * the score of the test projectS * a trust indicator -Additional filters (version, installer, test collection time window,... ) +Additional filters (version, installer, test collection time window,... ) are +included. + +The test case catalog +--------------------- +Until the Colorado release, each testing project was managing the list of its +test cases. It was very hard to have a global view of the available test cases +among the different test projects. A common view was possible through the API +but it was not very user friendly. +In fact you may know all the cases per project calling: -This landing page has been dockerized. The back end relies on the testing DB. + http://testresults.opnfv.org/test/api/v1/projects//cases - TODO: add picture +with project_name: bottlenecks, functest, qtip, storperf, vsperf, yardstick -The test case catalog ----------------------- -Until the Colorado release, each testing project was managing the list of its test cases. It -was very hard to have a global view of the available test cases among the -different test projects. A common view was possible through the API but it was -not very user friendly. It was decided to build a web site providing a consistent view of the test cases -per project and allow any scenario owner to build his/her custom list of tests. -The test catalog can be described as below:: - - TODO: add picture +per project and allow any scenario owner to build his/her custom list of tests +(Danube 2.0). Other resources =============== -- cgit 1.2.3-korg