From 1b4f8bbedbdc81afdb8dfab009396fad92a0a3d4 Mon Sep 17 00:00:00 2001 From: ChristopherPrice Date: Fri, 29 Jan 2016 20:08:03 +0100 Subject: Adding structure to pull content into central docs. Including minor fixes and corrections. Change-Id: I7c4bd9ca4f5fb35aeb2350096520c66b11aecbe8 Signed-off-by: ChristopherPrice (cherry picked from commit 63c77f27ec173fc3684ed980e82a4a7ed97beb6b) --- docs/userguide/description.rst | 75 +++++++++++++ docs/userguide/index.rst | 247 +++-------------------------------------- docs/userguide/runfunctest.rst | 140 +++++++++++++++++++++++ 3 files changed, 232 insertions(+), 230 deletions(-) create mode 100644 docs/userguide/description.rst create mode 100644 docs/userguide/runfunctest.rst (limited to 'docs/userguide') diff --git a/docs/userguide/description.rst b/docs/userguide/description.rst new file mode 100644 index 000000000..a8fdcd0f2 --- /dev/null +++ b/docs/userguide/description.rst @@ -0,0 +1,75 @@ +Description of the test cases +============================= + +Functest is an OPNFV project dedicated to functional testing. +In the continuous integration, it is launched after an OPNFV fresh installation. +The Functest target is to verify the basic functions of the infrastructure. + +Functest includes different test suites which several test cases within. +Test cases are developed in Functest and in feature projects. + +The current list of test suites can be distributed in 3 main domains:: + + +----------------+----------------+--------------------------------------------+ + | Domain | Test suite | Comments | + +================+================+============================================+ + | | vPing | NFV "Hello World" | + | +----------------+--------------------------------------------+ + | VIM | vPing_userdata | Ping using userdata and cloud-init | + | | | mechanism | + | +----------------+--------------------------------------------+ + |(Virtualised | Tempest | OpenStack reference test suite `[2]`_ | + | Infrastructure +----------------+--------------------------------------------+ + | Manager) | Rally scenario | OpenStack testing tool testing OpenStack | + | | | modules `[3]`_ | + +----------------+----------------+--------------------------------------------+ + | | OpenDaylight | Opendaylight Test suite | + | +----------------+--------------------------------------------+ + | Controllers | ONOS | Test suite of ONOS L2 and L3 functions | + | +----------------+--------------------------------------------+ + | | OpenContrail | | + +----------------+----------------+--------------------------------------------+ + | Features | vIMS | Show the capability to deploy a real NFV | + | | | test cases. | + | | | The IP Multimedia Subsytem is a typical | + | | | Telco test case, referenced by ETSI. | + | | | It provides a fully functional VoIP System.| + | +----------------+--------------------------------------------+ + | | Promise | Resource reservation and management project| + | | | to identify NFV related requirements and | + | | | realize resource reservation for future | + | | | usage by capacity management of resource | + | | | pools regarding compute, network and | + | | | storage. | + | +----------------+--------------------------------------------+ + | | SDNVPN | | + +----------------+----------------+--------------------------------------------+ + + +Most of the test suites are developed upstream. +For example, `Tempest `_ is the +OpenStack integration test suite. +Functest is in charge of the integration of different functional test suites. + +The Tempest suite has been customized but no new test cases have been created. +Some OPNFV feature projects (e.g. SDNVPN) have created Tempest tests cases and +pushed to upstream. + +The tests run from CI are pushed into a database. +The goal is to populate the database with results and to show them on a Test +Dashboard. + +There is no real notion of Test domain or Test coverage yet. +Basic components (VIM, controllers) are tested through their own suites. +Feature projects also provide their own test suites. + +vIMS test case was integrated to demonstrate the capability to deploy a +relatively complex NFV scenario on top of the OPNFV infrastructure. + +Functest considers OPNFV as a black box. +OPNFV, since Brahmaputra, offers lots of possible combinations: + + * 3 controllers (OpenDayligh, ONOS, OpenContrail) + * 4 installers (Apex, Compass, Fuel, Joid) + +However most of the tests shall be runnable on any configuration. diff --git a/docs/userguide/index.rst b/docs/userguide/index.rst index ba20507b1..cf36d9e80 100644 --- a/docs/userguide/index.rst +++ b/docs/userguide/index.rst @@ -17,81 +17,7 @@ A presentation has been created for the first OPNFV Summit `[4]`_. It is assumed that Functest container has been properly installed `[1]`_. - -Description of the test cases -============================= - -Functest is an OPNFV project dedicated to functional testing. -In the continuous integration, it is launched after an OPNFV fresh installation. -The Functest target is to verify the basic functions of the infrastructure. - -Functest includes different test suites which several test cases within. -Test cases are developed in Functest and in feature projects. - -The current list of test suites can be distributed in 3 main domains:: - - +----------------+----------------+--------------------------------------------+ - | Method | Test suite | Comments | - +================+================+============================================+ - | | vPing | NFV "Hello World" | - | +----------------+--------------------------------------------+ - | VIM | vPing_userdata | Ping using userdata and cloud-init | - | | | mechanism | - | +----------------+--------------------------------------------+ - |(Virtualised | Tempest | OpenStack reference test suite `[2]`_ | - | Infrastructure +----------------+--------------------------------------------+ - | Manager) | Rally scenario | OpenStack testing tool testing OpenStack | - | | | modules `[3]`_ | - +----------------+----------------+--------------------------------------------+ - | | OpenDaylight | Opendaylight Test suite | - | +----------------+--------------------------------------------+ - | Controllers | ONOS | Test suite of ONOS L2 and L3 functions | - | +----------------+--------------------------------------------+ - | | OpenContrail | | - +----------------+----------------+--------------------------------------------+ - | Features | vIMS | Show the capability to deploy a real NFV | - | | | test cases. | - | | | The IP Multimedia Subsytem is a typical | - | | | Telco test case, referenced by ETSI. | - | | | It provides a fully functional VoIP System.| - | +----------------+--------------------------------------------+ - | | Promise | Resource reservation and management project| - | | | to identify NFV related requirements and | - | | | realize resource reservation for future | - | | | usage by capacity management of resource | - | | | pools regarding compute, network and | - | | | storage. | - | +----------------+--------------------------------------------+ - | | SDNVPN | | - +----------------+----------------+--------------------------------------------+ - - -Most of the test suites are developed upstream. -For example, Tempest `[2]`_ is the OpenStack integration test suite. -Functest is in charge of the integration of different functional test suites. - -The Tempest suite has been customized but no new test cases have been created. -Some OPNFV feature projects (.e.g. SDNVPN) have created Tempest tests cases and -pushed to upstream. - -The tests run from CI are pushed into a database. -The goal is to populate the database with results and to show them on a Test -Dashboard. - -There is no real notion of Test domain or Test coverage yet. -Basic components (VIM, controllers) are tested through their own suites. -Feature projects also provide their own test suites. - -vIMS test case was integrated to demonstrate the capability to deploy a -relatively complex NFV scenario on top of the OPNFV infrastructure. - -Functest considers OPNFV as a black box. -OPNFV, since Brahmaputra, offers lots of possible combinations: - - * 3 controllers (OpenDayligh, ONOS, OpenContrail) - * 4 installers (Apex, Compass, Fuel, Joid) - -However most of the tests shall be runnable on any configuration. +.. include:: ./description.rst The different scenarios are described in the section hereafter. @@ -177,7 +103,7 @@ Tempest has batteries of tests for: * OpenStack API validation * Scenarios - * other specific tests useful in validating an OpenStack deployment + * Other specific tests useful in validating an OpenStack deployment We use Rally `[3]`_ to run Tempest suite. Rally generates automatically tempest.conf configuration file. @@ -187,10 +113,10 @@ When the Tempest suite is run, each test duration is measured. The full console output is stored in the tempest.log file. As an addition of Arno, Brahmaputra runs a customized set of Tempest test cases. -The list is specificed through --tests-file when running Rally. +The list is specificed through *--tests-file* when running Rally. This option has been introduced in Rally in version 0.1.2. -The customized test list is available in the Functest repo `[4]`_ +The customized test list is available in the Functest repo `[4]`_. This list contains more than 200 Tempest test cases. The list can be divied into two main parts: @@ -416,145 +342,7 @@ include:: flavor_ram: 512 flavor_disk: 0 - -Manual testing -============== - -Once the Functest docker container is running and Functest environment ready -(through /home/opnfv/repos/functest/docker/prepare_env.sh script), the system is -ready to run the tests. - -The script run_tests.sh is located in $repos_dir/functest/docker and it has -several options:: - - ./run_tests.sh -h - Script to trigger the tests automatically. - - usage: - bash run_tests.sh [--offline] [-h|--help] [-t ] - - where: - -h|--help show this help text - -r|--report push results to database (false by default) - -n|--no-clean do not clean up OpenStack resources after test run - -t|--test run specific set of tests - one or more of the following: vping,vping_userdata,odl,rally,tempest,vims,onos,promise. Separated by comma. - - examples: - run_tests.sh - run_tests.sh --test vping,odl - run_tests.sh -t tempest,rally --no-clean - -The -r option is used by the Continuous Integration in order to push the test -results into a test collection database, see in next section for details. -In manual mode, you must not use it, your try will be anyway probably rejected -as your POD must be declared in the database to collect the data. - -The -n option is used for preserving all the existing OpenStack resources after -execution test cases. - -The -t option can be used to specify the list of test you want to launch, by -default Functest will try to launch all its test suites in the following order -vPing, odl, Tempest, vIMS, Rally. -You may launch only one single test by using -t - -Within Tempest test suite you can define which test cases you want to execute in -your environment by editing test_list.txt file before executing run_tests.sh -script. - -Please note that Functest includes cleaning mechanism in order to remove -everything except what was present after a fresh install. -If you create your own VMs, tenants, networks etc. and then launch Functest, -they all will be deleted after executing the tests. Use --no-clean option with -run_test.sh in order to preserve all the existing resources. -However, be aware that Tempest and Rally create of lot of resources (users, -tenants, networks, volumes etc.) that are not always properly cleaned, so this -cleaning function has been set to keep the system as clean as possible after a -full Functest run. - -You may also add you own test by adding a section into the function run_test() - - -Automated testing -================= - -As mentioned in `[1]`, the prepare-env.sh and run_test.sh can be executed within -the container from jenkins. -2 jobs have been created, one to run all the test and one that allows testing -test suite by test suite. -You thus just have to launch the acurate jenkins job on the target lab, all the -tests shall be automatically run. - -When the tests are automatically started from CI, a basic algorithm has been -created in order to detect whether the test is runnable or not on the given -scenario. -In fact, one of the most challenging task in Brahmaputra consists in dealing -with lots of scenario and installers. -Functest test suites cannot be systematically run (e.g. run the ODL suite on an -ONOS scenario). - -CI provides several information: - - * The installer (apex|compass|fuel|joid) - * The scenario [controller]-[feature]-[mode] with - - * controller = (odl|onos|ocl|nosdn) - * feature = (ovs(dpdk)|kvm) - * mode = (ha|noha) - -Constraints per test case are defined in the Functest configuration file -/home/opnfv/functest/config/config_functest.yaml:: - - test-dependencies: - functest: - vims: - scenario: '(ocl)|(odl)|(nosdn)' - vping: - vping_userdata: - scenario: '(ocl)|(odl)|(nosdn)' - tempest: - rally: - odl: - scenario: 'odl' - onos: - scenario: 'onos' - .... - -At the end of the Functest environment creation (prepare_env.sh see `[1]`_), a -file (/home/opnfv/functest/conf/testcase-list.txt) is created with the list of -all the runnable tests. -We consider the static constraints as regex and compare them with the scenario. -For instance, odl can be run only on scenario including odl in its name. - -The order of execution is also described in the Functest configuration file:: - - test_exec_priority: - - 1: vping - 2: vping_userdata - 3: tempest - 4: odl - 5: onos - 6: ovno - #7: doctor - 8: promise - 9: odl-vpnservice - 10: bgpvpn - #11: openstack-neutron-bgpvpn-api-extension-tests - 12: vims - 13: rally - -The tests are executed as follow: - - * Basic scenario (vPing, vPing_userdata, Tempest) - * Controller suites: ODL or ONOS or OpenContrail - * Feature projects - * vIMS - * Rally (benchmark scenario) - -At the end of an automated execution, everything is cleaned. -We keep only the users/networks that have been statically declared in '[9]'_ - +.. include:: ./runfunctest.rst Test results ============ @@ -733,7 +521,7 @@ The results of ODL tests can be seen in the console:: * log.html * report.html - ODL result page +**ODL result page** .. figure:: ../images/functestODL.png :width: 170mm @@ -744,8 +532,8 @@ The results of ODL tests can be seen in the console:: ONOS ^^^^ -The ONOS test logs can be found in OnosSystemTest/TestON/logs -(ONOSCI_PATH to be added),and also can be seen in the console:: +The ONOS test logs can be found in OnosSystemTest/, and TestON/, and logs/ +(ONOSCI_PATH to be added), and can also be seen in the console:: ****************************** Result summary for Testcase4 @@ -902,7 +690,7 @@ steps: * INFO: environment prepared successfully => environment OK * INFO - Cloudify-manager server is UP ! => orchestrator deployed - * INFO The deployment of clearwater-opnfv is ended => VNF deployed + * INFO - The deployment of clearwater-opnfv is ended => VNF deployed * Multiple Identities (UDP) - (6505550771, 6505550675) Passed => tests run * DEBUG - Pushing results to DB.... => tests saved @@ -1035,7 +823,7 @@ But all were not ready for Brahmaputra. The API can described as follow: -**Version:**:: +Version: +--------+--------------------------+------------------------------------------+ | Method | Path | Description | @@ -1044,7 +832,7 @@ The API can described as follow: +--------+--------------------------+------------------------------------------+ -**Pods:**:: +Pods: +--------+--------------------------+------------------------------------------+ | Method | Path | Description | @@ -1059,7 +847,7 @@ The API can described as follow: | | | } | +--------+--------------------------+------------------------------------------+ -**Projects:**:: +Projects: +--------+--------------------------+------------------------------------------+ | Method | Path | Description | @@ -1087,7 +875,7 @@ The API can described as follow: +--------+--------------------------+------------------------------------------+ -**Test cases:**:: +Test cases: +--------+--------------------------+------------------------------------------+ | Method | Path | Description | @@ -1105,7 +893,7 @@ The API can described as follow: | | | } | +--------+--------------------------+------------------------------------------+ | PUT | /test_projects/{project}?| Modify a test case of {project} | - | | case_name={case} | | + | | case_name={case} | | | | | Content-Type: application/json | | | | { | | | | | @@ -1113,9 +901,9 @@ The API can described as follow: +--------+--------------------------+------------------------------------------+ | DELETE | /test_projects/{project}/| Delete a test case | | | case_name={case} | | - +----------------+----------------+--------------------------------------------+ + +--------+--------------------------+------------------------------------------+ -**Test Results:**:: +Test Results: +--------+--------------------------+------------------------------------------+ | Method | Path | Description | @@ -1155,7 +943,7 @@ The API can described as follow: +--------+--------------------------+------------------------------------------+ -**Dashboard:**:: +Dashboard: +--------+--------------------------+------------------------------------------+ | Method | Path | Description | @@ -1292,4 +1080,3 @@ IRC support chan: #opnfv-testperf .. _`Rally installation procedure`: https://rally.readthedocs.org/en/latest/tutorial/step_0_installation.html .. _`config_test.py` : https://git.opnfv.org/cgit/functest/tree/testcases/config_functest.py .. _`config_functest.yaml` : https://git.opnfv.org/cgit/functest/tree/testcases/config_functest.yaml - diff --git a/docs/userguide/runfunctest.rst b/docs/userguide/runfunctest.rst new file mode 100644 index 000000000..8125e8149 --- /dev/null +++ b/docs/userguide/runfunctest.rst @@ -0,0 +1,140 @@ +Executing the functest suites +============================= + +Manual testing +-------------- + +Once the Functest docker container is running and Functest environment ready +(through /home/opnfv/repos/functest/docker/prepare_env.sh script), the system is +ready to run the tests. + +The script *run_tests.sh* is located in $repos_dir/functest/docker and it has +several options:: + + ./run_tests.sh -h + Script to trigger the tests automatically. + + usage: + bash run_tests.sh [--offline] [-h|--help] [-t ] + + where: + -h|--help show this help text + -r|--report push results to database (false by default) + -n|--no-clean do not clean up OpenStack resources after test run + -t|--test run specific set of tests + one or more of the following: vping,vping_userdata,odl,rally,tempest,vims,onos,promise. Separated by comma. + + examples: + run_tests.sh + run_tests.sh --test vping,odl + run_tests.sh -t tempest,rally --no-clean + +The *-r* option is used by the Continuous Integration in order to push the test +results into a test collection database, see in next section for details. +In manual mode, you must not use it, your try will be anyway probably rejected +as your POD must be declared in the database to collect the data. + +The *-n* option is used for preserving all the existing OpenStack resources after +execution test cases. + +The *-t* option can be used to specify the list of test you want to launch, by +default Functest will try to launch all its test suites in the following order +vPing, odl, Tempest, vIMS, Rally. +You may launch only one single test by using *-t *. + +Within Tempest test suite you can define which test cases you want to execute in +your environment by editing test_list.txt file before executing *run_tests.sh* +script. + +Please note that Functest includes cleaning mechanism in order to remove +everything except what was present after a fresh install. +If you create your own VMs, tenants, networks etc. and then launch Functest, +they all will be deleted after executing the tests. Use the *--no-clean* option with +run_test.sh in order to preserve all the existing resources. +However, be aware that Tempest and Rally create of lot of resources (users, +tenants, networks, volumes etc.) that are not always properly cleaned, so this +cleaning function has been set to keep the system as clean as possible after a +full Functest run. + +You may also add you own test by adding a section into the function run_test(). + + +Automated testing +----------------- + +As mentioned in `[1]`, the *prepare-env.sh* and *run_test.sh* can be executed within +the container from jenkins. +2 jobs have been created, one to run all the test and one that allows testing +test suite by test suite. +You thus just have to launch the acurate jenkins job on the target lab, all the +tests shall be automatically run. + +When the tests are automatically started from CI, a basic algorithm has been +created in order to detect whether the test is runnable or not on the given +scenario. +In fact, one of the most challenging task in Brahmaputra consists in dealing +with lots of scenario and installers. +Functest test suites cannot be systematically run (e.g. run the ODL suite on an +ONOS scenario). + +CI provides several information: + + * The installer (apex|compass|fuel|joid) + * The scenario [controller]-[feature]-[mode] with + + * controller = (odl|onos|ocl|nosdn) + * feature = (ovs(dpdk)|kvm) + * mode = (ha|noha) + +Constraints per test case are defined in the Functest configuration file +/home/opnfv/functest/config/config_functest.yaml:: + + test-dependencies: + functest: + vims: + scenario: '(ocl)|(odl)|(nosdn)' + vping: + vping_userdata: + scenario: '(ocl)|(odl)|(nosdn)' + tempest: + rally: + odl: + scenario: 'odl' + onos: + scenario: 'onos' + .... + +At the end of the Functest environment creation (prepare_env.sh see `[1]`_), a +file (/home/opnfv/functest/conf/testcase-list.txt) is created with the list of +all the runnable tests. +We consider the static constraints as regex and compare them with the scenario. +For instance, odl can be run only on scenario including odl in its name. + +The order of execution is also described in the Functest configuration file:: + + test_exec_priority: + + 1: vping + 2: vping_userdata + 3: tempest + 4: odl + 5: onos + 6: ovno + 7: doctor + 8: promise + 9: odl-vpnservice + 10: bgpvpn + 11: openstack-neutron-bgpvpn-api-extension-tests + 12: vims + 13: rally + +The tests are executed as follow: + + * Basic scenario (vPing, vPing_userdata, Tempest) + * Controller suites: ODL or ONOS or OpenContrail + * Feature projects + * vIMS + * Rally (benchmark scenario) + +At the end of an automated execution, everything is cleaned. +We keep only the users/networks that have been statically declared in 'https://git.opnfv.org/cgit/functest/tree/testcases/VIM/OpenStack/CI/libraries/os_defaults.yaml'_ -- cgit 1.2.3-korg