diff options
author | Morgan Richomme <morgan.richomme@orange.com> | 2017-03-27 06:41:33 +0000 |
---|---|---|
committer | Gerrit Code Review <gerrit@opnfv.org> | 2017-03-27 06:41:33 +0000 |
commit | dcf1e078e3ffde4ed57c0cc6da05534ab419ca22 (patch) | |
tree | db927ca60fbd399038880c9c398544aa3d75da7f /docs/testing/developer/devguide/index.rst | |
parent | 18745da8f335fe3049ffdd5acdafb5848981a1c9 (diff) | |
parent | 19f9f2a99641cca98b896aa975a2182c43ba7f97 (diff) |
Merge "Update documentation for Danube"
Diffstat (limited to 'docs/testing/developer/devguide/index.rst')
-rw-r--r-- | docs/testing/developer/devguide/index.rst | 503 |
1 files changed, 259 insertions, 244 deletions
diff --git a/docs/testing/developer/devguide/index.rst b/docs/testing/developer/devguide/index.rst index ce5dc77b..d5295903 100644 --- a/docs/testing/developer/devguide/index.rst +++ b/docs/testing/developer/devguide/index.rst @@ -1,3 +1,6 @@ +.. This work is licensed under a Creative Commons Attribution 4.0 International License. +.. SPDX-License-Identifier: CC-BY-4.0 + ****************************** OPNFV FUNCTEST developer guide ****************************** @@ -62,18 +65,18 @@ Functest internal test cases ============================ The internal test cases in Danube are: - * healthcheck - * connection_check + * api_check + * cloudify_ims + * connection_check * vping_ssh * vping_userdata * odl - * snaps_smoke - * tempest_smoke_serial + * rally_full * rally_sanity + * snaps_health_check * tempest_full_parallel - * rally_full - * cloudify_ims + * tempest_smoke_serial By internal, we mean that this particular test cases have been developped and/or integrated by functest contributors and the associated @@ -86,7 +89,7 @@ The main internal test cases are in the opnfv_tests subfolder of the repository, the internal test cases are: * sdn: odl, onos - * openstack: healthcheck, vping_ssh, vping_userdata, tempest_*, rally_*, connection_check, api_check, snaps_smoke + * openstack: api_check, connection_check, snaps_health_check, vping_ssh, vping_userdata, tempest_*, rally_*, snaps_smoke * vnf: cloudify_ims If you want to create a new test case you will have to create a new @@ -99,19 +102,23 @@ especially the feature projects. The external test cases are: - * promise - * doctor - * onos + * barometer * bgpvpn - * copper - * security_scan - * sfc-odl - * sfc-onos - * parser + * doctor * domino + * odl-netvirt + * onos + * fds * multisite - * opera_ims + * netready * orchestra_ims + * parser + * promise + * refstack_defcore + * security_scan + * snaps_smoke + * sfc-odl + * vyos_vrouter The code to run these test cases may be directly in the repository of @@ -155,7 +162,7 @@ introduced in Danube: The goal is to unify the way to run test from Functest. -feature_base and vnf_base inherit from testcase_base. +feature_base and vnf_base inherit from testcase_base:: +-----------------------------------------+ | | @@ -174,9 +181,9 @@ feature_base and vnf_base inherit from testcase_base. | feature_base | | vnf_base | | | | | | - prepare() | | - prepare() | - | - post() | | - deploy_orchestrator() | - | - parse_results() | | - deploy_vnf() | - | | | - test_vnf() | + | - execute() | | - deploy_orchestrator() | + | - post() | | - deploy_vnf() | + | - parse_results() | | - test_vnf() | | | | - clean() | | | | - execute() | | | | | @@ -270,232 +277,13 @@ the API will return an error message. An additional method dashboard has been added to post-process the raw results in release Brahmaputra (deprecated in Colorado). -The data model is very basic, 4 objects are created: +The data model is very basic, 5 objects are created: * Pods * Projects * Testcases * Results - -Pods:: - - { - "id": <ID>, - "details": <URL description of the POD>, - "creation_date": "YYYY-MM-DD HH:MM:SS", - "name": <The POD Name>, - "mode": <metal or virtual>, - "role": <ci-pod or community-pod or single-node> - }, - -Projects:: - - { - "id": <ID>, - "name": <Name of the Project>, - "creation_date": "YYYY-MM-DD HH:MM:SS", - "description": <Short description> - }, - -Testcases:: - - { - "id": <ID>, - "name":<Name of the test case>, - "project_name":<Name of belonged project>, - "creation_date": "YYYY-MM-DD HH:MM:SS", - "description": <short description>, - "url":<URL for longer description> - }, - -Results:: - - { - "_id": <ID>, - "case_name": <Reference to the test case>, - "project_name": <Reference to project>, - "pod_name": <Reference to POD where the test was executed>, - "installer": <Installer Apex or Compass or Fuel or Joid>, - "version": <master or Colorado or Brahmaputra>, - "start_date": "YYYY-MM-DD HH:MM:SS", - "stop_date": "YYYY-MM-DD HH:MM:SS", - "build_tag": <such as "jenkins-functest-fuel-baremetal-daily-master-108">, - "scenario": <Scenario on which the test was executed>, - "criteria": <PASS or FAILED>, - "trust_indicator": { - "current": 0, - "histories": [] - } - } - -The API can described as follows. For detailed information, please go to - - http://testresults.opnfv.org/test/swagger/spec.html - - Authentication: opnfv/api@opnfv - -Version: - - +--------+--------------------------+-----------------------------------------+ - | Method | Path | Description | - +========+==========================+=========================================+ - | GET | /versions | Get all supported API versions | - +--------+--------------------------+-----------------------------------------+ - - -Pods: - - +--------+----------------------------+-----------------------------------------+ - | Method | Path | Description | - +========+============================+=========================================+ - | GET | /api/v1/pods | Get the list of declared Labs (PODs) | - +--------+----------------------------+-----------------------------------------+ - | POST | /api/v1/pods | Declare a new POD | - | | | Content-Type: application/json | - | | | { | - | | | "name": "pod_foo", | - | | | "mode": "metal", | - | | | "role": "ci-pod", | - | | | "details": "it is a ci pod" | - | | | } | - +--------+----------------------------+-----------------------------------------+ - | GET | /api/v1/pods/{pod_name} | Get a declared POD | - +--------+----------------------------+-----------------------------------------+ - -Projects: - - +--------+----------------------------+-----------------------------------------+ - | Method | Path | Description | - +========+============================+=========================================+ - | GET | /api/v1/projects | Get the list of declared projects | - +--------+----------------------------+-----------------------------------------+ - | POST | /api/v1/projects | Declare a new test project | - | | | Content-Type: application/json | - | | | { | - | | | "name": "project_foo", | - | | | "description": "whatever you want" | - | | | } | - +--------+----------------------------+-----------------------------------------+ - | DELETE | /api/v1/projects/{project} | Delete a test project | - +--------+----------------------------+-----------------------------------------+ - | GET | /api/v1/projects/{project} | Get details on a {project} | - | | | | - +--------+----------------------------+-----------------------------------------+ - | PUT | /api/v1/projects/{project} | Update a test project | - | | | | - | | | Content-Type: application/json | - | | | { | - | | | <the field(s) you want to modify> | - | | | } | - +--------+----------------------------+-----------------------------------------+ - - -Testcases: - - +--------+----------------------------+-----------------------------------------+ - | Method | Path | Description | - +========+============================+=========================================+ - | GET | /api/v1/projects/{project}/| Get the list of testcases of {project} | - | | cases | | - +--------+----------------------------+-----------------------------------------+ - | POST | /api/v1/projects/{project}/| Add a new test case to {project} | - | | cases | Content-Type: application/json | - | | | { | - | | | "name": "case_foo", | - | | | "description": "whatever you want" | - | | | "url": "whatever you want" | - | | | } | - +--------+----------------------------+-----------------------------------------+ - | DELETE | /api/v1/projects/{project}/| Delete a test case | - | | cases/{case} | | - +--------+----------------------------+-----------------------------------------+ - | GET | /api/v1/projects/{project}/| Get a declared test case | - | | cases/{case} | | - +--------+----------------------------+-----------------------------------------+ - | PUT | /api/v1/projects/{project}?| Modify a test case of {project} | - | | cases/{case} | | - | | | Content-Type: application/json | - | | | { | - | | | <the field(s) you want to modify> | - | | | } | - +--------+----------------------------+-----------------------------------------+ - -Results: - - +--------+----------------------------+------------------------------------------+ - | Method | Path | Description | - +========+============================+==========================================+ - | GET | /api/v1/results | Get all the test results | - +--------+----------------------------+------------------------------------------+ - | POST | /api/v1/results | Add a new test results | - | | | Content-Type: application/json | - | | | { | - | | | "project_name": "project_foo", | - | | | "scenario": "odl-l2", | - | | | "stop_date": "2016-05-28T14:42:58.384Z", | - | | | "trust_indicator": 0.5, | - | | | "case_name": "vPing", | - | | | "build_tag": "", | - | | | "version": "Colorado", | - | | | "pod_name": "pod_foo", | - | | | "criteria": "PASS", | - | | | "installer": "fuel", | - | | | "start_date": "2016-05-28T14:41:58.384Z",| - | | | "details": <your results> | - | | | } | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of {case} | - | | case={case} | | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of build_tag | - | | build_tag={tag} | {tag}. | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get last {N} records of test results | - | | last={N} | | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of scenario | - | | scenario={scenario} | {scenario}. | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of trust_indicator | - | | trust_indicator={ind} | {ind}. | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of last days | - | | period={period} | {period}. | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of {project} | - | | project={project} | | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of version | - | | version={version} | {version}. | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of criteria | - | | criteria={criteria} | {criteria}. | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | get the results on pod {pod} | - | | pod={pod} | | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the test results of installer {inst} | - | | installer={inst} | | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results? | Get the results according to combined | - | | <query conditions> | query conditions supported above | - +--------+----------------------------+------------------------------------------+ - | GET | /api/v1/results/{result_id}| Get the test result by result_id | - +--------+----------------------------+------------------------------------------+ - -Scenarios: - - +--------+----------------------------+-----------------------------------------+ - | Method | Path | Description | - +========+============================+=========================================+ - | GET | /api/v1/scenarios | Get the list of declared scenarios | - +--------+----------------------------+-----------------------------------------+ - | POST | /api/v1/scenario | Declare a new scenario | - +--------+----------------------------+-----------------------------------------+ - | GET | /api/v1/scenario? | Get a declared scenario | - | | <query conditions> | | - +--------+----------------------------+-----------------------------------------+ - + * Scenarios The code of the API is hosted in the releng repository `[6]`_. The static documentation of the API can be found at `[17]`_. @@ -574,6 +362,7 @@ Please note that currently token authorization is implemented but is not yet ena +---------------------+---------+---------+---------+---------+ | copper | X | | | X | +---------------------+---------+---------+---------+---------+ + src: colorado (see release note for the last matrix version) All the testcases listed in the table are runnable on os-odl_l2-nofeature scenarios. @@ -736,8 +525,7 @@ Regex are standard regex. You can have a look at `[11]`_ You can also easily test your regex via an online regex checker such as `[12]`_. Put your scenario in the TEST STRING window (e.g. os-odl_l3-ovs-ha), put -your regex in the REGULAR EXPRESSION window, then you can test your rule -. +your regex in the REGULAR EXPRESSION window, then you can test your rule. How to know which test I can run? @@ -941,6 +729,233 @@ You can also reuse a python function defined in functest_utils.py:: return False +Where can I find the documentation on the test API? +=================================================== + +http://artifacts.opnfv.org/releng/docs/testapi.html + + +How to exclude Tempest case from default Tempest smoke suite? +============================================================= + +Tempest default smoke suite deals with 165 test cases. +Since Colorado the success criteria is 100%, i.e. if 1 test is failed the +success criteria is not matched for the scenario. + +It is necessary to exclude some test cases that are expected to fail due to +known upstream bugs (see release notes). + +A file has been created for such operation: https://git.opnfv.org/cgit/functest/tree/functest/opnfv_tests/openstack/tempest/custom_tests/blacklist.txt. + +It can be described as follows:: + + - + scenarios: + - os-odl_l2-bgpvpn-ha + - os-odl_l2-bgpvpn-noha + installers: + - fuel + - apex + tests: + - tempest.api.compute.servers.test_create_server.ServersTestJSON.test_list_servers + - tempest.api.compute.servers.test_create_server.ServersTestJSON.test_verify_server_details + - tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_list_servers + - tempest.api.compute.servers.test_create_server.ServersTestManualDisk.test_verify_server_details + - tempest.api.compute.servers.test_server_actions.ServerActionsTestJSON.test_reboot_server_hard + - tempest.scenario.test_network_basic_ops.TestNetworkBasicOps.test_network_basic_ops + - tempest.scenario.test_server_basic_ops.TestServerBasicOps.test_server_basic_ops + - tempest.scenario.test_volume_boot_pattern.TestVolumeBootPattern.test_volume_boot_pattern + - tempest.scenario.test_volume_boot_pattern.TestVolumeBootPatternV2.test_volume_boot_pattern + +Please note that each exclusion must be justified. the goal is not to exclude +test cases because they do not pass. Several scenarios reached the 100% criteria. +So it is expected in the patch submited to exclude the cases to indicate the +reasons of the exclusion. + + +How do I know the Functest status of a scenario? +================================================ + +A Functest automatic reporting page is generated daily. +This page is dynamically created through a cron job and is based on the results +stored in the Test DB. +You can access this reporting page: http://testresults.opnfv.org/reporting + +See https://wiki.opnfv.org/pages/viewpage.action?pageId=6828617 for details. + + +I have tests, to which category should I declare them? +====================================================== + +CATEGORIES/TIERS description: + ++----------------+-------------------------------------------------------------+ +| healthcheck | Simple OpenStack healtcheck tests case that validates the | +| | basic operations in OpenStack | ++----------------+-------------------------------------------------------------+ +| Smoke | Set of smoke test cases/suites to validate the most common | +| | OpenStack and SDN Controller operations | ++----------------+-------------------------------------------------------------+ +| Features | Test cases that validate a specific feature on top of OPNFV.| +| | Those come from Feature projects and need a bit of support | +| | for integration | ++----------------+-------------------------------------------------------------+ +| Components | Advanced Openstack tests: Full Tempest, Full Rally | ++----------------+-------------------------------------------------------------+ +| Performance | Out of Functest Scope | ++----------------+-------------------------------------------------------------+ +| VNF | Test cases related to deploy an open source VNF including | +| | an orchestrator | ++----------------+-------------------------------------------------------------+ + +The main ambiguity could be between features and VNF. +In fact sometimes you have to spawn VMs to demonstrate the capabilities of the +feature you introduced. +We recommend to declare your test in the feature category. + +VNF category is really dedicated to test including: + + * creation of resources + * deployement of an orchestrator/VNFM + * deployment of the VNF + * test of the VNFM + * free resources + +The goal is not to study a particular feature on the infrastructure but to have +a whole end to end test of a VNF automatically deployed in CI. +Moreover VNF are run in weekly jobs (one a week), feature tests are in daily +jobs and use to get a scenario score. + +Where are the logs? +=================== + +Functest deals with internal and external testcases. Each testcase can generate +logs. + +Since Colorado we introduce the possibility to push the logs to the artifact. +A new script (https://git.opnfv.org/releng/tree/utils/push-test-logs.sh) has +been created for CI. + +When called, and assuming that the POD is authorized to push the logs to +artifacts, the script will push all the results or logs locally stored under +/home/opnfv/functest/results/. + +If the POD is not connected to CI, logs are not pushed. +But in both cases, logs are stored in /home/opnfv/functest/results in the +container. +Projects are encouraged to push their logs here. + +Since Colorado it is also easy for feature project to integrate this feature by +adding the log file as output_file parameter when calling execute_command from +functest_utils library + + ret_val = functest_utils.execute_command(cmd, output_file=log_file) + + +How does Functest deal with VNF onboarding? +=========================================== + +VNF onboarding has been introduced in Brahmaputra through the automation of a +clearwater vIMS deployed thanks to cloudify orchestrator. + +This automation has been described at OpenStack summit Barcelona: +https://youtu.be/Jr4nG74glmY + +The goal of Functest consists in testing OPNFV from a functional perspective: +the NFVI and/or the features developed in OPNFV. Feature test suites are +provided by the feature project. Functest just simplifies the integration of +the suite into the CI and gives a consolidated view of the tests per scenario. + +Functest does not develop VNFs. + +Functest does not test any MANO stack. + +OPNFV projects dealing with VNF onboarding +------------------------------------------ + +Testing VNF is not the main goal however it gives interesting and realistic +feedback on OPNFV as a Telco cloud. + +Onboarding VNF also allows to test a full stack: orchestrator + VNF. + +Functest is VNF and MANO stack agnostic. + +An internship has been initiated to reference the Open Source VNF: Intern +Project Open Source VNF catalog + +New projects dealing with orchestrators or VNFs are candidate for Danube. + +The 2 projects dealing with orchestration are: + + * orchestra (Openbaton) + * opera (Open-O) + +The Models project address various goals for promoting availability and +convergence of information and/or data models related to NFV service/VNF +management, as being defined in standards (SDOs) and as developed in open +source projects. + +Functest VNF onboarding +----------------------- + +In order to simplify VNF onboarding a new abstraction class has been developed +in Functest. + +This class is based on vnf_base and can be described as follow: + + +------------+ +--------------+ + | test_base |------------>| vnf_base | + +------------+ +--------------+ + |_ prepare + |_ deploy_orchestrator (optional) + |_ deploy_vnf + |_ test_vnf + |_ clean + + +Several methods are declared in vnf_base: + + * prepare + * deploy_orchestrator + * deploy_vnf + * test_vnf + * clean + +deploy_vnf and test_vnf are mandatory. + +prepare will create a user and a project. + +How to declare your orchestrator/VNF? +------------------------------------- +1) test declaration + +You must declare your testcase in the file <Functest repo>/functest/ci/testcases.yaml + +2) configuration + +You can precise some configuration parameters in config_functest.yaml + +3) implement your test + +Create your own VnfOnboarding file + +you must create your entry point through a python clase as referenced in the +configuration file + +e.g. aaa => creation of the file <Functest repo>/functest/opnfv_tests/vnf/aaa/aaa.py + +the class shall inherit vnf_base. +You must implement the methods deploy_vnf() and test_vnf() and may implement +deploy_orchestrator() + +you can call the code from your repo (but need to add the repo in Functest if +it is not the case) + +4) success criteria + +So far we considered the test as PASS if vnf_deploy and test_vnf is PASS +(see example in aaa). + ========== References ========== @@ -989,4 +1004,4 @@ _`OpenRC`: http://docs.openstack.org/user-guide/common/cli_set_environment_varia _`Rally installation procedure`: https://rally.readthedocs.org/en/latest/tutorial/step_0_installation.html -_`config_functest.yaml` : https://git.opnfv.org/cgit/functest/tree/testcases/config_functest.yaml +_`config_functest.yaml` : https://git.opnfv.org/cgit/functest/tree/functest/ci/config_functest.yaml |