aboutsummaryrefslogtreecommitdiffstats
path: root/docs/testing/developer
diff options
context:
space:
mode:
authorjose.lausuch <jose.lausuch@ericsson.com>2017-02-21 15:21:45 +0100
committerjose.lausuch <jose.lausuch@ericsson.com>2017-03-01 10:54:32 +0100
commit83714aaa9a09e7df27e06d5d95dadc89a8e67f9e (patch)
tree8d52ed9b7ab325e795ee9795da2a9bca2715feda /docs/testing/developer
parent8ad45aea10e3e2ec5168eacd53c7f32cf3a17e9b (diff)
Restructure docs
DOCS directory restructured according to: https://wiki.opnfv.org/display/DOC/Documentation+Guide Now: - release - release notes - testing - developer - dev guide (this is to be released on docs.opnfv.org) - internship (docs about intern projects) - user (this is to be released on docs.opnfv.org) - config guide - user guide Change-Id: I1851189601aac3c5989f19f99d779efe23dbf3d1 Signed-off-by: jose.lausuch <jose.lausuch@ericsson.com>
Diffstat (limited to 'docs/testing/developer')
-rw-r--r--docs/testing/developer/devguide/index.rst993
-rw-r--r--docs/testing/developer/internship/security_group/index.rst70
-rw-r--r--docs/testing/developer/internship/testapi_evolution/index.rst237
-rw-r--r--docs/testing/developer/internship/unit_tests/index.rst70
-rw-r--r--docs/testing/developer/internship/vnf_catalog/index.rst170
5 files changed, 1540 insertions, 0 deletions
diff --git a/docs/testing/developer/devguide/index.rst b/docs/testing/developer/devguide/index.rst
new file mode 100644
index 000000000..eee013678
--- /dev/null
+++ b/docs/testing/developer/devguide/index.rst
@@ -0,0 +1,993 @@
+******************************
+OPNFV FUNCTEST developer guide
+******************************
+
+.. toctree::
+ :numbered:
+ :maxdepth: 2
+
+
+============
+Introduction
+============
+
+Functest is a project dealing with functional testing.
+Functest produces its own internal test cases but can also be considered
+as a framework to support feature and VNF onboarding project testing.
+Functest developed a test API and defined a test collection framework
+that can be used by any OPNFV project.
+
+Therefore there are many ways to contribute to Functest. You can:
+
+ * Develop new internal test cases
+ * Integrate the tests from your feature project
+ * Develop the framework to ease the integration of external test cases
+ * Develop the API / Test collection framework
+ * Develop dashboards or automatic reporting portals
+
+This document describes how, as a developer, you may interact with the
+Functest project. The first section details the main working areas of
+the project. The Second part is a list of "How to" to help you to join
+the Functest family whatever your field of interest is.
+
+
+========================
+Functest developer areas
+========================
+
+
+Functest High level architecture
+================================
+
+Functest is project delivering a test container dedicated to OPNFV.
+It includes the tools, the scripts and the test scenarios.
+
+Functest can be described as follow::
+
+ +----------------------+
+ | |
+ | +--------------+ | +-------------------+
+ | | | | Public | |
+ | | Tools | +------------------+ OPNFV |
+ | | Scripts | | | System Under Test |
+ | | Scenarios | +------------------+ |
+ | | | | Management | |
+ | +--------------+ | +-------------------+
+ | |
+ | Functest Docker |
+ | |
+ +----------------------+
+
+Functest internal test cases
+============================
+The internal test cases in Danube are:
+
+ * healthcheck
+ * connection_check
+ * api_check
+ * vping_ssh
+ * vping_userdata
+ * odl
+ * snaps_smoke
+ * tempest_smoke_serial
+ * rally_sanity
+ * tempest_full_parallel
+ * rally_full
+ * cloudify_ims
+
+By internal, we mean that this particular test cases have been
+developped and/or integrated by functest contributors and the associated
+code is hosted in the Functest repository.
+An internal case can be fully developped or a simple integration of
+upstream suites (e.g. Tempest/Rally developped in OpenStack are just
+integrated in Functest).
+The structure of this repository is detailed in `[1]`_.
+The main internal test cases are in the opnfv_tests subfolder of the
+repository, the internal test cases are:
+
+ * sdn: odl, onos
+ * openstack: healthcheck, vping_ssh, vping_userdata, tempest_*, rally_*, connection_check, api_check, snaps_smoke
+ * vnf: cloudify_ims
+
+If you want to create a new test case you will have to create a new
+folder under the testcases directory.
+
+Functest external test cases
+============================
+The external test cases are inherited from other OPNFV projects,
+especially the feature projects.
+
+The external test cases are:
+
+ * promise
+ * doctor
+ * onos
+ * bgpvpn
+ * copper
+ * security_scan
+ * sfc-odl
+ * sfc-onos
+ * parser
+ * domino
+ * multisite
+ * opera_ims
+ * orchestra_ims
+
+
+The code to run these test cases may be directly in the repository of
+the project. We have also a **features** sub directory under opnfv_tests
+directory that may be used (it can be usefull if you want to reuse
+Functest library).
+
+Functest framework
+==================
+
+Functest can be considered as a framework.
+Functest is release as a docker file, including tools, scripts and a CLI
+to prepare the environement and run tests.
+It simplifies the integration of external test suites in CI pipeline
+and provide commodity tools to collect and display results.
+
+Since Colorado, test categories also known as tiers have been created to
+group similar tests, provide consistant sub-lists and at the end optimize
+test duration for CI (see How To section).
+
+The definition of the tiers has been agreed by the testing working group.
+
+The tiers are:
+ * healthcheck
+ * smoke
+ * features
+ * components
+ * performance
+ * vnf
+ * stress
+
+Functest abstraction classes
+============================
+
+In order to harmonize test integration, 3 abstraction classes have been
+introduced in Danube:
+
+ * testcase_base: base for any test case
+ * feature_base: abstraction for feature project
+ * vnf_base: abstraction for vnf onboarding
+
+The goal is to unify the way to run test from Functest.
+
+feature_base and vnf_base inherit from testcase_base.
+
+ +-----------------------------------------+
+ | |
+ | Testcase_base |
+ | |
+ | - init() |
+ | - run() |
+ | - publish_report() |
+ | - check_criteria() |
+ | |
+ +-----------------------------------------+
+ | |
+ V V
+ +--------------------+ +--------------------------+
+ | | | |
+ | feature_base | | vnf_base |
+ | | | |
+ | - prepare() | | - prepare() |
+ | - post() | | - deploy_orchestrator() |
+ | - parse_results() | | - deploy_vnf() |
+ | | | - test_vnf() |
+ | | | - clean() |
+ | | | - execute() |
+ | | | |
+ +--------------------+ +--------------------------+
+
+
+Functest util classes
+=====================
+
+In order to simplify the creation of test cases, Functest develops some
+functions that can be used by any feature or internal test cases.
+Several features are supported such as logger, configuration management and
+Openstack capabilities (snapshot, clean, tacker,..).
+These functions can be found under <repo>/functest/utils and can be described as
+follows:
+
+functest/utils/
+|-- config.py
+|-- constants.py
+|-- env.py
+|-- functest_constants.py
+|-- functest_logger.py
+|-- functest_utils.py
+|-- openstack_clean.py
+|-- openstack_snapshot.py
+|-- openstack_tacker.py
+`-- openstack_utils.py
+
+Note that for Openstack, keystone v3 is now deployed by default by compass,
+fuel and joid in Danube. All installers still support keysone v2 (deprecated in
+next version).
+
+Test collection framework
+=========================
+
+The OPNFV testing group created a test collection database to collect
+the test results from CI:
+
+
+ http://testresults.opnfv.org/test/swagger/spec.html
+
+ Authentication: opnfv/api@opnfv
+
+Any test project running on any lab integrated in CI can push the
+results to this database.
+This database can be used to see the evolution of the tests and compare
+the results versus the installers, the scenarios or the labs.
+
+
+Overall Architecture
+--------------------
+The Test result management can be summarized as follows::
+
+ +-------------+ +-------------+ +-------------+
+ | | | | | |
+ | Test | | Test | | Test |
+ | Project #1 | | Project #2 | | Project #N |
+ | | | | | |
+ +-------------+ +-------------+ +-------------+
+ | | |
+ V V V
+ +-----------------------------------------+
+ | |
+ | Test Rest API front end |
+ | http://testresults.opnfv.org/test |
+ | |
+ +-----------------------------------------+
+ A |
+ | V
+ | +-------------------------+
+ | | |
+ | | Test Results DB |
+ | | Mongo DB |
+ | | |
+ | +-------------------------+
+ |
+ |
+ +----------------------+
+ | |
+ | test Dashboard |
+ | |
+ +----------------------+
+
+Test API description
+--------------------
+The Test API is used to declare pods, projects, test cases and test
+results. Pods are the pods used to run the tests.
+The results pushed in the database are related to pods, projects and
+cases. If you try to push results of test done on non referenced pod,
+the API will return an error message.
+
+An additional method dashboard has been added to post-process
+the raw results in release Brahmaputra (deprecated in Colorado).
+
+The data model is very basic, 4 objects are created:
+
+ * Pods
+ * Projects
+ * Testcases
+ * Results
+
+Pods::
+
+ {
+ "id": <ID>,
+ "details": <URL description of the POD>,
+ "creation_date": "YYYY-MM-DD HH:MM:SS",
+ "name": <The POD Name>,
+ "mode": <metal or virtual>,
+ "role": <ci-pod or community-pod or single-node>
+ },
+
+Projects::
+
+ {
+ "id": <ID>,
+ "name": <Name of the Project>,
+ "creation_date": "YYYY-MM-DD HH:MM:SS",
+ "description": <Short description>
+ },
+
+Testcases::
+
+ {
+ "id": <ID>,
+ "name":<Name of the test case>,
+ "project_name":<Name of belonged project>,
+ "creation_date": "YYYY-MM-DD HH:MM:SS",
+ "description": <short description>,
+ "url":<URL for longer description>
+ },
+
+Results::
+
+ {
+ "_id": <ID>,
+ "case_name": <Reference to the test case>,
+ "project_name": <Reference to project>,
+ "pod_name": <Reference to POD where the test was executed>,
+ "installer": <Installer Apex or Compass or Fuel or Joid>,
+ "version": <master or Colorado or Brahmaputra>,
+ "start_date": "YYYY-MM-DD HH:MM:SS",
+ "stop_date": "YYYY-MM-DD HH:MM:SS",
+ "build_tag": <such as "jenkins-functest-fuel-baremetal-daily-master-108">,
+ "scenario": <Scenario on which the test was executed>,
+ "criteria": <PASS or FAILED>,
+ "trust_indicator": {
+ "current": 0,
+ "histories": []
+ }
+ }
+
+The API can described as follows. For detailed information, please go to
+
+ http://testresults.opnfv.org/test/swagger/spec.html
+
+ Authentication: opnfv/api@opnfv
+
+Version:
+
+ +--------+--------------------------+-----------------------------------------+
+ | Method | Path | Description |
+ +========+==========================+=========================================+
+ | GET | /versions | Get all supported API versions |
+ +--------+--------------------------+-----------------------------------------+
+
+
+Pods:
+
+ +--------+----------------------------+-----------------------------------------+
+ | Method | Path | Description |
+ +========+============================+=========================================+
+ | GET | /api/v1/pods | Get the list of declared Labs (PODs) |
+ +--------+----------------------------+-----------------------------------------+
+ | POST | /api/v1/pods | Declare a new POD |
+ | | | Content-Type: application/json |
+ | | | { |
+ | | | "name": "pod_foo", |
+ | | | "mode": "metal", |
+ | | | "role": "ci-pod", |
+ | | | "details": "it is a ci pod" |
+ | | | } |
+ +--------+----------------------------+-----------------------------------------+
+ | GET | /api/v1/pods/{pod_name} | Get a declared POD |
+ +--------+----------------------------+-----------------------------------------+
+
+Projects:
+
+ +--------+----------------------------+-----------------------------------------+
+ | Method | Path | Description |
+ +========+============================+=========================================+
+ | GET | /api/v1/projects | Get the list of declared projects |
+ +--------+----------------------------+-----------------------------------------+
+ | POST | /api/v1/projects | Declare a new test project |
+ | | | Content-Type: application/json |
+ | | | { |
+ | | | "name": "project_foo", |
+ | | | "description": "whatever you want" |
+ | | | } |
+ +--------+----------------------------+-----------------------------------------+
+ | DELETE | /api/v1/projects/{project} | Delete a test project |
+ +--------+----------------------------+-----------------------------------------+
+ | GET | /api/v1/projects/{project} | Get details on a {project} |
+ | | | |
+ +--------+----------------------------+-----------------------------------------+
+ | PUT | /api/v1/projects/{project} | Update a test project |
+ | | | |
+ | | | Content-Type: application/json |
+ | | | { |
+ | | | <the field(s) you want to modify> |
+ | | | } |
+ +--------+----------------------------+-----------------------------------------+
+
+
+Testcases:
+
+ +--------+----------------------------+-----------------------------------------+
+ | Method | Path | Description |
+ +========+============================+=========================================+
+ | GET | /api/v1/projects/{project}/| Get the list of testcases of {project} |
+ | | cases | |
+ +--------+----------------------------+-----------------------------------------+
+ | POST | /api/v1/projects/{project}/| Add a new test case to {project} |
+ | | cases | Content-Type: application/json |
+ | | | { |
+ | | | "name": "case_foo", |
+ | | | "description": "whatever you want" |
+ | | | "url": "whatever you want" |
+ | | | } |
+ +--------+----------------------------+-----------------------------------------+
+ | DELETE | /api/v1/projects/{project}/| Delete a test case |
+ | | cases/{case} | |
+ +--------+----------------------------+-----------------------------------------+
+ | GET | /api/v1/projects/{project}/| Get a declared test case |
+ | | cases/{case} | |
+ +--------+----------------------------+-----------------------------------------+
+ | PUT | /api/v1/projects/{project}?| Modify a test case of {project} |
+ | | cases/{case} | |
+ | | | Content-Type: application/json |
+ | | | { |
+ | | | <the field(s) you want to modify> |
+ | | | } |
+ +--------+----------------------------+-----------------------------------------+
+
+Results:
+
+ +--------+----------------------------+------------------------------------------+
+ | Method | Path | Description |
+ +========+============================+==========================================+
+ | GET | /api/v1/results | Get all the test results |
+ +--------+----------------------------+------------------------------------------+
+ | POST | /api/v1/results | Add a new test results |
+ | | | Content-Type: application/json |
+ | | | { |
+ | | | "project_name": "project_foo", |
+ | | | "scenario": "odl-l2", |
+ | | | "stop_date": "2016-05-28T14:42:58.384Z", |
+ | | | "trust_indicator": 0.5, |
+ | | | "case_name": "vPing", |
+ | | | "build_tag": "", |
+ | | | "version": "Colorado", |
+ | | | "pod_name": "pod_foo", |
+ | | | "criteria": "PASS", |
+ | | | "installer": "fuel", |
+ | | | "start_date": "2016-05-28T14:41:58.384Z",|
+ | | | "details": <your results> |
+ | | | } |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of {case} |
+ | | case={case} | |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of build_tag |
+ | | build_tag={tag} | {tag}. |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get last {N} records of test results |
+ | | last={N} | |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of scenario |
+ | | scenario={scenario} | {scenario}. |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of trust_indicator |
+ | | trust_indicator={ind} | {ind}. |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of last days |
+ | | period={period} | {period}. |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of {project} |
+ | | project={project} | |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of version |
+ | | version={version} | {version}. |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of criteria |
+ | | criteria={criteria} | {criteria}. |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | get the results on pod {pod} |
+ | | pod={pod} | |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the test results of installer {inst} |
+ | | installer={inst} | |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results? | Get the results according to combined |
+ | | <query conditions> | query conditions supported above |
+ +--------+----------------------------+------------------------------------------+
+ | GET | /api/v1/results/{result_id}| Get the test result by result_id |
+ +--------+----------------------------+------------------------------------------+
+
+Scenarios:
+
+ +--------+----------------------------+-----------------------------------------+
+ | Method | Path | Description |
+ +========+============================+=========================================+
+ | GET | /api/v1/scenarios | Get the list of declared scenarios |
+ +--------+----------------------------+-----------------------------------------+
+ | POST | /api/v1/scenario | Declare a new scenario |
+ +--------+----------------------------+-----------------------------------------+
+ | GET | /api/v1/scenario? | Get a declared scenario |
+ | | <query conditions> | |
+ +--------+----------------------------+-----------------------------------------+
+
+
+The code of the API is hosted in the releng repository `[6]`_.
+The static documentation of the API can be found at `[17]`_.
+The test API has been dockerized and may be installed locally in your
+lab. See `[15]`_ for details.
+
+The deployment of the test API has been automated.
+A jenkins job manages:
+ * the unit tests of the test api
+ * the creation of a new docker file
+ * the deployment of the new test api
+ * the archive of the old test api
+ * the backup of the Mongo DB
+
+Test API Authorization
+~~~~~~~~~~~~~~~~~~~~~~
+
+PUT/DELETE/POST operations of the testapi now require token based authorization. The token needs
+to be added in the request using a header 'X-Auth-Token' for access to the database.
+
+e.g::
+ headers['X-Auth-Token']
+
+The value of the header i.e the token can be accessed in the jenkins environment variable
+*TestApiToken*. The token value is added as a masked password.
+
+.. code-block:: python
+
+ headers['X-Auth-Token'] = os.environ.get('TestApiToken')
+
+The above example is in Python. Token based authentication has been added so that only ci pods
+jenkins job can have access to the database.
+
+Please note that currently token authorization is implemented but is not yet enabled.
+
+ Automatic reporting
+ ===================
+
+ An automatic reporting page has been created in order to provide a
+ consistant view of the scenarios.
+ In this page, each scenario is evaluated according to test criteria.
+ The code for the automatic reporting is available at `[8]`_.
+
+ The results are collected from the centralized database every day and,
+ per scenario. A score is calculated based on the results from the last
+ 10 days. This score is the addition of single test scores. Each test
+ case has a success criteria reflected in the criteria field from the
+ results.
+
+ Considering an instance of a scenario os-odl_l2-nofeature-ha, the
+ scoring is the addition of the scores of all the runnable tests from the
+ categories (tiers healthcheck, smoke and features)
+ corresponding to this scenario.
+
+
+ +---------------------+---------+---------+---------+---------+
+ | Test | Apex | Compass | Fuel | Joid |
+ +=====================+=========+=========+=========+=========+
+ | vPing_ssh | X | X | X | X |
+ +---------------------+---------+---------+---------+---------+
+ | vPing_userdata | X | X | X | X |
+ +---------------------+---------+---------+---------+---------+
+ | tempest_smoke_serial| X | X | X | X |
+ +---------------------+---------+---------+---------+---------+
+ | rally_sanity | X | X | X | X |
+ +---------------------+---------+---------+---------+---------+
+ | odl | X | X | X | X |
+ +---------------------+---------+---------+---------+---------+
+ | promise | | | X | X |
+ +---------------------+---------+---------+---------+---------+
+ | doctor | X | | X | |
+ +---------------------+---------+---------+---------+---------+
+ | security_scan | X | | | |
+ +---------------------+---------+---------+---------+---------+
+ | parser | | | X | |
+ +---------------------+---------+---------+---------+---------+
+ | copper | X | | | X |
+ +---------------------+---------+---------+---------+---------+
+
+ All the testcases listed in the table are runnable on
+ os-odl_l2-nofeature scenarios.
+ If no result is available or if all the results are failed, the test
+ case get 0 point.
+ If it was succesfull at least once but not anymore during the 4 runs,
+ the case get 1 point (it worked once).
+ If at least 3 of the last 4 runs were successful, the case get 2 points.
+ If the last 4 runs of the test are successful, the test get 3 points.
+
+ In the example above, the target score for fuel/os-odl_l2-nofeature-ha
+ is 3x6 = 18 points.
+
+ The scenario is validated per installer when we got 3 points for all
+ individual test cases (e.g 18/18).
+ Please note that complex or long duration tests are not considered for
+ the scoring. The success criteria are not always easy to define and may
+ require specific hardware configuration. These results however provide
+ a good level of trust on the scenario.
+
+ A web page is automatically generated every day to display the status.
+ This page can be found at `[9]`_. For the status, click on Status menu,
+ you may also get feedback for vims and tempest_smoke_serial test cases.
+
+ Any validated scenario is stored in a local file on the web server. In
+ fact as we are using a sliding windows to get results, it may happen
+ that a successful scenarios is no more run (because considered as
+ stable) and then the number of iterations (4 needed) would not be
+ sufficient to get the green status.
+
+ Please note that other test cases (e.g. sfc_odl, bgpvpn) need also
+ ODL configuration addons and as a consequence specific scenario.
+ There are not considered as runnable on the generic odl_l2 scenario.
+
+Dashboard
+=========
+
+Dashboard is used to provide a consistant view of the results collected
+in CI.
+The results showed on the dashboard are post processed from the Database,
+which only contains raw results.
+
+In Brahmaputra, we created a basic dashboard.
+Since Colorado, it was decided to adopt ELK framework. Mongo DB results
+are extracted to feed Elasticsearch database (`[7]`_).
+
+A script was developed to build elasticsearch data set. This
+script can be found in `[16]`_.
+
+For next versions, it was decided to integrated bitergia dashboard.
+Bitergia already provides a dashboard for code and infrastructure.
+A new Test tab will be added. The dataset will be built by consuming
+the test API.
+
+
+=======
+How TOs
+=======
+
+How Functest works?
+===================
+
+The installation and configuration of the Functest docker image is
+described in `[1]`_.
+
+The procedure to start tests is described in `[2]`_
+
+
+How can I contribute to Functest?
+=================================
+
+If you are already a contributor of any OPNFV project, you can
+contribute to functest. If you are totally new to OPNFV, you must first
+create your Linux Foundation account, then contact us in order to
+declare you in the repository database.
+
+We distinguish 2 levels of contributors:
+
+ * the standard contributor can push patch and vote +1/0/-1 on any Functest patch
+ * The commitor can vote -2/-1/0/+1/+2 and merge
+
+Functest commitors are promoted by the Functest contributors.
+
+
+Where can I find some help to start?
+====================================
+
+This guide is made for you. You can also have a look at the project wiki
+page `[10]`_.
+There are references on documentation, video tutorials, tips...
+
+You can also directly contact us by mail with [Functest] prefix in the
+title at opnfv-tech-discuss@lists.opnfv.org or on the IRC chan
+#opnfv-functest.
+
+
+What kind of testing do you do in Functest?
+===========================================
+
+Functest is focusing on Functional testing. The results must be PASS or
+FAIL. We do not deal with performance and/or qualification tests.
+We consider OPNFV as a black box and execute our tests from the jumphost
+according to Pharos reference technical architecture.
+
+Upstream test suites are integrated (Rally/Tempest/ODL/ONOS,...).
+If needed Functest may bootstrap temporarily testing activities if they
+are identified but not covered yet by an existing testing project (e.g
+security_scan before the creation of the security repository)
+
+
+How test constraints are defined?
+=================================
+
+Test constraints are defined according to 2 paramaters:
+
+ * The scenario (DEPLOY_SCENARIO env variable)
+ * The installer (INSTALLER_TYPE env variable)
+
+A scenario is a formal description of the system under test.
+The rules to define a scenario are described in `[4]`_
+
+These 2 constraints are considered to determinate if the test is runnable
+or not (e.g. no need to run onos suite on odl scenario).
+
+In the test declaration for CI, the test owner shall indicate these 2
+constraints. The file testcases.yaml `[5]`_ must be patched in git to
+include new test cases. A more elaborated system based on template is
+planned for next releases
+
+For each dependency, it is possible to define a regex::
+
+ name: promise
+ criteria: 'success_rate == 100%'
+ description: >-
+ Test suite from Promise project.
+ dependencies:
+ installer: '(fuel)|(joid)'
+ scenario: ''
+
+In the example above, it means that promise test will be runnable only
+with joid or fuel installers on any scenario.
+
+The vims criteria means any installer and exclude onos and odl with
+bgpvpn scenarios::
+
+ name: vims
+ criteria: 'status == "PASS"'
+ description: >-
+ This test case deploys an OpenSource vIMS solution from Clearwater
+ using the Cloudify orchestrator. It also runs some signaling traffic.
+ dependencies:
+ installer: ''
+ scenario: '(ocl)|(nosdn)|^(os-odl)((?!bgpvpn).)*$'
+
+
+How to write and check constaint regex?
+=======================================
+
+Regex are standard regex. You can have a look at `[11]`_
+
+You can also easily test your regex via an online regex checker such as `[12]`_.
+Put your scenario in the TEST STRING window (e.g. os-odl_l3-ovs-ha), put
+your regex in the REGULAR EXPRESSION window, then you can test your rule
+.
+
+
+How to know which test I can run?
+=================================
+
+You can use the API `[13]`_. The static declaration is in git `[5]`_
+
+If you are in a Functest docker container (assuming that the
+environement has been prepared): just use the CLI.
+
+You can get the list per Test cases or by Tier::
+
+ # functest testcase list
+ healthcheck
+ vping_ssh
+ vping_userdata
+ tempest_smoke_serial
+ rally_sanity
+ odl
+ doctor
+ security_scan
+ tempest_full_parallel
+ rally_full
+ vims
+ # functest tier list
+ - 0. healthcheck:
+ ['healthcheck']
+ - 1. smoke:
+ ['vping_ssh', 'vping_userdata', 'tempest_smoke_serial', 'rally_sanity']
+ - 2. sdn_suites:
+ ['odl']
+ - 3. features:
+ ['doctor', 'security_scan']
+ - 4. openstack:
+ ['tempest_full_parallel', 'rally_full']
+ - 5. vnf:
+ ['vims']
+
+
+How to manually start Functest tests?
+=====================================
+
+Assuming that you are connected on the jumphost and that the system is
+"Pharos compliant", i.e the technical architecture is compatible with
+the one defined in the Pharos project::
+
+ # docker pull opnfv/functest:latest
+ # envs="-e INSTALLER_TYPE=fuel -e INSTALLER_IP=10.20.0.2 -e DEPLOY_SCENARIO=os-odl_l2-nofeature-ha -e CI_DEBUG=true"
+ # sudo docker run --privileged=true -id ${envs} opnfv/functest:latest /bin/bash
+
+
+Then you must connect to the docker container and source the
+credentials::
+
+ # docker ps (copy the id)
+ # docker exec -ti <container_id> bash
+ # source $creds
+
+
+You must first check if the environment is ready::
+
+ # functest env status
+ Functest environment ready to run tests.
+
+
+If not ready, prepare the env by launching::
+
+ # functest env prepare
+ Functest environment ready to run tests.
+
+Once the Functest env is ready, you can use the CLI to start tests.
+
+You can run test cases per test case or per tier:
+ # functest testcase run <case name> or # functest tier run <tier name>
+
+
+e.g::
+
+ # functest testcase run tempest_smoke_serial
+ # functest tier run features
+
+
+If you want to run all the tests you can type::
+
+ # functest testcase run all
+
+
+If you want to run all the tiers (same at the end that running all the
+test cases) you can type::
+
+ # functest tier run all
+
+
+How to declare my tests in Functest?
+====================================
+
+If you want to add new internal test cases, you can submit patch under
+the testcases directory of Functest repository.
+
+For feature test integration, the code can be kept into your own
+repository. The Functest files to be modified are:
+
+ * functest/docker/Dockerfile: get your code in Functest container
+ * functest/ci/testcases.yaml: reference your test and its associated constraints
+
+
+Dockerfile
+----------
+
+This file lists the repositories (internal or external) to be cloned in
+the Functest container. You can also add external packages::
+
+ RUN git clone https://gerrit.opnfv.org/gerrit/<your project> ${REPOS_DIR}/<your project>
+
+testcases.yaml
+--------------
+
+All the test cases that must be run from CI / CLI must be declared in
+ci/testcases.yaml.
+
+This file is used to get the constraints related to the test::
+
+ name: <my_super_test_case>
+ criteria: <not used yet in Colorado, could be > 'PASS', 'rate > 90%'
+ description: >-
+ <the description of your super test suite>
+ dependencies:
+ installer: regex related to installer e.g. 'fuel', '(apex)||(joid)'
+ scenario: regex related to the scenario e.g. 'ovs*no-ha'
+
+
+You must declare your test case in one of the category (tier).
+
+If you are integrating test suites from a feature project, the default
+category is **features**.
+
+
+How to select my list of tests for CI?
+======================================
+
+Functest can be run automatically from CI, a jenkins job is usually
+called after an OPNFV fresh installation.
+By default we try to run all the possible tests (see `[14]` called from
+Functest jenkins job)::
+
+ cmd="python ${FUNCTEST_REPO_DIR}/ci/run_tests.py -t all ${flags}"
+
+
+Each case can be configured as daily and/or weekly task.
+Weekly tasks are used for long duration or experimental tests.
+Daily tasks correspond to the minimum set of test suites to validate a scenario.
+
+When executing run_tests.py, a check based on the jenkins build tag will
+be considered to detect whether it is a daily and/or a weekly test.
+
+in your CI you can customize the list of test you want to run by case or
+by tier, just change the line::
+
+ cmd="python ${FUNCTEST_REPO_DIR}/ci/run_tests.py -t <whatever you want> ${flags}"
+
+e.g.::
+
+ cmd="python ${FUNCTEST_REPO_DIR}/ci/run_tests.py -t healthcheck,smoke ${flags}"
+
+This command will run all the test cases of the first 2 tiers, i.e.
+healthcheck, connection_check, api_check, vping_ssh, vping_userdata,
+snaps_somke, tempest_smoke_serial and rally_sanity.
+
+
+How to push your results into the Test Database
+===============================================
+
+The test database is used to collect test results. By default it is
+enabled only for CI tests from Production CI pods.
+
+The architecture and associated API is described in previous chapter.
+If you want to push your results from CI, you just have to call the API
+at the end of your script.
+
+You can also reuse a python function defined in functest_utils.py::
+
+ def push_results_to_db(db_url, case_name, logger, pod_name,version, payload):
+ """
+ POST results to the Result target DB
+ """
+ url = db_url + "/results"
+ installer = get_installer_type(logger)
+ params = {"project_name": "functest", "case_name": case_name,
+ "pod_name": pod_name, "installer": installer,
+ "version": version, "details": payload}
+
+ headers = {'Content-Type': 'application/json'}
+ try:
+ r = requests.post(url, data=json.dumps(params), headers=headers)
+ if logger:
+ logger.debug(r)
+ return True
+ except Exception, e:
+ print "Error [push_results_to_db('%s', '%s', '%s', '%s', '%s')]:" \
+ % (db_url, case_name, pod_name, version, payload), e
+ return False
+
+
+==========
+References
+==========
+
+_`[1]`: http://artifacts.opnfv.org/functest/docs/configguide/index.html Functest configuration guide
+
+_`[2]`: http://artifacts.opnfv.org/functest/docs/userguide/index.html functest user guide
+
+_`[3]`: https://wiki.opnfv.org/opnfv_test_dashboard Brahmaputra dashboard
+
+_`[4]`: https://wiki.opnfv.org/display/INF/CI+Scenario+Naming
+
+_`[5]`: https://git.opnfv.org/cgit/functest/tree/ci/testcases.yaml
+
+_`[6]`: https://git.opnfv.org/cgit/releng/tree/utils/test/result_collection_api
+
+_`[7]`: https://git.opnfv.org/cgit/releng/tree/utils/test/scripts
+
+_`[8]`: https://git.opnfv.org/cgit/releng/tree/utils/test/reporting/functest
+
+_`[9]`: http://testresults.opnfv.org/reporting/
+
+_`[10]`: https://wiki.opnfv.org/opnfv_functional_testing
+
+_`[11]`: https://docs.python.org/2/howto/regex.html
+
+_`[12]`: https://regex101.com/
+
+_`[13]`: http://testresults.opnfv.org/test/api/v1/projects/functest/cases
+
+_`[14]`: https://git.opnfv.org/cgit/releng/tree/jjb/functest/functest-daily.sh
+
+_`[15]`: https://git.opnfv.org/cgit/releng/tree/utils/test/result_collection_api/README.rst
+
+_`[16]`: https://git.opnfv.org/cgit/releng/tree/utils/test/scripts/mongo_to_elasticsearch.py
+
+_`[17]`: http://artifacts.opnfv.org/releng/docs/testapi.html
+
+OPNFV main site: http://www.opnfv.org
+
+OPNFV functional test page: https://wiki.opnfv.org/opnfv_functional_testing
+
+IRC support chan: #opnfv-functest
+
+_`OpenRC`: http://docs.openstack.org/user-guide/common/cli_set_environment_variables_using_openstack_rc.html
+
+_`Rally installation procedure`: https://rally.readthedocs.org/en/latest/tutorial/step_0_installation.html
+
+_`config_functest.yaml` : https://git.opnfv.org/cgit/functest/tree/testcases/config_functest.yaml
diff --git a/docs/testing/developer/internship/security_group/index.rst b/docs/testing/developer/internship/security_group/index.rst
new file mode 100644
index 000000000..d1cdbdd8f
--- /dev/null
+++ b/docs/testing/developer/internship/security_group/index.rst
@@ -0,0 +1,70 @@
+=======
+License
+=======
+
+Functest Docs are licensed under a Creative Commons Attribution 4.0
+International License.
+You should have received a copy of the license along with this.
+If not, see <http://creativecommons.org/licenses/by/4.0/>.
+
+==================================
+Functest Security group test cases
+==================================
+
+Author: Girish Sukhatankar
+mentors: D.Blaisonneau, J.Lausuch, M.Richomme
+
+Abstract
+========
+
+
+Version history
+===============
+
++------------+----------+------------------+------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+------------------------+
+| 2016-??-?? | 0.0.1 | Morgan Richomme | Beginning of the |
+| | | (Orange) | Internship |
++------------+----------+------------------+------------------------+
+
+
+Overview:
+=========
+
+
+
+
+Problem Statement:
+------------------
+
+
+
+Curation Phase
+--------------
+
+
+
+
+
+Schedule:
+=========
+
+
+
++--------------------------+------------------------------------------+
+| **Date** | **Comment** |
+| | |
++--------------------------+------------------------------------------+
+| December - January | ........ |
++--------------------------+------------------------------------------+
+| January - february | ........ |
++--------------------------+------------------------------------------+
+
+
+References:
+===========
+
+.. _`[1]` : https://wiki.opnfv.org/display/DEV/Intern+Project%3A+Security+groups+test+case+in+Functest
+
diff --git a/docs/testing/developer/internship/testapi_evolution/index.rst b/docs/testing/developer/internship/testapi_evolution/index.rst
new file mode 100644
index 000000000..6a1cde7df
--- /dev/null
+++ b/docs/testing/developer/internship/testapi_evolution/index.rst
@@ -0,0 +1,237 @@
+=======
+License
+=======
+
+Functest Docs are licensed under a Creative Commons Attribution 4.0
+International License.
+You should have received a copy of the license along with this.
+If not, see <http://creativecommons.org/licenses/by/4.0/>.
+
+==================
+Test API evolution
+==================
+
+Author: Sakala Venkata Krishna Rohit
+Mentors: S. Feng, J.Lausuch, M.Richomme
+
+Abstract
+========
+
+The testapi is used by all the test opnfv projects to report results.
+It is also used to declare projects, test cases and labs. A major refactoring
+has been done in Colorado with the introduction of swagger. The testapi is defined in Functest
+developer guide. The purpose of this project is to add more features to the testapi that automate
+the tasks that are done manually now, though there are tasks other than automation.
+
+Version history
+===============
+
++------------+----------+------------------+------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+------------------------+
+| 2016-11-14 | 0.0.1 | Morgan Richomme | Beginning of the |
+| | | (Orange) | Internship |
++------------+----------+------------------+------------------------+
+| 2017-02-17 | 0.0.2 | S.V.K Rohit | End of the Internship |
+| | | (IIIT Hyderabad) | |
++------------+----------+------------------+------------------------+
+
+Overview:
+=========
+
+The internhip time period was from Nov 14th to Feb 17th. The project prosposal page is here `[1]`_.
+The intern project was assigned to Svk Rohit and was mentored by S. Feng, J.Lausuch, M.Richomme.
+The link to the patches submitted is `[2]`_. The internship was successfully completed and the
+documentation is as follows.
+
+Problem Statement:
+------------------
+
+The problem statement could be divided into pending features that needed to be added into testapi
+repo. The following were to be accomplished within the internship time frame.
+
+* **Add verification jenkins job for the testapi code**
+ The purpose of this job is to verify whehter the unit tests are successful or not with the
+ inclusion of the patchset submitted.
+
+* **Automatic update of opnfv/testapi docker image**
+ The docker image of testapi is hosted in the opnfv docker hub. To ensure that the testapi image
+ is always updated with the repository, automatic updation of the image is necessary and a job
+ is triggered whenever a new patch gets merged.
+
+* **Automation deployment of testresults.opnfv.org/test/ website**
+ In the same manner as the docker image of testapi is updated, the testapi website needs to be
+ in sync with the repository code. So, a job has been added to the opnfv jenkins ci for the
+ updation of the testresults website.
+
+* **Generate static documentation of testapi calls**
+ The purpose of this is to give an static/offline view of testapi. If someone wants to have a
+ look at the Restful apis of testapi, he/she does't need to go to the website, he can download
+ a html page and view it anytime.
+
+* **Backup MongoDB of testapi**
+ The mongoDB needs to be backed up every week. Till now it was done manually, but due to this
+ internship, it is now automated using a jenkins job.
+
+* **Add token based authorization to the testapi calls**
+ The token based authorization was implemented to ensure that only ci_pods could access the
+ database. Authentication has been added to only delete/put/post requests.
+
+Curation Phase:
+---------------
+
+The curation phase was the first 3 to 4 weeks of the internship. This phase was to get familiar
+with the testapi code and functionality and propose the solutions/tools for the tasks mentioned
+above. Swagger codegen was choosen out of the four tools proposed `[3]`_ for generating static
+documentaion.
+
+Also, specific amount of time was spent on the script flow of the jenkins jobs. The automatic
+deployment task involves accessing a remote server from inside the jenkins build. The deployment
+had to be done only after the docker image update is done. For these constraints to satisfy, a
+multijob jenkins job was choosen instead of a freestyle job.
+
+Important Links:
+----------------
+
+* MongoDB Backup Link - `[4]`_
+* Static Documentation - `[5]`_
+* TestAPI Token addition to ci_pods - `[6]`_
+
+Schedule:
+=========
+
+The progress and completion of the tasks is described in the below table.
+
++--------------------------+------------------------------------------+
+| **Date** | **Comment** |
+| | |
++--------------------------+------------------------------------------+
+| Nov 14th - Dec 31st | Understand Testapi code and the |
+| | requirements. |
++--------------------------+------------------------------------------+
+| Jan 1st - Jan 7th | Add jenkins job to create static |
+| | documentation and write build scripts. |
++--------------------------+------------------------------------------+
+| Jan 8th - Jan 21st | Add verification jenkins job for unit |
+| | tests. |
++--------------------------+------------------------------------------+
+| Jan 22nd - Jan 28th | Add jenkins job for mongodb backup |
+| | |
++--------------------------+------------------------------------------+
+| Jan 29th - Feb 11th | Enable automatic deployment of |
+| | testresults.opnfv.org/test/ |
++--------------------------+------------------------------------------+
+| Feb 12th - Feb 17th | Add token based authentication |
+| | |
++--------------------------+------------------------------------------+
+
+FAQ's
+=====
+
+This section lists the problems that I have faced and the understanding that I have acquired during
+the internship. This section may help other developers in solving any errors casused because of the
+code written as a part of this internship.
+
+
+Test Api
+--------
+
+What is the difference between defining data_file as "/etc/.." and "etc/.." in setup.cfg ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+If in the setup.cfg, it is defined as
+
+[files]
+data_files =
+etc/a.conf = etc/a.conf.sample
+
+then it ends up installed in the /usr/etc/. With this configuration, it would be installed
+correctly within a venv. but when it is defined as
+
+[files]
+data_files =
+/etc/a.conf = etc/a.conf.sample
+
+then it ends up installed on the root of the filesystem instead of properly be installed within the
+venv.
+
+Which attribute does swagger-codegen uses as the title in the generation of document generation ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+It uses the nickname of the api call in swagger as the title in the generation of the document
+generation.
+
+Does swagger-codegen take more than one yaml file as input ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+No, swagger-codegen only takes one yaml file as input to its jar file. If there more than one yaml
+file, one needs to merge them and give it as an input keeping mind the swagger specs.
+
+
+Jenkins & JJB
+-------------
+
+Which scm macro is used for verification jenkins jobs ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+There are two macros for scm one is git-scm and other git-scm-gerrit. git-scm-gerrit is used for
+verification jenkins job.
+
+Does the virtualenv created in one build script exists in other build scripts too ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+No, the virtualenv created in one build script only exists in that build script/shell.
+
+What parameters are needed for the scm macros ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Project and Branch are the two parameters needed for scm macros.
+
+What is the directory inside the jenkins build ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The directory of the jenkins build is the directory of the repo. `ls $WORKSPACE` command will give
+you all the contents of the directory.
+
+How to include a bash script in jenkins job yaml file ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+An example might be apt here as an answer.
+
+builders:
+ - shell:
+ !include-raw: include-raw001-hello-world.sh
+
+
+How do you make a build server run on a specific machine ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+It can be done by defining a label parameter 'SLAVE_LABEL' or in OPNFV , there are macros for each
+server, one can use those parameter macros.
+Ex: opnfv-build-defaults. Note, if we use macro, then no need to define GIT_BASE, but if one uses
+SLAVE_LABEL, one needs to define a parameter GIT_BASE. This is because macro already has GIT_BASE
+defined.
+
+What job style should be used when there is a situation like one build should trigger other builds
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+or when different build scripts need to be run on different machines ?
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+MultiJob style should be used as it has phases where each phase can be taken as a build scipt and
+can have its own parameters by which one can define the SLAVE_LABEL parameter.
+
+References:
+===========
+
+_`[1]` : https://wiki.opnfv.org/display/DEV/Intern+Project%3A+testapi+evolution
+
+_`[2]` : https://gerrit.opnfv.org/gerrit/#/q/status:merged+owner:%22Rohit+Sakala+%253Crohitsakala%2540gmail.com%253E%22
+
+_`[3]` : https://docs.google.com/document/d/1jWwVZ1ZpKgKcOS_zSz2KzX1nwg4BXxzBxcwkesl7krw/edit?usp=sharing
+
+_`[4]` : http://artifacts.opnfv.org/testapibackup.html
+
+_`[5]` : http://artifacts.opnfv.org/releng/docs/testapi.html
+
+_`[6]` : http://artifacts.opnfv.org/functest/docs/devguide/index.html#test-api-authorization
diff --git a/docs/testing/developer/internship/unit_tests/index.rst b/docs/testing/developer/internship/unit_tests/index.rst
new file mode 100644
index 000000000..f969aa72d
--- /dev/null
+++ b/docs/testing/developer/internship/unit_tests/index.rst
@@ -0,0 +1,70 @@
+=======
+License
+=======
+
+Functest Docs are licensed under a Creative Commons Attribution 4.0
+International License.
+You should have received a copy of the license along with this.
+If not, see <http://creativecommons.org/licenses/by/4.0/>.
+
+===================
+Functest Unit tests
+===================
+
+Author: Ashish Kumar
+Mentors: H.Yao, J.Lausuch, M.Richomme
+
+Abstract
+========
+
+
+Version history
+===============
+
++------------+----------+------------------+------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+------------------------+
+| 2016-??-?? | 0.0.1 | Morgan Richomme | Beginning of the |
+| | | (Orange) | Internship |
++------------+----------+------------------+------------------------+
+
+
+Overview:
+=========
+
+
+
+
+Problem Statement:
+------------------
+
+
+
+Curation Phase
+--------------
+
+
+
+
+
+Schedule:
+=========
+
+
+
++--------------------------+------------------------------------------+
+| **Date** | **Comment** |
+| | |
++--------------------------+------------------------------------------+
+| December - January | ........ |
++--------------------------+------------------------------------------+
+| January - february | ........ |
++--------------------------+------------------------------------------+
+
+
+References:
+===========
+
+.. _`[1]` : https://wiki.opnfv.org/display/DEV/Intern+Project%3A+Functest+unit+tests
+
diff --git a/docs/testing/developer/internship/vnf_catalog/index.rst b/docs/testing/developer/internship/vnf_catalog/index.rst
new file mode 100644
index 000000000..df7633391
--- /dev/null
+++ b/docs/testing/developer/internship/vnf_catalog/index.rst
@@ -0,0 +1,170 @@
+=======
+License
+=======
+
+Functest Docs are licensed under a Creative Commons Attribution 4.0
+International License.
+You should have received a copy of the license along with this.
+If not, see <http://creativecommons.org/licenses/by/4.0/>.
+
+=======================
+Open Source VNF Catalog
+=======================
+
+Author: Kumar Rishabh
+Mentors: B.Souville, M.Richomme, J.Lausuch
+
+Abstract
+========
+
+
+
+Version hissory
+===============
+
++------------+----------+------------------+------------------------+
+| **Date** | **Ver.** | **Author** | **Comment** |
+| | | | |
++------------+----------+------------------+------------------------+
+| 2016-12-12 | 0.0.1 | Morgan Richomme | Beginning of the |
+| | | (Orange) | Internship |
++------------+----------+------------------+------------------------+
+
+
+Overview:
+=========
+
+
+This project aims to create an Open Source catalog for reference and
+classification of Virtual Network Functions (VNFs)s available on
+Internet. The classification method proposed will be in sync with the
+requirements of Telcos active in NFV landscape. The project aims to have
+running web platform similar to [1] by the mid of internship (2nd week
+of March). By the penultimate month of internship I aim to have fully
+functional implementation of an Open Source VNF in functest.
+
+
+Problem Statement:
+------------------
+
+OPNFV aims to be the reference platform for development,
+standardization and integration of Open Source NFV components across
+various Open Source Platforms. It mainly deals with the infrastructure
+through the Network Function Virtualization Infrastructure (NFVI) and
+Virtual Infrastructure manager (VIM). The MANO (Management and
+orchestration) stacks have been introduced recently. VNFs are not
+directly in OPNFV scope, however VNFs are needed to test and qualify the
+infrastructure. In this regard having a common curated Open Source
+Reference VNF catalog would be of immense importance to community.
+
+Since major focus of OPNFV is Telcos, a curated platform targeted from
+industry point of view would be very useful. We plan to divide the
+entire project into three major phases(with some iterative improvements
+and overlaps)
+
+
+Curation Phase
+--------------
+This phase pertains to studying various Open Source VNFs available and
+classification of them based on certain parameters. The parameters that
+I currently have in mind are:
+ * Developer Metrics: These pertain to repo characteristics of VNF under
+ study
+ * Usage Statistics - Activity, Number of Commits, stars
+ * Maturity Statistics - For instance if an NFV community decides code
+ coverage is important for them, it shows the NFV community is serious
+ about taking the project forward
+ * Technical Tagging: These are the tags that pertain to technical
+ characteristics of a VNF
+ * Broad Use Cases - Whether the VNF fits strictly in IaaS, PaaS or
+ SaaS layer or is an hybrid of two/all.
+ * Generic Use Cases - This in my opinion is the broadest
+ classification category. For instance a VNF could be built with a
+ broad idea of powering IOT devices at home or from usage perspective
+ of Telco Operators (vFW, vEPC, vIMS, vCDN, vAAA, vCPE,...).`[2]`_
+ * Fields of Application
+ * Library Status - Whether APIs are standardized, support RESTful
+ services.
+ * Dependency Forwarding Graph - This is pretty complex tagging
+ mechanism. It essentially tries to establish a graph relationship
+ between the VNFs (elementary VNFs are used in Service Function
+ Chaining chains such as Firewall, DPI, content enrichment,..). In my
+ opinion this is useful immensely. This will allow users to go to
+ platform and ask a question like - “I have this X tech stack to
+ support, Y and Z are my use cases, which NFVs should I use to support
+ this.
+ * Visitor Score - Based on `[1]`_ I plan to evolve a visitor score for
+ the platform. This will allow users to score an NFV on certain
+ parameters, may be post comments.
+
+**I plan to use the above three scores and evolve cumulative score which
+will be displayed next to each of the NFV on the platform.**
+
+ * Platform building phase - This will involve erecting a Web Platform
+ which will be similar to this `[1]`_. I am decently familiar with
+ Django and hence I will write the platform in Django. There are two
+ action plans that I have in mind right now. Either I can start writing
+ the platform simultaneously which will help keep track of my progress
+ or I can write the platform after 1.5 - 2 months into the internship.
+ Either way I aim to have the Web Platform ready by March 12.
+
+ * Functest VNF implementation phase - This is the last phase that will
+ involve writing a fully functional implementation of an Open Source VNF
+ into Functest. I will undertake this after I am 3 months into the
+ internship. I have a decent familiarity with python and hence I think
+ it shouldn’t be too difficult. I need to decide how complex the VNFI
+ should undertake this exercise for (e.g. AAA such as free radius sounds
+ relatively easy, vCDN is much more challenging).
+ This will be decided in consent with my mentors.
+
+
+
+
+Schedule:
+=========
+I plan to take this project in 6 months time frame as I want to use it
+as a chance to read more about NFVs in particular and SDN in general
+
+
++--------------------------+------------------------------------------+
+| **Date** | **Comment** |
+| | |
++--------------------------+------------------------------------------+
+| December 12 - January 12 | Study the above mentioned metrics |
+| | Decide which of them are important for |
+| | community (and which are not). |
++--------------------------+------------------------------------------+
+| January 12 - January 27 | Make a database for the above studied |
+| | metrics and evolve it further based on |
+| | Mentors’ input. + associated API |
++--------------------------+------------------------------------------+
+| January 27 - February 5 | Compile the data collected above and make|
+| | it public. Although I can keep everything|
+| | public from the beginning too. My |
+| | rationale of not making the entire data |
+| | public in initial stage as the errors |
+| | caused by me could be misleading for |
+| | developers. |
++--------------------------+------------------------------------------+
+| February 5 - March 5 | Erect the Web Platform and release it |
+| | for restricted group for alpha testing. |
++--------------------------+------------------------------------------+
+| March 5 - March 12 | Make it public. Release it to public for |
+| | beta testing. Fix Bugs. |
++--------------------------+------------------------------------------+
+| March 12 - April 12 | Start working on implementation of an |
+| | Open Source VNF in Functest. |
++--------------------------+------------------------------------------+
+| April 12 - May 12 | I will decided what to do here based on |
+| | discussion with mentors. |
++--------------------------+------------------------------------------+
+
+
+References:
+===========
+
+.. _`[1]` : Openhub: https://www.openhub.net/explore/projects
+
+.. _`[2]` : ETSI NFV White Paper: https://portal.etsi.org/Portals/0/TBpages/NFV/Docs/NFV_White_Paper3.pdf
+
+.. _`[3]` : https://wiki.opnfv.org/display/DEV/Intern+Project%3A+Open+Source+VNF+catalog