aboutsummaryrefslogtreecommitdiffstats
path: root/docs/testing
diff options
context:
space:
mode:
Diffstat (limited to 'docs/testing')
-rw-r--r--docs/testing/developer/devguide/index.rst1
-rw-r--r--docs/testing/developer/devguide/web.rst100
-rw-r--r--docs/testing/user/configguide/index.rst1
-rw-r--r--docs/testing/user/configguide/web.rst74
-rw-r--r--docs/testing/user/userguide/index.rst3
-rw-r--r--docs/testing/user/userguide/network.rst115
-rw-r--r--docs/testing/user/userguide/network_testcase_description.rst127
-rw-r--r--docs/testing/user/userguide/web.rst70
8 files changed, 491 insertions, 0 deletions
diff --git a/docs/testing/developer/devguide/index.rst b/docs/testing/developer/devguide/index.rst
index 0b583cc5..ab411005 100644
--- a/docs/testing/developer/devguide/index.rst
+++ b/docs/testing/developer/devguide/index.rst
@@ -16,5 +16,6 @@ QTIP Developer Guide
framework.rst
cli.rst
api.rst
+ web.rst
compute-qpi.rst
storage-qpi.rst
diff --git a/docs/testing/developer/devguide/web.rst b/docs/testing/developer/devguide/web.rst
new file mode 100644
index 00000000..ae4e3156
--- /dev/null
+++ b/docs/testing/developer/devguide/web.rst
@@ -0,0 +1,100 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+
+
+***************************************
+Web Portal for Benchmarking Services
+***************************************
+
+QTIP consists of different tools(metrics) to benchmark the NFVI. These metrics
+fall under different NFVI subsystems(QPI's) such as compute, storage and network.
+QTIP benchmarking tasks are built upon `Ansible`_ playbooks and roles.
+QTIP web portal is a platform to expose QTIP as a benchmarking service hosted on a central host.
+
+Framework
+=========
+
+The web travel has been developed on Python `Django`_ framework. Dig into the documentation to learn about Django.
+
+Design
+======
+
+Django is a MTV (Model Template View) framework. Database objects are mapped to models in ``models.py``. Views handle the
+requests from client side and interact with database using Django ORM. Templates are responsible for
+UI rendering based on response context from Views.
+
+Models
+------
+
+Repo
+~~~~
+
+Model for `workspace`_ repos
+
+::
+
+ Repo:
+ name
+ git_link
+
+
+Task
+~~~~
+
+Tasks keep track of every benchmark run through QTIP-Web Services. Whenever you run a benchmark,
+a new task is created which keep track of time stats and log task progress and ansible output for
+the respective playbook.
+
+::
+
+ Task
+ start_time
+ end_time
+ status
+ run_time
+ repo
+ log
+
+
+Views
+-----
+
+Dashboard
+~~~~~~~~~
+
+ - Base class - TemplateVIew
+
+Class based view serving as home page for the application.
+
+
+ReposView
+~~~~~~~~~
+
+ - Base class - LoginRequiredMixin, CreateView
+
+Class based view for listing and add new repos
+
+
+RepoUpdate
+~~~~~~~~~~
+
+ - Base class - LoginRequiredMixin, UpdateView
+
+Class based View for listing and updating an existing repo details.
+
+*Both ReposView and RepoUpdate View use same template ``repo_form.html``. The context has an extra variable ``template_role`` which is used to distinguish if repo form is for create or edit operation.*
+
+
+Run
+~~~
+
+ - Base class - LoginRequiredMixin, View
+ - template name - run.html
+
+Class based View for adding new task and run benchmark based on task details. The logs are saved
+in ``logs/run_<log_id>`` directory.
+
+
+.. _Ansible: https://www.ansible.com/
+.. _Django: https://docs.djangoproject.com/en/1.11/
+.. _workspace: https://github.com/opnfv/qtip/blob/master/docs/testing/developer/devguide/ansible.rst#create-workspace
diff --git a/docs/testing/user/configguide/index.rst b/docs/testing/user/configguide/index.rst
index 9c72ecd2..fa893e5e 100644
--- a/docs/testing/user/configguide/index.rst
+++ b/docs/testing/user/configguide/index.rst
@@ -12,3 +12,4 @@ QTIP Installation Guide
:maxdepth: 2
./configuration.rst
+ ./web.rst
diff --git a/docs/testing/user/configguide/web.rst b/docs/testing/user/configguide/web.rst
new file mode 100644
index 00000000..83365abe
--- /dev/null
+++ b/docs/testing/user/configguide/web.rst
@@ -0,0 +1,74 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+
+
+***************************************
+Web Portal installation & configuration
+***************************************
+
+Web Portal for Benchmarking is developed on python `Django`_ Framework. Right now the installation
+is need to be done from source.
+
+
+
+Clone QTIP Repo
+===============
+
+::
+
+ git clone https://github.com/opnfv/qtip.git
+
+
+Setup database and Initialize user data
+=======================================
+
+CD into `web` directory.
+------------------------
+
+::
+
+ cd qtip/qtip/web
+
+
+Setup migrations
+----------------
+
+::
+
+ python manage.py makemigrations
+
+
+In usual case migrations will be already available with source. Console willll notify you
+of the same.
+
+Run migrations
+--------------
+
+::
+
+ python manage.py migrate
+
+
+Create superuser
+----------------
+::
+
+ python manage.py createsuperuser
+
+
+Console will prompt for adding new web admin. Enter new credentials.
+
+
+
+Collecting Static Dependencies
+------------------------------
+::
+
+ python manage.py importstatic
+
+
+This will import js and css dependencies for UI in static directory. Now the web application is
+ready to run.
+
+
+.. _Django: https://docs.djangoproject.com/en/1.11/
diff --git a/docs/testing/user/userguide/index.rst b/docs/testing/user/userguide/index.rst
index fbfdd394..e05a5e90 100644
--- a/docs/testing/user/userguide/index.rst
+++ b/docs/testing/user/userguide/index.rst
@@ -15,5 +15,8 @@ QTIP User Guide
getting-started.rst
cli.rst
api.rst
+ web.rst
compute.rst
storage.rst
+ network.rst
+ network_testcase_description.rst
diff --git a/docs/testing/user/userguide/network.rst b/docs/testing/user/userguide/network.rst
new file mode 100644
index 00000000..4d48d4d5
--- /dev/null
+++ b/docs/testing/user/userguide/network.rst
@@ -0,0 +1,115 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) 2018 Spirent Communications Corp.
+
+
+********************************
+Network Performance Benchmarking
+********************************
+Like compute or storage QPI, network QPI gives users an overall score for system network performance.
+For now it focuses on L2 virtual switch performance on NFVI. Current testcase are from RFC2544 standart and
+implemntation is based on Spirent Testcenter Virtual.
+
+For now, network QPI runs against on the baremetal/virtual scenario deployed by
+the OPNFV installer `APEX`_.
+
+Getting started
+===============
+Notice: All descriptions are based on containers.
+
+Requirements
+------------
+
+* Git must be installed.
+* Docker and docker-compose must be installed.
+* Spirent Testcenter Virtual image must be uploaded to the target cloud and the
+ associated flavor must be created before test.
+* Spirent License Server and Spirent LabServer must be set up and keep them ip
+ reachable from target cloud external network before test.
+
+Git Clone QTIP Repo
+-------------------
+
+::
+
+ git clone https://git.opnfv.org/qtip
+
+Running QTIP container and Nettest Containers
+----------------------------------------------
+
+With Docker Compose, we can use a YAML file to configure application's services and
+use a single command to create and start all the services.
+
+There is a YAML file ``./qtip/tests/ci/network/docker-compose.yaml`` from QTIP repos.
+It can help you to create and start the network QPI service.
+
+Before running docker-compose, you must specify these three variables:
+
+* DOCKER_TAG, which specified the Docker tag(ie: latest)
+* SSH_CREDENTIALS, a directory which includes an SSH key pair will be mounted into QTIP container.
+ QTIP use this SSH key pair to connect to remote hosts.
+* ENV_FILE, which includes the environment variables required by QTIP and Storperf containers
+
+ A example of ENV_FILE:
+
+ ::
+
+ INSTALLER_TYPE=apex
+ INSTALLER_IP=192.168.122.247
+ TEST_SUITE=network
+ NODE_NAME=zte-virtual5
+ SCENARIO=generic
+ TESTAPI_URL=
+ OPNFV_RELEASE=euphrates
+ # The below environment variables are Openstack Credentials.
+ OS_USERNAME=admin
+ OS_USER_DOMAIN_NAME=Default
+ OS_PROJECT_DOMAIN_NAME=Default
+ OS_BAREMETAL_API_VERSION=1.29
+ NOVA_VERSION=1.1
+ OS_PROJECT_NAME=admin
+ OS_PASSWORD=ZjmZJmkCvVXf9ry9daxgwmz3s
+ OS_NO_CACHE=True
+ COMPUTE_API_VERSION=1.1
+ no_proxy=,192.168.37.10,192.0.2.5
+ OS_CLOUDNAME=overcloud
+ OS_AUTH_URL=http://192.168.37.10:5000/v3
+ IRONIC_API_VERSION=1.29
+ OS_IDENTITY_API_VERSION=3
+ OS_AUTH_TYPE=password
+ # The below environment variables are extra info with Spirent.
+ SPT_LICENSE_SERVER_IP=192.168.37.251
+ SPT_LAB_SERVER_IP=192.168.37.122
+ SPT_STCV_IMAGE_NAME=stcv-4.79
+ SPT_STCV_FLAVOR_NAME=m1.tiny
+
+Then, you use the following commands to start network QPI service.
+
+::
+
+ docker-compose -f docker-compose.yaml pull
+ docker-compose -f docker-compose.yaml up -d
+
+Execution
+---------
+
+You can run network QPI with docker exec:
+::
+
+ docker exec <qtip container> bash -x /home/opnfv/repos/qtip/qtip/scripts/quickstart.sh
+
+QTIP generates results in the ``$PWD/results/`` directory are listed down under the
+timestamp name.
+
+Metrics
+-------
+
+Nettest provides the following `metrics`_:
+
+* RFC2544 througput
+* RFC2544 latency
+
+
+.. _APEX: https://wiki.opnfv.org/display/apex
+.. _metrics: https://tools.ietf.org/html/rfc2544
+
diff --git a/docs/testing/user/userguide/network_testcase_description.rst b/docs/testing/user/userguide/network_testcase_description.rst
new file mode 100644
index 00000000..66fda073
--- /dev/null
+++ b/docs/testing/user/userguide/network_testcase_description.rst
@@ -0,0 +1,127 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) 2018 Spirent Communications Corp.
+.. Template to be used for test case descriptions in QTIP Project.
+
+
+Test Case Description
+=====================
+
++-----------------------------------------------------------------------------+
+|Network throughput |
++==============+==============================================================+
+|test case id | qtip_throughput |
++--------------+--------------------------------------------------------------+
+|metric | rfc2544 throughput |
++--------------+--------------------------------------------------------------+
+|test purpose | get the max throughput of the pathway on same host or accross|
+| | hosts |
++--------------+--------------------------------------------------------------+
+|configuration | None |
++--------------+--------------------------------------------------------------+
+|test tool | Spirent Test Center Virtual |
++--------------+--------------------------------------------------------------+
+|references | RFC2544 |
++--------------+--------------------------------------------------------------+
+|applicability | 1. test the switch throughput on same host or accross hosts |
+| | 2. test the switch throughput for different packet sizes |
++--------------+--------------------------------------------------------------+
+|pre-test | 1. deploy STC license server and LabServer on public network |
+|conditions | and verify it can operate correctlly |
+| | 2. upload STC virtual image and create STCv flavor on the |
+| | deployed cloud environment |
++--------------+------+----------------------------------+--------------------+
+|test sequence | step | description | result |
+| +------+----------------------------------+--------------------+
+| | 1 | deploy STCv stack on the target | 2 STCv VM will be |
+| | | cloud with affinity attribute | established on the |
+| | | according to requirements. | cloud |
+| +------+----------------------------------+--------------------+
+| | 2 | run rfc2544 throughput test with | test result report |
+| | | different packet size | will be produced in|
+| | | | QTIP container |
+| +------+----------------------------------+--------------------+
+| | 3 | destory STCv stack | STCv stack |
+| | | different packet size | destoried |
++--------------+------+----------------------------------+--------------------+
+|test verdict | find the test result report in QTIP container running |
+| | directory |
++--------------+--------------------------------------------------------------+
+
++-----------------------------------------------------------------------------+
+|Network throughput |
++==============+==============================================================+
+|test case id | qtip_latency |
++--------------+--------------------------------------------------------------+
+|metric | rfc2544 lantency |
++--------------+--------------------------------------------------------------+
+|test purpose | get the latency value of the pathway on same host or accross |
+| | hosts |
++--------------+--------------------------------------------------------------+
+|configuration | None |
++--------------+--------------------------------------------------------------+
+|test tool | Spirent Test Center Virtual |
++--------------+--------------------------------------------------------------+
+|references | RFC2544 |
++--------------+--------------------------------------------------------------+
+|applicability | 1. test the switch latency on same host or accross hosts |
+| | 2. test the switch latency for different packet sizes |
++--------------+--------------------------------------------------------------+
+|pre-test | 1. deploy STC license server and LabServer on public network |
+|conditions | and verify it can operate correctlly |
+| | 2. upload STC virtual image and create STCv flavor on the |
+| | deployed cloud environment |
++--------------+------+----------------------------------+--------------------+
+|test sequence | step | description | result |
+| +------+----------------------------------+--------------------+
+| | 1 | deploy STCv stack on the target | 2 STCv VM will be |
+| | | cloud with affinity attribute | established on the |
+| | | according to requirements. | cloud |
+| +------+----------------------------------+--------------------+
+| | 2 | run rfc2544 latency test with | test result report |
+| | | different packet size | will be produced in|
+| | | | QTIP container |
+| +------+----------------------------------+--------------------+
+| | 3 | destroy STCv stack | STCv stack |
+| | | | destried |
++--------------+------+----------------------------------+--------------------+
+|test verdict | find the test result report in QTIP container running |
+| | directory |
++--------------+--------------------------------------------------------------+
+
++-----------------------------------------------------------------------------+
+|Network Latency |
++==============+==============================================================+
+|test case id | e.g. qtip_throughput |
++--------------+--------------------------------------------------------------+
+|metric | what will be measured, e.g. latency |
++--------------+--------------------------------------------------------------+
+|test purpose | describe what is the purpose of the test case |
++--------------+--------------------------------------------------------------+
+|configuration | what .yaml file to use, state SLA if applicable, state |
+| | test duration, list and describe the scenario options used in|
+| | this TC and also list the options using default values. |
++--------------+--------------------------------------------------------------+
+|test tool | e.g. ping |
++--------------+--------------------------------------------------------------+
+|references | RFC2544 |
++--------------+--------------------------------------------------------------+
+|applicability | describe variations of the test case which can be |
+| | performend, e.g. run the test for different packet sizes |
++--------------+--------------------------------------------------------------+
+|pre-test | describe configuration in the tool(s) used to perform |
+|conditions | the measurements (e.g. fio, pktgen), POD-specific |
+| | configuration required to enable running the test |
++--------------+------+----------------------------------+--------------------+
+|test sequence | step | description | result |
+| +------+----------------------------------+--------------------+
+| | 1 | use this to describe tests that | what happens in |
+| | | require several steps e.g. | this step |
+| | | step 1 collect logs | e.g. logs collected|
+| +------+----------------------------------+--------------------+
+| | 2 | remove interface | interface down |
+| +------+----------------------------------+--------------------+
+| | N | what is done in step N | what happens |
++--------------+------+----------------------------------+--------------------+
+|test verdict | expected behavior, or SLA, pass/fail criteria |
++--------------+--------------------------------------------------------------+
diff --git a/docs/testing/user/userguide/web.rst b/docs/testing/user/userguide/web.rst
new file mode 100644
index 00000000..79f180d9
--- /dev/null
+++ b/docs/testing/user/userguide/web.rst
@@ -0,0 +1,70 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+
+
+**********************
+Web Portal User Manual
+**********************
+
+QTIP consists of different tools(metrics) to benchmark the NFVI. These metrics
+fall under different NFVI subsystems(QPI's) such as compute, storage and network.
+QTIP benchmarking tasks are built upon `Ansible`_ playbooks and roles.
+QTIP web portal is a platform to expose QTIP as a benchmarking service hosted on a central host.
+
+
+Running
+=======
+
+After setting up the web portal as instructed in config guide, cd into the `web` directory.
+
+and run.
+
+::
+
+ python manage.py runserver 0.0.0.0
+
+
+You can access the portal by logging onto `<host>:8000/bench/login/`
+
+If you want to use port 80, you may need sudo permission.
+
+::
+
+ sudo python manage.py runserver 0.0.0.0:80
+
+To Deploy on `wsgi`_, Use the Django `deployment tutorial`_
+
+
+Features
+========
+
+After logging in You'll be redirect to QTIP-Web Dashboard. You'll see following menus on left.
+
+ * Repos
+ * Run Benchmarks
+ * Tasks
+
+Repo
+----
+
+ Repos are links to qtip `workspaces`_. This menu list all the aded repos. Links to new repos
+ can be added here.
+
+Run Benchmarks
+--------------
+
+ To run a benchmark, select the corresponding repo and run. QTIP Benchmarking service will clone
+ the workspace and run the benchmarks. Inventories used are predefined in the workspace repo in the `/hosts/` config file.
+
+Tasks
+-----
+
+ All running or completed benchmark jobs can be seen in Tasks menu with their status.
+
+
+*New users can be added by Admin on the Django Admin app by logging into `/admin/'.*
+
+.. _Ansible: https://www.ansible.com/
+.. _wsgi: https://wsgi.readthedocs.io/en/latest/what.html
+.. _deployment tutorial: https://docs.djangoproject.com/en/1.11/howto/deployment/wsgi/
+.. _workspaces: https://github.com/opnfv/qtip/blob/master/docs/testing/developer/devguide/ansible.rst#create-workspace