aboutsummaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--docs/testing/developer/devguide/index.rst1
-rw-r--r--docs/testing/developer/devguide/web.rst100
-rw-r--r--docs/testing/user/configguide/configuration.rst5
-rw-r--r--docs/testing/user/configguide/index.rst1
-rw-r--r--docs/testing/user/configguide/web.rst74
-rw-r--r--docs/testing/user/userguide/compute.rst35
-rw-r--r--docs/testing/user/userguide/index.rst1
-rw-r--r--docs/testing/user/userguide/network.rst1
-rw-r--r--docs/testing/user/userguide/network_testcase_description.rst37
-rw-r--r--docs/testing/user/userguide/storage.rst19
-rw-r--r--docs/testing/user/userguide/web.rst70
11 files changed, 43 insertions, 301 deletions
diff --git a/docs/testing/developer/devguide/index.rst b/docs/testing/developer/devguide/index.rst
index ab411005..0b583cc5 100644
--- a/docs/testing/developer/devguide/index.rst
+++ b/docs/testing/developer/devguide/index.rst
@@ -16,6 +16,5 @@ QTIP Developer Guide
framework.rst
cli.rst
api.rst
- web.rst
compute-qpi.rst
storage-qpi.rst
diff --git a/docs/testing/developer/devguide/web.rst b/docs/testing/developer/devguide/web.rst
deleted file mode 100644
index ae4e3156..00000000
--- a/docs/testing/developer/devguide/web.rst
+++ /dev/null
@@ -1,100 +0,0 @@
-.. This work is licensed under a Creative Commons Attribution 4.0 International License.
-.. http://creativecommons.org/licenses/by/4.0
-
-
-***************************************
-Web Portal for Benchmarking Services
-***************************************
-
-QTIP consists of different tools(metrics) to benchmark the NFVI. These metrics
-fall under different NFVI subsystems(QPI's) such as compute, storage and network.
-QTIP benchmarking tasks are built upon `Ansible`_ playbooks and roles.
-QTIP web portal is a platform to expose QTIP as a benchmarking service hosted on a central host.
-
-Framework
-=========
-
-The web travel has been developed on Python `Django`_ framework. Dig into the documentation to learn about Django.
-
-Design
-======
-
-Django is a MTV (Model Template View) framework. Database objects are mapped to models in ``models.py``. Views handle the
-requests from client side and interact with database using Django ORM. Templates are responsible for
-UI rendering based on response context from Views.
-
-Models
-------
-
-Repo
-~~~~
-
-Model for `workspace`_ repos
-
-::
-
- Repo:
- name
- git_link
-
-
-Task
-~~~~
-
-Tasks keep track of every benchmark run through QTIP-Web Services. Whenever you run a benchmark,
-a new task is created which keep track of time stats and log task progress and ansible output for
-the respective playbook.
-
-::
-
- Task
- start_time
- end_time
- status
- run_time
- repo
- log
-
-
-Views
------
-
-Dashboard
-~~~~~~~~~
-
- - Base class - TemplateVIew
-
-Class based view serving as home page for the application.
-
-
-ReposView
-~~~~~~~~~
-
- - Base class - LoginRequiredMixin, CreateView
-
-Class based view for listing and add new repos
-
-
-RepoUpdate
-~~~~~~~~~~
-
- - Base class - LoginRequiredMixin, UpdateView
-
-Class based View for listing and updating an existing repo details.
-
-*Both ReposView and RepoUpdate View use same template ``repo_form.html``. The context has an extra variable ``template_role`` which is used to distinguish if repo form is for create or edit operation.*
-
-
-Run
-~~~
-
- - Base class - LoginRequiredMixin, View
- - template name - run.html
-
-Class based View for adding new task and run benchmark based on task details. The logs are saved
-in ``logs/run_<log_id>`` directory.
-
-
-.. _Ansible: https://www.ansible.com/
-.. _Django: https://docs.djangoproject.com/en/1.11/
-.. _workspace: https://github.com/opnfv/qtip/blob/master/docs/testing/developer/devguide/ansible.rst#create-workspace
diff --git a/docs/testing/user/configguide/configuration.rst b/docs/testing/user/configguide/configuration.rst
index ae745341..d04f5bab 100644
--- a/docs/testing/user/configguide/configuration.rst
+++ b/docs/testing/user/configguide/configuration.rst
@@ -40,8 +40,9 @@ Run and enter the docker instance
1. If you want to run benchmarks:
::
- envs="INSTALLER_TYPE={INSTALLER_TYPE} -e INSTALLER_IP={INSTALLER_IP} -e NODE_NAME={NODE_NAME}"
+ envs="INSTALLER_TYPE={INSTALLER_TYPE} -e INSTALLER_IP={INSTALLER_IP} -e NODE_NAME={NODE_NAME}"
docker run -p [HOST_IP:]<HOST_PORT>:5000 --name qtip -id -e $envs opnfv/qtip
+ docker start qtip
docker exec -i -t qtip /bin/bash
``INSTALLER_TYPE`` should be one of OPNFV installer, e.g. apex, compass, daisy, fuel
@@ -90,7 +91,7 @@ Environment configuration
Hardware configuration
----------------------
-QTIP does not have specific hardware requriements, and it can runs over any
+QTIP does not have specific hardware requirements, and it can runs over any
OPNFV installer.
diff --git a/docs/testing/user/configguide/index.rst b/docs/testing/user/configguide/index.rst
index fa893e5e..9c72ecd2 100644
--- a/docs/testing/user/configguide/index.rst
+++ b/docs/testing/user/configguide/index.rst
@@ -12,4 +12,3 @@ QTIP Installation Guide
:maxdepth: 2
./configuration.rst
- ./web.rst
diff --git a/docs/testing/user/configguide/web.rst b/docs/testing/user/configguide/web.rst
deleted file mode 100644
index 83365abe..00000000
--- a/docs/testing/user/configguide/web.rst
+++ /dev/null
@@ -1,74 +0,0 @@
-.. This work is licensed under a Creative Commons Attribution 4.0 International License.
-.. http://creativecommons.org/licenses/by/4.0
-
-
-***************************************
-Web Portal installation & configuration
-***************************************
-
-Web Portal for Benchmarking is developed on python `Django`_ Framework. Right now the installation
-is need to be done from source.
-
-
-
-Clone QTIP Repo
-===============
-
-::
-
- git clone https://github.com/opnfv/qtip.git
-
-
-Setup database and Initialize user data
-=======================================
-
-CD into `web` directory.
-------------------------
-
-::
-
- cd qtip/qtip/web
-
-
-Setup migrations
-----------------
-
-::
-
- python manage.py makemigrations
-
-
-In usual case migrations will be already available with source. Console willll notify you
-of the same.
-
-Run migrations
---------------
-
-::
-
- python manage.py migrate
-
-
-Create superuser
-----------------
-::
-
- python manage.py createsuperuser
-
-
-Console will prompt for adding new web admin. Enter new credentials.
-
-
-
-Collecting Static Dependencies
-------------------------------
-::
-
- python manage.py importstatic
-
-
-This will import js and css dependencies for UI in static directory. Now the web application is
-ready to run.
-
-
-.. _Django: https://docs.djangoproject.com/en/1.11/
diff --git a/docs/testing/user/userguide/compute.rst b/docs/testing/user/userguide/compute.rst
index f889bfe6..7c5adc26 100644
--- a/docs/testing/user/userguide/compute.rst
+++ b/docs/testing/user/userguide/compute.rst
@@ -16,10 +16,11 @@ test compute components.
All the compute benchmarks could be run in the scenario:
On Baremetal Machines provisioned by an OPNFV installer (Host machines)
+On Virtual machines provisioned by OpenStack deployed by an OPNFV installer
Note: The Compute benchmank constains relatively old benchmarks such as dhrystone
and whetstone. The suite would be updated for better benchmarks such as Linbench for
-the OPNFV E release.
+the OPNFV future release.
Getting started
@@ -32,7 +33,7 @@ Inventory File
QTIP uses Ansible to trigger benchmark test. Ansible uses an inventory file to
determine what hosts to work against. QTIP can automatically generate a inventory
-file via OPNFV installer. Users also can write their own inventory infomation into
+file via OPNFV installer. Users also can write their own inventory information into
``/home/opnfv/qtip/hosts``. This file is just a text file containing a list of host
IP addresses. For example:
::
@@ -53,19 +54,33 @@ manual. If *CI_DEBUG* is not set or set to *false*, QTIP will delete the key fro
remote hosts before the execution ends. Please make sure the key deleted from remote
hosts or it can introduce a security flaw.
-Commands
---------
+Execution
+---------
-In a QTIP container, you can run compute QPI by using QTIP CLI:
-::
+There are two ways to execute compute QPI:
+
+* Script
+
+ You can run compute QPI with docker exec:
+ ::
+
+ # run with baremetal machines provisioned by an OPNFV installer
+ docker exec <qtip container> bash -x /home/opnfv/repos/qtip/qtip/scripts/quickstart.sh -q compute
+
+ # run with virtual machines provisioned by OpenStack
+ docker exec <qtip container> bash -x /home/opnfv/repos/qtip/qtip/scripts/quickstart.sh -q compute -u vnf
+
+* Commands
+
+ In a QTIP container, you can run compute QPI by using QTIP CLI. You can get more details from
+ *userguide/cli.rst*.
- mkdir result
- qtip plan run <plan_name> -p $PWD/result
+Test result
+------------
-QTIP generates results in the ``$PWD/result`` directory are listed down under the
+QTIP generates results in the ``/home/opnfv/<project_name>/results/`` directory are listed down under the
timestamp name.
-you can get more details from *userguide/cli.rst*.
Metrics
-------
diff --git a/docs/testing/user/userguide/index.rst b/docs/testing/user/userguide/index.rst
index e05a5e90..93adc8a9 100644
--- a/docs/testing/user/userguide/index.rst
+++ b/docs/testing/user/userguide/index.rst
@@ -15,7 +15,6 @@ QTIP User Guide
getting-started.rst
cli.rst
api.rst
- web.rst
compute.rst
storage.rst
network.rst
diff --git a/docs/testing/user/userguide/network.rst b/docs/testing/user/userguide/network.rst
index 4d48d4d5..68c39974 100644
--- a/docs/testing/user/userguide/network.rst
+++ b/docs/testing/user/userguide/network.rst
@@ -112,4 +112,3 @@ Nettest provides the following `metrics`_:
.. _APEX: https://wiki.opnfv.org/display/apex
.. _metrics: https://tools.ietf.org/html/rfc2544
-
diff --git a/docs/testing/user/userguide/network_testcase_description.rst b/docs/testing/user/userguide/network_testcase_description.rst
index 66fda073..0f1a0b45 100644
--- a/docs/testing/user/userguide/network_testcase_description.rst
+++ b/docs/testing/user/userguide/network_testcase_description.rst
@@ -88,40 +88,3 @@ Test Case Description
|test verdict | find the test result report in QTIP container running |
| | directory |
+--------------+--------------------------------------------------------------+
-
-+-----------------------------------------------------------------------------+
-|Network Latency |
-+==============+==============================================================+
-|test case id | e.g. qtip_throughput |
-+--------------+--------------------------------------------------------------+
-|metric | what will be measured, e.g. latency |
-+--------------+--------------------------------------------------------------+
-|test purpose | describe what is the purpose of the test case |
-+--------------+--------------------------------------------------------------+
-|configuration | what .yaml file to use, state SLA if applicable, state |
-| | test duration, list and describe the scenario options used in|
-| | this TC and also list the options using default values. |
-+--------------+--------------------------------------------------------------+
-|test tool | e.g. ping |
-+--------------+--------------------------------------------------------------+
-|references | RFC2544 |
-+--------------+--------------------------------------------------------------+
-|applicability | describe variations of the test case which can be |
-| | performend, e.g. run the test for different packet sizes |
-+--------------+--------------------------------------------------------------+
-|pre-test | describe configuration in the tool(s) used to perform |
-|conditions | the measurements (e.g. fio, pktgen), POD-specific |
-| | configuration required to enable running the test |
-+--------------+------+----------------------------------+--------------------+
-|test sequence | step | description | result |
-| +------+----------------------------------+--------------------+
-| | 1 | use this to describe tests that | what happens in |
-| | | require several steps e.g. | this step |
-| | | step 1 collect logs | e.g. logs collected|
-| +------+----------------------------------+--------------------+
-| | 2 | remove interface | interface down |
-| +------+----------------------------------+--------------------+
-| | N | what is done in step N | what happens |
-+--------------+------+----------------------------------+--------------------+
-|test verdict | expected behavior, or SLA, pass/fail criteria |
-+--------------+--------------------------------------------------------------+
diff --git a/docs/testing/user/userguide/storage.rst b/docs/testing/user/userguide/storage.rst
index 7681ff7a..9457e67e 100644
--- a/docs/testing/user/userguide/storage.rst
+++ b/docs/testing/user/userguide/storage.rst
@@ -87,12 +87,23 @@ Then, you use the following commands to start storage QPI service.
Execution
---------
-You can run storage QPI with docker exec:
-::
+* Script
+
+ You can run storage QPI with docker exec:
+ ::
+
+ docker exec <qtip container> bash -x /home/opnfv/repos/qtip/qtip/scripts/quickstart.sh
+
+* Commands
- docker exec <qtip container> bash -x /home/opnfv/repos/qtip/qtip/scripts/quickstart.sh
+ In a QTIP container, you can run storage QPI by using QTIP CLI. You can get more
+ details from *userguide/cli.rst*.
+
+
+Test result
+------------
-QTIP generates results in the ``$PWD/results/`` directory are listed down under the
+QTIP generates results in the ``/home/opnfv/<project_name>/results/`` directory are listed down under the
timestamp name.
Metrics
diff --git a/docs/testing/user/userguide/web.rst b/docs/testing/user/userguide/web.rst
deleted file mode 100644
index 79f180d9..00000000
--- a/docs/testing/user/userguide/web.rst
+++ /dev/null
@@ -1,70 +0,0 @@
-.. This work is licensed under a Creative Commons Attribution 4.0 International License.
-.. http://creativecommons.org/licenses/by/4.0
-
-
-**********************
-Web Portal User Manual
-**********************
-
-QTIP consists of different tools(metrics) to benchmark the NFVI. These metrics
-fall under different NFVI subsystems(QPI's) such as compute, storage and network.
-QTIP benchmarking tasks are built upon `Ansible`_ playbooks and roles.
-QTIP web portal is a platform to expose QTIP as a benchmarking service hosted on a central host.
-
-
-Running
-=======
-
-After setting up the web portal as instructed in config guide, cd into the `web` directory.
-
-and run.
-
-::
-
- python manage.py runserver 0.0.0.0
-
-
-You can access the portal by logging onto `<host>:8000/bench/login/`
-
-If you want to use port 80, you may need sudo permission.
-
-::
-
- sudo python manage.py runserver 0.0.0.0:80
-
-To Deploy on `wsgi`_, Use the Django `deployment tutorial`_
-
-
-Features
-========
-
-After logging in You'll be redirect to QTIP-Web Dashboard. You'll see following menus on left.
-
- * Repos
- * Run Benchmarks
- * Tasks
-
-Repo
-----
-
- Repos are links to qtip `workspaces`_. This menu list all the aded repos. Links to new repos
- can be added here.
-
-Run Benchmarks
---------------
-
- To run a benchmark, select the corresponding repo and run. QTIP Benchmarking service will clone
- the workspace and run the benchmarks. Inventories used are predefined in the workspace repo in the `/hosts/` config file.
-
-Tasks
------
-
- All running or completed benchmark jobs can be seen in Tasks menu with their status.
-
-
-*New users can be added by Admin on the Django Admin app by logging into `/admin/'.*
-
-.. _Ansible: https://www.ansible.com/
-.. _wsgi: https://wsgi.readthedocs.io/en/latest/what.html
-.. _deployment tutorial: https://docs.djangoproject.com/en/1.11/howto/deployment/wsgi/
-.. _workspaces: https://github.com/opnfv/qtip/blob/master/docs/testing/developer/devguide/ansible.rst#create-workspace