summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--README73
-rw-r--r--cli/commands/perftest.py4
-rw-r--r--cli/helper.py14
-rw-r--r--docs/designspec/dashboard.rst75
-rw-r--r--docs/releasenotes/brahmaputra.rst78
-rw-r--r--docs/releasenotes/index.rst14
6 files changed, 183 insertions, 75 deletions
diff --git a/README b/README
index 4f5759bc..083b1ad6 100644
--- a/README
+++ b/README
@@ -1,71 +1,6 @@
-QTIP Benchmark Suite
----------------------
+QTIP is a performance benchmark service for OPNFV platform
-QTIP is a benchmarking suite intended to benchmark the following components of the OPNFV Platform:
+QTIP aims to benchmark OPNFV platforms through a "Bottom up" approach. Emphasis
+on platform performance through quantitative benchmarks rather than validation.
-1. Computing components
-2. Networking components
-3. Storage components
-
-The efforts in QTIP are mostly focused on identifying
-
-1. Benchmarks to run
-2. Test cases in which these benchmarks to run
-3. Automation of suite to run benchmarks within different test cases
-4. Collection of test results
-
-QTIP Framework can now be called: (qtip.py).
-
-The Framework can run 5 computing benchmarks:
-
-1. Dhrystone
-2. Whetstone
-3. RamBandwidth
-4. SSL
-5. nDPI
-
-These benchmarks can be run in 2 test cases:
-
-1. VM vs Baremetal
-2. Baremetal vs Baremetal
-
-Instructions to run the script:
-
-1. Download and source the OpenStack `adminrc` file for the deployment on which you want to create the VM for benchmarking
-2. run `python qtip.py -s {SUITE} -b {BENCHMARK}`
-3. run `python qtip.py -h` for more help
-4. list of benchmarks can be found in the `qtip/test_cases` directory
-5. SUITE refers to compute, network or storage
-
-Requirements:
-
-1. Ansible 1.9.2
-2. Python 2.7
-3. PyYAML
-
-Configuring Test Cases:
-
-Test cases can be found within the `test_cases` directory.
-For each Test case, a Config.yaml file contains the details for the machines upon which the benchmarks would run.
-Edit the IP and the Password fields within the files for the machines on which the benchmark is to run.
-A robust framework that would allow to include more tests would be included within the future.
-
-Jump Host requirements:
-
-The following packages should be installed on the server from which you intend to run QTIP.
-
-1: Heat Client
-2: Glance Client
-3: Nova Client
-4: Neutron Client
-5: wget
-6: PyYaml
-
-Networking
-
-1: The Host Machines/compute nodes to be benchmarked should have public/access network
-2: The Host Machines/compute nodes should allow Password Login
-
-QTIP support for Foreman
-
-{TBA}
+See [project wiki](https://wiki.opnfv.org/display/qtip) for more information.
diff --git a/cli/commands/perftest.py b/cli/commands/perftest.py
index c163070a..0eb6d062 100644
--- a/cli/commands/perftest.py
+++ b/cli/commands/perftest.py
@@ -10,12 +10,14 @@
from prettytable import PrettyTable
import yaml
import click
+import os
+from cli import helper
class PerfTest:
def __init__(self):
- self.path = 'benchmarks/perftest/summary'
+ self.path = os.path.join(helper.fetch_root(), 'perftest/summary')
def list(self):
table = PrettyTable(["Name", "Description"])
diff --git a/cli/helper.py b/cli/helper.py
new file mode 100644
index 00000000..a5865bce
--- /dev/null
+++ b/cli/helper.py
@@ -0,0 +1,14 @@
+##############################################################################
+# Copyright (c) 2016 ZTE Corp and others.
+#
+# All rights reserved. This program and the accompanying materials
+# are made available under the terms of the Apache License, Version 2.0
+# which accompanies this distribution, and is available at
+# http://www.apache.org/licenses/LICENSE-2.0
+##############################################################################
+
+import os
+
+
+def fetch_root():
+ return os.path.join(os.path.dirname(__file__), os.pardir, 'benchmarks/')
diff --git a/docs/designspec/dashboard.rst b/docs/designspec/dashboard.rst
index ad5520b6..555b3a24 100644
--- a/docs/designspec/dashboard.rst
+++ b/docs/designspec/dashboard.rst
@@ -57,14 +57,79 @@ The condition of a benchmark result includes
Conditions that do NOT have an obvious affect on the test result may be ignored,
e.g. temperature, power supply.
-Deviation
----------
-
-Performance tests are usually repeated many times to reduce random disturbance.
-This view shall show an overview of deviation among different runs.
+Stats
+-----
+
+Performance tests are actually measurement of specific metrics. All measurement
+comes with uncertainty. The final result is normally one or a group of metrics
+calculated from many repeats.
+
+For each metric, the stats board shall consist of a diagram of all measured
+values and a box of stats::
+
+ ^ +------------+
+ | | count: ? |
+ | |average: ? |
+ | | min: ? |
+ | X | max: ? |
+ | XXXX XXXX X XXXXX | |
+ |X XX XX XX XXX XXX XX | |
+ | XXXXXX X XXXXX XX | |
+ | | |
+ | | |
+ | | |
+ | | |
+ | | |
+ +---------------------------------------------> +------------+
+
+The type of diagram and selection of stats shall depend on what metric to show.
Comparison
----------
Comparison can be done between different PODs or different configuration on the
same PODs.
+
+In a comparison view, the metrics are displayed in the same diagram. And the
+parameters are listed side by side.
+
+Both common parameters and different parameters are listed. Common values are
+merged to the same cell. And user may configure the view to hide common rows.
+
+A draft design is as following::
+
+ ^
+ |
+ |
+ |
+ | XXXXXXXX
+ | XXX XX+-+ XXXXXXXXXX
+ | XXX +XXXX XXXXX
+ +-+XX X +--+ ++ XXXXXX +-+
+ | X+-+X +----+ +-+ +----+X
+ |X +--+ +---+ XXXXXX X
+ | +-------+ X
+ |
+ |
+ +----------------------------------------------------->
+
+ +--------------------+----------------+---------------+
+ | different param 1 | | |
+ | | | |
+ +-----------------------------------------------------+
+ | different param 2 | | |
+ | | | |
+ +-------------------------------------+---------------+
+ | common param 1 | |
+ | | |
+ +-------------------------------------+---------------+
+ | different param 3 | | |
+ | | | |
+ +-------------------------------------+---------------+
+ | common param 2 | |
+ | | |
+ +--------------------+--------------------------------+
+ +------------+
+ | HIDE COMMON|
+ +------------+
+
diff --git a/docs/releasenotes/brahmaputra.rst b/docs/releasenotes/brahmaputra.rst
new file mode 100644
index 00000000..92fafd80
--- /dev/null
+++ b/docs/releasenotes/brahmaputra.rst
@@ -0,0 +1,78 @@
+***********
+Brahmaputra
+***********
+
+NOTE: The release note for OPNFV Brahmaputra is missing. This is a copy of the
+README.
+
+QTIP Benchmark Suite
+====================
+
+QTIP is a benchmarking suite intended to benchmark the following components of the OPNFV Platform:
+
+1. Computing components
+2. Networking components
+3. Storage components
+
+The efforts in QTIP are mostly focused on identifying
+
+1. Benchmarks to run
+2. Test cases in which these benchmarks to run
+3. Automation of suite to run benchmarks within different test cases
+4. Collection of test results
+
+QTIP Framework can now be called: (qtip.py).
+
+The Framework can run 5 computing benchmarks:
+
+1. Dhrystone
+2. Whetstone
+3. RamBandwidth
+4. SSL
+5. nDPI
+
+These benchmarks can be run in 2 test cases:
+
+1. VM vs Baremetal
+2. Baremetal vs Baremetal
+
+Instructions to run the script:
+
+1. Download and source the OpenStack `adminrc` file for the deployment on which you want to create the VM for benchmarking
+2. run `python qtip.py -s {SUITE} -b {BENCHMARK}`
+3. run `python qtip.py -h` for more help
+4. list of benchmarks can be found in the `qtip/test_cases` directory
+5. SUITE refers to compute, network or storage
+
+Requirements:
+
+1. Ansible 1.9.2
+2. Python 2.7
+3. PyYAML
+
+Configuring Test Cases:
+
+Test cases can be found within the `test_cases` directory.
+For each Test case, a Config.yaml file contains the details for the machines upon which the benchmarks would run.
+Edit the IP and the Password fields within the files for the machines on which the benchmark is to run.
+A robust framework that would allow to include more tests would be included within the future.
+
+Jump Host requirements:
+
+The following packages should be installed on the server from which you intend to run QTIP.
+
+1: Heat Client
+2: Glance Client
+3: Nova Client
+4: Neutron Client
+5: wget
+6: PyYaml
+
+Networking
+
+1: The Host Machines/compute nodes to be benchmarked should have public/access network
+2: The Host Machines/compute nodes should allow Password Login
+
+QTIP support for Foreman
+
+{TBA}
diff --git a/docs/releasenotes/index.rst b/docs/releasenotes/index.rst
new file mode 100644
index 00000000..5d045388
--- /dev/null
+++ b/docs/releasenotes/index.rst
@@ -0,0 +1,14 @@
+.. This work is licensed under a Creative Commons Attribution 4.0 International License.
+.. http://creativecommons.org/licenses/by/4.0
+.. (c) 2015 Dell Inc.
+.. (c) 2016 ZTE Corp.
+
+
+##################
+QTIP Release Notes
+##################
+
+.. toctree::
+ :maxdepth: 2
+
+ brahmaputra.rst