aboutsummaryrefslogtreecommitdiffstats
path: root/behave_tests/features/steps/steps.py
AgeCommit message (Collapse)AuthorFilesLines
2022-10-27behave_tests: refactor TestAPI DB lookupGwenael Lambrouin1-52/+54
- use testapi.TestapiClient everywhere - relax search constraints: match only project name (nfvbench), test case name (characterization or non-regression), scenario tag (throughput or latency) and user_label (test chain identifier: identifies, among other things, the platform, the compute class under test, ...) - add unit tests for some of the related behave steps Change-Id: I26763f845c2286601cb958b326525b29320a1627 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2022-05-31behave_tests: fix infinite recursionGwenael Lambrouin1-1/+1
Fix bug in behave_tests that occurs when behave looks for a previous result (either non-regression or characterization) in testapi database. When the result cannot be found on the first results page, behave enters an infinite recursion. Eventually, the operating system kills the python interpreter leading to a return code equal to 137. When behave is run by xtesting run_tests command, run_tests is also killed so the error is not reported by xtesting. This is now fixed. Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com> Change-Id: I6b0feafb5ebadf7d0d1df6d0ee03fd22cbe6899d
2021-08-31behave_tests: tweak nfvbench results JSON filenameGwenael Lambrouin1-4/+12
In case nfvbench packet rate is expressed as a percentage of the max throughput rather than a value in pps or bps, use that percentage to build nfvbench results JSON filename. This is sometimes needed for results post-processing because the percentage information cannot be found in nfvbench results itself. Change-Id: I7d16dba16a733a8ee58a6f80ce4df40cb40e9843 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: change packet rate for latency non regression testsGwenael Lambrouin1-0/+90
Base the rate of the latency test on the latest characterization max throughput test instead of the latest non regression max throughput test. The goal is to use the same packet rate for all latency tests and to avoid variations of the latency result due to the variation of the max throughput results, ie to decouple max throughput and latency test results. This is achieved with a new "Given" behave phrase: Given packet rate equal to {percentage} of max throughput of last characterization This new phrase is now used by default in non-regression.feature, but it is still possible to use the previous behaviour with the phrase: Given <throughput> rate of previous scenario Change-Id: I15b5d7a68cd57c67d01d2119781f65114e6d41ce Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: doc and log start nfvbench serverGwenael Lambrouin1-1/+21
Change-Id: I36b7a32525f75bf1dc2b7ec150428afa5298d478 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: log latency test (fixed threshold)Gwenael Lambrouin1-3/+10
Change-Id: I8285829a854f146fb9736d44655a7e848923203e Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: log nfvbench API testGwenael Lambrouin1-5/+7
Change-Id: I67bfba22393f2f324b3c052b443b24c520231172 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: refactor max result searchGwenael Lambrouin1-19/+16
Remove duplicate code introduced by logging and make the max result search easier to read. Change-Id: If88c6d5a8b57ae9e26edab206e0f61526a98d09d Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: log nfvbench traffic runsGwenael Lambrouin1-1/+19
Change-Id: I791b57c78f98252f01c08a6539762725888a3514 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-22behave_tests: code cleaning (TEST_DB_EXT_URL)Gwenael Lambrouin1-5/+0
Remove all reference to the TEST_DB_EXT_URL env variable which is not used. Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com> Change-Id: I552255f11c04da344aac1f2d9dd9f3da4293e553
2021-07-22behave_tests: configure nfvbench ip/port with env varsGwenael Lambrouin1-7/+13
It is now possible to configure nfvbench server IP address and port number with environment variables: NFVBENCH_SERVER_HOST and NFVBENCH_SERVER_PORT. It is still possible to configure them in feature files, and the values found in feature files take precedence. This allows to have behave tests and nfvbench server running on different machines without changing feature files, which is especially useful for testing. Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com> Change-Id: I98dc7f87a1a233b90b44dfc8b26a1e63961fff3c
2021-07-22Compare the latency result with a fixed threshold of 1msGwenael Lambrouin1-0/+30
Change-Id: I2b4ea4ee6e6442d4ceac268e7bf3c6bf9277ff54 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-07-08NFVBENCH-215 Fix wrong throughput ratio in latency testsfmenguy1-1/+0
Change-Id: I5c976dd49a2c17b47559b1d6a565a6e78f7cfd0e Signed-off-by: fmenguy <francoisregis.menguy@orange.com>
2021-06-08Fix pps error message in behave testsfmenguy1-1/+1
Change-Id: I2f050f8a6f193c4e04ac8a427aedb7c241633b73 Signed-off-by: fmenguy <francoisregis.menguy@orange.com>
2021-06-02behave_tests: increase nfvbench_test_api timeoutGwenael Lambrouin1-1/+1
In some cases, the 50 seconds timeout to wait for nfvbench HTTP server to be ready is too short. (Convoluted but real example: when the DNS servers are not properly configured and nfvbench tries to reach the OpenStack APIs while it does not need them because we just want to do a loopback test without loop VM) The new timeout is 120 seconds. Change-Id: I4932eff7c9a100370e7ceaaa2a467efbbceb5993 Signed-off-by: Gwenael Lambrouin <gwenael.lambrouin@orange.com>
2021-04-27NFVBENCH-205 - Add behave tests for characterization and non-regressionfmenguy1-0/+455
Change-Id: I708eee21a9fd11e7a276707fb0b43d8598381ce7 Signed-off-by: fmenguy <francoisregis.menguy@orange.com>