aboutsummaryrefslogtreecommitdiffstats
path: root/README.md
blob: 23c251cc067f4927a4d58b6476e01429d7c9505c (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
# Xtesting in a nutshell

Xtesting is a simple framework to assemble sparse test cases and to accelerate
the adoption of continuous integration best practices. By managing all
the interactions with the components (test scheduler, test results database,
artifact repository), it allows the developer to work only on the test suites
without diving into CI/CD.

It asks for a few low constraints
[quickly achievable](https://www.sdxcentral.com/articles/news/opnfvs-6th-release-brings-testing-capabilities-that-orange-is-already-using/2018/05/)
to verify multiple components in the same CI/CD toolchain. Even more, it brings
the capability to run third-party test cases in our CI toolchains and then to
also rate network functions by the coverage.

Please see
[the Katacoda scenarios](https://www.katacoda.com/ollivier/courses/xtestingci)
to try Xtesting. You will love them!

## [Write your own Xtesting driver](https://www.katacoda.com/ollivier/courses/xtestingci/firstdriver)

Note that [running MongoDB 5.0+ requires _avx_ CPU instruction set](https://www.mongodb.com/docs/manual/administration/production-notes/#x86_64)
that is usually shipped in all recent _x86_ hardware processors.
Though, it may not be available in your virtualized environments.
For example, Qemu _avx_ support is only available [since version 7.2](https://github.com/nodkz/mongodb-memory-server/issues/710#issuecomment-1297462935)
and must be explicitly enabled (e.g. with the argument _-cpu max_).

You can check the presence of the _avx_ CPU instruction set on your processor
with the following command.
```bash
grep '^processor\|^flags.* avx' /proc/cpuinfo
```

### dump all the following files in an empty dir

weather.py

```python
#!/usr/bin/env python

# pylint: disable=missing-docstring

import json
import os
import sys
import time

import requests

from xtesting.core import testcase


class Weather(testcase.TestCase):

    url = "https://samples.openweathermap.org/data/2.5/weather"
    city_name = "London,uk"
    app_key = "439d4b804bc8187953eb36d2a8c26a02"

    def run(self, **kwargs):
        try:
            self.start_time = time.time()
            req = requests.get("{}?q={}&&appid={}".format(
                self.url, self.city_name, self.app_key))
            req.raise_for_status()
            data = req.json()
            os.makedirs(self.res_dir, exist_ok=True)
            with open('{}/dump.txt'.format(self.res_dir), 'w+') as report:
                json.dump(data, report, indent=4, sort_keys=True)
            for key in kwargs:
                if data["main"][key] > kwargs[key]:
                    self.result = self.result + 100/len(kwargs)
            self.stop_time = time.time()
        except Exception:  # pylint: disable=broad-except
            print("Unexpected error:", sys.exc_info()[0])
            self.result = 0
            self.stop_time = time.time()
```

setup.py

```python
#!/usr/bin/env python

# pylint: disable=missing-docstring

import setuptools

setuptools.setup(
    setup_requires=['pbr>=2.0.0'],
    pbr=True)
```

setup.cfg

```
[metadata]
name = weather
version = 1

[files]
packages = .

[entry_points]
xtesting.testcase =
    weather = weather:Weather
```

requirements.txt

```
xtesting
requests!=2.20.0,!=2.24.0 # Apache-2.0
```

testcases.yaml

```yaml
---
tiers:
    -
        name: simple
        order: 0
        description: ''
        testcases:
            -
                case_name: humidity
                project_name: weather
                criteria: 100
                blocking: true
                clean_flag: false
                description: ''
                run:
                    name: weather
                    args:
                        humidity: 80
            -
                case_name: pressure
                project_name: weather
                criteria: 100
                blocking: true
                clean_flag: false
                description: ''
                run:
                    name: weather
                    args:
                        pressure: 1000
            -
                case_name: temp
                project_name: weather
                criteria: 100
                blocking: true
                clean_flag: false
                description: ''
                run:
                    name: weather
                    args:
                        temp: 280
    -
        name: combined
        order: 1
        description: ''
        testcases:
            -
                case_name: half
                project_name: weather
                criteria: 50
                blocking: true
                clean_flag: false
                description: ''
                run:
                    name: weather
                    args:
                        humidity: 90
                        pressure: 1000
                        temp: 280
```

Dockerfile

```
FROM alpine:3.18

ADD . /src/
RUN apk --no-cache add --update python3 py3-pip py3-wheel git py3-lxml && \
    git init /src && pip3 install /src
COPY testcases.yaml /etc/xtesting/testcases.yaml
CMD ["run_tests", "-t", "all"]
```

site.yml

```yaml
---
- hosts:
    - 127.0.0.1
  roles:
    - role: collivier.xtesting
      project: weather
      registry_deploy: true
      repo: 127.0.0.1
      dport: 5000
      suites:
        - container: weather
          tests:
            - humidity
            - pressure
            - temp
            - half
```

### make world

Deploy your own Xtesting toolchain

```bash
virtualenv xtesting -p python3 --system-site-packages
. xtesting/bin/activate
pip install ansible
ansible-galaxy install collivier.xtesting
ansible-galaxy collection install ansible.posix community.general community.grafana \
    community.kubernetes community.docker community.postgresql
ansible-playbook site.yml
deactivate
rm -r xtesting
```

Build your container

```bash
sudo docker build -t 127.0.0.1:5000/weather .
```

Publish your container on your local registry

```bash
sudo docker push 127.0.0.1:5000/weather
```

### play

Jenkins is accessible via http://127.0.0.1:8080 and you can identify yourself
as admin to be allowed to trigger a build:
- login: admin
- password: admin

The default Jenkins view lists all the Jenkins jobs. You can easily find your
main job, weather-latest-daily, via the Jenkins view named weather.

You're ready to start a new build of weather-latest-daily without changing
the default parameters.

The test case is executed after a few seconds and all the test outputs are
accessible via the console icons. If you open the
weather-127_0_0_1-weather-latest-humidity-run, you will first read:
- the test output highlighting its status
- a link to the test database where its results are stored
- a couple of links to its artifacts automatically published

A zip file dumping all test campaign data is printed in the
weather-latest-zip console.

### That's all folks!