The informational model describes the structure and format of configuration and monitoring information collected by the CLMC and how the information is used to support service management decision amking
Implementation documentation and discussion can be found in the [FLAME CLMC Information Model Specification](https://gitlab.it-innovation.soton.ac.uk/FLAME/flame-clmc/blob/integration/docs/monitoring.md)
#### Development Environment
tbd
#### Testing
The development environment is currently Vagrant/Virtualbox providing the infrastructure for dev and testing.
Testing is implemented using pytest.
Vagrant-disksize plugin must be installed.
The installation script is here:
`vagrant plugin install vagrant-disksize`
`sudo clmctest/services/pytest/install.sh`
#### Testing
using the following convention:
Testing is implemented using pytest using the following convention:
* Tests are written in python using pytest
* Related tests are stored in a python module `clmctest/<testmodule>` to create a suite of tests. All tests are stored in files test_*.py, there can be many tests per file, and many files per module
...
...
@@ -71,11 +60,11 @@ using the following convention:
#### Creating a deployment for a test
To set up a simulation of the adaptive streaming use case scenario first install the vagrant-disksize plugin (if not already installed)
Each test has a fixture defining a set of VMs required for the test. The fixture is described in an rspec.yml file that is stored within the clmctest/<testmoduel> directory. For example, the monitoring test fixture is stored in the following file.
`vagrant plugin install vagrant-disksize`
`clmctest\monitoring\rspec.yml`
and then execute the following command
The fixture is created by running the vagrant up command with the test module name as a parameter
`vagrant --fixture=monitoring -- up`
...
...
@@ -87,38 +76,32 @@ The **clmc-service** vm includes influx, Kapacitor and Chronograf. The following
* Chronograf: 8888
* Kapacitor: 9092
#### Running the streaming-sim test
Typically, tests are run from a test-runner VM that has pytest installed. The script used to install pytest is
The next step is to generate the test data, which could be done in two ways.
First option is to run a python script to generate the test data sets
`python3 clmctest/monitoring/StreamingSim.py`
This script could also be used to clear the generated data by using the '-c' option
Run the pytest.
`python3 clmctest/monitoring/StreamingSim.py -c`
`pytest -s clmctest/monitoring/`
#### Running the monitoring tests
#### Tests run from the host
The second option is to directly run the testing module, which will detect if the data was generated, and if not, will automatically generate the data before executing the tests. Keep in mind that if the test data is being generated using this way, a 10 seconds timeout is given after the generation is finished so that the data could properly be inserted into the database. If the data was already generated using the first option, only the tests would be executed.
Tests can be run from the host by sshing commands to the test running. This is described in the .gitlab-ci.yml file. The file shows how a python package is created