Skip to content
Snippets Groups Projects
Commit ccddb77b authored by MJB's avatar MJB
Browse files

updated readme to make dev and testing documentation consistent

parent c80f02c5
No related branches found
No related tags found
No related merge requests found
......@@ -22,7 +22,7 @@
-->
# FLAME Cross Layer Management and Control
#### Version: X.X
#### Version: 1.0.0
### About this document
......@@ -35,32 +35,21 @@
### Contents
#### Documentation
##### Information Model
The informational model describes the structure and format of configuration and monitoring information collected by the CLMC and how the information is used to support service management decision amking
https://gitlab.it-innovation.soton.ac.uk/mjb/flame-clmc/blob/integration/docs/monitoring.md
##### Adaptive Streaming Use Case Scenario
The use case scenario provides an example usage of the information model for an mpeg-dash adaptive streaming service
https://gitlab.it-innovation.soton.ac.uk/mjb/flame-clmc/blob/integration/docs/adaptive-streaming-usecase-scenario.md
Implementation documentation and discussion can be found in the [FLAME CLMC Information Model Specification](https://gitlab.it-innovation.soton.ac.uk/FLAME/flame-clmc/blob/integration/docs/monitoring.md)
#### Development Environment
tbd
#### Testing
The development environment is currently Vagrant/Virtualbox providing the infrastructure for dev and testing.
Testing is implemented using pytest.
Vagrant-disksize plugin must be installed.
The installation script is here:
`vagrant plugin install vagrant-disksize`
`sudo clmctest/services/pytest/install.sh`
#### Testing
using the following convention:
Testing is implemented using pytest using the following convention:
* Tests are written in python using pytest
* Related tests are stored in a python module `clmctest/<testmodule>` to create a suite of tests. All tests are stored in files test_*.py, there can be many tests per file, and many files per module
......@@ -71,11 +60,11 @@ using the following convention:
#### Creating a deployment for a test
To set up a simulation of the adaptive streaming use case scenario first install the vagrant-disksize plugin (if not already installed)
Each test has a fixture defining a set of VMs required for the test. The fixture is described in an rspec.yml file that is stored within the clmctest/<testmoduel> directory. For example, the monitoring test fixture is stored in the following file.
`vagrant plugin install vagrant-disksize`
`clmctest\monitoring\rspec.yml`
and then execute the following command
The fixture is created by running the vagrant up command with the test module name as a parameter
`vagrant --fixture=monitoring -- up`
......@@ -87,38 +76,32 @@ The **clmc-service** vm includes influx, Kapacitor and Chronograf. The following
* Chronograf: 8888
* Kapacitor: 9092
#### Running the streaming-sim test
Typically, tests are run from a test-runner VM that has pytest installed. The script used to install pytest is
`clmctest/services/pytest/install.sh`
SSH into the CLMC server
To run a test, SSH into the test runner machine
`vagrant --fixture=monitoring -- ssh clmc-service`
`vagrant --fixture=monitoring -- ssh test-runner`
Then go to the 'vagrant' directory.
`cd /vagrant`
The next step is to generate the test data, which could be done in two ways.
First option is to run a python script to generate the test data sets
`python3 clmctest/monitoring/StreamingSim.py`
This script could also be used to clear the generated data by using the '-c' option
Run the pytest.
`python3 clmctest/monitoring/StreamingSim.py -c`
`pytest -s clmctest/monitoring/`
#### Running the monitoring tests
#### Tests run from the host
The second option is to directly run the testing module, which will detect if the data was generated, and if not, will automatically generate the data before executing the tests. Keep in mind that if the test data is being generated using this way, a 10 seconds timeout is given after the generation is finished so that the data could properly be inserted into the database. If the data was already generated using the first option, only the tests would be executed.
Tests can be run from the host by sshing commands to the test running. This is described in the .gitlab-ci.yml file. The file shows how a python package is created
The command for running the testing module is:
`python setup.py sdist --dist-dir=build`
`pytest -s clmctest/monitoring/test_simresults.py`
Then the package is installed
The `-s` option in the command is used to output prints used in the test code and is, therefore, optional.
`vagrant --fixture=scripts -- ssh test-runner -- -tt "pip3 install /vagrant/build/clmctest-SNAPSHOT.tar.gz"`
If pytest is not installed, an easy solution is to use the Python Package Index (PyPI)
Then the tests are run
`sudo apt-get install python3-pip`
`pip3 install pytest`
\ No newline at end of file
`vagrant --fixture=scripts -- ssh test-runner -- -tt "pytest -s --pyargs clmctest.scripts"`
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment