Skip to content
Snippets Groups Projects

FLAME Cross Layer Management and Control

Version: X.X

About this document

Authors

Authors Organisation
Michael Boniface University of Southampton, IT Innovation Centre
Simon Crowle University of Southampton, IT Innovation Centre

Contents

Information Model

The informational model describes the structure and format of configuration and monitoring information collected by the CLMC and how the information is used to support service management decision amking

https://gitlab.it-innovation.soton.ac.uk/mjb/flame-clmc/blob/integration/docs/monitoring.md

Adaptive Streaming Use Case Scenario

The use case scenario provides an example usage of the information model for an mpeg-dash adaptive streaming service

https://gitlab.it-innovation.soton.ac.uk/mjb/flame-clmc/blob/integration/docs/adaptive-streaming-usecase-scenario.md

Development Environment

tbd

Testing

Testing is implemented using pytest.

The installation script is here:

test/services/pytest/install.sh

using the following convention:

  • Tests are written in python using pytest
  • Related tests are stored in a python module test/<testmodule> to create a suite of tests. All tests are stored in files test_*.py, there can be many tests per file, and many files per module
  • Each test module has a rspec.yml that provides the baseline "fixture" for the tests in the module
  • Tests are executed against fixtures. Fixtures are modular "setups" created for a test, that are inserted into the python code using dependancy injection. This offers more flexibility than the *unit style testing. The baseline deployment is created using vagrant up with an appropriate rspec, and the pytest fixture reads the rspec.yml and makes the configuration available to the test.
  • Tests are executed from a guest VM (not the host) in the repo root using the command pytest test/<testmodule>
  • Pytest will scan the directory for all tests including in files test_*.py and run them

Creating a deployment for a test

To set up a simulation of the adaptive streaming use case scenario first install the vagrant-disksize plugin (if not already installed)

vagrant plugin install vagrant-disksize

and then execute the following command

vagrant --fixture=streaming-sim -- up

This will provision the following VMs clmc-service, ipendpoint1, ipendpoint2

The clmc-service vm includes influx, Kapacitor and Chronograf. The following ports forwarded to the clmc VM from the host machine are as follows:

  • Influx: 8086
  • Chronograf: 8888
  • Kapacitor: 9092

Running the streaming-sim test

SSH into the CLMC server

vagrant --fixture=streaming-sim -- ssh clmc-service

Then go to the 'vagrant' directory.

cd /vagrant

The next step is to generate the test data, which could be done in two ways.

First option is to run a python script to generate the test data sets

python3 test/streaming-sim/StreamingSim.py

This script could also be used to clear the generated data by using the '-c' option

python3 test/streaming-sim/StreamingSim.py -c

The second option is to directly run the testing module, which will detect if the data was generated, and if not, will automatically generate the data before executing the tests. Keep in mind that if the test data is being generated using this way, a 10 seconds timeout is given after the generation is finished so that the data could properly be inserted into the database. If the data was already generated using the first option, only the tests would be executed.

The command for running the testing module is

pytest -s test/streaming-sim/test_simresults.py

The -s option in the command is used to output prints used in the test code and is, therefore, optional.

If pytest is not installed, an easy solution is to use the Python Package Index (PyPI)

sudo apt-get install python3-pip

pip3 install pytest