Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Formal EdgeX testing is currently focused on ensuring the functional aspects of EdgeX are working correctly. This level of testing involves a number of automated test suites for Unit, Integration and Blackbox API testing and validation. A set of tests that can be used to determine the non-functional characteristics of EdgeX, specifically performance, is also required.  This document defines the scope of these tests and also identifies the key test cases required for each category of tests. Recommendations for possible testing approaches are also made.

Scope

EdgeX Performance Testing should minimally address the following general test case categories:

...

  • Support Notifications – tests that measure the time it takes from when a Notification in sent to the service to the point the message has been pushed to all registered receivers for different load conditions. For example, measure the fan-out performance where one publisher sends a Notification to the service and a varying number of clients subscribe to receive the Notification or fan-in when where a varying number of concurrent publishers send Notifications to the service and a single client subscribes to receive all of the Notifications.  (MKB: these should be do-able in isolation of EdgeX given it is a separate service?)
  • Core Command – tests measure the time it takes to issue GET and PUT commands to a device/sensor via the Command Service for different load conditions. For example, measure the time it takes for a varying number of concurrent clients to each issue a GET command to read a property value from a device or for a varying number of concurrent clients to set a property on a device with a PUT command.
  • Rules Engine - do we need dedicated Rules Engine tests ?
  • Export Services – do we need dedicated Export Service performance tests ? For example, measure the performance when writing to a specific Cloud instance (e.g. Google IoT Core)?
  • Device Services – for baseline and regression test purposes many of the general performance tests outlined above may be able to be performed using a Virtual Device Service. However, it is also necessary and desirable to be able to repeat at least a subset of these tests with real Device Services (e.g. Modbus, MQTT or BACnet Device Services), perhaps connected to real device or minimally connected to a simulator. The performance of each individual Device Service will be implementation is specific.

Requirements

  1. Automated tests – the performance tests must be able to be integrated into the EdgeX build/test (CI pipeline) infrastructure and run on demand. (MKB: why only performance tests here?)
  2. Standalone tests - the performance tests must be able to be run standalone on a developer’s desktop without dependencies on the EdgeX build/test infrastructure. (MKB: why only performance tests here?)
  3. Results logging and display – the results of the performance tests must be recorded in a format that enables a set of graphical performance test curves to be produced that can be easily displayed in a web browser. This includes being able to display historical performance trends  to easily determine any regression in EdgeX performance.
  4. The performance tests should be able to be run on different platforms - CPU (ARM 32 and 64 bit, x86 64 bit) and OS combinations (Linux and Windows).
  5. Performance tests should be able to be run against both dockerized (default) and un-dockerized version of EdgeX.

Test Cases

Footprint

  1. Measure the file size in bytes of each EdgeX Microservice executable.
  2. Measure the file size in bytes of each EdgeX Microservice docker image.

...