Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Formal EdgeX testing is currently focused on ensuring the functional aspects of EdgeX are working correctly. This level of testing involves a number of automated test suites for Unit, Integration and Blackbox API testing and validation. A set of tests that can be used to determine the non-functional characteristics of EdgeX, specifically performance, is also required.  This document defines the scope of these tests and also identifies the key test cases required for each category of tests. Recommendations for possible testing approaches are also made.

Scope

EdgeX Performance Testing should minimally address the following general test case categories:

...

  • Core Data – tests that measure the read/write performance of Core Data and underlying database for different load conditions. For example, varying the number of concurrent clients writing to Core Data with different payload sizes and measuring the time it takes to issue each write call, or varying the number of concurrent clients reading different size collections of events from Core Data and measuring the time it takes to issue each read call.
  • Support Logging – tests that measure log write and log query performance of the Logging Services for different load conditions. For example, measure the time it takes to write a log message for a varying number of Logging (write) clients and varying payload sizes, or measure the time it takes to query a log message for a varying number of Logging (read) clients and varying payload sizes.
  • Support Notifications – tests that measure the time it takes from when a Notification in sent to the service to the point the message has been pushed to all registered receivers for different load conditions.  For example, measure the fan-out performance where one publisher sends a Notification to the service and a varying number of clients subscribe to receive the Notification or fan-in when where a varying number of concurrent publishers send Notifications to the service and a single client subscribes to receive all of the Notifications.  
  • Core Command – tests measure the time it takes to issue GET and PUT commands to a device/sensor via the Command Service for different load conditions. For example, measure the time it takes for a varying number of concurrent clients to each issue a GET command to read a property value from a device or for a varying number of concurrent clients to set a property on a device with a PUT command.
  • Rules Engine - do we need dedicated Rules Engine tests ?
  • Export Services – do we need dedicated Export Service performance tests ? For example, measure the performance when writing to a specific Cloud instance (e.g. Google IoT Core)?
  • Device Services – for baseline and regression test purposes many of the general performance tests outlined above may be able to be performed using a Virtual Device Service. However, it is also necessary and desirable to be able to repeat at least a subset of these tests with real Device Services (e.g. Modbus, MQTT or BACnet Device Services), perhaps connected to real devise or minimally connected to a simulator. The performance of each individual Device Service will be implementation is specific

Requirements

  1. Automated tests – the performance tests must be able to be integrated into the EdgeX build/test (CI pipeline) infrastructure and run on demand.
  2. Standalone tests - the performance tests must be able to be run standalone on a developer’s desktop without dependencies on the EdgeX build/test infrastructure.
  3. Results logging and display – the results of the performance tests must be recorded in a format that enables a set of graphical performance test curves to be produced that can be easily displayed in a web browser. This includes being able to display historical performance trends to be able to easily determine any regression in EdgeX performance.
  4. The performance tests should be able to be run on different platforms - CPU (ARM 32 and 64 bit, x86 64 bit) and OS combinations (Linux and Windows).
  5. Performance tests should be able to be run against both dockerized (default) and un-dockerized version of EdgeX

Test Cases

Footprint

  1. Measure the file size in bytes of each EdgeX Microservice executable.
  2. Measure the file size in bytes of each EdgeX Microservice docker image.

...