Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Improved unit tests across services (improvements should be driven by specific WG dev teams)
  • Improve blackbox test structure including reorganization of the tests and better test case documentation (source ramya.ranganathan@intel.com)
  • New test framework (e.g. Robot or Cucumber) to support additional types of functional/blackbox and system integration tests e.g. Device Service or system level latency tests
  • Dockerize blackbox testing infrastructure for deployment simplification/flexibility
  • System integration tests – currently we don’t have any end to end tests e.g. Device Service read data -> Core Data -> Rules Engine or Export Service
  • Blackbox tests for new Application Services microservices
  • Blackbox test for Device Services  - Virtual, Modbus, MQTT, BACNet, OPC UA. Develop common set of tests that can be used against all Device Services.
  • Configuration testing – currently all blackbox tests run with single static configuration, we should test additional EdgeX configuartionsconfigurations
  • Automated performance testing
    • API Load testing (measure response time) and metrics (CPU, memory) collection for all EdgeX microservices (this work was started during the Edinburgh iteration)
    • EdgeX microservice startup times (STRETCH GOAL FOR EDINBURGH) 
    • Automated system level latency and throughout testing (e.g. device read to export or device read to analytics to device actuation)
    • Baseline performance of service binaries no container
    • Additional performance test runs against other container technologies supported (e.g. snaps)
    • The ability to create summary reports/dashboards of key EdgeX performance indicators with alerts if thresholds have been exceeded
  • Test coverage analysis e.g. using tools such as Codecov.io
  • Static code analysis e.g. using tools such as SonarQube or Coverity to identify badly written code, memory leaks and security vulnerabilities  
  • Tracing during testing e.g. based on Open Tracing standard such as Zipkin, Jaeger


Item#DescriptionPriorityTarget Release
1Improved unit tests across services (improvements should be driven by specific WG dev teams)1Fuji
2Improve blackbox test structure including reorganization of the tests and better test case documentation (source ramya.ranganathan@intel.com)1Fuji
3New test framework (e.g. Robot or Cucumber) to support additional types of functional/blackbox and system integration tests e.g. Device Service or system level latency tests1Fuji
5Dockerize blackbox testing infrastructure for deployment simplification/flexibility2Geneva
6System integration tests – currently we don’t have any end to end tests e.g. Device Service read data -> Core Data -> Rules Engine or Export Service2Geneva
7Blackbox tests for new Application Services microservices1Fuji
8Blackbox test for Device Services  - Virtual, Modbus, MQTT, BACNet, OPC UA. Develop common set of tests that can be used against all Device Services.1Fuji
9Configuration testing – currently all blackbox tests run with single static configuration, we should test additional EdgeX configurations2Geneva
10Automated performance testing - API Load testing (measure response time) and metrics (CPU, memory) collection for all EdgeX microservices (this work was started during the Edinburgh iteration)1Fuji
11Automated performance testing - EdgeX microservice startup times (STRETCH GOAL FOR EDINBURGH) 1Fuji
12Automated performance testing - Automated system level latency and throughout testing (e.g. device read to export or device read to analytics to device actuation)1Fuji
13Automated performance testing - Baseline performance of service binaries no container3Hanoi
14Automated performance testing - Additional performance test runs against other container technologies supported (e.g. snaps)3Hanoi
15Automated performance testing - The ability to create summary reports/dashboards of key EdgeX performance indicators with alerts if thresholds have been exceeded2Geneva
16Test coverage analysis e.g. using tools such as Codecov.io1Fuji
17Static code analysis e.g. using tools such as SonarQube or Coverity to identify badly written code, memory leaks and security vulnerabilities  2Geneva
18Tracing during testing e.g. based on Open Tracing standard such as Zipkin, Jaeger3Hanoi







Fuji Release (Proposed)

  • Improved unit tests across services (improvements should be driven by specific WG dev teams)
  • Improve blackbox test structure including reorganization of the tests and better test case documentation (source ramya.ranganathan@intel.com)
  • New test framework (e.g. Robot or Cucumber) to support additional types of functional/blackbox and system integration tests e.g. Device Service or system level latency tests
  • Blackbox tests for new Application Services microservices
  • Blackbox test for Device Services  - Virtual, Modbus, MQTT, BACNet, OPC UA. Initially develop common set of tests that can be used against all Device Services.
  • Automated performance testing
    • API Load testing (measure response time) and metrics (CPU, memory) collection for all EdgeX microservices (this work was started during the Edinburgh iteration)
    • EdgeX microservice startup times  
    • Automated system level latency and throughout testing (e.g. device read to export or device read to analytics to device actuation) - STRETCH GOAL
    • The ability to create summary reports/dashboards of key EdgeX performance indicators with alerts if thresholds have been exceeded
  • Test coverage analysis e.g. using tools such as Codecov.io

...