...
- Improved unit tests across services (improvements should be b driven by specific WG dev teams)
- Improve blackbox test structure including reorganization of the tests and better test case documentation (source ramya.ranganathan@intel.com)
- Dockerize blackbox testing infrastructure for deployment simplification/flexibility
- System integration tests – currently we don’t have any end to end tests e.g. Device Service read data -> Core Data -> Rules Engine or Export Service
- Blackbox tests for new Application Services microservices
- Blackbox test for Device Services - Virtual, Modbus, MQTT, BACNet, OPC UA. Develop common set of tests that can be used against all Device Services.
- Configuration testing – currently all blackbox tests run with single static configuration
- Automated performance testing
- API Load testing (measure response time) and metrics (CPU, memory) collection for all EdgeX microservices (this work was started during the Edinburgh iteration)
- EdgeX microservice startup times
- Automated system level latency and throughout testing (e.g. device read to export or device read to analytics to device actuation)
- Baseline performance of service binaries no container
- Additional performance test runs against other container technologies supported (e.g. snaps)
- The ability to create summary reports/dashboards of key EdgeX performance indicators with alerts if thresholds have been exceeded
- Test coverage analysis e.g. using tools such as Codecov.io
- Static code analysis e.g. using tools such as SonarQube or Coverity to identify badly written code, memory leaks and security vulnerabilities
- Tracing during testing e.g. based on Open Tracing standard such as Zipkin, Jaeger
...