This process was approved by TSC vote on 7/22/20.

The process to objectively assess the security risk of 3rd party open source components or dependencies is outlined with consideration of the legacy way of performing the assessment, as well as the new process discussed within the project during the Hanoi development time frame.

The Process

The process should take into consideration relevant data such as the project's age, popularity / maturity, evidence of security practices, recent commit history, diversity of committers, established CVE practices, or other observable evidence.  In terms of licensing compliance, ideal process should also consider the license associated to the component as well.

Additionally, the ideal process should take into account the following scenarios:

  1. Existing Code (Skeletons in the Closet)
  2. Code in Holding (Analysis of code before it is accepted)
  3. Pull Request with new dependency

Developer Process for Each Use Case

Use CaseProcessToolsApplicability
1.) Existing Code
(Skeletons in the Closet)

Automated scan within build automation via Snyk CLI 

Scans of the published Docker images via Snyk  with notifications to SIR Team members / Snyk Administrators

Working groups should review any issues identified via the build automation tools and address within the context of the working group reviews.

Snyk

Community Bridge Advanced Snyk Reports

Clair

Scan automation occurs within the build - PR merge to master


2.) Code in Holding
(Analysis of code before it is accepted)

Review catalog of approved packages in this wiki.  Compare that list to the list being submitted.  Provide the summary of differences to include the list of new packages and why they are needed.

Complete the paper study for each package. 

See paper study process as written for use case 3.

When considering code that is under consideration for moving into the main EdgeX Foundry Org, out of holding


3.) Pull Request with new dependency

Submitter of a Pull Request (PR) will complete the Pull Request template to include any new changes that introduce dependency changes (e.g. imports or go module dependencies)

The standard Pull Request template includes a question that asks  - "Are there any new imports or modules? If so, what are they used for and why?"

Submitter of the PR will add a dependency label to the pull request.

If the dependency is security related, the submitter will add the security-review label to the PR so a member of the Security WG can help review.

Submitter should include scan results which include consideration of compliance (license) as well as security vulnerability (e.g. CVE) data, that can be reviewed by a Security WG member.

Note: Reviewers will see one of the changed files is go.mod for Go projects.

Run this command at the root of your repo

GO111MODULE=on go list -m all 

GO111MODULE=on go mod graph

For a PR with new dependencies, the submitter of the PR will complete a manual paper study to collect the following data points for review:

  • Total increase in new imports: (count)
    Does the new import introduce additional import dependencies, if so, how many?
    • Ensure that every one of the new dependencies is checked for the same criteria.
  • Releases/Tags: (count)
    • We should avoid new imports that have never had a release and/or tag. How many is too few, this is a judgement call and probably involves also considering how long ago the last release was, and how far apart releases have been done.
  • Contributors: (count)
  • License - what is the license, and is it Apache 2.0 compatible?
  • Stars/Forks/Watchers: (count)
    • These are all indications of how wide-spread the package is used.
  • godoc.org metrics: (count)
    • The individual godoc pages hosted by godocs.org include metrics at the base of the page which indicate how many packages import the package
  • Subjective opinion of the reviewers – at the end of the day, we rely on our reviewers to vet new code. Reviewers should give thought to whether the code is improving our project, whether we'd be better off to implement the functionality ourselves, and at the same time considering whether this new import itself comes with too many dependencies (e.g. go-kit).

    When submitting the PR, complete the PR template and set the labels using both - dependency , security-review (security components only)
  • On approval, notify the working group chair to update the catalog of approved packages if required.



On a Pull Request, whenever there's a new dependency introduced as shown through changes to the go.mod



Approved Go Modules (those in Red are being investigated for replacement - avoid them if possible

See Approved Go Modules/Packages

Process Research

Explore Documentation: Issue-1947



  • No labels

8 Comments

  1. James/Tony - some points for consideration...
    On the current process, I think the list is good, but do we need to provide some guidance on the criteria - if not some #'s - then some suggestions on example modules that would fail and those that would pass based on that criteria set?
    Also, the license has to be Apache or equivalent and we should define what those are (MIT, BSD, etc.)

    Are we saying the Ideal process is our target?  All of it or just parts? 

  2. Jim - thanks for the feedback. 

    Per the our meeting yesterday, I understood Tony would try to simplify the acceptance criteria from the laundry list of questions I had complied in my explore work

    As I noted the license criteria as follows:

    Note License Acceptance Criteria has been defined within the project.

  3. A good example of something that should fail acceptance and would be considered as rejected for use would be something like this one:

    Package: github.com/goburrow/modbus:
    evidence: https://github.com/goburrow/modbus/pulse/monthly


  4. A good example of a dependency that should pass acceptance and would be considered as accepted for use would be something like this one: 

    Package: github.com/hashicorp/vault:
    evidence: https://github.com/hashicorp/vault/pulse

    or 

    Package: github.com/Kong/kong
    evidence: https://github.com/Kong/kong/pulse

  5. In terms of Ideal process.. the TSC can make the call on conditional acceptance of a component that does not measure up to the acceptance criteria. 

    1. Perhaps a simplified process for making the decision to accept or reject an open source dependency (use case 2 / use case 3 only) could be the following:

      • Review the GitHub pulse data - evidence of lack of maintenance on a project (no active commits, no recent releases, low number of committers, stale activity with PRs and Issues management etc.) equates to rejection for use within the project
      • Review Security vulnerability data - evidence of lack of a process to address reported security vulnerabilities and / or evidence of a publicly reported vulnerability without any course of action to address equates to rejection for use within the project
      • Review License - lack of evidence that the license fits per the criteria outlined for acceptance, equates to rejection for use within the project


  6. James, as promised, here's my take on the simplified requirements for vetting a new 3rd party import:

    • Total increase in new imports: does the new import introduce additional import dependencies, if so, how many?
      • Ensure that every one of the new dependencies is checked for the same criteria.
    • Releases/Tags: count
      • We should avoid new imports that have never had a release and/or tag. How many is too few, this is a judgement call and probably involves also considering how long ago the last release was, and how far apart releases have been done.
    • Contributors: count
    • License - what is the license, and is it Apache 2.0 compatible?
    • Stars/Forks/Watchers: counts
      • These are all indications of how wide-spread the package is used.
    • godoc.org metrics: count
      • The individual godoc pages hosted by godocs.org include metrics at the base of the page which indicate how many packages import the package
    • Subjective opinion of the reviewers – at the end of the day, we rely on our reviewers to vet new code. Reviewers should give thought to whether the code is improving our project, whether we'd be better off to implement the functionality ourselves, and at the same time considering whether this new import itself comes with too many dependencies (e.g. go-kit).

    I think we need to be careful about being too prescriptive about things like security vulnerability processes (or lack thereof), as I would wager that very few of our existing package imports include published security vulnerability processes. Likewise, if we want folks to do a CVE search on the imports, we should provide a link and instructions on how to do this. I think a better ask would be to ask reviewers to consider the existing bug/issue list of the project as a part of the overall process.

  7. Only comment I have is about the formatting of the approved Go modules. I liked the way it was initially presented in table format with more detail about each module. Not sure if all those details would be helpful here or not.

    Example:

    Module NameLicenseAVOIDNotesEtc..

    bitbucket.org/bertimus9/systemstat v0.0.0-20180207000608-0eeff89b0690

    ...YESToo old...

    github.com/armon/circbuf v0.0.0-20150827004946-bbbad097214e

    ...YES...
    github.com/armon/go-metrics v0.0.0-20180917152333-f0300d1749da...NO...