The evaluation is not completely automated yet. Current evaluation results are identical to the results reported in the PLoS ONE collection review on "Open source software in quantum computing". We're currently working hard on fully automating the evaluation such that we can publish monthly updates soon.
Project selection
We only evaluate a subset of the projects presented in our exhaustive list of open quantum software repositories. The following decision tree outlines the process which is used to select free and open software projects for evaluation. The main goal is to identify projects that have already built a community around their project. The acronym PR stands for pull request which is a form of code contribution on software hosting websites.
These rules are not set in stone and we're always open for suggestions. If you have some thoughts, comments or concerns, please open an issue on our GitHub page.
Repository evaluation
Evaluation results for the static analysis of each project and its source code. We report the version control and issue tracking systems as well as the total number, attention rate and average response time for all open and closed issues and pull requests (PRs). We define attention rate as 1-I where I is the fraction of ignored issues and pull requests with respect to the total number of issues and pull requests. An ideal project never ignores any of its user or developer questions or contributions and would have an attention rate of 1.0. The average response time measures how long it takes a core contributor (or an employee of the company hosting the project) to respond to issues or pull requests. Next, we analyze the existence of a test suite and report the resulting code coverage for most projects. Code complexity is only reported for projects written in Python since other languages do not allow for fast retrieval of this metric (if you know how to do this please let us know through a GitHub issue or email).
Name | Version control system | Issue tracking system | Issues/PRs | Attention rate | Average response time (days) | Test suite | Code coverage | Cyclomatic Complexity |
---|---|---|---|---|---|---|---|---|
Cirq | Git | GitHub | 448/686 | 0.54 | 2.6 | ✓ | 94% | 2.99 |
Cliffords.jl | Git | GitHub | 6/12 | 0.33 | <1 | ✓ | - | - |
dimod | Git | GitHub | 110/201 | 0.30 | 5.3 | ✓ | 94% | 2.96 |
dwave-system | Git | GitHub | 54/72 | 0.24 | 8.2 | ✓ | 87% | 3.47 |
FermiLib | Git | GitHub | 24/134 | 0.31 | <1 | ✓ | 99% | 2.43 |
Forest - Grove | Git | GitHub | 53/130 | 0.51 | 17.7 | ✓ | 72% | 3.25 |
Forest - pyQuil | Git | GitHub | 293/385 | 0.41 | 10.6 | ✓ | 88% | 2.65 |
OpenFermion | Git | GitHub | 137/345 | 0.61 | 1.3 | ✓ | 100% | 2.46 |
ProjectQ | Git | GitHub | 84/198 | 0.75 | 4.0 | ✓ | 100% | 4.02 |
PyZX | Git | GitHub | 6/2 | 0.80 | <1 | ✓ | 51% | 4.42 |
QGL.jl | Git | GitHub | 17/13 | 0.75 | 130.6 | ✓ | - | - |
Qbsolv | Git | GitHub | 50/85 | 0.17 | 22.2 | ✓ | 95% | - |
Qiskit Aqua | Git | GitHub | 43/141 | 0.20 | 1.8 | ✓ | 67% | 3.04 |
Qiskit Terra | Git | GitHub | 526/713 | 0.11 | 16.0 | ✓ | 76% | 2.56 |
Qiskit Tutorials | Git | GitHub | 94/274 | 0.40 | 8.6 | ❌ | - | - |
Qiskit.js | Git | GitHub | 19/8 | 0.33 | 4.4 | ✓ | 66% | - |
Qrack | Git | GitHub | 7/78 | 0.07 | 8.7 | ✓ | 87% | - |
Quantum Fog | Git | GitHub | 17/1 | 1.00 | <1 | ❌ | 0% | 3.32 |
Quantum++ | Git | GitHub | 8/45 | 0.88 | <1 | ✓ | 72% | - |
Qubiter | Git | GitHub | 14/3 | 0.75 | <1 | ❌ | 0% | - |
Quirk | Git | GitHub | 286/131 | 0.96 | <1 | ✓ | - | - |
reference-qvm | Git | GitHub | 6/14 | 0.44 | 75.6 | ✓ | 80% | 3.99 |
ScaffCC | Git | GitHub | 15/11 | 0.18 | 10.1 | ✓ | - | - |
Strawberry Fields | Git | GitHub | 16/20 | 0.73 | 1.2 | ✓ | 97% | 2.70 |
XACC | Git | GitHub | 65/14 | 0.65 | <1 | ✓ | - | - |
XACC VQE | Git | GitHub | 22/4 | 0.33 | 8.8 | ✓ | - | - |
Documentation evaluation
The image below shows the detailed results of our qualitative documentation analysis in form of a colour coded heatmap with scores ranging from 1 (bad) to 5 (good). We evaluated each project based on the quality of their source code documentation, README files, changelogs, user documentation and tutorials. The detailed rubrik used for scoring each of the five aspects can be found in the Supporting Information of the review paper “Open source software in quantum computing”.
Community evaluation
Lastly, the following table shows the evaluation results for the community analysis. For each project, we analyse if a public development roadmap exists and if the software is published in form of releases. Additionally, we report the GitHub community profile score, the total number of contributors, the type of user- and developer-centric discussion channel and the type of public code review process – specifically if it applies to internal (I) and/or external (E) contributors.
Project | Roadmap | Releases | Contributors | User-discussion channels | Developer-discussion channels | Public review process | Community Profile |
---|---|---|---|---|---|---|---|
Cirq | ❌ | ✓ | 28 | Stack Exchange | - | E+I | 4/7 |
Cliffords.jl | ❌ | ✓ | 7 | - | - | E | 3/7 |
dimod | ❌ | ✓ | 11 | Forum | - | E+I | 5/7 |
dwave-system | ❌ | ✓ | 6 | Forum | - | E+I | 4/7 |
FermiLib | ❌ | ✓ | 10 | - | - | E+I | 3/7 |
Forest - Grove | ❌ | ✓ | 24 | Slack | Slack | E+I | 3/7 |
Forest - pyQuil | ❌ | ✓ | 46 | Slack | Slack | E+I | 3/7 |
OpenFermion | ❌ | ✓ | 26 | - | - | E+I | 3/7 |
ProjectQ | ❌ | ✓ | 10 | - | - | E+I | 3/7 |
PyZX | ❌ | ❌ | 3 | - | - | - | 3/7 |
QGL.jl | ❌ | ❌ | 3 | - | - | E+I | 3/7 |
Qbsolv | ❌ | ✓ | 18 | Forum | - | E+I | 5/7 |
Qiskit Aqua | ❌ | ✓ | 14 | Forum | - | E+I | 7/7 |
Qiskit Terra | ✓ | ✓ | 67 | Forum, Slack | Slack | E+I | 7/7 |
Qiskit Tutorials | ❌ | ❌ | 37 | - | - | E+I | 3/7 |
Qiskit.js | ❌ | ✓ | 4 | Forum | - | E | 7/7 |
Qrack | ❌ | ✓ | 2 | - | - | E+I | 3/7 |
Quantum Fog | ❌ | ❌ | 2 | - | - | E | 3/7 |
Quantum++ | ❌ | ✓ | 3 | Gitter | - | E | 5/7 |
Qubiter | ❌ | ❌ | 2 | - | - | E | 3/7 |
Quirk | ❌ | ✓ | 3 | - | - | E | 4/7 |
reference-qvm | ❌ | ✓ | 8 | - | - | E+I | 3/7 |
ScaffCC | ❌ | ✓ | 7 | - | - | E | 3/7 |
Strawberry Fields | ❌ | ✓ | 5 | Slack | Slack | E+I | 7/7 |
XACC | ❌ | ❌ | 6 | - | - | E | 4/7 |
XACC VQE | ❌ | ❌ | 2 | - | - | E | 3/7 |
Let us know if you disagree with some of these results. If you're maintaining one of the projects above and you think that we did a mistake in our evaluation then we want to hear from you! Please open an issue on our GitHub page.