ICPP Reproducibility Badges
Based on standard NISO , ICPP will consider the following badges:
Open Research Objects (ORO)
This badge signals that author-created digital objects used in the research (including data and code) are permanently archived in a public repository that assigns a global identifier and guarantees persistence, and are made available via standard open licenses that maximize artifact availability
The following requirements are necessary to receive this badge:
- Artifacts used in the research (including data and code) are permanently archived in a public repository that assigns a global and unique identifier (DOI) and guarantees persistence and are made available via standard open licenses that maximize artifact availability. DOIs can be acquired via Zenodo, FigShare, Dryad, Software Heritage. If authors have already their code in github, Zenodo provides an integration with Github to automatically generate DOIs from Git tags.
Please do NOT provide Dropbox links or gzipped files hosted through personal webpages.
Note that, for physical objects relevant to the research, metadata about the object should be made available.
Research Objects Reviewed (ROR)
This badge signals that all relevant author-created digital objects used in the research (including data and code) were reviewed according to the following criteria:
- Metadata / Documentation: Are the artifacts sufficiently documented to enable them to be exercised by readers of the paper?
- Completeness: Do the submitted artifacts include all the key components described in the paper?
- Exercisability: Do the submitted artifacts include the scripts and data needed to run the experiments described in the paper, and can the software be successfully executed?
We encourage authors to describe their (i) workflow underlying the paper, (ii) describing some of the black boxes, or a white box (e.g., source, configuration files, build environment), (iii) input data: either the process to generate the input data should be made available, or when the data is not generated, the actual data itself or a link to the data should be provided, (iv) environment (system configuration and initialization, scripts, workload, measurement protocol) used to produce the raw experimental data, and (v) the scripts needed to transform the raw data into the graphs included in the paper.
Results Reproduced (ROR-R)
The Research Objects were Reviewed and then, the evaluators successfully reproduced the key computational results using the author-created research objects, methods, code, and conditions of analysis. Note we do not aim to recreate the exact or identical results, especially hardware-based results. However, we do aim to:
- Reproduce Behavior: This is of specific importance where results are hardware-dependent. Bit-wise reproducibility is not our goal. If we get access to the same hardware as used by experiments, we will aim to reproduce the results on that hardware. If not, we aim to work with authors to determine the equivalent or approximate behavior on available hardware. For example, if results are about response time, our objective will be to check if a given algorithm is significantly faster than another one, or that a given parameter affects negatively or positively the behavior of a system.
- Reproduce the Central Results and Claims of the Paper: We do not aim to reproduce all the results and claims of the paper. The reproducibility committee will determine the central results of the accepted paper and will work with authors to confirm it. Once confirmed, the badge will be assigned based on the committee being able to reproduce behavior of these central results.