CODESS+ISSS is proud to join other ESWEEK conferences in adopting an Artifact Evaluation (AE) process for accepted papers. This initiative aims to promote transparency, reproducibility, and reusability in research. Authors of accepted papers are invited to submit artifacts for evaluation under the following categories:
- Artifacts Available: Ensures the artifact is permanently archived and publicly accessible.
- Artifacts Evaluated: Confirms the artifact is functional and reusable.
- Results Reproduced: Validates that the key results of the paper can be fully reproduced using the submitted artifact.
Artifact Badges
Artifacts consist of reusable products that other researchers can use to bootstrap their own research. Experience shows that such papers earn increased citations and greater prestige in the research community. Artifacts of interest include (but are not limited to) the following.
| Badge Name | Description | Figure | 
|---|---|---|
| Artifact Available | The artifact is permanently archived in a publicly accessible repository and is relevant to the paper. |  | 
| Artifact Evaluated | Functional: The artifact is executable and can produce results supporting the claims in the paper. Reusable: In addition to being functional, the artifact is well-documented and structured to facilitate reuse and future adaptation. Reviewers will assess the ease of reuse in new contexts. |   | 
| Artifact Reproduced | Reviewers were able to fully reproduce the key results reported in the paper using the submitted artifact. |  | 
Note: Authors are responsible for testing their artifacts in different environments beyond their original testing environment before claiming the Artifact Evaluated and Artifact Reproduced badges.
Submission Guidelines
Authors are required to prepare a single PDF containing the following information:
- Archive Link: Provide a link to the artifact in a publicly accessible, permanent repository (e.g., Zenodo, Figshare, or institutional repositories). Note that GitHub/GitLab is not considered permanent.
- Execution Steps: Include clear instructions for installation, usage, and expected outcomes. It is highly recommended to create a single script run.sh to enable single-command execution. To facilitate easy evaluations, ensure that complete artifact evaluation can be completed within 30 minutes. It is the authors’ responsibility to streamline the evaluation experience for reviewers.
- Badge Claims: Specify which badges are being claimed (e.g., Artifact Available, Artifact Evaluated – Functional, Reusable, or Artifact Reproduced) and provide justification for each claim.
- Precompiled Environment: Include a link to a precompiled environment (e.g., DockerHub) to handle artifact evaluation dependencies. If the artifact requires special hardware, specify the requirements clearly.
- Camera-Ready Paper: Attach the camera-ready version of the paper to the end of the PDF.
- Upload the PDF: Upload the completed PDF to the Artifact section of your submission in CODESS+ISSS HotCRP.
Artifact Preparation Requirements
Authors must include the following files when preparing their artifacts:
- README: A main file describing what the artifact does and where it can be obtained (with hidden links and access passwords if necessary). It should include a clear description of how to repeat, replicate, or reproduce the results presented in the paper. Artifacts focusing on data should cover aspects relevant to understanding the context, data provenance, ethical and legal statements (if relevant), and storage requirements. Artifacts focusing on software should cover aspects relevant to installation and usage, accompanied by a small example.
- REQUIREMENTS: For software-focused artifacts, this file should cover hardware environment requirements (e.g., performance, storage, or non-commodity peripherals) and software environments (e.g., Docker, VM, and operating system). If relevant, include a requirements.txt file with explicit versioning information (e.g., for Python-only environments). Any deviation from standard environments must be reasonably justified.
- STATUS: A file stating what kind of badge(s) the authors are applying for and the reasons why the artifact deserves those badge(s).
- LICENSE: A file describing the distribution rights. To score “available” or higher, the license must be some form of open-source license. Details are under the respective badges and the FSE 2025 open science policy.
- INSTALL: A file with installation instructions. These instructions should include notes illustrating a very basic usage example or a method to test the installation. For instance, specify the expected output that confirms the code is installed and working, and that it is doing something interesting and useful.
Important Dates and Instructions
By July 20, 2025, please submit your research artifact description (PDF) at the CODESS+ISSS HotCRP submission website. We expect the actual artifact to be available on persistent repositories such as Zenodo and to have a DOI as mentioned in the above description. The Evaluation Committee may contact the authors within the reviewing period (July 20 – August 11) to request clarifications on the basic installation and start-up procedures or to resolve simple installation problems. Please continue checking the submission site during this period.
AUTHORS TAKE NOTE: The official publication date is the date the proceedings are made available in the ACM Digital Library. This date may be up to two weeks prior to the first day of the conference. The official publication date affects the deadline for any patent filings related to published work.
Given the short review time available, the authors are expected to respond within a 48-hour period. Authors may update their research artifacts after submission only for changes requested by reviewers in the reviewing phase. Finally, further information will be constantly made available on this website.
In case of questions, please do not hesitate to contact the Artifact Evaluation chair.
Artifact Evaluation Committee Members
Chair: Aruna Jayasena (arunajayasena@ufl.edu)
Committee:
| Name | Affiliation | 
| Dakshina Tharindu | University of Florida | 
| Gabriel Ott | University of Oldenburg | 
| Gayatri Madduri | Indian Institute of Science, Bangalore | 
| Gokul Krishnan | Apple | 
| Hansika Weerasena | University of Florida | 
| Haocheng Xu | University of California, Irvine | 
| Sahan Nelundeniyalage | University of Florida | 
| Teng Wang | University of Science and Technology of China | 
| Yifan Zhang | University of California, Irvine | 


