Application#

The CI2023 Reproducibility Challenge will be a virtual event held May 1-31, 2023, where teams of 2-4 will collaborate to create a Jupyter notebook which reproduces the key contributions of a published environmental data science paper for eventual integration in the open-source Environmental Data Science (EDS) book.

A pool of peer reviewers consisting of original authors of pre-approved papers, reviewers for the Climate Informatics conference, and other members of the Climate Informatics and EDS book communities will begin working with participants at the two-week mark of the competition to sharpen the submission and ensure it is ready for publication in the EDS book following the competition.

We provide further information of the key stages, requirements and registration links for notebook Teams and Reviewers.

Information for teams#

Stages#

We provide below the key stages participants will get involved:

  • Registration and teams formation: Participants should fill a registration form for the challenge and indicate the study they will be reproducing, programming proficiency and other relevant skills. After the registration deadline, the Organising Committee will mediate team formation and assign team members based on paper preferences and skills.

  • Reproducibility: Teams must follow EDS book submission guidelines. They should open a Notebook Idea in the EDS book repository to describe and discuss initial ideas of the paper to reproduce. EDS book provides templates of notebook repositories in Python, R and Julia that aim to facilitate teams submission. Teams should reproduce the study’s results as closely as possible using the available data and code. They should document their process and any deviations from the original study in a Jupyter notebook.

  • Submission and Review: Teams should submit the notebook repository detailing their reproduction process, including any challenges or differences from the original study. They should also include their findings and any additional insights or observations. The review process will be documented on PRE-REVIEW and REVIEW issues aiming to assign reviewers and moderate high-level conversations between teams and reviewers.

  • Evaluation: The notebook will be evaluated by a panel of judges based on their accuracy, completeness, and clarity. The judges may also consider the participants’ insights and additional observations. Teams can find further details of the scoring criteria here.

  • Award: The winning team will receive Cambridge University Press book of their choosing (up to ÂŁ500 in value split across the winning team).

Tip

Participants are welcome to follow tips below for a successful experience in the reproducibility challenge:

  • Understand the study: carefully read and analyse the original study to fully understand the methods and procedures used to generate the results.

  • Obtain data and resources: obtain the data and resources provided by the challenge organizers and ensure they have access to any necessary software tools and computing resources.

  • Reproduce results: attempt to reproduce the results of the original study, following the same methods and procedures as closely as possible.

  • Document and report: document relevant process and report results, including any challenges or issues encountered during the reproduction process.

  • Share code and resources: share code and resources with team members through collaborative platforms provided in the challenge: C-Scale JupyterHub or GitHub repository. This will make easier for others to replicate and verify work.

  • Engage with the community: engage with the CI2023 and EDS book community and seek feedback and input from others, including organisers, reviewers, and other participants.

  • Adhere to best practices: adhere to best practices for reproducible research, including using open-source tools and sharing their work openly and transparently.

Application form#

The application period has now closed. Thank you for your interest.

Information for reviewers#

The role of a reviewer is to sharpen the submission and ensure it is ready for publication in the EDS book following the competition. Reviewers should assess the accuracy, completeness, and clarity of their reproduction process and findings. Here is some information for reviewers:

  • Read the guidelines: carefully read EDS book guidelines and criteria for reviewers to ensure that they are assessing the notebooks appropriately.

  • Evaluate the notebook: assess the notebook based on its clarity, completeness, and accuracy. They should also consider any additional insights or observations provided by the participants.

  • Verify the reproduction process: verify that the participant has followed the reproduction process as closely as possible, and note any deviations or challenges that the participant encountered.

  • Evaluate the findings: evaluate the accuracy and reliability of the participants’ findings, and compare them to the original study’s results. They should also consider any additional insights or observations provided by the participant.

  • Provide constructive feedback: provide constructive feedback to the participant, highlighting any strengths and weaknesses of their notebook and offering suggestions for improvement.

  • Meet the deadlines: Reviewers should meet the deadlines set and submit their evaluations in a timely manner (1 week).

It is important for reviewers to be objective and impartial, and to provide constructive feedback to help participants improve their reproducibility skills.

Application form#

The application period has now closed. Thank you for your interest.