Loading
  • Entries to the Procgen challenge can be either “open” or “closed.” Teams submitting “open” entries will be expected to reveal most details of their method including source-code (special exceptions may be made for pending publications). Teams may choose to submit a “closed” entry, and are then not required to provide any details beyond an abstract. The motivation for introducing this division is to allow greater participation from industrial teams that may be unable to reveal algorithmic details while also allocating more time at the workshop to teams that are able to give more detailed presentations. Participants are strongly encouraged to submit “open” entries if possible.

  • For a team to be eligible to move to round two, each member must satisfy the following: (1) be at least 18 and at least the age of majority in place of residence; (2) not reside in any region or country subject to U.S. Export Regulations; and (3) not be an organizer of this competition nor a family member of a competition organizer. In addition, to receive any awards from our sponsors, competition winners must attend the NeurIPS workshop.

  • Participants may only use the provided dataset and environments; no additional datasets or environments may be included in the source file submissions nor may be downloaded during training evaluation. During evaluation of submitted code, the individual containers will not have access to any external network to avoid any information leak. All submitted code repositories will be scrubbed to remove any files larger than 30MB to ensure participants are not checking in any model weighs pre-trained on the released training dataset. While the container running the submitted code will not have external network access, relevant exceptions are added to ensure participants can download and use the pre-trained models included in popular frameworks like PyTorch and TensorFlow. Participants can request to add network exceptions for any other publicly available pre-trained models, which will be validated by AICrowd on a case-by-case basis.

  • In all rounds, participants will be allotted 8 million timesteps in each environment to train their agents. When evaluating generalization, we will provide participant 500 levels from each environment during the training phase. Participants will also be restricted to no more than 2 hours of compute per environment, using a P100 GPU and 16 vCPUs.

  • Participants are expected to operate in good faith and to not attempt to circumvent these restrictions.

  • The organizers reserve the right to make amendments to the above mentioned rules.