We welcome submissions from areas of pre-trained models, few-shot learning, transfer learning, self-supervised learning, meta-learning, etc. We also invite submissions from researchers in other application areas such as physics, chemistry, biology. To summarize, the topics include but are not limited to:
- Theoretical foundations of connections and differences between different pre-training methods (supervised pre-training, self-supervised pre-training with different auxiliary tasks, meta-learning, etc.)
- Empirical analysis of various pre-training methods
- Generalization bounds of different pre-training methods
- Novel pre-training methods to maximize generalization
- Model selection among a zoo of pre-trained models
- New fine-tuning techniques for maximally leveraging a pre-trained model
- Pre-training for various application domains, such as computer vision, natural language processing, robotics, physics, drug discovery, and environmental sustainability
Submission URL: https://openreview.net/group?id=ICML.cc/2022/Workshop/Pre-Training
Format: All submissions must be in PDF format and anonymized. Submissions are limited to four content pages, including all figures and tables; unlimited additional pages containing references and supplementary materials are allowed. Reviewers may choose to read the supplementary materials but will not be required to. Camera-ready versions may go up to five content pages.
Style file: You must format your submission using the ICML 2022 LaTeX style file. For your convenience, we have modified the main conference style file to refer to the Pre-training workshop: pre-training.sty. Please include the references and supplementary materials in the same PDF as the main paper. The maximum file size for submissions is 50MB. Submissions that violate the ICML style (e.g., by decreasing margins or font sizes) or page limits may be rejected without further review.
Dual-submission policy: We welcome ongoing and unpublished work. We will also accept papers that are under review at the time of submission, or that have been recently accepted without published proceedings.
Non-archival: The workshop is a non-archival venue and will not have official proceedings. Workshop submissions can be subsequently or concurrently submitted to other venues.
Visibility: Submissions and reviews will not be public. Only accepted papers will be made public.
Contact: For any questions, please contact us at pretraining2022@googlegroups.com.
If you would like to become a reviewer for this workshop, please let us know at https://t.co/gaHT0VA2Sl.