On Pre-training Language Model for Antibody

UC Santa Barbara, †ByteDance Research, ††Tsinghua University
ICLR 2023

Abstract

Antibodies are vital proteins offering robust protection for the human body from pathogens. The development of general protein and antibody-specific pre-trained language models both facilitate antibody prediction tasks. However, there have been limited studies that comprehensively explore the representation capability of distinct pre-trained language models on different antibody tasks. To investigate the problem, we aim to answer several key questions in this paper, such as how pre-trained language models perform in antibody tasks with different specificity and how introducing specific biological mechanisms to the pre-training process can benefit the model. Additionally, we evaluate if the learned antibody pre-trained representations can be applied to real-world antibody problems, like drug discovery and immune process understanding. Previously, no benchmark available largely hindered the study to answer these questions. To aid in our investigation, we provide an AnTibody Understanding Evaluation (ATUE) benchmark . We comprehensively evaluate the performance of protein pre-trained language models by empirical study along with conclusions and new insights. Our ATUE and code are released.

Poster

BibTeX

@inproceedings{DBLP:conf/iclr/WangY023,
      author       = {Danqing Wang and
                      Fei Ye and
                      Hao Zhou},
      title        = {On Pre-training Language Model for Antibody},
      booktitle    = {The Eleventh International Conference on Learning Representations,
                      {ICLR} 2023, Kigali, Rwanda, May 1-5, 2023},
      publisher    = {OpenReview.net},
      year         = {2023},
      url          = {https://openreview.net/pdf?id=zaq4LV55xHl},
      timestamp    = {Fri, 30 Jun 2023 14:55:53 +0200},
      biburl       = {https://dblp.org/rec/conf/iclr/WangY023.bib},
      bibsource    = {dblp computer science bibliography, https://dblp.org}
    }