SciBERT: A Pretrained Language Model for Scientific Text
I. Beltagy, K. Lo, и A. Cohan. Proceedings of the 2019 Conference on Empirical Methods in Natural
Language Processing and the 9th International Joint Conference on
Natural Language Processing, EMNLP-IJCNLP 2019, Hong Kong, China,
November 3-7, 2019, стр. 3613--3618. Association for Computational Linguistics, (2019)
DOI: 10.18653/V1/D19-1371
Proceedings of the 2019 Conference on Empirical Methods in Natural
Language Processing and the 9th International Joint Conference on
Natural Language Processing, EMNLP-IJCNLP 2019, Hong Kong, China,
November 3-7, 2019
Пожалуйста, войдите в систему, чтобы принять участие в дискуссии (добавить собственные рецензию, или комментарий)
Цитировать эту публикацию
%0 Conference Paper
%1 DBLP:conf/emnlp/BeltagyLC19
%A Beltagy, Iz
%A Lo, Kyle
%A Cohan, Arman
%B Proceedings of the 2019 Conference on Empirical Methods in Natural
Language Processing and the 9th International Joint Conference on
Natural Language Processing, EMNLP-IJCNLP 2019, Hong Kong, China,
November 3-7, 2019
%D 2019
%E Inui, Kentaro
%E Jiang, Jing
%E Ng, Vincent
%E Wan, Xiaojun
%I Association for Computational Linguistics
%K diss foundations imported
%P 3613--3618
%R 10.18653/V1/D19-1371
%T SciBERT: A Pretrained Language Model for Scientific Text
%U https://doi.org/10.18653/v1/D19-1371
@inproceedings{DBLP:conf/emnlp/BeltagyLC19,
added-at = {2024-03-15T09:52:22.000+0100},
author = {Beltagy, Iz and Lo, Kyle and Cohan, Arman},
bibsource = {dblp computer science bibliography, https://dblp.org},
biburl = {https://www.bibsonomy.org/bibtex/2866ff3c609590f77c8986c9b7a29b00e/tobias.koopmann},
booktitle = {Proceedings of the 2019 Conference on Empirical Methods in Natural
Language Processing and the 9th International Joint Conference on
Natural Language Processing, {EMNLP-IJCNLP} 2019, Hong Kong, China,
November 3-7, 2019},
doi = {10.18653/V1/D19-1371},
editor = {Inui, Kentaro and Jiang, Jing and Ng, Vincent and Wan, Xiaojun},
interhash = {7d9144fc5d18b1c19854cefc489fda99},
intrahash = {866ff3c609590f77c8986c9b7a29b00e},
keywords = {diss foundations imported},
pages = {3613--3618},
publisher = {Association for Computational Linguistics},
timestamp = {2024-03-15T09:52:22.000+0100},
title = {SciBERT: {A} Pretrained Language Model for Scientific Text},
url = {https://doi.org/10.18653/v1/D19-1371},
year = 2019
}