Title: Transformers: State-of-the-Art Natural Language Processing
Type Software Wolf, Thomas, Debut, Lysandre, Sanh, Victor, Chaumond, Julien, Delangue, Clement, Moi, Anthony, Cistac, Perric, Ma, Clara, Jernite, Yacine, Plu, Julien, Xu, Canwen, Le Scao, Teven, Gugger, Sylvain, Drame, Mariama, Lhoest, Quentin, Rush, Alexander M. (2020): Transformers: State-of-the-Art Natural Language Processing. Zenodo. Software. https://zenodo.org/record/5553107
Links
- Item record in Zenodo
- Digital object URL
Summary
v4.11.3: Patch release
This patch release fixes a few issues encountered since the release of v4.11.2:
[DPR] Correct init (#13796) Fix warning situation: UserWarning: max_length is ignored when padding=True" (#13829) Bart: check if decoder_inputs_embeds is set (#13800) include megatron_gpt2 in installed modules (#13834) Fixing 1-length special tokens cut. (#13862) Fixing empty prompts for text-generation when BOS exists. (#13859) Fixing question-answering with long contexts (#13873) Fixing GPU for token-classification in a better way. (#13856) Fixing Backward compatiblity for zero-shot (#13855) Fix hp search for non sigopt backends (#13897) Fix trainer logging_nan_inf_filter in torch_xla mode #13896 (@ymwangg) [Trainer] Fix nan-loss condition #13911 (@anton-l)More information
- DOI: 10.5281/zenodo.5553107
Dates
- Publication date: 2020
- Issued: October 01, 2020
Notes
Other: If you use this software, please cite it using these metadata.Rights
- info:eu-repo/semantics/openAccess Open Access
Format
electronic resource
Relateditems
Description | Item type | Relationship | Uri |
---|---|---|---|
IsSupplementTo | https://github.com/huggingface/transformers/tree/v4.11.3 | ||
IsVersionOf | https://doi.org/10.5281/zenodo.3385997 | ||
IsPartOf | https://zenodo.org/communities/zenodo |