Search
DistilBERT
- Editorial Staff
- Oct 8, 2024
- 1 min read
Updated: Oct 9, 2024

Hugging Face
Introduced in 2019 by Hugging Face to provide an efficient alternative to BERT for natural language processing tasks. It's a smaller, faster version of BERT created through knowledge distillation. It retains 97% of BERT's language understanding capabilities while being 40% smaller and 60% faster. It is available as part of the Hugging Face Transformers library.
Kommentare