top of page

DistilBERT

  • Writer: Editorial Staff
    Editorial Staff
  • Oct 8, 2024
  • 1 min read

Updated: Oct 9, 2024


Hugging Face

Introduced in 2019 by Hugging Face to provide an efficient alternative to BERT for natural language processing tasks. It's a smaller, faster version of BERT created through knowledge distillation. It retains 97% of BERT's language understanding capabilities while being 40% smaller and 60% faster. It is available as part of the Hugging Face Transformers library.


Kommentare


Top Stories

!
Widget Didn’t Load
Check your internet and refresh this page.
If that doesn’t work, contact us.

Stay updated with the latest in language models and natural language processing. Subscribe to our newsletter for weekly insights and news.

Stay Tuned for Exciting Updates

  • LinkedIn
  • Twitter

© 2023 SLM Spotlight. All Rights Reserved.

bottom of page