SEW-tinySEW by ASAPP ResearchThe base model pretrained on 16kHz sampled speech audio. When using the...
- huggingface.co
- 2025-05-06
Please refer here. https://github.com/SKTBrain/KoBERT
- huggingface.co
- 2025-05-06
OverviewLanguage model: gbert-base-germandprLanguage: GermanTraining data: GermanDPR train set (~ 56...
- huggingface.co
- 2025-05-06
LiLT + XLM-RoBERTa-baseThis model is created by combining the Language-Independent Layout Transforme...
- huggingface.co
- 2025-05-06
SciNCLSciNCL is a pre-trained BERT language model to generate document-level embeddings of research ...
- huggingface.co
- 2025-05-06
This is a copy of the original BLOOM weights that is more efficient to use with the DeepSpeed-MII an...
- huggingface.co
- 2025-05-07
rubert-base-cased-conversationalConversational RuBERT (Russian, cased, 12‑layer, 768‑hidden, 12‑head...
- huggingface.co
- 2025-05-12
Model Card for sup-simcse-roberta-largeModel DetailsModel DescriptionDeveloped by: Princeton-nlpShar...
- huggingface.co
- 2025-05-12
IndoBERT Base Model (phase2 – uncased)IndoBERT is a state-of-the-art language model for Indone...
- huggingface.co
- 2025-05-13
This is a Japanese sentence-LUKE model.日本語用Sentence-LUKEモデルです。日本語Sentence-BERTモデルと同一のデータセットと設定で学習しまし...
- huggingface.co
- 2025-05-13