Update README.md
Browse files
README.md
CHANGED
|
@@ -107,10 +107,7 @@ tags:
|
|
| 107 |
**A Multilingual Safety Guardrail Model for 100 languages built on XLM-RoBERTa**
|
| 108 |
|
| 109 |
CREST which stands for CRoss-lingual Efficient Safety Transfer, is a parameter-efficient multilingual safety classifier for 100 languages, fine-tuned using 13 strategically selected high-resource languages only, chosen through cluster-guided sampling, enabling strong cross-lingual transfer to unseen low-resource languages.
|
| 110 |
-
The model is fine-tuned on the XLM-RoBERTa architecture with a classification head, having a max input length of 512 tokens. The
|
| 111 |
-
- `CREST-Base` approximately 278M parameters
|
| 112 |
-
- `CREST-Large` approximately 560M parameters
|
| 113 |
-
|
| 114 |
The model is designed for fast, lightweight safety filtering across a large number of languages, both high-resource and low-resource languages, with minimal training cost, suitable for real-time and on-device deployments.
|
| 115 |
|
| 116 |
|
|
|
|
| 107 |
**A Multilingual Safety Guardrail Model for 100 languages built on XLM-RoBERTa**
|
| 108 |
|
| 109 |
CREST which stands for CRoss-lingual Efficient Safety Transfer, is a parameter-efficient multilingual safety classifier for 100 languages, fine-tuned using 13 strategically selected high-resource languages only, chosen through cluster-guided sampling, enabling strong cross-lingual transfer to unseen low-resource languages.
|
| 110 |
+
The model is fine-tuned on the XLM-RoBERTa architecture with a classification head, having a max input length of 512 tokens. The Base variant has approximately 279M parameters.
|
|
|
|
|
|
|
|
|
|
| 111 |
The model is designed for fast, lightweight safety filtering across a large number of languages, both high-resource and low-resource languages, with minimal training cost, suitable for real-time and on-device deployments.
|
| 112 |
|
| 113 |
|