FFT2SD - Fine Tuned Models
Collection
Collection of all the fine-tuned models in the project.
•
7 items
•
Updated
This model is a fine-tuned version of ltg/norbert3-base on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 16.8069 | 0.1953 | 100 | 3.4757 |
| 4.8249 | 0.3906 | 200 | 1.0589 |
| 2.6989 | 0.5859 | 300 | 0.5940 |
| 1.8387 | 0.7812 | 400 | 0.4043 |
| 1.4115 | 0.9766 | 500 | 0.3205 |
| 1.1266 | 1.1719 | 600 | 0.2622 |
| 1.0318 | 1.3672 | 700 | 0.2170 |
| 0.7614 | 1.5625 | 800 | 0.1910 |
| 0.6442 | 1.7578 | 900 | 0.1684 |
| 0.6635 | 1.9531 | 1000 | 0.1541 |
| 0.6724 | 2.1484 | 1100 | 0.1338 |
| 0.6284 | 2.3438 | 1200 | 0.1222 |
| 0.5602 | 2.5391 | 1300 | 0.1120 |
| 0.45 | 2.7344 | 1400 | 0.1009 |
| 0.4228 | 2.9297 | 1500 | 0.0975 |
| 0.3959 | 3.125 | 1600 | 0.0891 |
| 0.3395 | 3.3203 | 1700 | 0.0849 |
| 0.3663 | 3.5156 | 1800 | 0.0802 |
| 0.3785 | 3.7109 | 1900 | 0.0741 |
| 0.3077 | 3.9062 | 2000 | 0.0700 |
| 0.3077 | 4.1016 | 2100 | 0.0632 |
| 0.2746 | 4.2969 | 2200 | 0.0578 |
| 0.2928 | 4.4922 | 2300 | 0.0564 |
| 0.2677 | 4.6875 | 2400 | 0.0517 |
| 0.2449 | 4.8828 | 2500 | 0.0508 |
| 0.1824 | 5.0781 | 2600 | 0.0478 |
| 0.2439 | 5.2734 | 2700 | 0.0428 |
| 0.1905 | 5.4688 | 2800 | 0.0402 |
| 0.2193 | 5.6641 | 2900 | 0.0384 |
| 0.1955 | 5.8594 | 3000 | 0.0365 |
| 0.1353 | 6.0547 | 3100 | 0.0314 |
| 0.123 | 6.25 | 3200 | 0.0302 |
| 0.0964 | 6.4453 | 3300 | 0.0307 |
| 0.1457 | 6.6406 | 3400 | 0.0276 |
| 0.1173 | 6.8359 | 3500 | 0.0268 |
| 0.1187 | 7.0312 | 3600 | 0.0242 |
| 0.0816 | 7.2266 | 3700 | 0.0212 |
| 0.0883 | 7.4219 | 3800 | 0.0219 |
| 0.1125 | 7.6172 | 3900 | 0.0212 |
Base model
ltg/norbert3-base