| --- |
| library_name: transformers |
| tags: |
| - generated_from_trainer |
| model-index: |
| - name: tiny-audio-embedded |
| results: [] |
| --- |
| |
| <!-- This model card has been generated automatically according to the information the Trainer had access to. You |
| should probably proofread and complete it, then remove this comment. --> |
|
|
| # tiny-audio-embedded |
|
|
| This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset. |
| It achieves the following results on the evaluation set: |
| - Loss: 0.9147 |
|
|
| ## Model description |
|
|
| More information needed |
|
|
| ## Intended uses & limitations |
|
|
| More information needed |
|
|
| ## Training and evaluation data |
|
|
| More information needed |
|
|
| ## Training procedure |
|
|
| ### Training hyperparameters |
|
|
| The following hyperparameters were used during training: |
| - learning_rate: 0.001 |
| - train_batch_size: 32 |
| - eval_batch_size: 32 |
| - seed: 42 |
| - gradient_accumulation_steps: 2 |
| - total_train_batch_size: 64 |
| - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
| - lr_scheduler_type: cosine |
| - lr_scheduler_warmup_steps: 2000 |
| - num_epochs: 2 |
| - label_smoothing_factor: 0.1 |
| |
| ### Training results |
| |
| | Training Loss | Epoch | Step | Validation Loss | |
| |:-------------:|:------:|:-----:|:---------------:| |
| | 2.4254 | 0.0309 | 1000 | 1.0182 | |
| | 2.2814 | 0.0617 | 2000 | 0.9839 | |
| | 2.2150 | 0.0926 | 3000 | 0.9564 | |
| | 2.1892 | 0.1234 | 4000 | 0.9486 | |
| | 2.1949 | 0.1543 | 5000 | 0.9442 | |
| | 2.1758 | 0.1851 | 6000 | 0.9387 | |
| | 2.1483 | 0.2160 | 7000 | 0.9367 | |
| | 2.1435 | 0.2469 | 8000 | 0.9336 | |
| | 2.1568 | 0.2777 | 9000 | 0.9336 | |
| | 2.1350 | 0.3086 | 10000 | 0.9298 | |
| | 2.1592 | 0.3394 | 11000 | 0.9312 | |
| | 2.1291 | 0.3703 | 12000 | 0.9279 | |
| | 2.1377 | 0.4011 | 13000 | 0.9271 | |
| | 2.1456 | 0.4320 | 14000 | 0.9278 | |
| | 2.1421 | 0.4628 | 15000 | 0.9254 | |
| | 2.1457 | 0.4937 | 16000 | 0.9239 | |
| | 2.1270 | 0.5246 | 17000 | 0.9247 | |
| | 2.1311 | 0.5554 | 18000 | 0.9232 | |
| | 2.1075 | 0.5863 | 19000 | 0.9220 | |
| | 2.1268 | 0.6171 | 20000 | 0.9221 | |
| | 2.1180 | 0.6480 | 21000 | 0.9210 | |
| | 2.0914 | 0.6788 | 22000 | 0.9211 | |
| | 2.1120 | 0.7097 | 23000 | 0.9214 | |
| | 2.1319 | 0.7406 | 24000 | 0.9206 | |
| | 2.1328 | 0.7714 | 25000 | 0.9203 | |
| | 2.1336 | 0.8023 | 26000 | 0.9192 | |
| | 2.0920 | 0.8331 | 27000 | 0.9193 | |
| | 2.0895 | 0.8640 | 28000 | 0.9191 | |
| | 2.1330 | 0.8948 | 29000 | 0.9184 | |
| | 2.1262 | 0.9257 | 30000 | 0.9179 | |
| | 2.0950 | 0.9566 | 31000 | 0.9177 | |
| | 2.1082 | 0.9874 | 32000 | 0.9177 | |
| | 2.0877 | 1.0183 | 33000 | 0.9175 | |
| | 2.1202 | 1.0491 | 34000 | 0.9170 | |
| | 2.1147 | 1.0800 | 35000 | 0.9168 | |
| | 2.0989 | 1.1108 | 36000 | 0.9165 | |
| | 2.0941 | 1.1417 | 37000 | 0.9162 | |
| | 2.1437 | 1.1725 | 38000 | 0.9163 | |
| | 2.0914 | 1.2034 | 39000 | 0.9160 | |
| | 2.0870 | 1.2343 | 40000 | 0.9160 | |
| | 2.0900 | 1.2651 | 41000 | 0.9159 | |
| | 2.1074 | 1.2960 | 42000 | 0.9158 | |
| | 2.0863 | 1.3268 | 43000 | 0.9156 | |
| | 2.0879 | 1.3577 | 44000 | 0.9155 | |
| | 2.0966 | 1.3885 | 45000 | 0.9151 | |
| | 2.0793 | 1.4194 | 46000 | 0.9151 | |
| | 2.0587 | 1.4503 | 47000 | 0.9148 | |
| | 2.0919 | 1.4811 | 48000 | 0.9148 | |
| | 2.0917 | 1.5120 | 49000 | 0.9149 | |
| | 2.0948 | 1.5428 | 50000 | 0.9148 | |
| | 2.1051 | 1.5737 | 51000 | 0.9148 | |
| | 2.1150 | 1.6045 | 52000 | 0.9148 | |
| | 2.0989 | 1.6354 | 53000 | 0.9149 | |
| | 2.0856 | 1.6663 | 54000 | 0.9147 | |
| | 2.0850 | 1.6971 | 55000 | 0.9148 | |
| | 2.0982 | 1.7280 | 56000 | 0.9147 | |
| | 2.1025 | 1.7588 | 57000 | 0.9147 | |
| | 2.0903 | 1.7897 | 58000 | 0.9148 | |
| | 2.0694 | 1.8205 | 59000 | 0.9147 | |
| | 2.1191 | 1.8514 | 60000 | 0.9148 | |
| | 2.0871 | 1.8823 | 61000 | 0.9147 | |
| | 2.0957 | 1.9131 | 62000 | 0.9147 | |
| | 2.0817 | 1.9440 | 63000 | 0.9147 | |
| | 2.1124 | 1.9748 | 64000 | 0.9147 | |
| | 2.0836 | 2.0 | 64816 | 0.9147 | |
| |
| |
| ### Framework versions |
| |
| - Transformers 5.6.1 |
| - Pytorch 2.11.0+cu130 |
| - Datasets 3.6.0 |
| - Tokenizers 0.22.2 |
| |