|
|
--- |
|
|
license: apache-2.0 |
|
|
tags: |
|
|
- pruned |
|
|
- python |
|
|
- optimized |
|
|
base_model: LGAI-EXAONE/EXAONE-4.0-1.2B |
|
|
--- |
|
|
|
|
|
# EXAONE-4.0-1.2B-python-extra-light |
|
|
|
|
|
This model is a **extra-light** pruned version of [LGAI-EXAONE/EXAONE-4.0-1.2B](https://huggingface.co/LGAI-EXAONE/EXAONE-4.0-1.2B), specialized for **PYTHON** tasks. |
|
|
|
|
|
## Pruning Details |
|
|
|
|
|
- **Base Model**: LGAI-EXAONE/EXAONE-4.0-1.2B |
|
|
- **Specialization**: Python |
|
|
- **Prune Mode**: Extra-light |
|
|
- **Method**: Activation-based weight pruning |
|
|
|
|
|
## Performance Comparison |
|
|
|
|
|
| Category | Original | Pruned | |
|
|
|----------|----------|--------| |
|
|
| Python | 20.0% | 20.0% | |
|
|
| HTML | 6.7% | 0.0% | |
|
|
| Trivia | 86.7% | N/A | |
|
|
| Math | 60.0% | N/A | |
|
|
| Reasoning | N/A | N/A | |
|
|
| Medical | 93.3% | N/A | |
|
|
| Linux | 93.3% | N/A | |
|
|
| Writing | 46.7% | N/A | |
|
|
|
|
|
 |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
from transformers import AutoModelForCausalLM, AutoTokenizer |
|
|
|
|
|
model = AutoModelForCausalLM.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-extra-light") |
|
|
tokenizer = AutoTokenizer.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-extra-light") |
|
|
``` |
|
|
|
|
|
## License |
|
|
|
|
|
This model inherits the license from the base model. |
|
|
|