| license: apache-2.0 | |
| tags: | |
| - pruned | |
| - python | |
| - optimized | |
| base_model: LGAI-EXAONE/EXAONE-4.0-1.2B | |
| # EXAONE-4.0-1.2B-python-light | |
| This model is a **light** pruned version of [LGAI-EXAONE/EXAONE-4.0-1.2B](https://huggingface.co/LGAI-EXAONE/EXAONE-4.0-1.2B), specialized for **PYTHON** tasks. | |
| ## Pruning Details | |
| - **Base Model**: LGAI-EXAONE/EXAONE-4.0-1.2B | |
| - **Specialization**: Python | |
| - **Prune Mode**: Light | |
| - **Method**: Activation-based weight pruning | |
| ## Performance Comparison | |
| | Category | Original | Pruned | | |
| |----------|----------|--------| | |
| | Python | 20.0% | 20.0% | | |
| | HTML | 6.7% | 0.0% | | |
| | Trivia | 86.7% | N/A | | |
| | Math | 60.0% | N/A | | |
| | Reasoning | N/A | N/A | | |
| | Medical | 93.3% | N/A | | |
| | Linux | 93.3% | N/A | | |
| | Writing | 46.7% | N/A | | |
|  | |
| ## Usage | |
| ```python | |
| from transformers import AutoModelForCausalLM, AutoTokenizer | |
| model = AutoModelForCausalLM.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-light") | |
| tokenizer = AutoTokenizer.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-light") | |
| ``` | |
| ## License | |
| This model inherits the license from the base model. | |