| license: apache-2.0 | |
| tags: | |
| - pruned | |
| - python | |
| - optimized | |
| base_model: openai-community/openai-gpt | |
| # openai-gpt-python-heavy | |
| This model is a **heavy** pruned version of [openai-community/openai-gpt](https://huggingface.co/openai-community/openai-gpt), specialized for **PYTHON** tasks. | |
| ## Pruning Details | |
| - **Base Model**: openai-community/openai-gpt | |
| - **Specialization**: Python | |
| - **Prune Mode**: Heavy | |
| - **Method**: Activation-based weight pruning | |
| ## Performance Comparison | |
| | Category | Original | Pruned | | |
| |----------|----------|--------| | |
| | Python | 0.0% | 0.0% | | |
| | HTML | 0.0% | 0.0% | | |
| | Trivia | 0.0% | 0.0% | | |
| | Math | 0.0% | 0.0% | | |
| | Reasoning | 0.0% | 0.0% | | |
| | Medical | 0.0% | 0.0% | | |
| | Linux | 0.0% | 0.0% | | |
| | Writing | 0.0% | 0.0% | | |
|  | |
| ## Usage | |
| ```python | |
| from transformers import AutoModelForCausalLM, AutoTokenizer | |
| model = AutoModelForCausalLM.from_pretrained("CompactAI/openai-gpt-python-heavy") | |
| tokenizer = AutoTokenizer.from_pretrained("CompactAI/openai-gpt-python-heavy") | |
| ``` | |
| ## License | |
| This model inherits the license from the base model. | |