| base_model: | |
| - lainlives/codellama-34b-python-merge | |
| # lainlives/codellama-34b-python-merge (Quantized) | |
| ## Description | |
| This model is a quantized version of the original model [`lainlives/codellama-34b-python-merge`](https://huggingface.co/lainlives/codellama-34b-python-merge). | |
| It's quantized using the BitsAndBytes library to 4-bit using the [bnb-my-repo](https://huggingface.co/spaces/bnb-community/bnb-my-repo) space. | |
| ## Quantization Details | |
| - **Quantization Type**: int4 | |
| - **bnb_4bit_quant_type**: nf4 | |
| - **bnb_4bit_use_double_quant**: True | |
| - **bnb_4bit_compute_dtype**: bfloat16 | |
| - **bnb_4bit_quant_storage**: uint8 | |