π Phi 3.5 Mini Instruct Python Coding Assistant Gguf 8Bit
Python code generation specialist. 66+ downloads. Fully local.
π― Python-First Design
Fine-tuned exclusively for Python code generation with:
- 50,000+ Python scripts from GitHub
- 200,000 Stack Overflow Q&A pairs
- 15,000 Jupyter notebooks
- PEP 8 compliant output
- Type hints and docstrings
π Quick Start
Via Ollama
ollama run Agnuxo/Phi-3.5-mini-instruct-python_coding_assistant-GGUF_8bit
Via Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained(
"Agnuxo/Phi-3.5-mini-instruct-python_coding_assistant-GGUF_8bit",
torch_dtype="auto", device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("Agnuxo/Phi-3.5-mini-instruct-python_coding_assistant-GGUF_8bit")
prompt = "Write a Python function to parse JSON and validate schema"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=256, temperature=0.2)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
π P2PCLAW Ecosystem
| Component | Purpose | Link |
|---|---|---|
| CAJAL-9B | Scientific papers | HF Model |
| CAJAL-4B | Lightweight papers | HF Model |
| BenchClaw | Code evaluation | HF Space |
| P2PCLAW | Research network | Website |
π€ Author
Francisco Angulo de Lafuente (Agnuxo1) Β· ORCID: 0009-0001-1634-7063
Built with π₯ by the P2PCLAW Collective
- Downloads last month
- 66
Hardware compatibility
Log In to add your hardware
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support