How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="jackysnake/RedCoder")
messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe(messages)
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("jackysnake/RedCoder")
model = AutoModelForCausalLM.from_pretrained("jackysnake/RedCoder")
messages = [
    {"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
	messages,
	add_generation_prompt=True,
	tokenize=True,
	return_dict=True,
	return_tensors="pt",
).to(model.device)

outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))
Quick Links

REDCODER: Automated Multi-Turn Red Teaming for Code LLMs

🔬 A model fine-tuned for adversarial multi-turn prompt generation to induce vulnerabilities in Code LLMs.
📄 [arXiv:2507.22063] • 🧠 💻 Full code & data: GitHub – luka-group/RedCoder


🧠 Model Summary

REDCODER is a red-teaming LLM trained to engage target Code LLMs in multi-turn conversations that gradually steer them into generating CWE vulnerabilities (e.g., Such as path traversal, SQL injection, etc.).

This model is designed to support:

  • ⚔️ Red-teaming evaluations for Code LLMs
  • 🧪 Security benchmarking of model guardrails and filters
  • 🧩 Multi-turn adversarial prompt generation in research settings

⚠️ This model should not be used to generate real-world exploits. Its intended use is for research, safety evaluation, and secure LLM development.


If you find this work useful, please cite:

@article{mo2025redcoder,
  title   = {REDCODER: Automated Multi-Turn Red Teaming for Code LLMs},
  author  = {Wenjie Jacky Mo and Qin Liu and Xiaofei Wen and Dongwon Jung and
             Hadi Askari and Wenxuan Zhou and Zhe Zhao and Muhao Chen},
  journal = {arXiv preprint arXiv:2507.22063},
  year    = {2025}
}
Downloads last month
4
Safetensors
Model size
8B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for jackysnake/RedCoder

Finetuned
(1080)
this model

Paper for jackysnake/RedCoder