Self-Alignment with Instruction Backtranslation
Paper β’ 2308.06259 β’ Published β’ 43
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Spico/Humback-M0")
model = AutoModelForCausalLM.from_pretrained("Spico/Humback-M0")The proposed Humback is a novel framework that can augment the instruction data for supervised fine-tuning with high quality.
This is a SFT (supervised fine-tuning) model $M_{0}$ for Humback reproduction.
This model is trained on the seed data.
The seed data is a sampled dataset from oasst1.
You may find more details and usage examples in Spico197/Humback .
@misc{li2023selfalignment,
title={Self-Alignment with Instruction Backtranslation},
author={Xian Li and Ping Yu and Chunting Zhou and Timo Schick and Luke Zettlemoyer and Omer Levy and Jason Weston and Mike Lewis},
year={2023},
eprint={2308.06259},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Spico/Humback-M0")