Terms Of Use: This model is governed by the Creative Commons Attribution NonCommercial NoDerivatives 4.0 International license. By using these weights, you agree to the following conditions:

Personal Use: Granted for local inference and research.

No Redistribution: Do not re-upload these files to any other platform.

No Derivatives: You are strictly prohibited from modifying or fine-tuning these weights.

Non-Commercial: You cannot sell this model or use it in paid products.

Social media content: You are permitted to feature nanobit-300m in youtube, tiktok, or other social media content. Requirement: You must provide a direct link to this repository in the video description.

Implementation: import torch from transformers import AutoTokenizer, LlamaForCausalLM

path = "imsuprtwo2/nanobit-300m"

tokenizer = AutoTokenizer.from_pretrained(path) model = LlamaForCausalLM.from_pretrained(path).to("cuda")

prompt = "Instruction: Write a python script to sort a list. \nOutput:" inputs = tokenizer(prompt, return_tensors="pt").to("cuda")

outputs = model.generate(**inputs, max_new_tokens=100) print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Downloads last month
79
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Space using imsuprtwo2/NanoBit-300M 1