Instructions to use TheMistoAI/MistoLine with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use TheMistoAI/MistoLine with Diffusers:
pip install -U diffusers transformers accelerate
import torch from diffusers import DiffusionPipeline # switch to "mps" for apple devices pipe = DiffusionPipeline.from_pretrained("TheMistoAI/MistoLine", dtype=torch.bfloat16, device_map="cuda") prompt = "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k" image = pipe(prompt).images[0] - Notebooks
- Google Colab
- Kaggle
- Local Apps
- Draw Things
- DiffusionBee
Inference API Broken
#11
by Cyteon - opened
Trying to use inference api on right side of page or using api, it say Model type not found
MistoLine cannot run without SDXL base model. You might want to try it in ComfyUI, WebUI, or Diffusers.
StrugglerXYH changed discussion status to closed
Is it possible to use with inference api?
Is it possible to use with inference api?
I don't think so. The inference API part is generated by Hugging Face automatically. I have no idea where to modify its code.
Hi @Cyteon , did you find a solution to this? I am facing the same issue!
@StrugglerXYH can you please open the discussion again?
StrugglerXYH changed discussion status to open
I have not been able to find a solution to this as well