22th February 2026: Model loading change

See more here: https://huggingface.co/Kijai/LTXV2_comfy Will update the workflows asap to reflect the changes in ComfyUI model loader logic.

And if you have some experience with nodes, see the link above on what to change, if you want to do yourself. (basically just swapping out the main model loader with a new one)

Update: ComfyUI made a temporary "fix" so that the old workflows still work for now. Eventually you will need to use the new updated workflows though with new model loaders. https://github.com/Comfy-Org/ComfyUI/pull/12605

In process of updating my workflows to reflect the changes


The workflows are based on the extracted models from https://huggingface.co/Kijai/LTXV2_comfy The extracted models might run easier on your computer (as separate files), as well as GGUF support etc. (but you can easily swap out the model loader for the ComfyUI default model loader if you want to load the checkpoint with "all in one" vae built-in etc)

Model Downloads:

Needed nodes:

(video made with LTX-2, Credit to https://www.reddit.com/user/fantazart/) https://www.reddit.com/r/StableDiffusion/comments/1qeovkh/ltx2_cinematic_love_letter_to_opensource_community/


A general guide: https://docs.ltx.video/open-source-model/integration-tools/comfy-ui

More workflows:

ComfyUI official workflows: https://docs.comfy.org/tutorials/video/ltx/ltx-2

LTX-Video official workflows: https://github.com/Lightricks/ComfyUI-LTXVideo/tree/master/example_workflows

Some really nice clean workflows here: https://comfyui.nomadoor.net/en/basic-workflows/ltx-2/

RunComfy (can download workflow to use locally):

LTX-2 Controlnet (pose, depth etc) https://www.runcomfy.com/comfyui-workflows/ltx-2-controlnet-in-comfyui-depth-controlled-video-workflow

LTX-2 First Last Frame https://www.runcomfy.com/comfyui-workflows/ltx-2-first-last-frame-in-comfyui-audio-visual-motion-control

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support