SECRET SAUCE WAN 2.2

Details

Download Files

Model description

This Pack Contains:

INGREDIENTS for specialized features:

  • 14 High-Noise Models

  • 10 Low-Noise Models

  • 7 Face Models (HuggingFace link included)

SAUCE STATES:

  • 1 High & Low Noise Merged Model
    A mixture of selected ingredients, baked together with Lightning and HPS LoRAs. This polyvalent, pre-made stack is based on the fp8_scaled low noise model from Kijai and Lightning High noise "DYNO" model from lightx2v.

  • 1 High & Low Noise Single-Train LoRA
    A branch of a continuous training process, diversified as an alternative to stacking individual features (Ingredients singular models) or using the merged models.

While the "INGREDIENTS" LoRAs are mostly trained on specific concepts, these "SS0_SINGLE_TRAIN" LoRA are trained on multiple concepts, kinda like my previous released models. Potentially more stable using it than a stack, also less "tunable".

Be careful when stacking models to not overcook your renders, the settings of the merged models are available on the MODEL_MERGE folder.

Some video samples contain male-specific features that I embedded by mistake. I didn’t include these LoRAs in the pack, as they don’t fit the art direction of this release. I might release them later on Hugging Face or elsewhere. 

All showcase videos are generated using T2V, I recommend euler 8/12 steps, shift 8.

My workflows here

/model/1818841/wan-22-workflow-t2v-i2v-t2i-kijai-wrapper

For easy and fast LoRA comparison / tests, see this article and the workflow that is given with. It's great
https://civitai.com/articles/19612


Think of this pack as a large toolbox for WAN 2.2 inference.

I’m not posting every model separately, but feel free to keep only what you need if disk space is limited. The full content is about 30GB, including the Faces pack from the HuggingFace link.

If you enjoy this pack and want to support my work, you can buy me a coffee here:
👉 buymeacoffee.com/designedbycrt

Thanks.

Images made by this model

No Images Found.