Wan2.2+TTP Tile Upscale Workflow
Details
Download Files
Model description
这是一个通过利用Wan2.2 LOW模型强大的图片生成能力和TTP分块放大节点与KJ节点组合而成的图像高清放大工作流。经测试在占用16G显存且不使用加速lora的情况下将分辨率为1024x2048的图片放大到2048x4096只用了2分钟。 展示图的第一张为放大前,第二张为放大后的图像。可以看到Wan2.2在几乎没有任何损失的情况下完成了任务。感谢各位开源节点者与通义万相团队将这一切变成现实。
This is an image super-resolution workflow that leverages the powerful image generation capabilities of the Wan2.2 LOW model, combined with the TTP tiled upscaling node and KJ node. Testing shows that with 16GB of VRAM usage and without using acceleration LoRA, it only took 2 minutes to upscale an image from 1024x2048 resolution to 2048x4096.
The first image in the demonstration is the original, and the second is the upscaled version. As you can see, Wan2.2 accomplished the task with almost no loss in quality. Thanks to all the open-source node developers and the Tongyi Wanxiang team for making this possible.
Wan2.2 A14B I2V low model:https://huggingface.co/Kijai/WanVideo_comfy_fp8_scaled/tree/main/I2V
DAT upscale model:https://github.com/zhengchen1999/DAT
此工作流中是在安装了Triton和sageattention2.1.1的情况下运行的,如果没有安装上述加速模块的话可以尝试不接wanvideo torch complie settings节点,同时把wanvideo model loader节点的attention_mode换为默认的sdpa来运行,虽然生成时间会有所提升,但质量仍是可保证的。
In this workflow, the system was running with Triton and sageattention2.1.1 installed. If these acceleration modules are not available, you can try disconnecting the WanVideo Torch Compile Settings node and switching the attention_mode in the WanVideo Model Loader node to the default sdpa.
While this may increase generation time, the output quality will remain guaranteed.



