Token Merging : Speed up Stable Diffusion with this one simple trick!https://github.com/dbolya/tomesd#installation
Token Merging (ToMe) is a technique used to speed up transformers by merging redundant tokens, which helps reduce the workload for the transformer without compromising quality. The technique is applied to the underlying transformer blocks in Stable Diffusion, minimizing quality loss while preserving the speed-up and memory benefits. It works without training and can be used for any Stable Diffusion model, reducing the workload by up to 60%. ToMe for SD is not another efficient reimplementation of transformer modules, but an actual reduction of the total workload required to generate an image. The results of ToMe for SD show that it produces images similar to the originals, while being faster and using less memory, making it an efficient tool for image generation.