ETC: training-free diffusion models acceleration with Error-aware Trend Consistency

Abstract

Diffusion models have achieved remarkable generative quality but remain bottlenecked by costly iterative sampling. Recent training-free methods accelerate diffusion process by reusing model outputs. However, these methods ignore denoising trends and lack error control for model-specific tolerance, leading to trajectory deviations under multi-step reuse and exacerbating inconsistencies in the generated results. To address these issues, we introduce Error-aware Trend Consistency (ETC), a framework that (1) introduces a consistent trend predictor that leverages the smooth continuity of diffusion trajectories, projecting historical denoising patterns into stable future directions and progressively distributing them across multiple approximation steps to achieve acceleration without deviating; (2) proposes a model-specific error tolerance search mechanism that derives corrective thresholds by identifying transition points from volatile semantic planning to stable quality refinement. Experiments show that ETC achieves a 2.65× acceleration over FLUX with negligible (-0.074 SSIM score) degradation of consistency.

 

Method

Overview of the proposed ETC. ETC leverages all historical model outputs to estimate future trends and dynamically adjusts approximation frequency according to each model’s error tolerance limit. Trend predictor compute weighted projections of cross-step changes in model outputs and dynamically extends or contracts the approximation window based on whether deviations between projected trends and periodic model inferences remain within the model’s corrective capacity. Model-specific error tolerance search quantify the perceptual influence of deviation perturbations on generation quality and derive critical transition points from volatile semantic planning to smooth quality refinement phases that reflect model limits for error correction.

 

Results


SDXL-Base


FLUX


OpenSora-1.2


Wan-2.1