Textile Research Journal, 2025 (SCI-Expanded, Scopus)
This study proposes a perceptually guided workflow for transforming digital images into knit-compatible outputs constrained by predefined yarn palettes. The workflow incorporates multiple error diffusion-based color quantization methods and evaluates each result using seven perceptual and statistical quality metrics (Color Difference formula 2000, Structural Similarity Index Measure, Mean Squared Error, Peak Signal-to-Noise Ratio, Visual Information Fidelity, Feature Similarity Index Measure, Gradient Magnitude Similarity Deviation). These metrics are normalized, perceptually weighted, and aggregated into a composite score to support method comparison. Designers can predefine target yarn colors and compare alternative reductions both visually and quantitatively. The selected image is automatically converted into a bitmap format, with each color linked to predefined knitting codes. Experimental results show that the workflow preserves visual fidelity while respecting material constraints such as yarn availability and machine resolution. By combining color reduction, quality assessment, and code generation in a unified process, the system enables efficient and reproducible translation of digital visuals into textile-ready formats.