Skip to main content
Fig. 14 | Journal of Engineering and Applied Science

Fig. 14

From: A survey on GAN acceleration using memory compression techniques

Fig. 14

Comparison between different distillation and pruning techniques using cyclegan model and horse2zebra dataset. a, b, c, d, e, and f represents the works in [25, 28, 30, 33, 34, 37], respectively. (f) [25] is a pruning technique, while all the others are distillation. In e, we only reported the distillation results and omitted the quantization effect for a fair comparison. Work a, c had stronger teacher than the rest. In c, we estimated the compression ratio as the ratio between the number mac operations

Back to article page