[edit]
RATQ: A Universal Fixed-Length Quantizer for Stochastic Optimization
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1399-1409, 2020.
Abstract
We present Rotated Adaptive Tetra-iterated Quantizer (RATQ), afixed-length quantizer for gradients in first order stochasticoptimization. RATQ is easy to implement and involves only a Hadamard transform computation and adaptive uniform quantization with appropriately chosen dynamic ranges. For noisy gradients with almost surely bounded Euclidean norms, we establish an informationtheoretic lower bound for optimization accuracy using finite precisiongradients and show that RATQ almost attains this lower bound. For mean square bounded noisy gradients, we use a gain-shape quantizer which separately quantizes the Euclidean norm and uses RATQ to quantize the normalized unit norm vector. We establish lower bounds for performance of any optimization procedure and shape quantizer, when used with a uniform gain quantizer. Finally, we propose an adaptive quantizer for gain which when used with RATQ for shape quantizer outperforms uniform gain quantization and is, in fact, close to optimal.