Search Results for author: Simla Burcu Harma

Found 2 papers, 0 papers with code

Effective Interplay between Sparsity and Quantization: From Theory to Practice

no code implementations31 May 2024 Simla Burcu Harma, Ayan Chakraborty, Elizaveta Kostenok, Danila Mishin, Dongho Ha, Babak Falsafi, Martin Jaggi, Ming Liu, Yunho Oh, Suvinay Subramanian, Amir Yazdanbakhsh

In addition, through rigorous analysis, we demonstrate that sparsity and quantization are not orthogonal; their interaction can significantly harm model accuracy, with quantization error playing a dominant role in this degradation.

Accuracy Booster: Enabling 4-bit Fixed-point Arithmetic for DNN Training

no code implementations19 Nov 2022 Simla Burcu Harma, Ayan Chakraborty, Nicholas Sperry, Babak Falsafi, Martin Jaggi, Yunho Oh

Based on our findings, we propose Accuracy Booster, a mixed-mantissa HBFP technique that uses 4-bit mantissas for over 99% of all arithmetic operations in training and 6-bit mantissas only in the last epoch and first/last layers.

Cannot find the paper you are looking for? You can Submit a new open access paper.