The present disclosure relates to providing a method and system for training neural networks. It discloses an adaptive training algorithm Adam Q to overcome the challenges associated with its predecessors. It proposes a look-up table to be fused with the existing Adam algorithm such that Adam Q may not need to run all the computational operations while determining the updated weights and instead may directly see the results from the fused look-up table. It further goes on to propose a quantization technique where the received inputs from the previous iterations are first quantized and then taken as an input for the provided look-up table to make the proposed algorithm more efficient. Thus, by subjecting the inputs to quantization and fusing the look-up table, Adam Q aims to provide a more computationally efficient and financially sustainable way while ensuring data privacy.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.