Abstract

Currently, training of Large Language Model (LLM) consumes considerable power. Network switches plays a significant part of the power consumption. There are several ways to save power consumed in switches. For example, in network switch based application-specific integrated circuit (ASIC) designs, techniques such as adjusting chip voltage or frequency, implementing clock gating, and using activity monitors can be utilized. These power saving techniques, however, could cause switch performance to drop. To overcome this issue, a technique is proposed herein that may be utilized to significantly reduce the average running power for a switch by learning specific traffic models to lower switch power/performance during low traffic period such that an average power savings may be realized.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS