Inventor(s)

N/AFollow

Abstract

The present disclosure is directed to achieving zero-latency cold starts in cloud computing environments utilizing in-band application context streaming to eliminate disk I/O bottlenecks. When a service becomes idle, a micro-snapshot generating engine performs differential state stripping to identify and serialize only modified memory pages and CPU register values into a micro-snapshot, which is then offloaded to the volatile static random-access memory (SRAM) of a programmable network switch or top-of-rack (ToR) switch. Upon detecting an incoming user request for the hibernated service, the switch may intercept the packet and inject the stored application context as a burst stream of packets directly ahead of the original request. Utilizing remote direct memory access (RDMA) and data direct input/output (DDIO), a target compute node’s smart network interface card (NIC) may be configured to write the incoming state stream directly into the processor’s last-level cache (L3) or main memory, rehydrating the execution environment at line rate. This process allows the application to resume execution from its exact prior state the moment the user request reaches the CPU, enabling energy-efficient infrastructure without the latency penalties associated with traditional state restoration.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

Share

COinS