IEEE Access (Jan 2021)

In-Network Flow Classification With Knowledge Distillation

  • Kate Ching-Ju Lin,
  • Chen-Yang Li

DOI
https://doi.org/10.1109/ACCESS.2021.3100057
Journal volume & issue
Vol. 9
pp. 111879 – 111889

Abstract

Read online

Recent research has incorporated machine learning with software-defined networking to support intelligent traffic engineering. However, most frameworks only enable machine learning in remote controllers, which introduce significant signaling overhead and data forwarding costs. In this work, we present a new architecture called in-network inference (INI) to realize local learning in Neural Compute Stick (NCS), a portable device that can be connected to a programmable switch via a USB port. While NCS can flexibly extend the computing power of a switch, its limited capacity however cannot afford real-time inference for enormous traffic demands. To develop a practical local learning architecture, we design a two-phase learning framework that combines local learning with knowledge distillation and remote learning to achieve lightweight but accurate traffic classification. We further design an inference model deployment and adaptation algorithm to utilize multiple NCS devices equipped with different switches to share the inference workload of a network. Our testbed experiments show that the two-phase learning framework reduces the inference rejection rate by 46.5% and maintains the inference accuracy of 98.10%. The trace-driven simulations verify that the proposed adaptive model placement scheme considers load balancing and, hence, better utilizes the computing resources of NCS to serve dynamic inference requests.

Keywords