Skip to content

Early Classification in Split Computing with Regularized Bottlenecks

Notifications You must be signed in to change notification settings

gabrielecastellano/early-exit-distillation

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Early Classification with Network Distillation

This repository implements a pipeline for introducing a bottleneck to an existing Deep Neural Network so that it can be easily split for distributed deployment (inspired by the work in Head Network Distillation). Jointly, the pipeline trains a local smaller classifier that can perform early prediction on the bottleneck output. Bottleneck and early classifier are trained jointly by means of a regularization process that leads to producing high-quality embeddings as the bottleneck output, hence helping the early classification task. Multiple early classifiers are supported: Logistic, KNN, K-means, and a novel Gaussian Mixture Layer developed in PyTorch. The pipeline can be further extended with additional classifiers implementing the desired modules under src/early_classifier.

About

Early Classification in Split Computing with Regularized Bottlenecks

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 65.2%
  • Python 34.8%