Knowledge Distillation, a Methodology for Compressing Neural Networks
Author(s): Behzad Benam Originally published on Towards AI. Teacher-student architecture to create a smaller model for embedded systems This member-only story is on us. Upgrade to access all of Medium. Photo by Sharon McCutcheon on Unsplash Large neural networks are complex and …