Skip to content

Single Teacher to Single Student model knowledge distillation using DNN Structure

Notifications You must be signed in to change notification settings

MinJunKang/Knowledge-Distillation_MNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge-Distillation_MNIST

Single Teacher to Single Student model knowledge distillation using DNN Structure

Paper : https://arxiv.org/abs/1503.02531

Data

MNIST - Tensorflow

Usage

in the same directory of your code file type 'python ./main.py' in console

You have 10 options.

  1. Train the teacher model
  2. Get soft target with various temperature
  3. Train the student model without using soft targets
  4. Train the student model with soft targets
  5. Get the trained student model with the best accuracy of various models
  6. View training log of student model
  7. View training log of teacher model
  8. Make the DNN student model immediately and test the model
  9. Show the result log data of model that you made in 8 option
  10. From the training log of student model, create excel file and do some analysis

Programming Language

Python 3.6 Tensorflow, keras

OS dependency

windows 10, ubuntu linux

Teacher Model

Student Model

Training Method

Stop Training if overfitting(validation set accuracy doesn't increase) happens more than 50 epochs

Result

About

Single Teacher to Single Student model knowledge distillation using DNN Structure

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages