Alex Jiang

Alex Jiang

Flying code monkey

  • Articles
  • About me
  • Links
  • Music
  • email
  • github
  • twitter
  • facebook
  • linkedin
  • instagram
© 2007-2025 All rights reserved.

Model Training

March 2016Machine Learning

My Understanding of Knowledge Distilling

knowledge distilling is a method to compress a large model into a smaller one.

Read
← PREV
→ NEXT