Alex Jiang
Flying code monkey
Articles
About me
Links
Music
email
github
twitter
facebook
linkedin
instagram
© 2007-2024 All rights reserved.
Model Training
March 2016
Machine Learning
My Understanding of Knowledge Distilling
knowledge distilling is a method to compress a large model into a smaller one.
Read
← PREV
→ NEXT