【论文笔记】(2015,防御蒸馏)Distillation as a Defense to Adversarial Perturbations against Deep Neural Networks
    
    
        
        
摘要:        
有关蒸馏 (Distillation)的论文: (2006)Model Compression (2014)Do Deep Nets Really Need to be Deep? 论文笔记 (2015)Distilling the Knowledge in a Neural Network 论文笔    阅读全文
        
            posted @ 2022-06-26 21:10
李斯赛特
阅读(2188)
评论(0)
推荐(0)
        
        
                    
                
浙公网安备 33010602011771号