共 50 条
- [2] Distilling Knowledge for Non-Neural Networks 2019 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2019, : 1411 - 1416
- [3] Distilling Holistic Knowledge with Graph Neural Networks 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10367 - 10376
- [4] Pictures to remember. Generational memories about the recent past in Chile IC-REVISTA CIENTIFICA DE INFORMACION Y COMUNICACION, 2020, (17): : 247 - 271
- [5] Genetic evolution of neural networks that remember PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1148 - 1153
- [7] Distilling Spikes: Knowledge Distillation in Spiking Neural Networks 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 4536 - 4543
- [8] Distilling Neural Networks for Greener and Faster Dependency Parsing 16TH INTERNATIONAL CONFERENCE ON PARSING TECHNOLOGIES AND IWPT 2020 SHARED TASK ON PARSING INTO ENHANCED UNIVERSAL DEPENDENCIES, 2020, : 2 - 13
- [9] On neural networks that design neural associative memories IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (02): : 360 - 372
- [10] How Convolutional Neural Networks Remember Art 2018 25TH INTERNATIONAL CONFERENCE ON SYSTEMS, SIGNALS AND IMAGE PROCESSING (IWSSIP), 2018,