82543

Автор(ы): 

Автор(ов): 

4

Параметры публикации

Тип публикации: 

Доклад

Название: 

Anti-Distillation: Knowledge Transfer from a Simple Model to the Complex One

ISBN/ISSN: 

2767-9535

DOI: 

10.1109/ispras57371.2022.10076855

Наименование конференции: 

  • 2022 Ivannikov Ispras Open Conference (ISPRAS)

Наименование источника: 

  • Proceedings of the Ivannikov Memorial Workshop (IVMEM), 2022

Город: 

  • Москва

Издательство: 

  • IEEE

Год издания: 

2022

Страницы: 

https://ieeexplore.ieee.org/document/10076855
Аннотация
The paper considers the problem of adapting the model to new data with a large amount of information. We propose to build a more complex model using the parameters of a simple one. We take into account not only the accuracy of the prediction on the original samples but also the adaptability to new data and the robustness of the obtained solution. The work is devoted to developing the method that allows adapting the pre-trained model to a more heterogeneous dataset. In the computational experiment, we analyse the quality of predictions and model robustness on Fashion-MNIST dataset.

Библиографическая ссылка: 

Петрушина К.Е., Бахтеев О.Ю., Грабовой А.В., Стрижов В.В. Anti-Distillation: Knowledge Transfer from a Simple Model to the Complex One / Proceedings of the Ivannikov Memorial Workshop (IVMEM), 2022. М.: IEEE, 2022. С. https://ieeexplore.ieee.org/document/10076855.