Mostrar el registro sencillo del ítem

dc.contributor.authorCastro Payán, Francisco Manuel
dc.contributor.authorMarín-Jiménez, Manuel J.
dc.contributor.authorGuil-Mata, Nicolás 
dc.contributor.authorSchmid, Cordelia
dc.contributor.authorAlahari, Karteek
dc.date.accessioned2018-07-06T11:58:46Z
dc.date.available2018-07-06T11:58:46Z
dc.date.created2018
dc.date.issued2018-07-06
dc.identifier.urihttps://hdl.handle.net/10630/16158
dc.description.abstractAlthough deep learning approaches have stood out in recent years due to their state-of-the-art results, they continue to suffer from (catastrophic forgetting), a dramatic decrease in overall performance when training with new classes added incrementally. This is due to current neural network architectures requiring the entire dataset, consisting of all the samples from the old as well as the new classes, to update the model---a requirement that becomes easily unsustainable as the number of classes grows. We address this issue with our approach to learn deep neural networks incrementally, using new data and only a small exemplar set corresponding to samples from the old classes. This is based on a loss composed of a distillation measure to retain the knowledge acquired from the old classes, and a cross-entropy loss to learn the new classes. Our incremental training is achieved while keeping the entire framework end-to-end, i.e., learning the data representation and the classifier jointly, unlike recent methods with no such guarantees.en_US
dc.description.sponsorshipThis work has been funded by project TIC-1692 (Junta de Andalucía), TIN2016-80920R (Spanish Ministry of Science and Technology) and Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech.en_US
dc.language.isoengen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectRedes neuronales (Informática)en_US
dc.subject.otherIncremental learningen_US
dc.subject.otherCNNen_US
dc.subject.otherDistillation lossen_US
dc.subject.otherImage classificationen_US
dc.titleEnd-to-end Incremental Learningen_US
dc.typeinfo:eu-repo/semantics/conferenceObjecten_US
dc.centroE.T.S.I. Informáticaen_US
dc.relation.eventtitleEuropean Conference on Computer Visionen_US
dc.relation.eventplaceMunich, Alemaniaen_US
dc.relation.eventdateSeptiembre de 2018en_US
dc.rights.ccAttribution-NonCommercial-NoDerivatives 4.0 Internacional*


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution-NonCommercial-NoDerivatives 4.0 Internacional