A proposal for mixture of experts with entropic regularization

dc.contributor.authorPeralta, Billy
dc.contributor.authorSaavedra Domínguez, Ariel Saavedra
dc.contributor.authorCaro, Luis
dc.date.accessioned2022-05-16T13:00:30Z
dc.date.available2022-05-16T13:00:30Z
dc.date.issued2017
dc.description.abstractIn these days, there are a growing interest in pattern recognition for tasks as prediction of weather events, recommendation of the best route, intrusion detection or face detection. Each of these tasks can be modelled as classification problem, where a common alternative is to use an ensemble model of classification. A well-known example is given by Mixture-of-Experts model, which represents a probabilistic artificial neural network consisting of local experts classifiers weighted by a gate network, and whose combination creates an environment of competition among experts seeking to obtain patterns of the data source. We observe that this architecture assume that one gate influence only one data point, consequently the training can be misguided in real datasets where the data is better explained by multiple experts. In this work, we present a variant of regular Mixture-of-Experts model, which consists of maximizing of the entropy of gate network in addition to classification cost minimization. The results show the advantage of our approach in multiple datasets in terms of accuracy metric. As a future work, we plan to apply this idea to the Mixture-of-Experts with embedded feature selection.
dc.fuente.origenIEEE
dc.identifier.doi10.1109/CLEI.2017.8226425
dc.identifier.eisbn9781538630570
dc.identifier.isbn9781538630587
dc.identifier.urihttps://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8226425
dc.identifier.urihttps://doi.org/10.1109/CLEI.2017.8226425
dc.identifier.urihttps://repositorio.uc.cl/handle/11534/63935
dc.information.autorucEscuela de ingeniería ; Saavedra Domínguez, Ariel Saavedra ; S/I ; 1050244
dc.language.isoen
dc.nota.accesoContenido parcial
dc.publisherIEEE
dc.relation.ispartofLatin American Computer Conference (CLEI) (23° : 2017 : Córdoba, Argentina)
dc.rightsacceso restringido
dc.subjectElectronic mail
dc.subjectLogic gates
dc.subjectComputational modeling
dc.subjectColor
dc.subjectProposals
dc.subjectPattern recognition
dc.titleA proposal for mixture of experts with entropic regularizationes_ES
dc.typecomunicación de congreso
sipa.codpersvinculados1050244
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
A proposal for mixture of experts with entropic regularization.pdf
Size:
2.85 KB
Format:
Adobe Portable Document Format
Description: