Full metadata record
DC pole | Hodnota | Jazyk |
---|---|---|
dc.contributor.author | Vaněk, Jan | |
dc.contributor.author | Machlica, Lukáš | |
dc.contributor.author | Psutka, Josef | |
dc.date.accessioned | 2016-01-07T11:54:32Z | - |
dc.date.available | 2016-01-07T11:54:32Z | - |
dc.date.issued | 2013 | |
dc.identifier.citation | VANĚK, Jan; MACHLICA, Lukᚡ PSUTKA, Josef. Estimation of Single-Gaussian and Gaussian mixture models for pattern recognition. In: Progress in pattern recognition, image analysis, computer vision, and applications. Berlin: Springer, 2013, p. 49-56. (Lectures notes in computer science; 8258). ISBN 978-3-642-41821-1. | en |
dc.identifier.isbn | 978-3-642-41821-1 | |
dc.identifier.uri | http://www.kky.zcu.cz/cs/publications/JanVanek_2013_Estimationof | |
dc.identifier.uri | http://hdl.handle.net/11025/17160 | |
dc.format | 8 s. | cs |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | en |
dc.publisher | Springer | en |
dc.relation.ispartofseries | Lectures notes in computer science; 8258 | en |
dc.rights | © Jan Vaněk - Lukáš Machlica - Josef V. Psutka - Josef Psutka | cs |
dc.subject | odhad míry pravděpodobnosti | cs |
dc.subject | směsi Gaussovských modelĹů | cs |
dc.subject | Kullback- Leiblerova divergence | cs |
dc.subject | odchylka | cs |
dc.subject | měřítko | cs |
dc.title | Estimation of Single-Gaussian and Gaussian mixture models for pattern recognition | en |
dc.type | článek | cs |
dc.type | article | en |
dc.rights.access | openAccess | en |
dc.type.version | publishedVersion | en |
dc.description.abstract-translated | Single-Gaussian and Gaussian-Mixture Models are utilized in various pattern recognition tasks. The model parameters are estimated usually via Maximum Likelihood Estimation (MLE) with respect to available training data. However, if only small amount of training data is available, the resulting model will not generalize well. Loosely speaking, classification performance given an unseen test set may be poor. In this paper, we propose a novel estimation technique of the model variances. Once the variances were estimated using MLE, they are multiplied by a scaling factor, which reflects the amount of uncertainty present in the limited sample set. The optimal value of the scaling factor is based on the Kullback-Leibler criterion and on the assumption that the training and test sets are sampled from the same source distribution. In addition, in the case of GMM, the proper number of components can be determined. | en |
dc.subject.translated | maximum likelihood estimation | en |
dc.subject.translated | Gaussian mixture models | en |
dc.subject.translated | Kullback- Leibler divergence | en |
dc.subject.translated | variance | en |
dc.subject.translated | scaling | en |
dc.type.status | Peer-reviewed | en |
Vyskytuje se v kolekcích: | Články / Articles (KIV) Články / Articles (KKY) |
Soubory připojené k záznamu:
Soubor | Popis | Velikost | Formát | |
---|---|---|---|---|
JanVanek_2013_Estimationof.pdf | Plný text | 208,74 kB | Adobe PDF | Zobrazit/otevřít |
Použijte tento identifikátor k citaci nebo jako odkaz na tento záznam:
http://hdl.handle.net/11025/17160
Všechny záznamy v DSpace jsou chráněny autorskými právy, všechna práva vyhrazena.