Title: Unsupervised methods for language modeling: technical report no. DCSE/TR-2012-03
Authors: Brychcín, Tomáš
Issue Date: 2012
Publisher: University of West Bohemia in Pilsen
Document type: zpráva
report
URI: http:// www.kiv.zcu.cz/publications/
http://hdl.handle.net/11025/21549
Keywords: jazykový model;n-gram
Keywords in different language: language model;n-gram
Abstract in different language: Language models are crucial for many tasks in NLP and N-grams are the best way to build them. Huge e ort is being invested in improving n-gram language models. By introducing external information (morphology, syntax, partitioning into documents, etc.) into the models a signi cant improvement can be achieved. The models can however be improved with no external information and smoothing is an excellent example of such an improvement. Thesis summarizes the state-of-the-art approaches to unsupervised language modeling with emphases on the in ectional languages, which are particularly hard to model. It is focused on methods that can discover hidden patterns that are already in a training corpora. These patterns can be very useful for enhancing the performance of language modeling, moreover they do not require additional information sources.
Rights: © University of West Bohemia in Pilsen
Appears in Collections:Zprávy / Reports (KIV)

Files in This Item:
File Description SizeFormat 
Brychcin.pdfPlný text425,44 kBAdobe PDFView/Open


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/21549

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.