Title: BERT-Based Sentiment Analysis Using Distillation
Authors: Lehečka, Jan
Švec, Jan
Ircing, Pavel
Šmídl, Luboš
Citation: LEHEČKA, J., ŠVEC, J., IRCING, P., ŠMÍDL, L. BERT-Based Sentiment Analysis Using Distillation. In Statistical Language and Speech Processing, SLSP 2020. Cham: Springer, 2020. s. 58-70. ISBN 978-3-030-59429-9, ISSN 0302-9743.
Issue Date: 2020
Publisher: Springer
Document type: konferenční příspěvek
conferenceObject
URI: 2-s2.0-85092196103
http://hdl.handle.net/11025/42765
ISBN: 978-3-030-59429-9
ISSN: 0302-9743
Keywords in different language: Sentiment analysis;BERT;Knowledge distillation
Abstract in different language: In this paper, we present our experiments with BERT (Bidirectional Encoder Representations from Transformers) models in the task of sentiment analysis, which aims to predict the sentiment polarity for the given text. We trained an ensemble of BERT models from a large self-collected movie reviews dataset and distilled the knowledge into a single production model. Moreover, we proposed an improved BERT’s pooling layer architecture, which outperforms standard classification layer while enables per-token sentiment predictions. We demonstrate our improvements on a publicly available dataset with Czech movie reviews.
Rights: Plný text je přístupný v rámci univerzity přihlášeným uživatelům.
© Springer
Appears in Collections:Konferenční příspěvky / Conference papers (NTIS)
Konferenční příspěvky / Conference Papers (KKY)
OBD

Files in This Item:
File SizeFormat 
Lehečka2020_Chapter_BERT-BasedSentimentAnalysisUsi.pdf482,82 kBAdobe PDFView/Open    Request a copy


Please use this identifier to cite or link to this item: http://hdl.handle.net/11025/42765

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

search
navigation
  1. DSpace at University of West Bohemia
  2. Publikační činnost / Publications
  3. OBD