Augmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations

dc.catalogadorgrr
dc.contributor.authorAraujo Vasquez, Vladimir Giovanny
dc.contributor.authorVilla, Andres
dc.contributor.authorMendoza Rocha, Marcelo Gabriel
dc.contributor.authorMoens, Marie-Francine
dc.contributor.authorSoto, Alvaro
dc.date.accessioned2024-05-28T20:16:05Z
dc.date.available2024-05-28T20:16:05Z
dc.date.issued2021
dc.description.abstractCurrent language models are usually trained using a self-supervised scheme, where the main focus is learning representations at the word or sentence level. However, there has been limited progress in generating useful discourse-level representations. In this work, we propose to use ideas from predictive coding theory to augment BERT-style language models with a mechanism that allows them to learn suitable discourse-level representations. As a result, our proposed approach is able to predict future sentences using explicit top-down connections that operate at the intermediate layers of the network. By experimenting with benchmarks designed to evaluate discourse-related knowledge using pre-trained sentence representations, we demonstrate that our approach improves performance in 6 out of 11 tasks by excelling in discourse relationship detection.
dc.fuente.origenWOS
dc.identifier.eisbn978-1-955917-09-4
dc.identifier.urihttps://doi.org/10.48550/arXiv.2109.04602
dc.identifier.urihttps://repositorio.uc.cl/handle/11534/85921
dc.identifier.wosidWOS:000855966303012
dc.information.autorucEscuela de Ingeniería; Araujo Vasquez, Vladimir Giovanny; S/I; 1081563
dc.information.autorucEscuela de Ingeniería; Mendoza Rocha, Marcelo Gabriel; 0000-0002-7969-6041; 1237020
dc.language.isoen
dc.pagina.final3022
dc.pagina.inicio3016
dc.publisherASSOC COMPUTATIONAL LINGUISTICS-ACL
dc.revista2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021)
dc.titleAugmenting BERT-style Models with Predictive Coding to Improve Discourse-level Representations
dc.typecomunicación de congreso
sipa.codpersvinculados1081563
sipa.codpersvinculados1237020
sipa.trazabilidadWOS;2023-01-17
Files