Attention is Turing complete
dc.catalogador | yvc | |
dc.contributor.author | Pérez, Jorge | |
dc.contributor.author | Barceló Baeza, Pablo | |
dc.contributor.author | Marinkovic, Javier | |
dc.date.accessioned | 2023-11-19T02:34:22Z | |
dc.date.available | 2023-11-19T02:34:22Z | |
dc.date.issued | 2021 | |
dc.description.abstract | Alternatives to recurrent neural networks, in particular, architectures based on self-attention, are gaining momentum for processing input sequences. In spite of their relevance, the computational properties of such networks have not yet been fully explored. We study the computational power of the Transformer, one of the most paradigmatic architectures exemplifying self-attention. We show that the Transformer with hard-attention is Turing complete exclusively based on their capacity to compute and access internal dense representations of the data. Our study also reveals some minimal sets of elements needed to obtain this completeness result. | |
dc.fechaingreso.objetodigital | 2023-11-19 | |
dc.fuente.origen | PREI | |
dc.identifier.issn | 1532-4435 | |
dc.identifier.uri | https://repositorio.uc.cl/handle/11534/75358 | |
dc.information.autoruc | Instituto de Ingeniería Matemática y Computacional ; Barceló Baeza, Pablo ; 0000-0003-2293-2653 ; 13516 | |
dc.issue.numero | 75 | |
dc.language.iso | en | |
dc.nota.acceso | Contenido completo | |
dc.pagina.final | 35 | |
dc.pagina.inicio | 1 | |
dc.revista | Journal of machine learning research | |
dc.rights | acceso abierto | |
dc.subject | Transformers | |
dc.subject | Turing completeness | |
dc.subject | Self-Attention | |
dc.subject | Neural networks | |
dc.subject | Arbitrary precision | |
dc.subject.ddc | 500 | |
dc.subject.dewey | Ciencias | es_ES |
dc.title | Attention is Turing complete | |
dc.type | artículo | |
dc.volumen | 22 | |
sipa.codpersvinculados | 13516 |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- TEXTO COMPLETO_Attention is Turing Complete.pdf
- Size:
- 382.68 KB
- Format:
- Adobe Portable Document Format
- Description: