Papers
arxiv:2410.01405

On Expressive Power of Looped Transformers: Theoretical Analysis and Enhancement via Timestep Encoding

Published on Oct 2, 2024
Authors:
,

Abstract

Looped Transformers demonstrate improved parameter efficiency and computational capabilities for reasoning tasks, though their function approximation expressiveness is limited, necessitating timestep-encoded scaling parameters for enhanced performance.

AI-generated summary

Looped Transformers provide advantages in parameter efficiency, computational capabilities, and generalization for reasoning tasks. However, their expressive power regarding function approximation remains underexplored. In this paper, we establish the approximation rate of Looped Transformers by defining the modulus of continuity for sequence-to-sequence functions. This reveals a limitation specific to the looped architecture. That is, the analysis prompts the incorporation of scaling parameters for each loop, conditioned on timestep encoding. Experiments validate the theoretical results, showing that increasing the number of loops enhances performance, with further gains achieved through the timestep encoding.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2410.01405 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2410.01405 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.