Papers
arxiv:2410.09375

Looped ReLU MLPs May Be All You Need as Practical Programmable Computers

Published on Oct 12, 2024
Authors:
,
,
,
,

Abstract

A 23-layer ReLU-MPL with looping can function as a universal programmable computer, demonstrating that simple neural network modules possess greater computational expressiveness than previously understood.

AI-generated summary

Previous work has demonstrated that attention mechanisms are Turing complete. More recently, it has been shown that a looped 9-layer Transformer can function as a universal programmable computer. In contrast, the multi-layer perceptrons with ReLU activation (ReLU-MLP), one of the most fundamental components of neural networks, is known to be expressive; specifically, a two-layer neural network is a universal approximator given an exponentially large number of hidden neurons. However, it remains unclear whether a ReLU-MLP can be made into a universal programmable computer using a practical number of weights. In this work, we provide an affirmative answer that a looped 23-layer ReLU-MLP is capable of performing the basic necessary operations, more efficiently and effectively functioning as a programmable computer than a looped Transformer. This indicates simple modules have stronger expressive power than previously expected and have not been fully explored. Our work provides insights into the mechanisms of neural networks and demonstrates that complex tasks, such as functioning as a programmable computer, do not necessarily require advanced architectures like Transformers.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2410.09375 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2410.09375 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.