Papers
arxiv:2512.09834

Transpiling quantum circuits by a transformers-based algorithm

Published on Mar 5
Authors:
,
,

Abstract

A transformer model is developed to transpile quantum circuits from QASM format to ion trap hardware-specific gate sets with high accuracy and polynomial scalability.

AI-generated summary

Transformers have gained popularity in machine learning due to their application in the field of natural language processing. They manipulate and process text efficiently, capturing long-range dependencies among data and performing the next word prediction. On the other hand, gate-based quantum computing is based on controlling the register of qubits in the quantum hardware by applying a sequence of gates, a process which can be interpreted as a low level text programming language. We develop a transformer model capable of transpiling quantum circuits from the qasm standard to other sets of gates native suited for a specific target quantum hardware, in our case the set for the trapped-ion quantum computers of IonQ. The feasibility of a translation up to five qubits is demonstrated with a percentage of correctly transpiled target circuits equal or superior to 99.98%. Regardless the depth of the register and the number of gates applied, we prove that the complexity of the transformer model scales, in the worst case scenario, with a polynomial trend by increasing the depth of the register and the length of the circuit, allowing models with a higher number of parameters to be efficiently trained on HPC infrastructures.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2512.09834
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2512.09834 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2512.09834 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2512.09834 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.