NLP

CORE MACHINE LEARNING

CODE TRANSLATION WITH COMPILER REPRESENTATIONS

February 25, 2023

Abstract

In this paper, we leverage low-level compiler intermediate representations (IR) to improve code translation. Traditional transpilers rely on syntactic information and handcrafted rules, which limits their applicability and produces unnatural- looking code. Applying neural machine translation (NMT) approaches to code has successfully broadened the set of programs on which one can get a natural- looking translation. However, they treat the code as sequences of text tokens, and still do not differentiate well enough between similar pieces of code which have different semantics in different languages. The consequence is low quality translation, reducing the practicality of NMT, and stressing the need for an approach significantly increasing its accuracy. Here we propose to augment code translation with IRs, specifically LLVM IR, with results on the C++, Java, Rust, and Go languages. Our method improves upon the state of the art for unsupervised code translation, increasing the number of correct translations by 11% on average, and up to 79% for the Java → Rust pair with greedy decoding. With beam search, it increases the number of correct translations by 5.5% in average. We extend previous test sets for code translation, by adding hundreds of Go and Rust functions. Additionally, we train models with high performance on the problem of IR decompilation, generating programming source code from IR, and study using IRs as intermediary pivot for translation.

Download the Paper

AUTHORS

Written by

Marc Szafraniec

Baptiste Rozière

Hugh Leather

François Charton

Patrick Labatut

Gabriel Synnaeve

Publisher

ICLR

Related Publications

June 05, 2024

CORE MACHINE LEARNING

An Introduction to Vision-Language Modeling

Florian Bordes, Richard Pang, Anurag Ajay, Alexander C. Li, Adrien Bardes, Suzanne Petryk, Oscar Mañas, Zhiqiu Lin, Anas Mahmoud, Bargav Jayaraman, Mark Ibrahim, Melissa Hall, Yunyang Xiong, Jonathan Lebensold, Candace Ross, Srihari Jayakumar, Chuan Guo, Diane Bouchacourt, Haider Al-Tahan, Karthik Padthe, Vasu Sharma, Hu Xu, Ellen Tan, Megan Richards, Samuel Lavoie, Pietro Astolfi, Reyhane Askari, Jun Chen, Kushal Tirumala, Rim Assouel, Mazda Moayeri, Arjang Talattof, Kamalika Chaudhuri, Zechun Liu, Xilun Chen, Quentin Garrido, Karen Ullrich, Aishwarya Agrawal, Kate Saenko, Asli Celikyilmaz, Vikas Chandra

June 05, 2024

May 24, 2024

SPEECH & AUDIO

NLP

DOC-RAG: ASR Language Model Personalization with Domain-Distributed Co-occurrence Retrieval Augmentation

Zhe Liu

May 24, 2024

May 07, 2024

CORE MACHINE LEARNING

ReTaSA: A Nonparametric Functional Estimation Approach for Addressing Continuous Target Shift

Hwanwoo Kim, Xin Zhang, Jiwei Zhao, Qinglong Tian

May 07, 2024

May 06, 2024

CONVERSATIONAL AI

NLP

GAIA: a benchmark for general AI assistants

Gregoire Mialon, Yann LeCun, Thomas Scialom, Clémentine Fourrier, Thomas Wolf

May 06, 2024

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.