PLDI 2025
Mon 16 - Fri 20 June 2025 Seoul, South Korea

This program is tentative and subject to change.

Wed 18 Jun 2025 17:00 - 17:20 at Cosmos, Violet & Tulip - Machine Learning

Domain-specific languages (DSLs) for machine learning are revolutionizing the speed and efficiency of machine learning workloads as they enable users easy access to high-performance compiler optimizations and accelerators. However, to take advantage of these capabilities, a user must first translate their legacy code from the language it is currently written in, into the new DSL. The process of automatically lifting code into these DSLs has been identified by several recent works, which propose program synthesis as a solution. However, synthesis is expensive and struggles to scale without carefully designed and hard-wired heuristics. In this paper, we present an approach for lifting that combines an enumerative synthesis approach with a Large Language Model used to \emph{automatically} learn the domain-specific heuristics for program lifting, in the form of a probabilistic grammar. Our approach outperforms the state-of-the-art tools in this area, despite only using \emph{learned} heuristics.

This program is tentative and subject to change.

Wed 18 Jun

Displayed time zone: Seoul change

16:00 - 17:20
16:00
20m
Talk
Type-Constrained Code Generation with Language Models
PLDI Research Papers
Niels Mündler ETH Zurich, Jingxuan He University of California at Berkeley, Hao Wang University of California at Berkeley, Koushik Sen University of California at Berkeley, Dawn Song University of California at Berkeley, Martin Vechev ETH Zurich
DOI Pre-print
16:20
20m
Talk
Reductive Analysis with Compiler-Guided Large Language Models for Input-Centric Code Optimizations
PLDI Research Papers
Xiangwei Wang North Carolina State University, Xinning Hui North Carolina State University, Chunhua Liao Lawrence Livermore National Laboratory, Xipeng Shen North Carolina State University
DOI
16:40
20m
Talk
Scalable, Validated Code Translation of Entire Projects using Large Language Models
PLDI Research Papers
Hanliang Zhang University of Bristol, Cristina David University of Bristol, Meng Wang University of Bristol, Brandon Paulsen Amazon, Daniel Kroening Amazon
DOI
17:00
20m
Talk
Guided Tensor Lifting
PLDI Research Papers
Yixuan Li University of Edinburgh, José Wesley De Souza Magalhães University of Edinburgh, Alexander Brauckmann University of Edinburgh, Michael F. P. O'Boyle University of Edinburgh, Elizabeth Polgreen University of Edinburgh
DOI