Reductive Analysis with Compiler-Guided Large Language Models for Input-Centric Code OptimizationsRecorded
This program is tentative and subject to change.
Input-centric program optimization aims to optimize code by considering the relations between program inputs and program behaviors. Despite its promise, a long-standing barrier for its adoption is the difficulty of automatically identifying critical features of complex inputs. This paper introduces a novel technique, \textit{reductive analysis through compiler-guided Large Language Models (LLMs)}, to solve the problem through a synergy between compilers and LLMs. It uses a reductive approach to overcome the scalability and other limitations of LLMs in program code analysis. The solution, for the first time, automates the identification of critical input features without heavy instrumentation or profiling, cutting the time needed for input identification by $44\times$ (or $450\times$ for local LLMs), reduced from 9.6 hours to 13 minutes (with remote LLMs) or 77 seconds (with local LLMs) on average, making input characterization possible to be integrated into the workflow of program compilations. Optimizations on those identified input features show similar or even better results than those identified by previous profiling-based methods, leading to optimizations that yield $92.6%$ accuracy in selecting the appropriate adaptive OpenMP parallelization decisions, and $20$–$30%$ performance improvement of serverless computing while reducing resource usage by $50$–$60%$.
This program is tentative and subject to change.
Wed 18 JunDisplayed time zone: Seoul change
16:00 - 17:20 | Machine LearningPLDI Research Papers at Cosmos, Violet & Tulip Chair(s): Feras Saad Carnegie Mellon University | ||
16:00 20mTalk | Type-Constrained Code Generation with Language Models PLDI Research Papers Niels Mündler ETH Zurich, Jingxuan He University of California at Berkeley, Hao Wang University of California at Berkeley, Koushik Sen University of California at Berkeley, Dawn Song University of California at Berkeley, Martin Vechev ETH Zurich DOI Pre-print | ||
16:20 20mTalk | Reductive Analysis with Compiler-Guided Large Language Models for Input-Centric Code OptimizationsRecorded PLDI Research Papers Xiangwei Wang North Carolina State University, Xinning Hui North Carolina State University, Chunhua Liao Lawrence Livermore National Laboratory, Xipeng Shen North Carolina State University DOI | ||
16:40 20mTalk | Scalable, Validated Code Translation of Entire Projects using Large Language Models PLDI Research Papers Hanliang Zhang University of Bristol, Cristina David University of Bristol, Meng Wang University of Bristol, Brandon Paulsen Amazon, Daniel Kroening Amazon DOI | ||
17:00 20mTalk | Guided Tensor Lifting PLDI Research Papers Yixuan Li University of Edinburgh, José Wesley De Souza Magalhães University of Edinburgh, Alexander Brauckmann University of Edinburgh, Michael F. P. O'Boyle University of Edinburgh, Elizabeth Polgreen University of Edinburgh DOI |