PLDI 2025
Mon 16 - Fri 20 June 2025 Seoul, South Korea
Tue 17 Jun 2025 14:40 - 15:00 at Violet - Compiler Technology and Auto-Tuning Chair(s): Yunho Oh

The growing complexity of deep learning models necessitates specialized hardware and software optimizations, particularly for deep learning accelerators.
While machine learning-based autotuning methods have emerged as a promising solution to reduce manual effort, both template-based and template-free approaches suffer from prolonged tuning times due to the profiling of invalid configurations, which may result in runtime errors.
To address this issue, we propose ML2Tuner, a multi-level machine learning-guided autotuning technique designed to improve efficiency and robustness.
ML2Tuner introduces two key ideas: (1) a validity prediction model to filter out invalid configurations prior to profiling, and (2) an advanced performance prediction model that leverages hidden features extracted during the compilation process.
Experimental results on an extended VTA accelerator demonstrate that ML2Tuner achieves equivalent performance improvements using only 12.3% of the samples required by a TVM-like approach and reduces invalid profiling attempts by an average of 60.8%, highlighting its potential to enhance autotuning performance by filtering out invalid configurations.

Tue 17 Jun

Displayed time zone: Seoul change

14:00 - 15:20
Compiler Technology and Auto-TuningLCTES at Violet
Chair(s): Yunho Oh Korea University
14:00
20m
Talk
JetCert: A Self-Adaptive Compilation Framework for Fast and Safe Code ExecutionRecorded
LCTES
Arman Cham Heidari Shahid Beheshti University, Mehran Alidoost Nia Shahid Beheshti University
DOI
14:20
20m
Talk
Grouptuner: Efficient Group-Aware Compiler Auto-tuning
LCTES
Bingyu Gao Peking University, Mengyu Yao Peking University, Ziming Wang Peking University, Dong Liu ZTE, Ding Li Peking University, Xiangqun Chen Peking University, Yao Guo Peking University
DOI
14:40
20m
Talk
Multi-level Machine Learning-Guided Autotuning for Efficient Code Generation on a Deep Learning Accelerator
LCTES
JooHyoung Cha Korea University of Science and Technology, Munyoung Lee ETRI, Jinse Kwon ETRI, Jemin Lee ETRI, Yongin Kwon ETRI
DOI
15:00
20m
Talk
DSP-MLIR: A Domain-Specific Language and MLIR Dialect for Digital Signal Processing
LCTES
Abhinav Kumar Arizona State University, Atharva Khedkar Arizona State University, Hwisoo So Yonsei University, Megan Kuo Arizona State University, Ameya Gurjar Arizona State University, Partha Biswas MathWorks, Aviral Shrivastava Arizona State University
DOI