Multi-level Machine Learning-Guided Autotuning for Efficient Code Generation on a Deep Learning Accelerator
The growing complexity of deep learning models necessitates specialized hardware and software optimizations, particularly for deep learning accelerators.
While machine learning-based autotuning methods have emerged as a promising solution to reduce manual effort, both template-based and template-free approaches suffer from prolonged tuning times due to the profiling of invalid configurations, which may result in runtime errors.
To address this issue, we propose ML2Tuner, a multi-level machine learning-guided autotuning technique designed to improve efficiency and robustness.
ML2Tuner introduces two key ideas: (1) a validity prediction model to filter out invalid configurations prior to profiling, and (2) an advanced performance prediction model that leverages hidden features extracted during the compilation process.
Experimental results on an extended VTA accelerator demonstrate that ML2Tuner achieves equivalent performance improvements using only 12.3% of the samples required by a TVM-like approach and reduces invalid profiling attempts by an average of 60.8%, highlighting its potential to enhance autotuning performance by filtering out invalid configurations.
Tue 17 JunDisplayed time zone: Seoul change
14:00 - 15:20 | |||
14:00 20mTalk | JetCert: A Self-Adaptive Compilation Framework for Fast and Safe Code ExecutionRecorded LCTES DOI | ||
14:20 20mTalk | Grouptuner: Efficient Group-Aware Compiler Auto-tuning LCTES Bingyu Gao Peking University, Mengyu Yao Peking University, Ziming Wang Peking University, Dong Liu ZTE, Ding Li Peking University, Xiangqun Chen Peking University, Yao Guo Peking University DOI | ||
14:40 20mTalk | Multi-level Machine Learning-Guided Autotuning for Efficient Code Generation on a Deep Learning Accelerator LCTES JooHyoung Cha Korea University of Science and Technology, Munyoung Lee ETRI, Jinse Kwon ETRI, Jemin Lee ETRI, Yongin Kwon ETRI DOI | ||
15:00 20mTalk | DSP-MLIR: A Domain-Specific Language and MLIR Dialect for Digital Signal Processing LCTES Abhinav Kumar Arizona State University, Atharva Khedkar Arizona State University, Hwisoo So Yonsei University, Megan Kuo Arizona State University, Ameya Gurjar Arizona State University, Partha Biswas MathWorks, Aviral Shrivastava Arizona State University DOI |