PLDI 2025
Mon 16 - Fri 20 June 2025 Seoul, South Korea

This program is tentative and subject to change.

Mon 16 Jun 2025 16:40 - 17:00 at Violet - SOAP 4

Quantization consists of replacing the original data types used to represent the weights of neural networks with less resource-intensive data types. Much work has been done on quantization, however, existing methods that provide guarantees simply give error bounds on the difference between the original and reduced precision.

In this article, we propose a new quantization technique that instead of guaranteeing error bounds finds the minimum precision required to maintain dominance (independent of a given set of formats). This means that, independently of what the scores of each class are, we guarantee that the dominant class remains the same in the original and quantified networks.

Our method is static and the proposed quantization holds for all the inputs. Technically, we use existing theorems that give error bounds on dot products, and we derive an optimization problem whose solution gives the reduced precision. Experimental results are presented.

This program is tentative and subject to change.

Mon 16 Jun

Displayed time zone: Seoul change

15:40 - 17:00
SOAP 4SOAP at Violet
15:40
60m
Keynote
TBD
SOAP
Charles Zhang Hong Kong University of Science and Technology
16:40
20m
Talk
Towards Bit-Level Dominance Preserving Quantization of Neural Classifiers
SOAP
Dorra Ben Khalifa University of Toulouse - ENAC, Matthieu Martel Université de Perpignan Via Domitia
16:40
20m
Talk
Optimizing Type Migration for C-to-Rust Translation: A Data Flow Graph Approach
SOAP
Qingxiao Xu , Jeff Huang Texas A&M University
16:40
20m
Day closing
Closing and Best Presentation Award
SOAP
Kihong Heo KAIST, Luca Negrini Ca’ Foscari University of Venice