Accelerating the Static Analysis of Neural Networks by Batch Representation of Abstract Values
Deep Neural Networks are increasingly being used in mission-critical systems. However, their use poses significant safety challenges and requires rigorous verification to ensure their correctness in all scenarios. Formal verification methods such as abstract interpretation are commonly used, but their scalability becomes problematic for DNNs with many parameters. Relational abstract domains, such as affine forms, improve verification accuracy by capturing linear relationships between variables. However, with affine forms, the verification process becomes computationally intensive. To address this challenge, this paper proposes a novel approach called batch representation, which processes the abstract values in batches rather than individually. By exploiting the linear independence of noise symbols and using deep learning frameworks such as Pytorch, the proposed method significantly improves computational efficiency and scalability. Experimental results show that the use of GPUs and CPU clusters enables a reduction in computational time of about 80%, demonstrating the effectiveness of the approach in handling large-scale verification tasks.
Tue 17 JunDisplayed time zone: Seoul change
10:30 - 12:00 | |||
10:30 30mTalk | Gate Fusion Is Map Fusion ARRAY DOI | ||
11:00 30mTalk | Array Programming on GPUs: Challenges and Opportunities ARRAY Xinyi Li University of Utah, Mark Baranowski University of Utah, Harvey Dam University of Utah, Ganesh Gopalakrishnan University of Utah DOI | ||
11:30 30mTalk | Accelerating the Static Analysis of Neural Networks by Batch Representation of Abstract Values ARRAY Guillaume Berthelot Numalis, Arnault Ioualalen Numalis, Matthieu Martel University of Peprignan; Numalis DOI |