An MILP Encoding for Efficient Verification of Quantized Deep Neural Networks
- Resource Type
- Periodical
- Authors
- Mistry, S.; Saha, I.; Biswas, S.
- Source
- IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst. Computer-Aided Design of Integrated Circuits and Systems, IEEE Transactions on. 41(11):4445-4456 Nov, 2022
- Subject
- Components, Circuits, Devices and Systems
Computing and Processing
Neural networks
Arithmetic
Encoding
Deep learning
Benchmark testing
Neurons
Semantics
Fixed-point arithmetic
formal verification
mixed-integer linear programming (MILP)
quantized neural networks
- Language
- ISSN
- 0278-0070
1937-4151
Quantized deep neural networks (DNNs) have the potential to find wide applications in safety-critical cyber–physical systems implemented on processors supporting only integer arithmetic. The significant challenge therein is to ensure the correctness of the operation of the network with its approximated computation. To address this verification challenge formally, we present a methodology to encode the verification problem into a mixed-integer linear programming (MILP) problem. Our encoding is based on the bit-precise semantics of quantized neural networks, which ensures the soundness of our method. We implement our verification methodology using the Gurobi MILP solver and evaluate it on several widely used DNN benchmarks. We compare our method with state-of-the-art bit-vector encodings, which are outperformed by our MILP-based verification methodology by an order of magnitude in most cases. These experimental results establish our MILP-based verification technique as a powerful tool for developing formally verified safety-critical systems with quantized DNNs as a component.