A QKeras Neural Network Zoo for Deeply Quantized Imaging
- Resource Type
- Conference
- Authors
- Loro, Francesco; Pau, Danilo; Tomaselli, Valeria
- Source
- 2021 IEEE 6th International Forum on Research and Technology for Society and Industry (RTSI) Research and Technology for Society and Industry (RTSI), 2021 IEEE 6th International Forum on. :165-170 Sep, 2021
- Subject
- Aerospace
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Fields, Waves and Electromagnetics
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Training
Industries
Deep learning
Quantization (signal)
Power demand
Neural networks
Imaging
network zoo
deeply quantized neural networks
deep learning frameworks
Larq
QKeras
- Language
- ISSN
- 2687-6817
Neural network zoos are quite common in the literature and are particularly useful for demonstrating the potential of any deep learning framework by providing examples of its use to the Artificial Intelligence community. Unfortunately most of them uses FP32 (32bits floating point) or INT8 (8bits integer) precision for activation and weights. Communities such as TinyML are paying more and more attention to memory and energy-saving to achieve mW and below power consumptions and therefore to Deeply Quantized Neural Networks (DQNNs). Two frameworks: QKeras and Larq, are gaining momentum for defining and training DQNNs. To best of our knowledge, the only available zoo for DQNN is the Larq framework. In this work we developed a new QKeras zoo and comparing the accuracy with the available Larq zoo. To avoid costly re-training, we show how to re-use the weights from Larq zoo. We also developed the zoo with ten networks and matched the performance of the Larq zoo for seven out of ten networks. Our work will be made publicly available through a GitHub repository.