Approximate Computing and the Efficient Machine Learning Expedition
- Resource Type
- Conference
- Authors
- Henkel, Jorg; Li, Hai; Raghunathan, Anand; Tahoori, Mehdi B.; Venkataramani, Swagath; Yang, Xiaoxuan; Zervakis, Georgios
- Source
- 2022 IEEE/ACM International Conference On Computer Aided Design (ICCAD) Computer Aided Design (ICCAD)2022 IEEE/ACM International Conference On. :1-9 Oct, 2022
- Subject
- Components, Circuits, Devices and Systems
Computing and Processing
Engineering Profession
General Topics for Engineers
Signal Processing and Analysis
Design automation
Costs
Computational modeling
Approximate computing
Taxonomy
Prototypes
Machine learning
Approximate Computing
In-memory
Machine Learning
Precision Scaling
Printed Electronics
Pruning
Quantization
Transformers
- Language
- ISSN
- 1558-2434
Approximate computing (AxC) has been long accepted as a design alternative for efficient system implementation at the cost of relaxed accuracy requirements. Despite the AxC research activities in various application domains, AxC thrived the past decade when it was applied in Machine Learning (ML). The by definition approximate notion of ML models but also the increased computational overheads associated with ML applications–that were effectively mitigated by corresponding approximations–led to a perfect matching and a fruitful synergy. AxC for AI/ML has transcended beyond academic prototypes. In this work, we enlighten the synergistic nature of AxC and ML and elucidate the impact of AxC in designing efficient ML systems. To that end, we present an overview and taxonomy of AxC for ML and use two descriptive application scenarios to demonstrate how AxC boosts the efficiency of ML systems.