I/O Constraints Optimization using Machine Learning
- Resource Type
- Conference
- Authors
- C, Lekshmi; Khatri, Anmol; Saha, Sourav; Gupta, Shivangi; Yadav, Raj; Bazaz, Rakshit
- Source
- 2022 IEEE 35th International System-on-Chip Conference (SOCC) System-on-Chip Conference (SOCC), 2022 IEEE 35th International. :1-6 Sep, 2022
- Subject
- Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineering Profession
Signal Processing and Analysis
Constraint optimization
Databases
Machine learning
Feature extraction
System-on-chip
Delays
Convergence
Constraints
Budgeting
I/O
Delay
prediction
Machine Learning
Supervised learning
RFR
XGB
regression
- Language
- ISSN
- 2164-1706
Hierarchical designs require high quality I/O constraints for predictable execution for partitions. Complex SOC sub-systems typically take around 10-12 convergence loop to stabilize I/O constraints with evolving collaterals. Proposed solution uses ML capabilities to achieve optimum I/O constraints from the early stages. ML model is trained with features extracted from a top-down timing model alongside delays extracted from implemented database. Cell and RC delays are separately trained and composite I/O budget is ascertained. This approach is successfully deployed in complex SOC sub-systems in sub-7nm and has achieved 30% faster full chip timing convergence with prediction error within 6% of cycle time.