The reliability of Very Large Scale Integration (VLSI) circuits is crucial in modern electronic devices. VLSI circuits, which contain millions of transistors, are vulnerable to a variety of reliability issues such as electromigration (EM), time-dependent dielectric breakdown (TDDB), negative bias temperature instability (NBTI) and hot carrier injection (HCI). These issues can lead to circuit failure and reduce the lifetime of electronic devices.This dissertation merged several works to present a study to mitigate the aging effects for VLSI circuits from multiple aspects. Firstly, due to the unidirectional current of the on-chip power grid, its aging effect is dominated by EM, this paper proposed a on-chip power grid EM-aware IR drop prediction method accelerated by machine learning (ML)
techniques, and a matching fast EM-aware IR drop fixing method for on-chip power grid.
Traditionally, VLSI reliability analysis and prediction have been performed using physics-based models and simulators. These models, however, are computationally intensive and can be time-consuming to run. This thesis paper presented a framework that machine learning techniques is adopted to model full-chip EM-aware IR drop, and a derived fast power grid sizing strategy, which leads to significant speedup over the existing analytical method. Secondly, with the continuous scaling of transistor sizes, the two dominant aging effects on arithmetic circuit and transistors are NBTI and HCI. The NBTI and HCI effects have various impacts such threshold voltage drift for the MOS transistors, device lifetime reduction, and circuit speed slowing down. Hence, this paper adopts stochastic computing techniques to enable an efficient trade-off among accuracy, latency, power, and area for error-tolerant applications.