Maybe you dream about taking your electronic designs to manufacturing without first validating their operation. All of those simulation steps take so much effort, and time-to-market pressures are relentless. But, of course, you’d never dare do it.
What if you had an effective way to streamline the simulation-based verification process? Machine learning could be your answer, says Elyse Rosenbaum, professor of electrical and computer engineering at the University of Illinois at Urbana-Champaign. Rosenbaum is also a director at the Center for Advanced Electronics through Machine Learning (CAEML), which aims to promote faster, more accurate model-based design, more accurate simulation-based design verification, and computationally efficient system-level analysis of manufacturing variations.
Rosenbaum gave a DesignCon keynote on Feb. 1 on “Machine Learning: An Enabling Technique for Electronics Modeling and Design Optimization.” She made the case that under the right circumstances, machine learning can enhance electronic design automation (EDA).
Modern EDA, noted Rosenbaum, has its limits, namely:
- It hasn’t eliminated design respins
- Many of the failures observed during qualification testing directly result from insufficient modeling capability (and variability can’t be modeled accurately or in a computationally efficient way)
- Simulation-based design optimization has demonstrated limited success; often, it’s slow and impractical because of the large number of design variables involved
- Behavioral models address some of the flaws of current simulation capabilities. However, the industry has lacked a general, systematic method for generating these models
According to Rosenbaum, that’s because of these challenges:
- The high dimensionality of the input space and limited knowledge of the correlations among the input parameters
- Variability in the physical attributes of system components and subsystems
- Difficulty of sampling and representing highly nonlinear response surfaces
- Lack of a priori information about electromagnetic or other interactions between components
Machine Learning to the Rescue
This is where machine learning can come into play. According to Rosenbaum, the first use of the phrase “machine learning” appeared in 1959 in the IBM Journal of Research and Development: “Some Studies in Machine Learning Using the Game of Checkers.” Over the years, popular culture has shared its interpretations (remember the Star Trek episode where Captain Kirk is told to sit back and let the machine do the work?). In Rosenbaum’s view, machine learning is the application of statistical learning theory to construct accurate predictors. With enough training data, this methodology won’t be impacted even if there’s a very complex functional relationship between input(s) and output(s) and/or if there are stochastic effects.
“EDA is a better application for machine learning than many of the applications for which machine learning is used today,” said Rosenbaum. Using machine learning to extract models that support EDA can make engineers’ professional lives better by enabling design optimization and shorter time to market.
There are a variety of machine learning algorithms and analyses that can be applied in an EDA flow. Linear regression, logistic regression, neural networks, Kernel methods, and supervised and unsupervised learning are some examples. Consider how machine learning can support thermal design optimization for 3D-ICs, which are prone to self-heating. Rosenbaum explained that her colleagues at Georgia Tech performed the first 3D thermal simulation of a power delivery network, using the resulting thermal profile to perform circuit simulation to measure clock skew. They had to go through the simulation process over and over until they identified a design that met the requirements for clock skew. Using a statistical learning method called Bayesian optimization, the researchers reduced the number of simulations. The only stipulation, Rosenbaum noted, is that you have to model the function accurately near its minimum.
Another example Rosenbaum discussed involves surrogate model-based circuit design, in which the data determines the model structure. In RFIC design, it’s customary to include tuning knobs. However, using SPICE simulation, it’s difficult to identify the optimal set of tuning knobs because the number of simulations required would be too large. Rosenbaum’s colleagues at North Carolina State built a surrogate model of an RFIC device and, through this method, identified a design with three particular tuning knobs that would meet their performance specs.
Recurrent neural network can also be useful in EDA because it can approximate any system represented using a state space model, which covers many circuits and devices. In an example of RNN used to model a commercial buffer chip, the RNN model simulated 12X faster vs. a transistor-level model (HSPICE simulation), Rosenbaum noted.
Physical vs. Empirical Models
One of Rosenbaum’s main takeaways was, even though there are clear benefits to chip design from machine learning, there’s also a time and place for it. “Let’s acknowledge that physical models have value,” said Rosenbaum. “Remember the empirical model is always only going to be an approximation of the true input/output function. You shouldn’t use an empirical model if you can easily derive the physical model on your own.”
Clearly, machine learning brings benefits to the chip design process, as well as to ICs themselves. Maxim certainly understands the value—many of its technologies apply machine learning techniques. The company is even hosting a machine learning competition for its employees.