Noise wave modeling of microwave transistors based on neural networks

The noise modeling of microwave FETs based on the noise-wave representation of a transistor-intrinsic circuit is considered. Frequency-dependent noise-wave temperatures are introduced as empirical model parameters and modeled using neural networks. In this way, online optimization in a circuit simulator is shifted to offline training of neural networks. An example of transistor-noise modeling for one specified component is shown. © 2004 Wiley Periodicals, Inc. Microwave Opt Technol Lett 41: 294–297, 2004; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/mop.20120