CCD on-chip amplifiers: noise performance versus MOS transistor dimensions

The effect of change in the channel width, the channel length, and the bias current of detection-node MOS transistors, in charge-coupled-device (CCD) on-chip amplifiers is studied. A novel approach to noise optimization is shown, and criteria for choosing the optimum gate dimensions are established both in theory and practice. A new parameter, the noise electron density (in square electrons per hertz), is found to be a more suitable parameter for characterizing noise performance. It is shown that, in a well-designed CCD on-chip amplifier, the noise electron density is solely the product of the equivalent gate noise of the detection-node MOS transistor and the total capacitance C/sub 1/ of the detection node. The noise performance is very insensitive to change in the channel width of a factor of two of the optimum value, but it is sensitive to a change in channel length and bias current. The optimum is valid for every type of signal processing. >