In the high-speed design today, equalization (EQ) is one of the key enablers for many interfaces like PCIe and DDR in order to achieve higher data rates, lower power consumption and smaller form factor. Given that there may be multiple equalizers in the system and each of them can have very fine granularity, how to efficiently find out the optimal EQ setting is a very challenging problem for signal integrity engineers even with an linear time invariant (LTI) system. Besides, the circuit implementation of an equalizer like continuous time linear equalizer (CTLE) can be very nonlinear. The strong push for low power design has also inevitably introduced nonlinearity in drivers. Without the simulation capability of handling nonlinear elements and efficiently performing EQ optimization, designers have been forced to make LTI assumptions in their simulations and add cost to guardband for the uncertainty in the simulation predictions. It has eliminated many potentially valuable designs and also added cost and manufacturing complexity to existing designs. In this paper, we propose a new EQ optimization algorithm under the framework of the algorithm developed in [1]. It can avoid the bruteforce simulation of all EQ settings while preserving the nonlinear effects. The proposed method has been validated using a DDR5 example.