Resource Optimization for Circuit Simulation using Machine Learning

Memory and Central Processing Units (CPU) are the primary computing resources for any circuit simulation job. Speed, efficiency and performance of these jobs depends on how the resources in the server farm are leveraged and optimized. But depending on the users selection as well as simulators architecture these resources might be over or underutilized leading to stuck jobs eventually dying or occupying more than its fair share impacting other jobs in queue as there are only finite resources available in the farm. This paper proposes an innovative approach to forecast and estimate the optimum allocation of memory and computing resources for all circuit simulation jobs leveraging Machine Learning techniques.