Assessment of Voltage Dip Staging for Low Voltage Systems

The voltage dip is one of power quality problems, which has started to affect distribution networks from the beginning of the last decade. Although the dip is attributed to a wide range of causes, short circuits are the most common sources of this problem. The present paper shows a case study investigating the dip in a low voltage (LV) distribution network. The change of fault position in adjacent LV feeders causes a systematic variation of voltage dip level measured at different busbars. A concept of relative dip is introduced. The impact of load type, including its contribution to short circuit current and voltage dip, is discussed. The simulation results show a possibility of classifying the dip impact on equipment according to the location of fault. The paper addresses the need to correlate the characteristics of dip and protection systems in order to improve the voltage profile and prevent serious dips