As a variant of Finite mixture model (FMM), finite Inverted Dirichlet mixture model (IDMM) can not avoid the conventional challenges, such as how to select the appropriate number of mixture components based on the observed data. Towards easing these issues, we propose a variational inference framework for learning IDMM which has been proved to be an efficient tool for modeling vectors with positive elements. Compared with the conventional Expectation maximization (EM) algorithm commonly used for learning FMM, the proposed approach prevents over-fitting well. Furthermore, it is able to do automatic determination of the number of mixture components and parameters estimation, simultaneously. Experimental results on both synthetic and real data of object detection confirm significant improvements on flexibility and efficiency being achieved.