Finite mixture models based on the symmetric Gaussian distribution have been applied broadly in data analysis. However, not all the data in real-world applications can be safely supposed to have a symmetric Gaussian form. This paper presents a new mixture model that includes the inverted Beta mixture model (IBMM) as a special case to analyze the positive non-Gaussian data. The advantage of the proposed model is that the number of the model parameters is variable and infinite. Consequently, the proposed model is adaptable to the size of the data. On the basis of the recently proposed extended variational inference (EVI) framework, we develop a closed-form solution to approximate the posterior distributions. The performance and the effectiveness of the proposed model are demonstrated with the real data generated from two challenging applications, namely, image classification and object detection.