Convolutional Neural Networks (CNNs) models achieve a dominant performance on immense domains. There are CNNs that come in numerous topologies of different sizes. This field's challenge is to design the right CNN architecture for a specific problem to attain high results with low computing resources for training this architecture. Our proposed automatic design approach is concerned with finding a solution to this challenge. The suggested framework is based on the genetic algorithm that works on evolving a population of CNN models to find the best-fitted architecture. Unlike the co-related work, our proposed method is concerned with generating lightweight architectures that are characterized by a low number of parameters with preserving high validation accuracy based on an ensemble learning technique. This framework is designed to work on limited computing resources to be feasible for deployment in a variety of environments. Four popular benchmark image datasets are used to validate the proposed framework, and it is compared to the peer competitors work based on various methods in terms of validation accuracy, the number of model' parameters, the number of used Graphical Processing Units (GPUs), and GPU days was taken to complete the process. Our experimental results showed mainly superiority in terms of GPU days with relatively high validation accuracy and a low number of parameters of the discovered model.