The subject of this study presents an employed method in deep learning to create a model and predict the following period of turbulent flow velocity. The applied data in this study are extracted datasets from simulated turbulent flow in the laboratory with the Taylor microscale Reynolds numbers in the range of 90 Rλ2 score range from 0.001–0.002 and 0.9839–0.9873, respectively, for both models with two distinct training data ratios. Using GPUs increases the LSTM performance speed more than applications with no GPUs.