This review is the first to propose the systematic presentation of a new approach to the study of queuing systems and networks. The concept of the new approach is based on a combination of traditional methods of queuing theory with various machine learning algorithms. The detailed description and justification of the possibility of applying the approach are given on the example of a combination of simulation with artificial neural networks. The analysis of publications allows us to conclude that the application of machine learning methods is highly effective, promising for further research, as well as for the possible separation of this new approach into an independent direction in the field of solving complex problems of the theory of queues.