The aim of this research is to derive new, more effective equations to adjust parameters of a neural network, which have advantage over formulas of the backpropagation method. For this purpose, an objective function of the neural network training is proposed to be modified using expansion of the abstract ideal output of each network layer into a Taylor series. Such decomposition is to be implemented at the point, which is close to the current output of the layer under consideration. The modified objective function gradients are calculated with respect to the difference between ideal and real values of the network parameters. As a result, the obtained weights and biases adjustment equations (for each layer) depend not only on the output of the previous layer, but also on its derivatives. Comparing to the backpropagation method, they allow to reduce the offline neural network training time. The derived formulas can be reduced to the ones of the backpropagation method. This proves the correctness of their development. The efficiency of the new training equations is shown using MNIST handwritten character recognition task. Considering the multilayer structures with different activation functions and number of layers, as far as new formulas are concerned, the same recognition accuracy is achieved, but faster comparing to the conventional backpropagation.