A minimization problem for mathematical expectation of a convex loss function over given convex compact X ∈ R^N is treated. It is assumed that the oracle sequentially returns stochastic subgradients for loss function at current points with uniformly bounded second moment. The aim consists in modification of well-known mirror descent method proposed by A.S. Nemirovsky and D.B. Yudin in 1979 and having extended the standard gradient method. In the beginning, the idea of a new so-called method of Inertial Mirror Descent (IMD) on example of a deterministic optimization problem in R N with continuous time is demonstrated. Particularly, in Euclidean case the method of heavy ball is realized; it is noted that the new method no use additional point averaging. Further on, a discrete IMD algorithm is described; the upper bound on error over objective function (i.e., of the difference between current mean losses and their minimum) is proved.