The problem of identifying the input-output mapping of multidimensional discrete-time stochastic systems using an information-theoretic criterion is considered. As an information-theoretic measure of dependence, which is the basis for constructing the corresponding criterion for statistical linearization, a measure based on the symmetric Kullback-Leibler divergence is proposed, in contrast to the usual Kullback-Leibler divergence, which, as is well known, is not symmetric. Based on such symmetric divergence, a corresponding measure of stochastic dependence is constructed, and the criterion of statistical linearization is the condition of pairwise coincidence of this measure of dependence, calculated for each component of the vector of the output variable of the system and each component of the vector of the input variable of the system, on the one hand, and the corresponding components of the output vector variable of the linearized model and the vector of the input variable of the system. This approach allows us to construct explicit analytical expressions for the weighting coefficients of the linearized model, which, at the same time, has the property of complete equivalence to the nonlinear system under study in the sense of coincidence of the corresponding information-theoretic characteristics.