The measure of divergence and the corresponding Hellinger-Tsallis mutual information have been introduced within the information-theoretic approach to system identification based on Tsallis divergence and Hellinger distance properties for a pair of probability distributions to be used in statistical linearization problems. The introduced measure in this case is used ambivalently: as mutual information, a measure of random vector dependence, — as a criterion of statistical linearization of multidimensional stochastic systems, and as a measure of divergence of probability distributions — as an anisotropic norm of input process used to quantify the correspondence between the observable data and the assumptions of the original problem statement as such.