Involved in system identification problems, information theory becomes a powerful tool of constructing solutions, in first turn, as a basis of the final mathematical formalization of identification criteria. Such an orientation unavoidably results in applying corresponding measures of dependence of random values, first of all, based on the mutual information. Meanwhile, this does not mean limiting to the consideration of Shannon approaches only, regarding, correspondingly, Shannon entropy. The present paper proposes applying (symmetric) Tsallis mutual information as a basis to construct an information-theoretic criterion within an input/output system identification problem. Being a consistent, in the sense of the terminology of A.N. Kolmogorov, measure of dependence of random values (a measure that vanishes if and only if the random values are stochastically independent) representing the system and model output variables, the corresponding identification criterion permits, from one hand side, to account the system identifiablity, from another hand side, enables applying fast and efficient computations to solve the problem providing a wide range of applications.