49083

Автор(ы): 

Автор(ов): 

2

Параметры публикации

Тип публикации: 

Статья в журнале/сборнике

Название: 

An Information-Theoretic Approach to System Identification with Applying Tsallis Entropy

Электронная публикация: 

Да

ISBN/ISSN: 

2405-8963

DOI: 

10.1016/j.ifacol.2018.07.124

Наименование источника: 

  • IFAC-PapersOnLine

Обозначение и номер тома: 

Vol. 51, No. 6

Город: 

  • Amsterdam

Издательство: 

  • Elsevier

Год издания: 

2018

Страницы: 

24-29
Аннотация
Involved in system identification problems, information theory becomes a powerful tool of constructing solutions, in first turn, as a basis of the final mathematical formalization of identification criteria. Such an orientation unavoidably results in applying corresponding measures of dependence of random values, first of all, based on the mutual information. Meanwhile, this does not mean limiting to the consideration of Shannon approaches only, regarding, correspondingly, Shannon entropy. The present paper proposes applying (symmetric) Tsallis mutual information as a basis to construct an information-theoretic criterion within an input/output system identification problem. Being a consistent, in the sense of the terminology of A.N. Kolmogorov, measure of dependence of random values (a measure that vanishes if and only if the random values are stochastically independent) representing the system and model output variables, the corresponding identification criterion permits, from one hand side, to account the system identifiablity, from another hand side, enables applying fast and efficient computations to solve the problem providing a wide range of applications.

Библиографическая ссылка: 

Чернышев К.Р., Жарко Е.Ф. An Information-Theoretic Approach to System Identification with Applying Tsallis Entropy // IFAC-PapersOnLine. 2018. Vol. 51, No. 6. С. 24-29.