This paper presents a matrix inequality approach to fault estimation problem in discrete-time linear systems affected by random disturbance. The input random disturbance is supposed to be a correlated stationary Gaussian sequence with bounded mean anisotropy. The quality criterion is defined as anisotropic norm of the system. Anisotropic norm of the system defines gain from input disturbance with bounded mean anisotropy to output. The aim of the paper is to derive numerically effective procedure to anisotropic fault detection filter design with guaranteed disturbance attenuation level. Numerical example is considered.