Sample entropy or approximate entropy, a complexity measure that quantifies the new information generation rate and is applicable to short time series, has been widely applied to physiological signal analysis since it was proposed. However, on one hand, sample entropy is easily affected by non-stationary sudden noise, because the tolerance during calculation is set to be proportional to standard deviation; on the other hand, it is not independent of the probability distribution, so that it does not purely characterize the new information generation rate. To solve these two problems, a new improved method named equiprobable symbolization sample entropy is proposed in this paper. Through equiprobable symbolization, the effects of both non-stationary sudden noises and probability distribution are eliminated. Besides, since equiprobable symbolization is usually non-uniform, it further breaks through the linear constrains in classic sample entropy. The method is proved to be rational by simulating three typical noises that have different time correlations and new information generation rates. Then the method is applied to electroencephalography (EEG) analysis. Results show that the method can successfully discriminate two different attention levels based on EEG with duration as short as 1.25 s and without removing any artificial artifacts. Therefore, the method is of great significance for EEG biofeedback, in which strong real-time abilities are usually required.