Paper
16 December 1989 Sample Estimators For Entropic Measures Of Mutual Information
R. C. McCarty
Author Affiliations +
Abstract
A nonlinear, entropic measure of mutual information (statistical dependence), I(X1,...,Xnin ≥ 2) = ∫-∞+∞...∫-∞+∞log [(f(x1,...xn)/(Πf1(x1))]dF(x1,...,xn)≥0 was proposed in 1966 by Blachman for a set of continuous random variables, (X1,..., X,ni-∞ < Xt < +∞, i= 1,..., n; ≥ 2) with continuous distribution F(xi,...,xn).
© (1989) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
R. C. McCarty "Sample Estimators For Entropic Measures Of Mutual Information", Proc. SPIE 0977, Real-Time Signal Processing XI, (16 December 1989); https://doi.org/10.1117/12.948566
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Statistical analysis

Signal processing

Head

Mathematics

Printing

Signal attenuation

Fourier transforms

RELATED CONTENT


Back to Top