MEASURING TIME-FREQUENCY INFORMATION CONTENT USING THE RENYI ENTROPIES Richard G. Baraniuk (1), Patrick Flandrin (2), Augustus J. E. M. Janssen (3), and Olivier Michel (2) (1) Department of Electrical and Computer Engineering Rice University P.O. Box 1892, Houston, TX 77251-1892, USA E-mail: richb@rice.edu (2) Laboratoire de Physique (URA 1325 CNRS) Ecole Normale Superieure de Lyon 46 allee d'Italie, 69364 Lyon Cedex 07, FRANCE E-mail: flandrin@physique.ens-lyon.fr, omichel@physique.ens-lyon.fr (3) Philips Research Laboratories Prof. Holstlaan 4 5656 AA Eindhoven, The Netherlands E-mail: janssena@natlab.research.philips.com Submitted to IEEE Transactions on Information Theory, August 1998 Abstract: The generalized entropies of Renyi inspire new measures for estimating signal information and complexity in the time-frequency plane. When applied to a time-frequency representation from Cohen's class or the affine class, the Renyi entropies conform closely to the notion of complexity that we use when visually inspecting time-frequency images. These measures possess several additional interesting and useful properties, such as accounting and cross-component and transformation invariances, that make them natural for time-frequency analysis. This paper comprises a detailed study of the properties and several potential applications of the Renyi entropies, with emphasis on the mathematical foundations for quadratic TFRs. In particular, for the Wigner distribution we establish that there exist signals for which the measures are not well defined.