Bibliography
N. Tishby, F.C. Pereira, and W. Bialek: “The Information Bottleneck method”. The 37th annual Allerton Conference on Communication, Control, and Computing, Sep 1999: pp. 368–377
G. Chechik, A Globerson, N. Tishby and Y. Weiss: “Information Bottleneck for Gaussian Variables”. Journal of Machine Learning Research 6, Jan 2005, pp. 165–188
F. Creutzig, H. Sprekeler: Predictive Coding and the Slowness Principle: an Information-Theoretic Approach, 2008, Neural Computation 20(4): 1026–1041
F. Creutzig, A. Globerson, N. Tishby: Past-future information bottleneck in dynamical systems, 2009, Physical Review E 79, 041925
N Tishby, N Slonim: “Data clustering by Markovian Relaxation and the Information Bottleneck Method”, Neural Information Processing Systems (NIPS) 2000, pp. 640–646
B.W. Silverman: “Density Estimation for Statistical Data Analysis”, Chapman and Hall, 1986.
N. Slonim, N. Tishby: "Document Clustering using Word Clusters via the Information Bottleneck Method", SIGIR 2000, pp. 208–215
Y. Weiss: "Segmentation using eigenvectors: a unifying view", Proceedings IEEE International Conference on Computer Vision 1999, pp. 975–982
D. J. Miller, A. V. Rao, K. Rose, A. Gersho: "An Information-theoretic Learning Algorithm for Neural Network Classification". NIPS 1995: pp. 591–597
P. Harremoes and N. Tishby "The Information Bottleneck Revisited or How to Choose a Good Distortion Measure". In proceedings of the International Symposium on Information Theory (ISIT) 2007
Read more about this topic: Information Bottleneck Method