Matrix Chernoff Bound
Rudolf Ahlswede and Andreas Winter introduced (Ahlswede & Winter 2003) a Chernoff bound for matrix-valued random variables.
If is distributed according to some distribution over matrices with zero mean, and if are independent copies of then for any ,
where holds almost surely and is an absolute constant.
Notice that the number of samples in the inequality depends logarithmically on . In general, unfortunately, such a dependency is inevitable: take for example a diagonal random sign matrix of dimension . The operator norm of the sum of independent samples is precisely the maximum deviation among independent random walks of length . In order to achieve a fixed bound on the maximum deviation with constant probability, it is easy to see that should grow logarithmically with in this scenario.
The following theorem can be obtained by assuming has low rank, in order to avoid the dependency on the dimensions.
Read more about this topic: Chernoff Bound
Famous quotes containing the words matrix and/or bound:
“In all cultures, the family imprints its members with selfhood. Human experience of identity has two elements; a sense of belonging and a sense of being separate. The laboratory in which these ingredients are mixed and dispensed is the family, the matrix of identity.”
—Salvador Minuchin (20th century)
“Edith: This complete loveliness will fade. And we shall forget what it was like.
Edward: Edith, dont.
Edith: Oh, its bound to. Just a few years and the gilt wears off the gingerbread.
Edward: Darling, answer me one thing truthfully. Have you ever seen gingerbread with gilt on it?
Edith: [laughing] Fool!
Edward: Then the whole argument is disposed of.”
—Reginald Berkeley (1890 N1935)