Skip to content

entropy package: how to compute mutual information?

2 messages · Sam Steingold

#
suppose I have two factor vectors:
 x <- as.factor(c("a","b","a","c","b","c"))
 y <- as.factor(c("b","a","a","c","c","b"))
I can compute their entropies:
 entropy(table(x))
[1] 1.098612
using
 library(entropy)
but it is not clear how to compute their mutual information directly.
I can compute the joint entropy as
  entropy(table(paste(x,y,sep="")))
[1] 1.791759
and then mutual information will be h(x) + h(y) - h(x,y) =
1.098612 + 1.098612 - 1.791759
0.405465

but I was wondering whether there was a better way (without creating a
fresh factor vector and a fresh factor class, both of which are
immediately discarded).
#
this can be simplified to entropy(table(x,y))