I am always fascinating by good programming techniques. R contains a lot of very good examples I have been learning from.
Since I am using some functions from package "wmtsa", I though I could borrow the elegant example, in the documentation of function "wavBestBasis" to compute the entropy from the Wavelet Transform coefficients.
In the following I have pasted the excerpt, from "wmtsa" on-line documentation, that implements the entropy calculation:
## define an entropy cost functional
"entropy" <- function(x){
iz <- which(x==0)
z <- -x^2 * log(x^2)
if (length(iz))
z[iz] <- 0
sum(z)
}
To my best recollection, Shannon's entropy operates on probabilities therefore requesting a normalization step.
I have written my simple code that implements Shannon's entropy:
# Shannon's entropy
# input: vector x
sum(x^2)
pi <- (x^2)/sum(x^2)
pi
-pi*log(pi)
-sum(pi*log(pi))
I have tested both entropy realizations on a simple case:
> x<- 1:10
x
[1] 1 2 3 4 5 6 7 8 9 10
# ENTROPY FROM WMTSA EXAMPLE
[1] 1.920788
Needless to say, when I calculate the entropy on wavelet coefficients to the purpose of selecting the best wavelet family on the basis of the "entropy" cost function, I get different results depending on which entropy implementation I use ...
Your comments and thoughts are very welcome.
e tutti i telefonini TIM!
Vai su
-------------- next part --------------
A non-text attachment was scrubbed...
Name: wmtsa.pdf
Type: application/pdf
Size: 551276 bytes
Desc: wmtsa.pdf
URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20090220/d545ae8b/attachment-0002.pdf>
Your email has gone to hundreds of people all over the world many of
whom read their mail over quite slow connections especially in less
developed countries. Sending all of them half a megabyte of unwanted
mail is not very polite.
[mail content deleted]
Michael Dewey
http://www.aghmed.fsnet.co.uk