Skip to content

Quick partitioning

2 messages · Anna Oganyan, Brian Ripley

#
Hello,
I am quite new in R, and  I have one problem:
I have large d-dimensional data sets (d=2, 3, 6, 10). I would like to 
divide the d-dim space into n (n may be 10, but better some larger 
number, for example 20) equally sized d-dim hypercubes  and count  how 
many data points are in each cube. Is there any way to do  it quickly, I 
mean - in a reasonable time? Actually, I  want  to get some rough idea 
of underlying densities of these data and compare them.
Thanks a lot!
Anna
#
On Thu, 25 Aug 2005, Anna Oganyan wrote:

            
How do you divide a 10D space into 10 hypercubes?  You need at least 
some of dimensions to be undivided.

The general idea is easy: apply cut() to each dimension, so your 
dimensions become factors, then table() will produce the counts.  That 
will be quick enough for millions of points.