Skip to content
Back to formatted view

Raw Message

Message-ID: <430E3E62.2040407@niss.org>
Date: 2005-08-25T21:55:46Z
From: Anna Oganyan
Subject: Quick partitioning

Hello,
I am quite new in R, and  I have one problem:
I have large d-dimensional data sets (d=2, 3, 6, 10). I would like to 
divide the d-dim space into n (n may be 10, but better some larger 
number, for example 20) equally sized d-dim hypercubes  and count  how 
many data points are in each cube. Is there any way to do  it quickly, I 
mean - in a reasonable time? Actually, I  want  to get some rough idea 
of underlying densities of these data and compare them.
Thanks a lot!
Anna