marginist
Hi,
Just wondering how big data sets R can deal with? I mean in terms of number of observations and number of variables (or fields).
As we all know that Excel can only handle around 65,000 rows and 255 columns. And for SAS, right now I didn't find any limitation.
Then how about R? Is there anybody having large data sets experiences with R?
Thanks!
rtist
the problem is how much memory the data set needs, how much memory do you have, and how much memory OS allows you to allocate, not the ncol or nrow.
marginist
Thanks!
So if the memory isn't a problem, then the data set should be ok in R no matter what package you are using to analyze it. Correct me if I am wrong.
rtist
[quote]引用第2楼marginist于2006-12-22 02:10发表的“”:
Thanks!
So if the memory isn't a problem, then the data set should be ok in R no matter what package you are using to analyze it. Correct me if I am wrong.[/quote]
Usually, memory is a problem. The limit of OS is also a problem.
As for each individual package, it depends on the space complexity of the algorithm.
For a simple check, just simulate the same amount of data to see if you have enough memory to hold your data.
#memory.limit(4000)
tmp=rnorm(65535*50)
marginist
Thanks!