How to run R models on large data sets? -


suppose have large data set(~10 gb) , want run support vector machine or linear model. typically when run these functions, error message: 'error: cannot allocate vector of size 308.4 mb'. best way deal this? creating random subsets , running models on individual subsets better approach?


Comments

Popular posts from this blog

python - Installing PyDev in eclipse is failed -

PHP OOP-based login system -

c# - Nested Internal Class with Readonly Hashtable throws Null ref exception.. on assignment -