How to run R models on large data sets? -


suppose have large data set(~10 gb) , want run support vector machine or linear model. typically when run these functions, error message: 'error: cannot allocate vector of size 308.4 mb'. best way deal this? creating random subsets , running models on individual subsets better approach?


Comments

Popular posts from this blog

asp.net mvc - SSO between MVCForum and Umbraco7 -

Python Tkinter keyboard using bind -

ubuntu - Selenium Node Not Connecting to Hub, Not Opening Port -