Error: cannot allocate vector of size 88.1 MB

Tags: TPS ace should be the reference of the big err is unable to pay attention to the hive
When I was training the model to run the code these days, I was always prompted to say: Error: cannot allocate vector of size 88.1MB, only knowing that the allocated space is insufficient.
Here are some of the answers we looked up:
1, this is the characteristics of R, there are several solutions:
1. Upgrade to R3.3.0 or above, the memory management and matrix calculation is too good. Calculations that can crash on R3.2.5 will work fine above R3.3.0. 2. Load some R language disk cache packets, search
3. Write code when appropriate to add some clean memory command.
4. I should run multiple threads.
5. Add memory function is limited. R3.2.5 can crash the server, which has 44 cores and 512 gigabytes of memory. It is necessary to optimize the code.
Second, sometimes adding memory chips can’t meet the demand of large data volume, so parallel computing strategy is adopted. If the data is read in one time, it can be combined with filematrix package to read the data from the hard disk in several times, but it will be much slower.
Three, find that parameter in R, there’s a place where you can change the maximum memory allocation, in Preference or something like that.
Download a Package called BigMemory. It rebuilds classes for large data sets, and is basically cutting edge in the ability to handle large data sets (including tens of GIGABYtes).
Links to cran.r-project.org/web/packages/bigmemory/
The BigMemory package is ok. Two other options are also available, mapReduce and RHIPE(using Hadoop), which can also work with large data sets.
Six, the great spirit guide (http://bbs.pinggu.org/thread-3682816-1-1.html), always the allocate a vector is the typical data too big can’t read
There are three methods:
1, upgrade hardware
2, improve algorithm
3, modify the upper limit of memory allocated by the operating system to R, memory.size(T) check the allocated memory
Memory.size (F) checks the memory used
Memory.limit () check the memory limit
Object.size () looks at how much memory each variable takes up.
memory.size() view current work space memory usage
memory.limit() view system memory usage limit.
If the current memory limit is not sufficient, you can change it to a newLimit by using memory.limit(newLimit). Note that in 32-bit R, the capped limit is 4G, and you cannot use more than 4G (digit limit) on a program. In such cases, consider using a 64-bit version.
Detail can refer to this article, is very good at https://blog.csdn.net/sinat_26917383/article/details/51114265
1 http://jliblog.com/archives/276
2 http://cos.name/wp-content/uploads/2011/05/01-Li-Jian-HPC.pdf
http://cran.r-project.org/web/views/HighPerformanceComputing.html 3 R high performance computing and parallel computing
If you encounter this problem, you can try the corresponding solution, the method is not bad oh ~
Error: Cannot allocate Vector of size 88.1MB
Tags: TPS ace should be the reference of the big err is unable to pay attention to the hive
The original address: https://www.cnblogs.com/babyfei/p/9565143.html

Read More: