I'm trying to write a very big data chunk (resulting from JSON.stringify) into a file. It must be synchroneous, because I would like to save it during the process:exit event. However, when using fs.writeFileSync(), Node throws with this message :
FATAL ERROR: JS Allocation failed - process out of memory
I've tried fs.createWriteStream(), but it does not seems to work. This code output 0 bytes with medium data amount, and throws the same error with large.
I think what you need is a 'pump'. Something like http://elegantcode.com/2011/04/06/taking-baby-steps-with-node-js-pumping-data-between-streams/ That will relieve your kernel buffers and they will not have to hold all the text all the time.
My problem was in JSON.stringify. Data were too big to be stored in a unique javascript string.
I solved this issue by serializing one property per line (so there is X calls to JSON.stringify, where X is the number of properties in my object). My deserializer rebuild the object by adding every properties.