Insert large amount of entries into Mongodb with Nodejs

wesbos Source

So I have a file that contains "pins" which I need to input into mongodb. THey are contained in a file called pins.txt and the data is new line delimited:

A4DS24SD2
DF234SDF2
HFFGHFG45
JDRSDFG35
...

There are about 70,000 pins that I need to import.

Each pin also has a default used status of false

So, to import such a huge number of pins, I've cooked this up using nodejs/mongoose (which my app is built in)

  fs.readFile './data/pins-test.txt', 'utf8', (err,data)->
    if err
      console.log err
    codes = data.split('\n')
    codes.forEach (code)->
      pin = new Pin()
      pin.pinid = code
      pin.save()

Works great with a testing of a few hundred pins, but my machine runs out of memory when I try the 70,000 file giving the error:

FATAL ERROR: CALL_AND_RETRY_2 Allocation failed - process out of memory

So, my question is, what is the best way to import this much data? Should I be using async.js to do these one at a time?

node.jsmongodbexpressmongoosedatabase

Answers

answered 5 years ago Alex #1

The problem with fs.readFile is that the whole file is loaded before the callback is fired.

You could use BufferedReader to do this line by line

Something like this should work (note... untested!)

reader = require "buffered-reader"
DataReader = reader.DataReader

...

  new DataReader("./data/pins1.txt", { encoding: "utf8" })
    .on "error", (error)->
        console.log ("error: " + error);
    .on "line", (line)->
        console.log(i,line)
        i++
        pin = new Pin()
        pin.pinid = line
        pin.save()
    .on "end", ()->
        console.log ("EOF")
    .read()

comments powered by Disqus