Post Processing Hadoop Data and storing it in MongoDB

I am using Timothy npm for doing map-reduce jobs in node.js. I am able to run the simple word-count problem example and produce the output. Can anyone please help me understand how to store the output generated by Hadoop in MongoDB for post processing.

require('timothy') .configure({ hadoopHome: "/usr/local/hadoop", config: './hadoop.xml', input: "/user/loremipsum.txt", output: "/user/processed_"+(new Date().getTime()), name: "Timothy Word Count Example", cmdenv: "var=", "mapred.map.tasks": 10 })

This is my configuration for connection with hadoop. The processed file is stored in hadoop HDFS: /user directory.