I'm designing and implementing an API for Node.js to access from Ubuntu an IBM mainframe via IBM3270 protocol using x3270 tool. So Node.js process spawn s3270 process and uses its stdin, stdout and stderr to communicate with an IBM mainframe.
I've implemented the following interface:
var hs = require('./hs');
var session = hs.createSession(opts);
session.on('error', function(err) {
console.log('ERROR: %s', err.message);
});
session.on('connect', function() {
console.log('** connected');
session.send('TRANS');
});
session.on('response', function(res) {
console.log(res);
session.disconnect();
});
session.on('disconnect', function() {
console.log('** disconnected');
session.close();
});
session.on('close', function() {
console.log('** closed');
});
session.connect();
Everything is working very well.
The problem is the following. I would like to use Q promise library to get the client code that uses my API more organized, and also have Node.js like API in form of session.send(trans, cb(err, res) {}). I don't realize how should I implement the send function in a manner that it accepts a callback.
Generalizing my question I would like to know designing Node.js like API what should I implement first:
send(trans) function that emits events and using this then implement send('trans', cb(err, res) {}) ORsend('trans', cb(err, res) {}) first (I don't know how) and then implement events ORWhat I'm looking for is the general workflow and design principles when designing Node.js like API that could be also consumed by Q promise library.
As I realized there are two approaches to design async API for Node.js:
EventEmittervar d = Q.defer();, return d.promise; and d.resolve(); from Q libraryI implemented my API with promise-based approach using Q library only in order to get my code more organized. Furthermore Q library has functions such as Q.nfcall();, Q.nfapply(); and Q.nfbind(); to convert callback-based Node.js API to promise-based equivalent.