I am working on a command line application that is supposed to take an array of file names, do transform operations ( text files, spreadsheets, etc payloads have to be rewritten into JSON objects ), and send results to an endpoint API ( api.example.com ). I am considering a sequential read, and pipe result to an instance of -http, or -request, But has no idea of where to start from. Is there any alternatives or strategy have you used to solve a similar problem?
Any algorithm, or point to an article or a similar question here on SO will be highly appreciated. Thanks.
Update1. I found a link that may help in this google group https://groups.google.com/forum/#!topic/nodejs/_42VJGc9xJ4
To keep track of the final solution:
var request = require('request');
var file = fs.createReadStream(path)
.pipe(request.put({url: url, headers:{'Content-Length': fileSize}}, function(err, res, body){
if(err) {
console.log('error', err);
} else {
console.log('status', res.statusCode);
if(res.statusCode === 200) {
console.log('success');
}
}
}));
The remaining problem is how to make this work for "n" files, in case "n" is high - 100 text files or more.