Node.js Streams: Is there a way to convert or wrap a fs write stream to a Transform stream?


Node.js Streams: Is there a way to convert or wrap a fs write stream to a Transform stream?



With a node http server I'm trying to pipe the request read stream to the response write stream with some intermediary transforms, one of which is a file system write.



The pipeline looks like this with non pertinent code removed for simplicity:


function handler (req, res) {
req.pipe(jsonParse())
.pipe(addTimeStamp())
.pipe(jsonStringify())
.pipe(saveToFs('saved.json'))
.pipe(res);
}



The custom Transform streams are pretty straight forward, but I have no elegant way of writing saveToFs. It looks like this:


saveToFs


function saveToFs (filename) {
const write$ = fs.createWriteStream(filename);
write$.on('open', () => console.log('opened'));
write$.on('close', () => console.log('closed'));

const T = new Transform();
T._transform = function (chunk, encoding, cb) {
write$.write(chunk);
cb(null, chunk);
}
return T;
}



The idea is simply to pipe the data to the write stream and then through to the response stream, but fs.createWriteStream(<file.name>) is only a writable stream, so it makes this approach difficult.


fs.createWriteStream(<file.name>)



Right now this code has two problems that I can see: the write stream never fires a close event (memory leak?), and I would like the data to pass through the file system write before returning data to the response stream instead of essentially multicasting to two sinks.



Any suggestions, or pointing out fundamental things I've missed would be greatly appreciated.




1 Answer
1



What you should do is save the stream returned by the .pipe before saveToFs, and then pipe that to a file and res.


stream


.pipe


saveToFs


res


function handler(req, res) {
const transformed = req.pipe(jsonParse())
.pipe(addTimeStamp())
.pipe(jsonStringify());

transformed.pipe(fs.createWriteStream('saved.json'));
transformed.pipe(res);
}



To sum it up, you can pipe the same readable stream (transformed) to multiple writable streams.


transformed



And I would like the data to pass through the file system write
before returning data to the response stream instead of essentially
multicasting to two sinks.



Use { end: false } option when piping to res.


{ end: false }


res


transformed.pipe(res, { end: false });



And then call res.end() when the file is written or whenever you want.


res.end()






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Comments

Popular posts from this blog

paramiko-expect timeout is happening after executing the command

Opening a url is failing in Swift

Export result set on Dbeaver to CSV