This lesson is for PRO members.

Unlock this lesson NOW!
Already subscribed? sign in

Getting Started with Express - Using Streams

6:59 Node.js lesson by

Streams are a big part of Node's famous non-blocking I/O, and Express lets you take full advantage of them. This lesson demonstrates how to integrate streams into your server to improve performance and simplify your code.

Get the Code Now
click to level up

egghead.io comment guidelines

Avatar
egghead.io

Streams are a big part of Node's famous non-blocking I/O, and Express lets you take full advantage of them. This lesson demonstrates how to integrate streams into your server to improve performance and simplify your code.

Avatar
Ram

This lesson is really informative.
Thanks !

Avatar
Thomas

How do you suggest handling errors that occur while piping?

Thank you for the great videos - Thomas

Avatar
Ben

Hey Thomas,

I think you'd just set up error handling the way you would any time you're using streams, and then make sure you send an appropriate error status with the response.

HTH,
Ben

In reply to Thomas

One of the ways that Node implements it's non-blocking I/O that's part of the tagline is through the use of streams. Today, we're going to look at how we can use streams and Express.

For a quick stream demo for anyone that's not familiar with them, we'll use the FS, or file system module that's built into Node, and then, we're going to reference our input file, which is just out Users.JSON that we've been using here, in the app.

And then, we're going to be writing to a file called "Out.JSON," which you can see over here is just empty. And so, the first thing we need to do is create a read stream, and so, we are going to say FS.CreateReadStream, and we're going to pass out that input file.

That's going to be a readable stream, it can pull data into it. The next thing we'll do is we'll create a write stream, and so, that's a stream that can have data written to it. Now, the way that we tie these two things together is by using the pipe method.

We're going to say, "Readable, take your data and pipe it into that writable stream." Read it in, push it out. Now, if we go to the terminal and run this Streams.JS file, you can see that it does, in fact, populate our Out.JSON file. We can run it again, just so you can see it's very quick, and it is transferring all of the data to that new file.

How does this apply to a server? We have this route that we've defined earlier of /data/username, where we then pull in the user's data, and then, write it out as JSON into the browser. But if we look at this GetUser function in our helper's file, we can see that it's actually using a blocking operation.

FS.ReadFileSync means that it's a synchronous call, which means we don't have to define and use a callback, but that also means that nothing else can happen while that read operation is happening. That's going to halt everything else in the Node process that's single threaded.

And so, doing something like that in a server is usually not a good idea. We did it for convenience, but in a server that may have a lot of users connecting, that's going to reduce the amount of load that your server can handle. A better way to do this is to use streams.

We're going to go and modify this route handler, because we're going to get rid of the call to that blocking function, and we are just going to create another readable stream. FS.CreateReadStream. In this case, we're just going to construct a path to the file, using that username that's in the URL.

Go into the user's directory, add the username itself to the .JSON extension, and there we have a readable stream of that JSON file. Now, before we were writing to an output file, but that's not what we need to do this time. This time, we need to send it back to the browser.

The way that we do that is we can actually pipe directly to the response object. Express has made the response object writable, and we can pipe to it just like we would that write stream to the file. And so, if we save that and go refresh, you can see that we still get the same data out.

We also get it nicely formatted, because it is pulling the data raw, not changing anything. The fact that it's format it in the file, reflects in the browser. And so there, in one to two lines of code, we have a route that no longer blocks and pipes the same information to the browser.

Now, that's a very simple and straightforward use case, but you can also do some other really cool things with other stream libraries. If we go and install a package called JSON stream, that's going to let us do some more interesting things.

We will go ahead and get rid of those helpers, since we are not using it anymore, and we will require that JSON stream package that we just installed.

Then, we are going to come down here and define a new route, and we'll just call this one /users/by, and then, we'll start with gender. What we can do now, we'll create a readable stream that is just using that users.JSON again. We're going to read in that file, and then, we are going to pipe that file to this JSON stream.

We're going to call the .parse method, and we're going to pass it a star, which just means read in everything in the file. JSON stream allows you to filter the results if you want, but we're going to read in everything from the file, and essentially, JSON stream parses things into objects before it calls your callback.

Our callback function here is going to get a user object. Each user object in that file will be passed to this. Let's go ahead and we'll grab that gender URL parameter, and we're actually going to filter the results hereby gender. We'll say, "If user by gender is equal the gender that was passed into the URL, then, we'll go ahead and return that user."

If we don't return that user, we're essentially just filtering it out of our results. We're dealing with objects right now, and so we need to turn it back into a stream, because that's what we need to pipe to the browser. We're going to call pipe, again, and then, we're going to call JSONstream.stream.

We're going to pass in some strings here, and those are just telling JSON stream how to format the output string. And then finally, we will just pipe it back to our response, like we did before.

We now have this new route that's going to pull a URL variable. It's going to read in this file, it's going to send it through our filter function, push it back to a string, and then, send it out to the browser.

If we go ahead and start our dev server backup and go back to the browser, we can check out this new URL. So, /user/by/female, and you can see we get all of our female users back. We'll change it to male, just to show, make sure that it's working and it is, in fact, giving us back all the male users in that case.

Now, the data is not quite fitting in there, and I also want to show some more of the flexibility that this gives you. So, let's get back to the file. Maybe we're not interested in the whole user coming back, and we just want the name. If we say "ReturnUser.name," then, we can send that back and we will just get the name sub-object of each user back to the browser.

And so here, in about four lines of code, we've created a dynamic, non-blocking route that allows us to parse and send back data as we wish.

HEY, QUICK QUESTION!
Joel's Head
Why are we asking?