Node

Watch User Created Playlist (8)

This playlist is user created.

pro-course-rss-logo

PRO RSS Feed

Manage Memory and Garbage Collection in Node.js

P

Add and Remove EventEmitters in Node.js

P

Scraping Dynamic JavaScript Websites with Nightmare

P

Web Scraping with Pagination and Advanced Selectors

P

Use Streams in Express

P

Understand Node.js Buffers

Creating Demo APIs with json-server

Using npm run to launch local scripts

P
node tutorial about Manage Memory and Garbage Collection in Node.js

Manage Memory and Garbage Collection in Node.js

8:40 node PRO

In this lesson, you will learn how to use view and interpret the garbage collection activity of your node.js app via the console. You will also learn how to take heapdump snapshots and analyze them with Chrome Developer Tools to identify possible memory leaks. A sample app with a known memory leak is provided as part of the walk-through to illustrate how to use the tools in your own environments.

node tutorial about Add and Remove EventEmitters in Node.js

Add and Remove EventEmitters in Node.js

6:00 node PRO

In this lesson, you will learn what an EventEmitter is and how it works. We start with a simple example creating an instance of the EventEmitter class, then expand on it by building listeners and emitting events to trigger them. You will learn how to view listeners in the global emitter object, as well as how to remove them and understand what the EventEmitter memory leak message means. We wrap everything up by examining the http server class to illustrate how node.js uses EventEmitters in many places for core features.

node tutorial about Scraping Dynamic JavaScript Websites with Nightmare

Scraping Dynamic JavaScript Websites with Nightmare

2:43 node PRO

Many websites have more than just simple static content. Dynamic content which is rendered by JavaScript requires browser to be able to scrape data. This video demonstrates how to use Nightmare (which is a wrapper around Electron) to launch a url and scrape dynamic data.

node tutorial about Web Scraping with Pagination and Advanced Selectors

Web Scraping with Pagination and Advanced Selectors

3:29 node PRO

When web scraping, you'll often want to get more than just one page of data. Xray supports pagination by finding the "next" or "more" button on each page and cycling through each new page until it can no longer find that link. This lesson demonstrates how to paginate as well as more advanced selectors for when links are difficult to scrape.

node tutorial about Use Streams in Express

Use Streams in Express

6:59 node PRO

Streams are a big part of Node's famous non-blocking I/O, and Express lets you take full advantage of them. This lesson demonstrates how to integrate streams into your server to improve performance and simplify your code.

We will look at .creatReadStream, .createWriteStream, and .pipe to read and write from streams.

node tutorial about Understand Node.js Buffers

Understand Node.js Buffers

10:30 node

In this lesson, we cover the Node.js Buffer object in detail. Not only will you learn that the buffer object is a reference to a memory space outside of the V8 engine, but you will learn practical methods to access it, modify it, and convert it to standard Javascript objects that can be used by your code. Examples and discussion are provided for using the toString() method, determining the byte length of the buffer, writing to a buffer, how to avoid truncating data, comparing buffers for equality using both compare() and equals(), and copying buffers using slice().

node tutorial about Creating Demo APIs with json-server

Creating Demo APIs with json-server

6:02 node

json-server makes it extremely easy to setup robust JSON apis to use for demos and proof of concepts. John walks you through the process of using pre-built json files for a server and how to generate larger datasets using lodash and faker.

node tutorial about Using npm run to launch local scripts

Using npm run to launch local scripts

2:11 node PRO

npm run allows you to configure scripts inside of your package.json file which can access locally installed node packages. If you're comfortable with this technique, you can also grunt, gulp, or other build tools by customizing your scripts and saving them inside of your package.json file. With this approach, when a developer starts a new project with your package.json, they can simply run npm install then npm run yourscript without having to install any node packages globally.

HEY, QUICK QUESTION!
Joel's Head
Why are we asking?