What is Node.js
Environment for running JavaScript code outside of the browser
Built on Chrome's V8 JavaScript engine
The browser engine compiles our code to machine code
It is based on C++ and written in C++
Node.js vs JS in Browser
Browser
DOM
Window
Interactive Apps
No Filesystem
Fragmentation
ES6 Modules
Node
No DOM
No Window: Unlike the browser, we can't use the window object and its properties
Server-Side App
Filesystem
Versions
CommonJS
Node.js Features
Globals in Node
GLOBALS - NO WINDOW!!!!
__dirname - path to current directory
__filename - file name
require - function to use modules (CommonJS)
module - info about current module (file)
process - info about env where the program is being executed
REPL
Read Evaluate Print Loop. Evaluate Node Code.
CLI
Command Line Interface.
Node Modules
CommonJS, every file is a module (by default)
Modules - Encapsulated Code (only share the minimum)
If a module involves a function in itself, then just by importing it in another file, it will invoke that function without explicitly calling it in that file.
Example: node-express-course/07-mind-grenade.js
๐ก Note: Node.js's original module system is CommonJS (which uses
require
andmodule.exports
).
Since Node.js was created, the ECMAScript module system (which usesimport
andexport
) has become standard and Node.js has added support for it.
Node.js will treat.cjs
files as CommonJS modules and.mjs
files as ECMAScript modules. It will treat.js
files as whatever the default module system for the project is (which is CommonJS unlesspackage.json
says"type": "module"
).
Inbuilt Node Modules
1. OS Module
const os = require('os');
// info about current user
const user = os.userInfo();
console.log(user);
// method returns the system uptime in seconds
console.log(`The System Uptime is ${os.uptime()} seconds`);
const currentOS = {
name: os.type(),
release: os.release(),
totalMem: os.totalmem(),
freeMem: os.freemem(),
};
console.log(currentOS);
2. Path Module
const path = require('path');
console.log(path.sep);
const filePath = path.join('/content/', 'subfolder', 'test.txt');
console.log(filePath);
const base = path.basename(filePath);
console.log(base);
const absolute = path.resolve(__dirname, 'content', 'subfolder', 'test.txt');
console.log(absolute);
3. Fs Module (sync)
const { readFileSync, writeFileSync } = require('fs');
console.log('start');
const first = readFileSync('./content/first.txt', 'utf8');
const second = readFileSync('./content/second.txt', 'utf8');
writeFileSync(
'./content/result-sync.txt',
`Here is the result : ${first}, ${second}`,
{ flag: 'a' }
);
console.log('done with this task');
console.log('starting the next one');
4. Fs Module (async)
const { readFile, writeFile } = require('fs');
console.log('start');
readFile('./content/first.txt', 'utf8', (err, result) => {
if (err) {
console.log(err);
return;
}
const first = result;
readFile('./content/second.txt', 'utf8', (err, result) => {
if (err) {
console.log(err);
return;
}
const second = result;
writeFile(
'./content/result-async.txt',
`Here is the result : ${first}, ${second}`,
(err, result) => {
if (err) {
console.log(err);
return;
}
console.log('done with this task');
}
);
});
});
console.log('starting next task');
๐ก Note: Multithreading in Node
How To Use Multithreading in Node.js | DigitalOcean
5. HTTP Module
const http = require('http');
const server = http.createServer((req, res) => {
console.log(req);
if (req.url === '/') {
res.end('Welcome to our home page');
return;
}
if (req.url === '/about') {
res.end('Here is our short history');
return;
}
res.end(`
<h1>Oops!</h1>
<p>We can't seem to find the page you are looking for</p>
<a href="/">back home</a>
`);
});
server.listen(5000, () => {
console.log('Server is up and running at Port 5000');
});
NPM
It's a node package manager that comes with Node installation.
Package.json
package.json
is a JSON file that contains information about a Node.js project, including its name, version, description, entry point, dependencies, and more. It is used by Node.js's package manager, npm
, to install and manage packages that the project depends on. The file can be created by running the command npm init
in the project's root directory, and it can be edited manually or using the npm
command line tool. The file should be committed to version control along with the project's source code.
Run npm init -y
to create the package.json
file.
{
"name": "tutorial",
"version": "1.0.0",
"description": "",
"main": "1-intro.js",
"scripts": {
"start": "nodemon app.js"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"lodash": "^4.17.20"
},
"devDependencies": {
"nodemon": "^2.0.7"
}
}
To install a dev dependency run the command:
npm i {packageName} -D
# or
npm i {packageName} --save-dev
To install a global dependency:
npm install -g {package}
We can write custom scripts in the package.json
file:
"scripts" : {
"start": "node app.js",
"dev" : "nodemon app.js"
}
We can run these scripts with npm start
or npm run dev
(sometimes we need to use run
before the script name).
To uninstall a node module, run npm uninstall {package}
.
Package-lock.json
package-lock.json
is an automatically generated file that describes the exact tree of installed packages in a project. This file serves as a single source of truth for the installed dependencies and their versions. It is used by npm
to resolve dependencies and ensure that all packages are installed at the right version. It should not be modified manually and should be committed to version control along with package.json
.
In a node module's version number, the first number describes the major change in the package, the second number is a minor change and provides backward compatibility with the older package of the same major version. The last number describes small patches in the package.
Advanced Topics
Event Loop
Node.js is based on an event-driven architecture. This means that it is designed to handle I/O operations and callbacks efficiently. The event loop is the mechanism that Node.js uses to handle asynchronous I/O operations.
When an I/O operation is performed, such as reading data from a file or making an HTTP request, Node.js registers a callback function to be executed when the operation is complete. The event loop is responsible for monitoring the completion of I/O operations and executing the corresponding callback functions.
The event loop works by processing a queue of events, each of which corresponds to an I/O operation and its associated callback function. When an I/O operation is completed, the corresponding callback function is added to the queue. The event loop then processes the queue of events, executing each callback function in turn.
This approach allows Node.js to handle a large number of concurrent I/O operations without blocking the main thread of execution. Instead, the event loop ensures that callbacks are executed asynchronously, allowing Node.js to handle many I/O operations in parallel.
Blocking Code
const http = require('http');
const server = http.createServer((req, res) => {
if (req.url === '/') {
res.end('Home Page
');
return;
}
if (req.url === '/about') {
// BLOCKING CODE !!!
for (let i = 0; i < 1000; i++) {
for (let j = 0; j < 1000; j++) {
console.log(`${i}, ${j}`);
}
}
res.end('About Page');
return;
}
res.end('Error Page');
});
server.listen(5000, () => {
console.log('Server is up and running at Port 5000');
});
Blocking code can negatively impact the performance of your application, causing delays and slowing down the responsiveness of your server. It is important to avoid blocking code and use asynchronous operations instead.
Streams
Streams are objects that let you read data from a source or write data to a destination in a continuous manner. There are four types of streams in Node.js:
Readable: Used for reading operations.
Writable: Used for writing operations.
Duplex: Used for both reading and writing operations.
Transform: A type of duplex stream where the output is computed based on input.
Reading from a Stream
const { createReadStream } = require('fs');
const stream = createReadStream('./content/big.txt', {
highWaterMark: 90000,
encoding: 'utf8',
});
stream.on('data', (result) => {
console.log(result);
});
stream.on('error', (err) => {
console.log(err);
});
Writing to a Stream
const { createWriteStream } = require('fs');
const stream = createWriteStream('./content/big.txt');
for (let i = 0; i < 10000; i++) {
stream.write(`Hello world ${i}\n`);
}
stream.end();
๐ก Note: Streams are a powerful way to handle large amounts of data efficiently, especially when working with files or network operations. By using streams, you can process data as it becomes available, rather than waiting for the entire data set to be loaded into memory.