<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Om Vaja</title>
    <description>The latest articles on Forem by Om Vaja (@om_vaja).</description>
    <link>https://forem.com/om_vaja</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/om_vaja"/>
    <language>en</language>
    <item>
      <title>Node JS - The Event Loop</title>
      <dc:creator>Om Vaja</dc:creator>
      <pubDate>Tue, 19 Nov 2024 10:42:00 +0000</pubDate>
      <link>https://forem.com/om_vaja/node-js-the-event-loop-35bk</link>
      <guid>https://forem.com/om_vaja/node-js-the-event-loop-35bk</guid>
      <description>&lt;p&gt;We have discussed why Node JS is single-threaded and also multi-threaded in our article called "&lt;a href="https://dev.to/om_vaja/node-js-internals-2igh"&gt;Node Internals&lt;/a&gt;". It’ll give you a solid foundation on Node’s architecture and set the stage for understanding the magic of the Event Loop!&lt;/p&gt;

&lt;p&gt;Node js could be considered single-threaded because of the Event Loop. But, what is the event loop?&lt;/p&gt;

&lt;p&gt;I always start with the restaurant analogy because I think it becomes easy to understand technical details.&lt;/p&gt;

&lt;p&gt;So, In the restaurant main chef takes orders from the order list and gives them to the team of assistants. When food is ready the chef serves the food. If any VIP customers come then the chef prioritize this order.&lt;/p&gt;

&lt;p&gt;If we take this analogy into our consideration then we can say that...&lt;/p&gt;

&lt;p&gt;In the context of Node JS Event Loop.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Chef is the Event Loop that manages tasks and delegating work.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Team of Assistance is a worker thread or the OS that handles the execution of the delegated tasks to them.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Order List is a task queue for tasks waiting for their turn.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;VIP customer is a Microtask that has high priority and is completed before regular tasks.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To, Understand Event Loop we have to first understand the difference between Microtasks and Macrotasks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Microtask
&lt;/h2&gt;

&lt;p&gt;Microtask means tasks that have some high priority and are executed after the currently executing Javascript code completes, but before moving to the next phase of the Event Loop.&lt;/p&gt;

&lt;p&gt;Example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;process.nextTick&lt;/li&gt;
&lt;li&gt;Promises (.then, .catch, .finally)&lt;/li&gt;
&lt;li&gt;queueMicrotask&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Macrotask
&lt;/h2&gt;

&lt;p&gt;These are lower-priority tasks queued for execution at a later phase in the Event Loop.&lt;/p&gt;

&lt;p&gt;Example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;setTimeout&lt;/li&gt;
&lt;li&gt;setInterval&lt;/li&gt;
&lt;li&gt;setImmediate&lt;/li&gt;
&lt;li&gt;I/O operations&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Event Loop
&lt;/h2&gt;

&lt;p&gt;When we run asynchronous tasks in Node.js, the Event Loop is at the heart of everything.&lt;/p&gt;

&lt;p&gt;Thanks to the Event Loop, Node.js can perform non-blocking I/O operations efficiently. It achieves this by delegating time-consuming tasks to the operating system or worker threads. Once the tasks are completed, their callbacks are processed in an organized manner, ensuring smooth execution without blocking the main thread.&lt;/p&gt;

&lt;p&gt;This is the magic that allows Node.js to handle multiple tasks concurrently while still being single-threaded.&lt;/p&gt;

&lt;h2&gt;
  
  
  Phases
&lt;/h2&gt;

&lt;p&gt;There are six phases in an Event Loop and each phase has its own queue, which holds specific types of tasks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.Timers phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this phase timer related callbacks are handled such as setTimeout, and setInterval.&lt;/p&gt;

&lt;p&gt;Node js checks the timer queue for callbacks whose delay has expired.&lt;/p&gt;

&lt;p&gt;If a timer delay is met, its callback is added to this queue for execution.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;console.log('Start');

setTimeout(() =&amp;gt; {
  console.log('Timer 1 executed after 1 second');
}, 1000);

setTimeout(() =&amp;gt; {
  console.log('Timer 2 executed after 0.5 seconds');
}, 500);

let count = 0;
const intervalId = setInterval(() =&amp;gt; {
  console.log('Interval callback executed');
  count++;

  if (count === 3) {
    clearInterval(intervalId);
    console.log('Interval cleared');
  }
}, 1000);

console.log('End');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Start
End
Timer 2 executed after 0.5 seconds
Timer 1 executed after 1 second
Interval callback executed
Interval callback executed
Interval callback executed
Interval cleared
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2.I/O callback phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This phase's purpose is to execute callbacks for completed I/O (Input/Output) operations, such as reading or writing files, querying databases, handling network requests, and other asynchronous I/O tasks.&lt;/p&gt;

&lt;p&gt;When you any asynchronous I/O operation in Node.js (like reading a file using fs.readFile) comes then, the operation is delegated to the operating system or worker threads. These I/O tasks are executed outside the main thread in a non-blocking manner. Once the task is completed, a callback function is triggered to process the results.&lt;/p&gt;

&lt;p&gt;The I/O Callbacks Phase is where these callbacks are queued for execution once the operation finishes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');

console.log('Start');

fs.readFile('example.txt', 'utf8', (err, data) =&amp;gt; {
  if (err) {
    console.log('Error reading file:', err);
    return;
  }
  console.log('File contents:', data);
});

console.log('Middle');

setTimeout(() =&amp;gt; {
  console.log('Simulated network request completed');
}, 0);

console.log('End');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Start
Middle
End
Simulated network request completed
File contents: (contents of the example.txt file)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;3.Idle phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this phase, no user-defined work is performed instead in this phase event loop gets ready for the next phases. only internal adjustments are done in this phase.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4.Poll phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Poll Phase checks whether there are pending I/O events (like network activity or file system events) that need to be processed. It will immediately execute the callbacks associated with these events.&lt;/p&gt;

&lt;p&gt;If no I/O events are pending, the Poll Phase can enter a blocking state.&lt;/p&gt;

&lt;p&gt;In this blocking state, Node.js will simply wait for new I/O events to arrive. This blocking state is what makes Node.js non-blocking: It waits until new I/O events trigger callback executions, keeping the main thread free for other tasks in the meantime.&lt;/p&gt;

&lt;p&gt;Any callbacks for completed I/O operations (such as fs.readFile, HTTP requests, or database queries) are executed during this phase. These I/O operations may have been initiated in previous phases (like the Timers Phase or I/O Callbacks Phase) and are now completed.&lt;/p&gt;

&lt;p&gt;If there are timers set with setTimeout or setInterval, Node.js will check if any timers have expired and if their associated callbacks need to be executed. If timers have expired, their callbacks are moved to the callback queue, but they will not be processed until the next phase, which is the Timers Phase.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');
const https = require('https');

console.log('Start');

fs.readFile('file1.txt', 'utf8', (err, data) =&amp;gt; {
  if (err) {
    console.log('Error reading file1:', err);
    return;
  }
  console.log('File1 content:', data);
});

fs.readFile('file2.txt', 'utf8', (err, data) =&amp;gt; {
  if (err) {
    console.log('Error reading file2:', err);
    return;
  }
  console.log('File2 content:', data);
});

https.get('https://jsonplaceholder.typicode.com/todos/1', (response) =&amp;gt; {
  let data = '';
  response.on('data', (chunk) =&amp;gt; {
    data += chunk;
  });
  response.on('end', () =&amp;gt; {
    console.log('HTTP Response:', data);
  });
});

console.log('End');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Start
End
File1 content: (contents of file1.txt)
File2 content: (contents of file2.txt)
HTTP Response: (JSON data from the HTTP request)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;5.Check phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After the Poll Phase has completed its tasks. This phase mainly handles the execution of setImmediate callbacks, which are scheduled to run immediately after the I/O events are processed in the Poll Phase.&lt;/p&gt;

&lt;p&gt;setImmediate callbacks are often used when you want to perform an action after the current event loop cycle, such as making sure some task is executed after the system is not busy processing I/O events.&lt;/p&gt;

&lt;p&gt;The Check Phase has higher priority over the Timers Phase (which handles setTimeout and setInterval). This means that setImmediate callbacks will always be executed before any timers even if their timers have expired.&lt;/p&gt;

&lt;p&gt;setImmediate guarantees that its callback will run after the current I/O cycle and before the next timer cycle. This can be important when you want to ensure that I/O-related tasks are completed first before running other tasks.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');

console.log('Start');

fs.readFile('somefile.txt', 'utf8', (err, data) =&amp;gt; {
  if (err) {
    console.error(err);
    return;
  }
  console.log('File content:', data);
});

setImmediate(() =&amp;gt; {
  console.log('Immediate callback executed');
});

setTimeout(() =&amp;gt; {
  console.log('Timeout callback executed');
}, 0);

console.log('End');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Start
End
Immediate callback executed
Timeout callback executed
File content: (contents of the file)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;6.close phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Close Callbacks Phase typically executes when an application needs to clean up before exiting or shutting down.&lt;/p&gt;

&lt;p&gt;This phase deals with events and tasks that need to be executed once a system resource, like a network socket or file handle, is no longer needed.&lt;/p&gt;

&lt;p&gt;Without this phase, an application might leave open file handles, network connections, or other resources, potentially leading to memory leaks, data corruption, or other issues.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const http = require('http');

const server = http.createServer((req, res) =&amp;gt; {
  res.write('Hello, world!');
  res.end();
});

server.listen(3000, () =&amp;gt; {
  console.log('Server is listening on port 3000');
});

setTimeout(() =&amp;gt; {
  console.log('Closing the server...');
  server.close(() =&amp;gt; {
    console.log('Server closed!');
  });
}, 5000);

process.on('exit', (code) =&amp;gt; {
  console.log('Process is exiting with code', code);
});

console.log('End of the script');

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;End of the script
Server is listening on port 3000
Closing the server...
Server closed!
Process is exiting with code 0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;There is one more special phase in the Event Loop of Node JS.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Microtask Queue&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;process.nextTick() and promises to execute their callbacks in a special phase in the Event Loop.&lt;/p&gt;

&lt;p&gt;process.nextTick() schedules a callback to be executed immediately after the current operation completes, but before the event loop continues to the next phase.&lt;/p&gt;

&lt;p&gt;process.nextTick() is not part of any phase in the event loop. Instead, it has its own internal queue that gets executed right after the currently executing synchronous code and before any phase in the event loop is entered.&lt;/p&gt;

&lt;p&gt;It's executed after the current operation but before I/O, setTimeout, or other tasks scheduled in the event loop.&lt;/p&gt;

&lt;p&gt;Promises have lower priority than process.nextTick() and are processed after all process.nextTick() callbacks.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;console.log('Start');

process.nextTick(() =&amp;gt; {
  console.log('NextTick callback');
});

Promise.resolve().then(() =&amp;gt; {
  console.log('Promise callback');
});

setTimeout(() =&amp;gt; {
  console.log('setTimeout callback');
}, 0);

console.log('End');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Start
End
NextTick callback
Promise callback
setTimeout callback
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, You have an overall idea of how the Event Loop works.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;I am giving you one question which answer you can give in the comments.&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');

console.log('Start');

setImmediate(() =&amp;gt; {
  console.log('setImmediate callback');
});

process.nextTick(() =&amp;gt; {
  console.log('NextTick callback');
});

setTimeout(() =&amp;gt; {
  console.log('setTimeout callback');
}, 0);

Promise.resolve().then(() =&amp;gt; {
  console.log('Promise callback');
});

fs.readFile(__filename, () =&amp;gt; {
  console.log('File read callback');
});

console.log('End');

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Thank you.&lt;/p&gt;

&lt;p&gt;Waiting for your answer.&lt;/p&gt;

</description>
      <category>node</category>
      <category>javascript</category>
      <category>backend</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Clustering and Worker Threads - Node JS</title>
      <dc:creator>Om Vaja</dc:creator>
      <pubDate>Mon, 18 Nov 2024 17:44:21 +0000</pubDate>
      <link>https://forem.com/om_vaja/clustering-and-worker-threads-node-js-1gjd</link>
      <guid>https://forem.com/om_vaja/clustering-and-worker-threads-node-js-1gjd</guid>
      <description>&lt;p&gt;In a previous article "&lt;a href="https://dev.to/om_vaja/node-js-internals-2igh"&gt; Node JS Internals &lt;/a&gt;" we discussed Node JS internal architecture and also discussed why we should node increase Thread Pool size to handle multiple requests concurrently. I have told you that scalability and performance are not related to Thread Pool size.&lt;/p&gt;

&lt;p&gt;For scalability and high performance, we can use clustering and worker threads.&lt;/p&gt;

&lt;h2&gt;
  
  
  Clustering
&lt;/h2&gt;

&lt;p&gt;Let's say you are in a grand wedding and thousands of guests are in the wedding. There is one kitchen and one cook is preparing the food for all these guests. Sounds unpredictable, right? You are not utilizing the kitchen's full resources if you have only one cook.&lt;/p&gt;

&lt;p&gt;This is exactly what happens in a Node JS application running on a multicore CPU when only one core is being used to handle all the requests. so, even though our machine has the power of multicores, without clustering, our application runs on just a one-core. One core is responsible for handling all the work.&lt;/p&gt;

&lt;p&gt;When in your kitchen multiple cooks are working that's the clustering. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Clustering is a technique that is used to enable the single Node JS application to utilize multiple CPU cores effectively.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To implement clustering you have to use a cluster module from Node JS.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const cluster = require('cluster');
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By using this cluster module, you can create multiple instances of our Node JS application. These instances are called workers. All workers share the same server port and handle incoming requests concurrently.&lt;/p&gt;

&lt;p&gt;There are two types of processes in the cluster architecture.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.Master Process:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Master process is like the main cook in the kitchen who manages workers. It initializes the application, sets up the clustering environment, and also delegates tasks to worker processes. It does not directly handle application requests. &lt;/p&gt;

&lt;p&gt;What does the Master process do?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Creates multiple worker processes using the cluster.fork() method. It also restarts workers if they crash or exit unexpectedly.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It makes sure that incoming requests are distributed across all worker processes. On Linux, this is handled by an operating system, and on Windows, Node JS itself acts as the load balancer.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It enables communication between workers via IPC(Inter-Process Communication).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2.Worker Processes:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Worker processes are the instance of the Node JS application created by the master process. Each process runs independently on a separate CPU core and handles incoming requests.&lt;/p&gt;

&lt;p&gt;Worker processes cannot directly communicate with each other they communicate via master.&lt;/p&gt;

&lt;p&gt;The worker process handles the incoming request and performs some tasks such as database query, computation, or any application logic.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const cluster = require('cluster');
const os = require('os');

if (cluster.isMaster) {
  console.log(`Master ${process.pid} is running`);

  // Fork workers
    cluster.fork();
    cluster.fork();
    cluster.fork();
    cluster.fork();

} else {
  console.log(`Worker ${process.pid} is running`);
  // Worker logic (e.g., server setup) goes here
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, we are checking first this is the master process. if yes then it will create worker processes. &lt;/p&gt;

&lt;p&gt;In our code, I am creating a worker process using cluster.fork().&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;But, this is not an ideal way of creating a worker process.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Suppose, you are creating 4 worker processes and your system has two cores.&lt;/p&gt;

&lt;p&gt;To, solve this problem instead of creating worker processes hardcoded first find the CPU cores then consider that data create worker processes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const cluster = require('cluster');
const os = require('os');

if (cluster.isMaster) {
  console.log(`Master ${process.pid} is running`);
  const numCPUs = os.cpus().length;

  // Fork workers
  for (let i = 0; i &amp;lt; numCPUs; i++) {
    cluster.fork();
  }
} else {
  console.log(`Worker ${process.pid} is running`);
  // Worker logic (e.g., server setup) goes here
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I am using dual core system so output will look like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Master 12345 is running
Worker 12346 is running
Worker 12347 is running
Worker 12348 is running
Worker 12349 is running
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Now, you have a question if you have a dual-core CPU then why 4 worker processes created?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is because the number of logical cores is 4, My CPU supports hyperthreading or Simultaneous Multi-Threading (SMT).&lt;/p&gt;

&lt;h2&gt;
  
  
  Worker Threads
&lt;/h2&gt;

&lt;p&gt;In the restaurant, the waiter takes the order and gives that order to a team of cooks because cooking takes some time. If table cleaning or any other waiter-related work comes then the waiter does this. When the order is ready the cook gives food back to the waiter and the waiter serves this food to the customer.&lt;/p&gt;

&lt;p&gt;This is the same scenario related to worker threads. If any computationally expensive tasks like large-scale data processing, complex calculations, or heavy algorithms comes then the main thread delegates this task to the worker thread. The worker performs this task, not the main thread.&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Why, this is helpful?  *&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We know that the Node JS event loop is single-threaded and if this heavy computational work is done by the main thread then the event loop will be blocked. If you use these worker threads then these heavy tasks are given to worker threads and worker threads perform these tasks, not the main thread so the event loop does not get blocked.&lt;/p&gt;

&lt;p&gt;Worker threads can communicate with the main thread via a message-passing system, and data can be sent between threads using structured cloning (deep copy).&lt;/p&gt;

&lt;p&gt;Now, we are trying to mimic the worker threads working.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;main.js (Main Thread)&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const { Worker } = require('worker_threads');

function startWorker() {
  const worker = new Worker('./worker.js'); // Create a worker using worker.js

  // Listen for messages from the worker
  worker.on('message', (message) =&amp;gt; {
    console.log('Message from worker:', message);
  });

  // Handle errors in the worker
  worker.on('error', (error) =&amp;gt; {
    console.error('Worker error:', error);
  });

  // Handle worker exit
  worker.on('exit', (code) =&amp;gt; {
    console.log(`Worker exited with code ${code}`);
  });

  // Send a message to the worker
  worker.postMessage({ num: 100 });
}

startWorker();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;worker.js (Worker Thread)&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const { parentPort } = require('worker_threads'); // Access parentPort to communicate with the main thread

// Listen for messages from the main thread
parentPort.on('message', (data) =&amp;gt; {
  console.log('Received data from main thread:', data);

  // Perform some processing
  const result = { squared: data.num * data.num };

  // Send the result back to the main thread
  parentPort.postMessage(result);
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If data contains large structures, it will be deeply cloned and passed over, which might have some performance overhead.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Working of code&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The Worker class is used to spawn new threads.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You can send data to the worker using worker.postMessage and listen for messages with worker.on('message', callback).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the worker thread, parentPort is the primary interface to communicate with the main thread.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You can listen for messages from the main thread (parentPort.on('message')) and send messages back using parentPort.postMessage.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The output will be:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Received data from main thread: { num: 100 }
Message from worker: { squared: 10000 }
Worker exited with code 0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Now, you also have one question: why don't we create hundreds of worker threads?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;But, The reason is if you create more threads than the number of cores, threads will compete for CPU time, leading to context switching, which is expensive and reduces overall performance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;When should you use clustering, worker threads, or both in Node.js?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. When to use Worker threads?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CPU-Bound tasks:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tasks involve heavy computations, such as Image/video processing, Data compression, or encryption, Machine learning inference, Scientific calculations&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Shared Memory is required:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You need to share data efficiently between threads without duplicating it.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Single-core usage:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If your application needs to scale only within a single process but still requires parallelism for CPU-intensive tasks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2.When to use clustering?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I/O bound:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tasks involve handling a high number of client requests, such as Web, servers, Chat applications, and APIs. Clustering helps scale horizontally by distributing requests across all CPU cores.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Isolated memory:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Your application doesn’t need to share a lot of data between processes.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multi-Core Utilization:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You want to utilize all available cores by spawning multiple Node.js processes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3.When to use both clustering and worker threads?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I/O-Bound + CPU-Bound Tasks:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The application handles HTTP requests but offloads computationally intensive tasks. Example: A web server processes file uploads and performs image resizing or video transcoding.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;High Scalability:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You need both process-level and thread-level parallelism for high throughput. In an E-commerce site Clustering ensures multiple processes handle incoming requests. Worker threads process background tasks like generating personalized recommendations.&lt;/p&gt;

&lt;p&gt;Thank You.&lt;/p&gt;

&lt;p&gt;Feel free to ask the question or give any suggestions.&lt;/p&gt;

&lt;p&gt;If you found this informative then like it.&lt;/p&gt;

</description>
      <category>node</category>
      <category>backend</category>
      <category>javascript</category>
      <category>clustering</category>
    </item>
    <item>
      <title>Node JS Internals</title>
      <dc:creator>Om Vaja</dc:creator>
      <pubDate>Sat, 16 Nov 2024 16:15:53 +0000</pubDate>
      <link>https://forem.com/om_vaja/node-js-internals-2igh</link>
      <guid>https://forem.com/om_vaja/node-js-internals-2igh</guid>
      <description>&lt;p&gt;Suppose you go to a restaurant and there is one single chef who promises that "I can cook for hundreds of people at the same time and none of you will go hungry", Sounds impossible, right? You can consider this single check as Node JS which manages all these multiple orders and still serves the food to all the customers.&lt;/p&gt;

&lt;p&gt;Whenever you ask someone the question "What is Node JS?", a person always gets back the answer "Node JS is a runtime which is used to run JavaScript outside of the browser environment".&lt;/p&gt;

&lt;p&gt;But, What does runtime mean?... The runtime environment is a software infrastructure in which code execution is written into a specific programming language. It has all the tools, libraries, and features to run code, handle errors, manage memory, and can interact with the underlying operating system or hardware.&lt;/p&gt;

&lt;p&gt;Node JS has all of these.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Google V8 Engine to run the code.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Core libraries and APIs such as fs, crypto, http, etc.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Infrastructure like Libuv and the Event Loop to support asynchronous and non-blocking I/O operations.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So, we can now know why Node JS is called runtime.&lt;/p&gt;

&lt;p&gt;This run time consists of two independent dependencies, &lt;strong&gt;V8&lt;/strong&gt; and &lt;strong&gt;libuv&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;V8&lt;/strong&gt; is an engine that is also used in Google Chrome and it is developed and managed by Google. In Node JS it executes the JavaScript code. When we run the command &lt;strong&gt;node index.js&lt;/strong&gt; then Node JS passes this code to the V8 engine. V8 processes this code, executes it, and provides the result. For example, if your code logs "Hello, World!" to the console, V8 handles the actual execution that makes this happen.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;libuv&lt;/strong&gt; library contains the C++ code that enables access to the operating system when we want functionality such as networking, I/O operations, or time-related operations. It works as a bridge between Node JS and the operating system.&lt;/p&gt;

&lt;p&gt;The libuv handles following operations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;File system operations: Reading or writing files (fs.readFile, fs.writeFile).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Networking: Handling HTTP requests, sockets, or connecting to servers.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Timers: Managing functions like setTimeout or setInterval.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tasks like file reading are handled by the Libuv thread pool, timers by Libuv’s timer system, and network calls by OS-level APIs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Is Node JS single-threaded?
&lt;/h2&gt;

&lt;p&gt;Look at the following example.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');
const path = require('path');

const filePath = path.join(__dirname, 'file.txt');

const readFileWithTiming = (index) =&amp;gt; {
  const start = Date.now();
  fs.readFile(filePath, 'utf8', (err, data) =&amp;gt; {
    if (err) {
      console.error(`Error reading the file for task ${index}:`, err);
      return;
    }
    const end = Date.now();
    console.log(`Task ${index} completed in ${end - start}ms`);
  });
};

const startOverall = Date.now();
for (let i = 1; i &amp;lt;= 4; i++) {
  readFileWithTiming(i);
}

process.on('exit', () =&amp;gt; {
  const endOverall = Date.now();
  console.log(`Total execution time: ${endOverall - startOverall}ms`);
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We are reading the same file four times and we are logging the time to read those files.&lt;/p&gt;

&lt;p&gt;We get the following output of this code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Task 1 completed in 50ms
Task 2 completed in 51ms
Task 3 completed in 52ms
Task 4 completed in 53ms
Total execution time: 54ms
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We can see that we completed all four files reading almost at 50th ms. If Node JS is single-threaded then how are all these files reading operations completed at the same time?&lt;/p&gt;

&lt;p&gt;This question answers that the libuv library uses the thread pool. the thread pool is a bunch of threads. By default, the thread pool size is 4 means 4 requests can be processed at once by libuv.&lt;/p&gt;

&lt;p&gt;Consider another scenario where instead of reading one file 4 times we are reading this file 6 times.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');
const path = require('path');

const filePath = path.join(__dirname, 'file.txt');

const readFileWithTiming = (index) =&amp;gt; {
  const start = Date.now();
  fs.readFile(filePath, 'utf8', (err, data) =&amp;gt; {
    if (err) {
      console.error(`Error reading the file for task ${index}:`, err);
      return;
    }
    const end = Date.now();
    console.log(`Task ${index} completed in ${end - start}ms`);
  });
};

const startOverall = Date.now();
for (let i = 1; i &amp;lt;= 6; i++) {
  readFileWithTiming(i);
}

process.on('exit', () =&amp;gt; {
  const endOverall = Date.now();
  console.log(`Total execution time: ${endOverall - startOverall}ms`);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output will look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Task 1 completed in 50ms
Task 2 completed in 51ms
Task 3 completed in 52ms
Task 4 completed in 53ms
Task 5 completed in 101ms
Task 6 completed in 102ms
Total execution time: 103ms
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9uoxf4b8jb1zmdajpec5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9uoxf4b8jb1zmdajpec5.png" alt="Image description" width="800" height="1079"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Suppose Read operation 1 and 2 completed and thread 1 and 2 become free.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1hbr0uyjgl16c6sggkwa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1hbr0uyjgl16c6sggkwa.png" alt="Image description" width="800" height="1079"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see that the first 4 times we get almost the same time for reading the file but when we read this file the 5th and 6th time then it takes almost double the time to complete the read operations from the first four reading operations. &lt;/p&gt;

&lt;p&gt;This happens because the thread pool size is by default 4 so four reading operations are handled at the same time but then again 2 (5th and 6th) times we are reading the file then libuv waits because all the threads have some work. When one of the four threads completes execution then 5th read operation is handled to that thread and the same for 6th time read operation will be done. that's the reason why it takes more time.&lt;/p&gt;

&lt;p&gt;So, Node JS is not single-threaded.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;But, why do some people refer to it as single-threaded?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is because the main event loop is single-threaded. This thread is responsible for executing Node JS code, including handling asynchronous callbacks and coordinating tasks. It does not directly handle blocking operations like file I/O.&lt;/p&gt;

&lt;p&gt;Code execution flow is like this.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Synchronous Code (V8):&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Node.js executes all synchronous (blocking) code line by line using the V8 JavaScript engine.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Async Tasks Delegated:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Asynchronous operations like fs.readFile, setTimeout, or http requests are sent to the Libuv library or other subsystems (e.g., OS).&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Task Execution:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Tasks like file reading are handled by the Libuv thread pool, timers by Libuv’s timer system, and network calls by OS-level APIs.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Callback Queued:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once an async task is complete, its associated callback is sent to the event loop's queue.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Event Loop Executes Callbacks:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The event loop picks up callbacks from the queue and executes them one by one, ensuring non-blocking execution.&lt;/p&gt;

&lt;p&gt;You can change the thread pool size using &lt;strong&gt;process.env.UV_THREADPOOL_SIZE = 8&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Now, I am thinking that if we set the high number of threads then we will able to handle the high number of requests also. I hope you will think like me about this.&lt;/p&gt;

&lt;p&gt;But, It is the opposite of what we were thinking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;If we increase the number of threads beyond a certain limit then it will slow down your code execution.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Look at the following example.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fs = require('fs');
const path = require('path');

// Set UV_THREADPOOL_SIZE to 100 (a high value) for this example
process.env.UV_THREADPOOL_SIZE = 100;

const filePath = path.join(__dirname, 'largeFile.txt');

// Function to simulate reading multiple files asynchronously
const readFileWithTiming = (index) =&amp;gt; {
  const start = Date.now();
  fs.readFile(filePath, 'utf8', (err, data) =&amp;gt; {
    if (err) {
      console.error(`Error reading the file for task ${index}:`, err);
      return;
    }
    const end = Date.now();
    console.log(`Task ${index} completed in ${end - start}ms`);
  });
};

const startOverall = Date.now();
for (let i = 1; i &amp;lt;= 10; i++) {
  readFileWithTiming(i);
}

process.on('exit', () =&amp;gt; {
  const endOverall = Date.now();
  console.log(`Total execution time: ${endOverall - startOverall}ms`);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;output:&lt;/p&gt;

&lt;p&gt;With High Thread Pool Size (100 threads)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Task 1 completed in 100ms
Task 2 completed in 98ms
Task 3 completed in 105ms
Task 4 completed in 95ms
Task 5 completed in 120ms
Task 6 completed in 130ms
Task 7 completed in 135ms
Task 8 completed in 140ms
Task 9 completed in 125ms
Task 10 completed in 150ms
Total execution time: 700ms
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, the following output is when we set the thread pool size as 4 (default size).&lt;/p&gt;

&lt;p&gt;With Default Thread Pool Size (4 threads)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Task 1 completed in 100ms
Task 2 completed in 98ms
Task 3 completed in 105ms
Task 4 completed in 95ms
Task 5 completed in 100ms
Task 6 completed in 98ms
Task 7 completed in 102ms
Task 8 completed in 104ms
Task 9 completed in 106ms
Task 10 completed in 99ms
Total execution time: 600ms
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can see that the total execution time has a 100ms difference. the total execution time ( thread pool size 4) is 600ms and the total execution time (thread pool size 100) is 700ms. so, a thread pool size of 4 is taking less time. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why the high number of threads != more tasks can be processed concurrently?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first reason is that each thread has its own stack and resource requirement. If you increase the number of threads then ultimately it leads to out-of-memory or CPU resources conditions.&lt;/p&gt;

&lt;p&gt;The second reason is that Operating systems have to schedule threads. If there are too many threads, the OS will spend a lot of time switching between them (context switching), which adds overhead and slows down the performance instead of improving it.&lt;/p&gt;

&lt;p&gt;Now, we can say that it is not about increasing the thread pool size to achieve scalability and high performance but, It’s about using the right architecture, such as clustering, and understanding the task nature (I/O vs CPU-bound) and how Node.js’s event-driven model works.&lt;/p&gt;

&lt;p&gt;Thank You for reading.&lt;/p&gt;

</description>
      <category>node</category>
      <category>backend</category>
      <category>javascript</category>
      <category>architecture</category>
    </item>
    <item>
      <title>REST Api</title>
      <dc:creator>Om Vaja</dc:creator>
      <pubDate>Fri, 04 Oct 2024 10:37:49 +0000</pubDate>
      <link>https://forem.com/om_vaja/rest-api-1pjg</link>
      <guid>https://forem.com/om_vaja/rest-api-1pjg</guid>
      <description>&lt;h2&gt;
  
  
  What is REST API?
&lt;/h2&gt;

&lt;p&gt;REST stands for Representational State Transfer. To, understand REST Api we have to understand the meaning of Representational and State Transfer words in REST Api.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Representational: At the end of the day, the client and server send or receive some data. These data or resources are represented in a format such as JSON, XML, or HTML. For example, if a client requests information about a user, the server might return a JSON representation of the user's data.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;State Transfer: When a client requests a resource from the server, the server sends the current state of that resource. The current state of the resource means up-to-date data. The client can then modify that resource by sending updates, and the server will save the new state.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  REST API Constraints
&lt;/h2&gt;

&lt;p&gt;It is an architecture pattern for designing network applications. It's helpful to increase scalability, performance, and maintainability.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Client-Server Architecture:&lt;/strong&gt;&lt;br&gt;
Client-server architecture means the client sends the request to the server and the server responds to this request. The client is responsible for the user interface and the server is responsible for application logic and data handling.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Stateless:&lt;/strong&gt;&lt;br&gt;
There are two types of communication. 1. Stateless and 2. stateful&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Stateless means the server is not storing any information about the client or any previous requests. Each request is treated as an independent request.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Stateful means the server stores the information about the client across multiple requests.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;REST Api should be stateless means each request from the client to the server must contain all the necessary information to understand and process the request. If the client wants to get user data then the get request will be sent and the client has to send the user id with the request if the client wants to update the user data then a put or patch request will be sent and again client has to send the user id with the request. so basically, each request is independent no data sharing should be there between requests.&lt;/p&gt;

&lt;p&gt;3.&lt;strong&gt;Cacheability&lt;/strong&gt;:&lt;br&gt;
Responses can include information to tell the client whether the response can be cached and for how long, making repeated data access more efficient.&lt;/p&gt;

&lt;p&gt;When the server sends a response, it can include headers like:&lt;/p&gt;

&lt;p&gt;-Cache-Control: max-age=600: This tells the client that the response can be cached for 600 seconds (10 minutes).&lt;/p&gt;

&lt;p&gt;-ETag: "abc123": This is a unique identifier for the version of the resource. The client can use this to check if the data has changed before asking the server again.&lt;/p&gt;

&lt;p&gt;4.&lt;strong&gt;Layered System&lt;/strong&gt;:&lt;br&gt;
The architecture can consist of multiple layers (such as security, caching, and load balancing) between the client and server. Each layer operates independently, unaware of other layers.&lt;/p&gt;

&lt;p&gt;A proxy or load balancer might be placed between the client and the server to distribute incoming requests without the client being aware of it.&lt;/p&gt;

&lt;p&gt;5.&lt;strong&gt;Uniform Interface&lt;/strong&gt;:&lt;br&gt;
Uniform Interface means there are some standardized way for communication between the client and the server.&lt;/p&gt;

&lt;p&gt;There are four principles:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Resource Identification:&lt;br&gt;
Each resource should be identified uniquely by a URI, such as '/users/123' or '/products/44'.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Manipulation of Resources Through Representations:&lt;br&gt;
Clients send representations of resources when they want to create or modify them, and the server responds with representations when they retrieve or request resources.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;REST uses standard HTTP methods (like GET, POST, PUT, DELETE) to perform actions on resources.&lt;/p&gt;

&lt;p&gt;GET: Retrieve data.&lt;br&gt;
POST: Create new data.&lt;br&gt;
PUT: Update existing data.&lt;br&gt;
DELETE: Remove data.&lt;/p&gt;

&lt;p&gt;3.Self-descriptive Message:&lt;br&gt;
Every message between the client and server contains all the information to understand what’s happening. This includes status codes like 200 OK or 404 Not Found and the format like JSON or XML.&lt;/p&gt;

&lt;p&gt;4.Hypermedia as the Engine of Application State(HATEOAS):&lt;br&gt;
HATEOAS means including links in the response. The server helps the client know what to do next by including links in the response.&lt;/p&gt;

&lt;p&gt;Just like if you want to see the followers, follow this link.&lt;/p&gt;

&lt;p&gt;If you fetch the GitHub response you will see the following response.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkt9511aw0wz8jxm5iea.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkt9511aw0wz8jxm5iea.png" alt="Image description" width="800" height="76"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;and if you check one of the user data then you will see links that guide the user if he wants to check followers or any other information.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdpzsu9n1dxkeim3fknb3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdpzsu9n1dxkeim3fknb3.png" alt="Image description" width="580" height="55"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In short, REST APIs are a powerful way to structure communication between clients and servers. By following principles like statelessness, and uniform interfaces, and features like HATEOAS, you can design scalable and flexible systems.&lt;/p&gt;

&lt;p&gt;Thank You.&lt;/p&gt;

</description>
      <category>restapi</category>
      <category>api</category>
      <category>backend</category>
      <category>backenddevelopment</category>
    </item>
  </channel>
</rss>
