<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Alahira Jeffrey Calvin</title>
    <description>The latest articles on Forem by Alahira Jeffrey Calvin (@alahirajeffrey).</description>
    <link>https://forem.com/alahirajeffrey</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/alahirajeffrey"/>
    <language>en</language>
    <item>
      <title>How To Implement Caching in Express Using Redis and Docker-Compose</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Fri, 17 May 2024 19:04:27 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/how-to-implement-caching-in-express-using-redis-and-docker-compose-3fc1</link>
      <guid>https://forem.com/alahirajeffrey/how-to-implement-caching-in-express-using-redis-and-docker-compose-3fc1</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;As the number of users of an application increases, the load on that application increases. This often leads to an increase in latency, which is the amount of time the application takes to return a response to a user. This is particularly prevalent in applications that persist data, as every request made by a client has to be processed by the servers, followed by appropriate queries to a database. The higher the number of requests, the more work the servers and databases have to do, leading to increased response times.&lt;/p&gt;

&lt;p&gt;Caching helps reduce the latency of applications by storing frequently accessed data in memory. This ensures that not every request ends up as a network call to the database, as servers can simply access data from the cache if it was stored there prior to the request. In this article, we will explore how to implement caching in an Express application using Redis.&lt;/p&gt;

&lt;h2&gt;
  
  
  Redis
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Redis&lt;/strong&gt; stands for &lt;strong&gt;REmote Dictionary Server&lt;/strong&gt; is an open-source, in-memory key-value data store which is known for its speed, versatility, and robust caching capabilities. It supports various data types like strings, hashes, lists, sets, and more.&lt;/p&gt;

&lt;h2&gt;
  
  
  Redis: Other Use Cases
&lt;/h2&gt;

&lt;p&gt;Aside it's use as a database cache, redis also has other uses cases such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Session Storage&lt;/strong&gt;: Sessions can be used in web applications to maintain state and store user-specific data such as recent actions, personal information, and more across multiple requests. Redis's in-memory nature makes it an excellent choice for storing session data, and providing fast read and write operations.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Pub/Sub Messaging System&lt;/strong&gt;: A messaging system allows for real-time communication between different parts of an application or even between different applications. Redis can act as a message broker and allows for the creation of a channel where publishers can send messages which are later consumed by consumers for further processing.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Rate Limiting&lt;/strong&gt;: It is often important to control the rate of requests made by clients to an API service in order to protect that API from abuse and attacks. Redis can be used to log the number of times a user makes a request to an API and when the number of requests exceeds a certain threshold, an error can be sent to the user. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Setting up Redis as a Cache in Express
&lt;/h2&gt;

&lt;p&gt;To follow along with this tutorial, you are going to have to node, docker and docker-compose installed on your computer as well as have a basic understanding of each of them. You'll also need to have a basic understanding of the express framework.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Open your terminal and create a folder for the application using&lt;br&gt;
&lt;code&gt;mkdir express-redis-cache&lt;/code&gt; or whatever you want to name your application. Navigate to the folder in the terminal by typing &lt;code&gt;cd express-redis-cache&lt;/code&gt;. Initialize an npm project using &lt;code&gt;npm init -y&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We are going to be running redis in a docker container so we are going to create a docker-compose file to hold the configurations. Create a docker-compose file at the roof of the folder using the command &lt;code&gt;touch compose.yml&lt;/code&gt; and then copy the code below and paste.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: "3.8"

services:
  redis:
    image: redis
    ports:
      - 6379:6379
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;All we are simply doing here is specifying the docker-compose version in the first line and under the services section, we are listing the services  (containers) that will be part of the application. Each service represents a container that will be managed by Docker Compose. In this file, we are listing redis as a service and specifying that the redis container should run on port 6379 on our machine as well as port 6379 on the redis container. To start the redis container, we simply run the command &lt;code&gt;docker compose up -d&lt;/code&gt;. We use the &lt;code&gt;-d&lt;/code&gt; flag to run the container in a detached mode i.e. the container does not stop after we close the terminal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;In the next step, we are going to install our dependencies. We simply do that by running &lt;code&gt;npm install axios express redis nodemon&lt;/code&gt;. Ideally, we would install nodemon as a dev dependency but we'll ignore it in this case.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After installation, open the package.json file and add the scripts to run the application. Your script section should look like the image below.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfnt6y223xqtehx1n2c3.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdfnt6y223xqtehx1n2c3.PNG" alt="package.json script section"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We use nodemon so that whenever we make a change, the express server is restarted with our changes automatically and as such, we do not have to manually restart the server every time for our changes to take effect.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Next, create the app.js file by running the command &lt;code&gt;touch app.js&lt;/code&gt; in the terminal. Open the file and follow along.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;First, we import the necessary packages as below.&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const express = require("express");
const { createClient } = require("redis");
const axios = require("axios");
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We'll be making use of axios to make a request to &lt;code&gt;https://swapi.dev/api&lt;/code&gt; which is an API that has information on various elements in start wars. The goal of the tutorial is to build an application to make a request to the API above and cache the data. Subsequent requests for that particular data would then be made to the cache instead of the API.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy and paste the code below after importing the appropriate packages. &lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const app = express();
const port = 3000;
const startWarsUrl = "https://swapi.dev/api";
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here we are simply creating an instance of the express application, defining the port in which the express application would listen and saving the URL to the API in a constant.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After doing the above, we are going to be instantiating the redis client and then as per best practices by the redis documentation we are going to be listening for the error and connection events.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// connect to redis db on localhost
const redisClient = createClient();

// listen for errors
redisClient.on("err", (err) =&amp;gt; {
  console.log("redis client error", err);
});

// listen for successful connection
redisClient.connect().then(() =&amp;gt; {
  console.log("redis connected successfully");
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;The function below would be responsible for fetching data from the API. it takes a parameter titled &lt;code&gt;dataToFetch&lt;/code&gt; and in the first line in the function, you can see the data available to you in the form of a comment. You can navigate to the URL for more information.&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const fetchStarWarsData = async (dataToFetch) =&amp;gt; {
  // data to fetch can be people, planets, films, species, vehicles and starships
  const response = await axios.get(`${startWarsUrl}/${dataToFetch}`);

  return response.data;
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;We are then going to create an endpoint that allows us to pass a parameter and then use the fetchStarWarsData function to fetch the data from the API if it is not cached in redis.&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app.get("/star-wars/:dataToFetch", async (req, res) =&amp;gt; {
  try {
    const dataToFetch = req.params.dataToFetch;

    // check if data is cached and return cached data
    const cacheResult = await redisClient.get(dataToFetch);
    if (cacheResult) {
      const parsedResult = JSON.parse(cacheResult);

      return res.status(200).json({ isCached: true, data: parsedResult });
    }

    // fetch data from api and cache if data is not already cached
    result = await fetchStarWarsData(dataToFetch);
    await redisClient.set(dataToFetch, JSON.stringify(result), {
      EX: 300,
      NX: true,
    });

    return res.status(200).json({ isCached: false, data: result });
  } catch (error) {
    return res.status(500).json({ message: error.message });
  }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The &lt;code&gt;EX&lt;/code&gt; parameter specified in the line to cache the result simply sets the time in seconds that data should be saved in the cache. While the &lt;code&gt;NX&lt;/code&gt; parameter when set to true, implies that redis should only cache data that does not already exist.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Finally all we have to do is listen for connections using the code below&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app.listen(port, () =&amp;gt; {
console.log(`server is listening on port ${port}`);
});
&lt;/code&gt;&lt;/pre&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We then run the command &lt;code&gt;npm run dev&lt;/code&gt; to run the app in using the nodemon package or run the command &lt;code&gt;npm run start&lt;/code&gt; to run the app using node.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To test the application, open Postman or any API testing software of your choice and make a &lt;code&gt;GET&lt;/code&gt; request to the endpoint localhost:3000/star-wars/planets or localhost:3000/star-wars/films or whatever data you want to fetch. You can visit the API to be sure. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Your first response should look like the one below.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F65c07brblifjhvui3z86.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F65c07brblifjhvui3z86.PNG" alt="Uncached response"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Make the same request again and you should get a response like the one below.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frlx4et3c9s2nwpbigqkd.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frlx4et3c9s2nwpbigqkd.PNG" alt="Cached response"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Notice the &lt;code&gt;isCached&lt;/code&gt; field in both of the returned responses. In the first response, it was false as we were making a request to the external API and in the second response, it was true as we were getting the result from the database cache. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Overall, your code should look like the one below:&lt;/p&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const express = require("express");&lt;br&gt;
const { createClient } = require("redis");&lt;br&gt;
const axios = require("axios");

&lt;p&gt;const app = express();&lt;br&gt;
const port = 3000;&lt;br&gt;
const startWarsUrl = "&lt;a href="https://swapi.dev/api" rel="noopener noreferrer"&gt;https://swapi.dev/api&lt;/a&gt;";&lt;/p&gt;

&lt;p&gt;// connect to redis db on localhost&lt;br&gt;
const redisClient = createClient();&lt;/p&gt;

&lt;p&gt;// listen for errors&lt;br&gt;
redisClient.on("err", (err) =&amp;gt; {&lt;br&gt;
  console.log("redis client error", err);&lt;br&gt;
});&lt;/p&gt;

&lt;p&gt;// listen for successful connection&lt;br&gt;
redisClient.connect().then(() =&amp;gt; {&lt;br&gt;
  console.log("redis connected successfully");&lt;br&gt;
});&lt;/p&gt;

&lt;p&gt;const fetchStarWarsData = async (dataToFetch) =&amp;gt; {&lt;br&gt;
  // data to fetch can be people, planet, films, species, vehicles and startships&lt;br&gt;
  const response = await axios.get(&lt;code&gt;${startWarsUrl}/${dataToFetch}&lt;/code&gt;);&lt;/p&gt;

&lt;p&gt;return response.data;&lt;br&gt;
};&lt;/p&gt;

&lt;p&gt;app.get("/star-wars/:dataToFetch", async (req, res) =&amp;gt; {&lt;br&gt;
  try {&lt;br&gt;
    const dataToFetch = req.params.dataToFetch;&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// check if data is cached and return cached data
const cacheResult = await redisClient.get(dataToFetch);
if (cacheResult) {
  const parsedResult = JSON.parse(cacheResult);

  return res.status(200).json({ isCached: true, data: parsedResult });
}

// fetch data from api and cache if data is not already cached
result = await fetchStarWarsData(dataToFetch);
await redisClient.set(dataToFetch, JSON.stringify(result), {
  EX: 300,
  NX: true,
});

return res.status(200).json({ isCached: false, data: result });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;} catch (error) {&lt;br&gt;
    return res.status(500).json({ message: error.message });&lt;br&gt;
  }&lt;br&gt;
});&lt;/p&gt;

&lt;p&gt;app.listen(port, () =&amp;gt; {&lt;br&gt;
  console.log(&lt;code&gt;server is listening on port ${port}&lt;/code&gt;);&lt;br&gt;
});&lt;br&gt;
&lt;/p&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Conclusion&lt;br&gt;
&lt;/h2&gt;

&lt;p&gt;In this article, we built an application using Node.js, Express, Docker Compose, and Redis to demonstrate how to fetch data from an API, cache it, and return the cached data as a response to the client. By implementing caching, subsequent requests for the same data are served from the cache instead of making repeated API calls, significantly improving response times and reducing server load. While this tutorial covers the basics, the principles discussed here provide a foundation for more advanced caching strategies and optimizations in your applications.&lt;/p&gt;

&lt;p&gt;If you enjoyed this article, kindly leave a like. It would be much appreciated. Thanks...&lt;/p&gt;

</description>
      <category>redis</category>
      <category>express</category>
      <category>backenddevelopment</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>From Skeptic to Believer: My Docker Story</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Wed, 13 Mar 2024 08:29:08 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/from-skeptic-to-believer-my-docker-story-1hof</link>
      <guid>https://forem.com/alahirajeffrey/from-skeptic-to-believer-my-docker-story-1hof</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;The first time I heard about Docker was about 2 years ago. Over the last two years, I have read various articles and watched various tutorials on how to use Docker but have never really felt the need to install or use it. I however started a new job as a backend developer a month ago and docker became an essential part of my workflow so I had no option but to download and set it up.&lt;/p&gt;

&lt;p&gt;In this article, I will share my brief experience with Docker. I will also delve into key Docker concepts, such as images and containers, and highlight some of the most useful Docker commands I've come across and used so far.&lt;/p&gt;

&lt;h2&gt;
  
  
  My Experience with Docker So Far
&lt;/h2&gt;

&lt;p&gt;Aside from the initial hurdle of setting up Windows Subsystem for Linux (wsl) as well as a few issues with authentication on docker hub, Using docker has been quite pleasant. So pleasant that I currently do not have any database installed locally anymore. A week into playing around with docker, I uninstalled Postgres, MongoDB, MySQL, and Neo4j. Whenever I need a database running locally now, I just create a container from one of the several images I downloaded locally.&lt;/p&gt;

&lt;p&gt;Before I go further, I would also like to say that one of the major issues I had was distinguishing between Docker images and Docker containers. However, an explanation I read on the geeks for geeks website pretty much cleared it up. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;The concept of Image and Container is like a class and object, in which an object is an instance of a class, and a class is the blueprint of the object.&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;I know it is a little bit more complicated than that, but this simple explanation gave me the foundation I needed to understand the difference between both concepts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Other Important Docker Concepts
&lt;/h2&gt;

&lt;p&gt;Some other important docker concepts include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Dockerfile&lt;/strong&gt;: A dockerfile is a text document with instructions on how to build a docker image. A dockerfile is like a recipe that defines all the steps needed to create a Docker image, such as installing dependencies, setting up environment variables, copying files, and configuring the container. An example of a dockerfile that defines instructions to create a Docker image for a Python application, installing dependencies, exposing port 8080, setting an environment variable, and running the application when the container starts is shown below.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Use a base image
FROM python:3.9-slim

# Set the working directory in the container to /app
WORKDIR /app

# Copy the current directory contents into the container at /app
COPY . /app

# Install dependencies from requirements.txt
RUN pip install --no-cache-dir -r requirements.txt

# Expose port 8080
EXPOSE 8080

# Define environment variable
ENV ENVIRONMENT production

# Run the application when the container launches
CMD ["python", "app.py"]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Docker Daemon&lt;/strong&gt;: This is the background process that is responsible managing docker objects such as images, containers, networks, and volumes. The Docker daemon exposes an API that allows users and client applications to issue commands and interact with Docker.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  My Most Used Docker Commands
&lt;/h2&gt;

&lt;p&gt;So we have come to arguably the most important part of this article. Here I am going to be listing my most used or important Docker commands. &lt;/p&gt;

&lt;p&gt;Side Note: This is not an exhaustive list.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker --version&lt;/code&gt;: Displays the current version of Docker installed on your system. It's what I use to quickly check if docker is running.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker pull &amp;lt;image&amp;gt;&lt;/code&gt;: Downloads a Docker image to your local machine e.g. &lt;code&gt;docker pull mongo&lt;/code&gt; pulls the latest official mongo image from docker hub.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker image ls&lt;/code&gt;: Lists all Docker images currently stored on your local machine.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker ps&lt;/code&gt;: Lists all running Docker containers.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker ps -a&lt;/code&gt;:: Lists all Docker containers, including those that are not currently running.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker start &amp;lt;container_name&amp;gt;&lt;/code&gt; or &lt;code&gt;docker start &amp;lt;container_id&amp;gt;&lt;/code&gt;:  Starts a stopped Docker container with the specified container name or ID.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker stop &amp;lt;container_name&amp;gt;&lt;/code&gt; or &lt;code&gt;docker start &amp;lt;container_id&amp;gt;&lt;/code&gt;:  Stops a running Docker container with the specified container name or ID.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker run --name postgres_db -p 5432:5432 -e POSTGRES_PASSWORD=password postgres -d&lt;/code&gt;: Creates and starts a new Docker container named "postgres_db" running PostgreSQL and exposing port 5432 on the host machine, sets the environment variable POSTGRES_PASSWORD to "password", and runs the container in detached mode (-d). &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;NB:&lt;/strong&gt; Running a container in detached mode means that the container runs in the background, and the terminal prompt is returned to you immediately after starting the container. This allows you to continue using the terminal for other tasks while the container runs in the background.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;docker run --name mongo_db -p 27017:27017 mongo -d&lt;/code&gt;: Creates and starts a new Docker container named "mongo_db" running MongoDB and exposing port 27017 on the host machine, and runs the container in detached mode (-d). &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In conclusion, my journey with Docker has been nothing short of interesting. What initially seemed like a daunting tool to learn has quickly become an essential part of my development workflow. &lt;/p&gt;

&lt;p&gt;As I continue to explore and expand my knowledge of Docker, I am excited about the possibilities it presents. I look forward to incorporating Docker into more projects and leveraging its capabilities to build robust and scalable applications.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>containers</category>
    </item>
    <item>
      <title>Exploring Neo4j: A Beginners Guide to Graph Databases</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Sat, 20 Jan 2024 21:46:37 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/exploring-neo4j-a-beginners-guide-to-graph-databases-24n3</link>
      <guid>https://forem.com/alahirajeffrey/exploring-neo4j-a-beginners-guide-to-graph-databases-24n3</guid>
      <description>&lt;h2&gt;
  
  
  Introduction to Graph Databases
&lt;/h2&gt;

&lt;p&gt;Graph databases are a type of NoSQL database that represent and store data in a graph structure i.e. data is stored in the form of nodes, relationships, and properties as opposed to standard SQL databases which store data in tables as rows and columns, and standard NoSQL databases which store data as key-value pairs. &lt;/p&gt;

&lt;p&gt;This simply means that in graph databases data is stored without being restricted to a predefined model or structure as in the case of other forms of databases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Graph Database Concepts
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Nodes&lt;/strong&gt;: these are used to represent entities or objects in a graph. These entities often hold a number of key-value pairs or properties which contain information about the entity. Nodes can be user to represent a person, place, or thing. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Relationships&lt;/strong&gt;: these help to define connections between nodes and show how various nodes are related to each other.  In graph databases, relationships are first-class citizens and are as important as nodes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Properties&lt;/strong&gt;: these are the key-value pairs associated with nodes or relationships between nodes. They help store additional information and context about entities or connections in a graph.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Below is the image of a graph with three nodes (A, B, and C) which have three relationships between one another (the arrows).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkxfvqdx2x67f7r0gu73.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkxfvqdx2x67f7r0gu73.jpg" alt="Graph with 3 nodes and 3 relationships" width="455" height="414"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Graph Databases
&lt;/h2&gt;

&lt;p&gt;Choosing a graph database over other forms of databases depends on a variety of factors such as the type of data as well as specific requirements of the application being built. However, here are some reasons for using graph databases:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Strength in Relationship&lt;/strong&gt;: Graph databases excel at representing and querying relationships between entities. With traditional databases, relationship queries are often very expensive and time-consuming. Therefore, If a data model involves complex and interconnected relationships, a graph database would be a natural fit.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Flexibility&lt;/strong&gt;: Graph databases often have a schema-less design therefore allowing for flexible data modeling. This flexibility is beneficial as it allows for easy modification of nodes and relationships without the need for a rigid schema.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Scalability&lt;/strong&gt;: Many graph databases support horizontal scalability. This means as your dataset grows, you can distribute the graph across multiple servers, ensuring performance and scalability for connected data scenarios.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Common Use Cases of Graph Databases
&lt;/h2&gt;

&lt;p&gt;Graph databases are particularly suitable for specific use cases such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Social networks&lt;/strong&gt;: Graph databases are well-suited for modeling and querying social relationships within a network. In a social network, individuals (nodes) are connected by various relationships such as friendships, and follows. Graph databases with their efficiency in traversing and analyzing relationships can make it easier to retrieve information about a user's friends, connections, and overall network structure.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Fraud detection&lt;/strong&gt;: Graph databases can help in uncovering patterns and anomalies within interconnected data. In fraud detection, transactions or entities can be represented as nodes, and relationships between them can indicate various interactions. Graph algorithms can identify suspicious patterns, such as unusual connections, multiple accounts linked to a single entity, or unexpected transaction flows. This makes graph databases effective in detecting fraudulent activities that might be challenging for traditional databases to identify.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Recommendation engines&lt;/strong&gt;: Graph databases can help in building recommendation engines because they can model complex relationships between users and items. Nodes can represent users and items, and relationships can signify user preferences, purchases, or interactions. Graph-based recommendation engines can then leverage these relationships to provide personalized suggestions, uncovering connections between users with similar preferences or behaviors.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Network analysis&lt;/strong&gt;: Graph databases are essential tools for analyzing and understanding complex networks, which can represent a variety of systems, including computer networks, transportation networks, or organizational structures. Nodes can represent entities (e.g., computers, locations, employees), and relationships can denote connections or interactions between these entities. Analyzing the structure of the graph can reveal insights such as central nodes, clusters, or potential bottlenecks.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In summary, if your application has a primary focus on relationships as well as an adaptable and or schema-less design a graph database would be a good choice. &lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started with graph databases
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;There are a plethora of of graph databases however, one of the most popular ones is &lt;a href="https://github.com/neo4j/neo4j" rel="noopener noreferrer"&gt;neo4j&lt;/a&gt;. You can start working with neo4j by following the commands in the &lt;a href="https://github.com/neo4j/neo4j" rel="noopener noreferrer"&gt;link&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Features of Neo4j:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Cypher Query Language&lt;/strong&gt;: Neo4j uses Cypher, a declarative query language specifically designed for graph databases. It simplifies expressing complex graph patterns. Cypher's syntax is designed to be readable by humans and very much expressive, making it easy to write and understand queries. Whether you're retrieving data, creating nodes and relationships, or performing complex graph traversals, Cypher provides a powerful and intuitive language for interacting with graph databases.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;ACID Compliance&lt;/strong&gt;: Neo4j ensures Atomicity, Consistency, Isolation, and Durability, making it suitable for transactional applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Scalability&lt;/strong&gt;: Neo4j can scale horizontally to accommodate growing datasets and increasing workloads.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To illustrate the concepts learnt, let's create a simple graph modeling social relationships:&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating Nodes
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE (:Profile {first_name: 'Jeffrey',last_name:'Calvin',  age: 30})
CREATE (:Profile {first_name: 'Daniel',last_name:'Alahira',  age: 25})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This Cypher query above creates two nodes representing profiles with first names (Jeffrey and Daniel) and last names (Calvin and Alahira)&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating Relationships
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;MATCH (a:Profile {first_name: 'Jeffrey'}), (b:Profile {first_name:'Daniel'})
CREATE (a)-[:BROTHERS]-&amp;gt;(b)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This Cypher query above established a "BROTHERS" relationship between the two nodes created earlier.&lt;/p&gt;

&lt;h2&gt;
  
  
  Querying the Graph
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;MATCH (a:Profile {first_name: 'Jeffrey'})-[:BROTHERS]-&amp;gt;(brother)
RETURN brother.first_name,brother.last_name
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Thie Cypher query above finds all brothers of a profile with the first name Jeffrey.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Neo4j provides a powerful platform for handling interconnected data through its graph database capabilities. Whether you're creating social networks, detecting fraud, or fine-tuning recommendation engines, Neo4j empowers you to navigate the complexities of data with high relationships.&lt;/p&gt;

</description>
      <category>database</category>
    </item>
    <item>
      <title>Step by Step Guide to Authentication with JSON Web Tokens (JWT) with express and passport</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Wed, 04 Oct 2023 11:42:36 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/how-to-implement-json-web-authentication-jwt-authentication-with-passport-in-express-39jg</link>
      <guid>https://forem.com/alahirajeffrey/how-to-implement-json-web-authentication-jwt-authentication-with-passport-in-express-39jg</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Authentication is a crucial part of web development. It is one of the ways developers ensure the security of an application. Though authentication and authorization are often used interchangeably, they are not the same thing. Authentication is simply the process of determining who a user is while authorization on the other hand is the process of determining if a user has access to a particular resource he/she is requesting.&lt;/p&gt;

&lt;p&gt;JSON Web Tokens have become a popular choice for implementing authentication in web applications due to their simplicity, ease of implementation, and effectiveness.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before we begin, ensure you have Node.js and npm (Node Package Manager), MongoDB, and Postman installed locally. A little knowledge of the tools listed above is also required.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project setup
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Start by creating a directory for the project. Open your terminal and type the command &lt;code&gt;mkdir passport-jwt-express-auth&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Navigate to the directory using the command &lt;code&gt;cd passport-jwt-express-auth&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Initialize a node project using the command &lt;code&gt;npm init -y&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install the dependencies by running &lt;code&gt;npm install express jsonwebtoken nodemon passport passport-jwt bcrypt dotenv mongoose&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;bcrypt&lt;/code&gt; would be used to hash the user passwords, &lt;code&gt;jsonwebtoken&lt;/code&gt; would be used for signing tokens, &lt;code&gt;passport-jwt&lt;/code&gt; would be used for retrieving and verifying tokens, &lt;code&gt;nodemon&lt;/code&gt; would be used automatically restart the server during development, &lt;code&gt;dotenv&lt;/code&gt; would be used to access environment variables, &lt;code&gt;mongoose&lt;/code&gt; would be used as the MongoDB ODM and &lt;code&gt;express&lt;/code&gt; would be used create the server.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a &lt;code&gt;.env&lt;/code&gt; file at the root of the project to hold the project environment variables. The main environment variables we would use in the project would be &lt;code&gt;JWT_SECRET&lt;/code&gt;, &lt;code&gt;PORT&lt;/code&gt;, and &lt;code&gt;MONGO_URI&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;JWT_SECRET&lt;/code&gt; would hold the secret which our application would use to sign tokens, &lt;code&gt;PORT&lt;/code&gt; would hold the port number in which we would serve our application, &lt;code&gt;MONGO_URI&lt;/code&gt; would hold the link to the Mongodb database.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open the &lt;code&gt;package.json&lt;/code&gt; file and add the following. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;"type": "module"&lt;/code&gt; This would enable you to use es6 imports in the project. Next, add the following under the scripts section&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"start:dev": "nodemon src/app.js",
"start:prod": "node src/app.js"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These scripts would be used to start the express server.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create a folder to hold the source code using &lt;code&gt;mkdir src&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Navigate to the folder using the command &lt;code&gt;cd src&lt;/code&gt; and create four files and name them &lt;code&gt;app.js&lt;/code&gt;, &lt;code&gt;passport.js&lt;/code&gt;, &lt;code&gt;model.js&lt;/code&gt;, and &lt;code&gt;auth.js&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The &lt;code&gt;auth.js&lt;/code&gt; file would hold the authentication routes, the &lt;code&gt;passport.js&lt;/code&gt; file would hold the configurations for passport, the &lt;code&gt;model.js&lt;/code&gt; file would hold the user model and the &lt;code&gt;app.js&lt;/code&gt; file would hold the code to bootstrap the server.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Setting up the express server
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Open the &lt;code&gt;app.js&lt;/code&gt; file and type the code below.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import express from "express";
import authRouter from "./auth.js";
import dotenv from 'dotenv';

// configure dotenv to access environment variables
dotenv.config();

const PORT = process.env.PORT || 3000;

// setup express server
const server = express();

server.use(express.json());
server.use("/api/v1/auth", authRouter);

// listen for connections
server.listen(PORT, () =&amp;gt; {
  console.log(`server is listening on port ${PORT}`);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code above sets up the express server and attaches the authentication routes (we have yet to create them) to the server. &lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up the database and schema
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Open the &lt;code&gt;model.js&lt;/code&gt; file and create the user schema. The schema establishes the fields and types of data to be stored in the Mongodb database. Create the user schema by typing the code below.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const mongoose = require('mongoose');
const Schema = mongoose.Schema;

const UserSchema = new Schema({
  email: {
    type: String,
    required: true,
    unique: true
  },
  password: {
    type: String,
    required: true
  }
});

const UserModel = mongoose.model('user', UserSchema);

module.exports = UserModel;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The schema above specifies that the database would have an email and password field which would both have String types and be required. The &lt;code&gt;unique: true&lt;/code&gt; property in the email field ensures that the database does not store two similar emails. The Mongoose library takes the schema and converts it to a model that would be used to perform CRUD actions later on.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open the &lt;code&gt;app.js&lt;/code&gt; file and update it with the code below to enable the server to connect to the Mongodb database when the server starts up.&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import mongoose from "mongoose";

// MongoDB connection uri
const MONGO_URI =
  process.env.MONGO_URI ||
  "mongodb://127.0.0.1:27017/passport-jwt-express-auth";

// connect to MongoDB
mongoose
  .connect(MONGO_URI, {
    useNewUrlParser: true,
    useUnifiedTopology: true,
  })
  .then(() =&amp;gt; {
    console.log("successfully connected to mongodb");
  })
  .catch((err) =&amp;gt; {
    console.log(err);
  });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;We first imported the Mongoose library and created the MONGO_URI variable to hold the connection string of our MongoDB database which was gotten by accessing the &lt;code&gt;MONGO_URI&lt;/code&gt; environment variable.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We then used the connection string to connect to the database. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Configuring passport and JWT
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Open the passport.js file and type the following code
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { ExtractJwt, Strategy } from "passport-jwt";
import passport from "passport";
const UserModel = require("./model.js");

const opts = {
  jwtFromRequest: ExtractJwt.fromAuthHeaderAsBearerToken(),
  secretOrKey: "secret",
};

passport.use(
  new Strategy(opts, async (payload, done) =&amp;gt; {
    try {
      const user = UserModel.findById(payload.id);
      if (user) return done(null, user);
    } catch (error) {
      return done(error);
    }
  })
);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code above uses the passport and the &lt;code&gt;passport-jwt&lt;/code&gt; strategy to extract the JSON Web Token from the request header and verifies using the JWT secret which can be gotten from the environment variables. If the token is valid, the ID of the user which is gotten from the token is then used to find and return the user's details from the database. &lt;/p&gt;

&lt;h2&gt;
  
  
  Implementing authentication routes
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;We would implement the register route first. It is important to note that we would not be using best practices as it is out of the scope of this post.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import express from "express";
import UserModel from "./model.js";

const authRouter = express.Router();

authRouter.post("/register", async (req, res, next) =&amp;gt; {
  try {
    const user = await UserModel.create({
      email: req.body.email,
      password: req.body.password,
    });

    return res.status(201).json({
      message: "user created",
      user: { email: user.email, id: user._id },
    });
  } catch (error) {
    console.log(error);
  }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the register route, we simply create the user and return a response that contains a message, the user's email, and id. It is important to note that in a real-world app, the password would be hashed as well as validations performed. In the next step, we'll create the login route.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;authRouter.post("/login", async (req, res, next) =&amp;gt; {
  try {
    //check if user exists
    const userExists = await UserModel.findOne({ email: req.body.email });
    if (!userExists)
      return res.status(400).json({ message: "user does not exist" });

    // check if password is correct
    if (userExists.password !== req.body.password)
      return res.status(400).json({ message: "incorrect password" });

    // generate access token
    const accessToken = jwt
      .sign(
        {
          id: userExists._id,
        },
        "secret",
        { expiresIn: "1d" }
      )

    return res
      .status(200)
      .json({ message: "user logged in", accessToken: accessToken });
  } catch (error) {
    console.log(error);
    next(error);
  }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the login route, we first check to see if the user exists and then check if the user's password is correct. After that, we sign an access token using the user's id, a secret key which would be stored in the .env file as well as specify a time limit for the access token to be valid. In the next code snippet, we would create a route to return the user's profile. This is the route that would be protected using JWT.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;authRouter.get("/profile", async (req, res, next) =&amp;gt; {
  try {
    // check if user exists
    const userExists = await UserModel.findOne({ email: req.body.email });
    if (!userExists)
      return res.status(400).json({ message: "user does not exist" });

    return res
      .status(200)
      .json({ userId: userExists._id, email: userExists.email });
  } catch (error) {
    console.log(error);
    next(error);
  }
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this route, all we simply do is to check if the user exists using the user's email and then return the user's id and email.&lt;/p&gt;

&lt;h2&gt;
  
  
  Protecting routes
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;To protect the route to get a user's profile, we first import passport and the passport strategy implemented earlier on into the routes file.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import passport from "passport";
import "./passport.js";
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;In the next step, we use the passport package to authenticate the route to get a profile and ensure only authenticated users can access a particular route. We update the route by adding the line &lt;code&gt;passport.authenticate("jwt", { session: false }),&lt;/code&gt;. Your code should be like the snippet below:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;authRouter.get(
  "/profile",
  passport.authenticate("jwt", { session: false }),
  async (req, res, next) =&amp;gt; {
   // leave as before
})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Testing
&lt;/h2&gt;

&lt;p&gt;The next step would involve testing the routes in the project.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Start the server by running the code &lt;code&gt;npm run start:dev&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Use Postman to create a user by navigating to the route &lt;code&gt;localhost:3000/api/v1/auth/register&lt;/code&gt;. Change the port number if yours is different.&lt;/li&gt;
&lt;li&gt;Login by navigating to the route &lt;code&gt;localhost:3000/api/v1/auth/register&lt;/code&gt; and passing your email and password&lt;/li&gt;
&lt;li&gt;Try accessing the profile route without passing the &lt;code&gt;bearer&lt;/code&gt; param which would hold the token of the user and an error is returned. However, if you include the &lt;code&gt;bearer&lt;/code&gt; param with the correct access token which is provided when you login, the route returns the user's id and email.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this tutorial, we learned the importance of authentication and how to implement a basic authentication system using JSON Web Tokens (JWT) and Passport in a Node.js application.&lt;/p&gt;

&lt;p&gt;We started by setting up the project, installing necessary dependencies, and configuring our Express server to handle authentication routes. We also established a MongoDB database schema for user data and integrated Passport for JWT authentication.&lt;/p&gt;

&lt;p&gt;It is important to note that while this tutorial provides a solid foundation, real-world authentication systems require additional features such as password hashing, input validation, error handling, and more comprehensive testing. Building upon this foundation, you can explore advanced authentication practices and further enhance the security of your web applications.&lt;/p&gt;

&lt;p&gt;Feel free to refer to the provided &lt;a href="https://github.com/alahirajeffrey/passport-jwt-express-auth" rel="noopener noreferrer"&gt;code&lt;/a&gt; and adapt it to your specific project needs.&lt;/p&gt;

</description>
      <category>express</category>
      <category>security</category>
    </item>
    <item>
      <title>Strategies for Deploying Web Applications to Production</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Tue, 29 Aug 2023 07:47:49 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/application-deployment-strategies-5cha</link>
      <guid>https://forem.com/alahirajeffrey/application-deployment-strategies-5cha</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;There are a variety of ways and techniques by which web applications are deployed to production. These techniques are called deployment strategies.&lt;/p&gt;

&lt;p&gt;The choice of deployment strategy can make or break a business as it has a direct impact on user experience. It is therefore imperative to pick a strategy that fits the needs of the application.&lt;/p&gt;

&lt;p&gt;There are a plethora of deployment strategies, some of the more common techniques include:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Continuous Deployment&lt;/strong&gt;: In this strategy, new versions of an application are immediately deployed to production as soon as they pass automated testing. This method allows for the latest code changes to be available to end users. This method is suitable for businesses that develop new features constantly and want to get them to production quickly. It however requires robust testing to work properly. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Blue/Green Deployment&lt;/strong&gt;: Here two versions of an application are deployed in two separate environments. The blue environment contains the stable version of the application, while the green environment contains the new version of the application.  Requests are initially routed to the blue environment until the green environment has been tested and found to be stable. Thereafter, requests are routed to the green environment at the load balancer level. In this strategy, the effect of a failed or unstable deployment is minimal however, it's an expensive strategy as it requires double resources as well as the possible issues of slow runouts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Canary deployment&lt;/strong&gt;: This deployment technique involves releasing the updated version of an application in increments i.e. to a subset of users at a time and then gradually increasing the number of users until all users are using the new version. This method is often used when testing is minimal and developers have little confidence in the stability of the new application.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Recreate deployment&lt;/strong&gt;: Here, the old application is simply shut down and the new application is then deployed in its place. This strategy is straightforward however, there will always be downtime while the old application is shut down and the new one is deployed. This can significantly impact user experience. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Shadow deployment&lt;/strong&gt;: In this strategy, both old and new applications are deployed. However, only output from the stable application is sent to users. The requests sent to the stable application are copied and sent to the new application. This allows the developers to monitor and compare the outputs and results from both applications. While this method allows the developers to test a new application live, it can be expensive and complex to set up. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ramped/ Rolling Deployment&lt;/strong&gt;: Much like the Canary deployment strategy, the new application is deployed in increments or gradually. However, unlike the Canary method where it is deployed to a subset of users, in this method the new application is gradually deployed to a subset or instances of servers. &lt;/p&gt;

&lt;p&gt;There are more strategies used by companies including phased deployment,  A/B testing, ring deployment, and feature flags among others and all have their merits and demerits. &lt;/p&gt;

&lt;p&gt;If I were to pick a strategy, I'd pick the recreate deployment strategy if my company was still a startup and very young and use monitoring to determine times when my users were least active to redeploy. Otherwise, I'd go for a blue/green strategy or continuous delivery strategy.&lt;/p&gt;

&lt;p&gt;However, the choice of deployment strategies to use is often dependent on a variety of factors such as resources available, the complexity of the project and end users among others. &lt;/p&gt;

&lt;p&gt;What strategy would you pick and why?&lt;/p&gt;

</description>
      <category>devops</category>
      <category>backend</category>
    </item>
    <item>
      <title>Scaling Nodejs Applications: An Introduction</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Mon, 28 Aug 2023 17:12:51 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/scaling-nodejs-applications-an-introduction-n39</link>
      <guid>https://forem.com/alahirajeffrey/scaling-nodejs-applications-an-introduction-n39</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Scaling is an important aspect of development, specifically web development as when a web application grows, its user base grows and the requests and traffic it receives also increase. It is thus essential to ensure your web application can handle the increased load without negatively affecting response time thereby impacting user experience. This article will provide an introduction to steps and strategies developers can take to scale Nodejs applications to meet the demands of an increased user base, number of requests or traffic, and by extension applications built using other technologies, languages, and frameworks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Vertical Scaling
&lt;/h2&gt;

&lt;p&gt;This is referred to as scaling up and is one of the simplest and easiest to implement methods of scaling and simply involves increasing the resources available to a server or increasing the hardware of a server. These resources can include CPU, memory, and storage. It is quite easy to do as most cloud providers allow users to vertical scale their servers with a few clicks of a button. It is important to note that there are limitations and hardware constraints to the amount of resources that can be increased on a single server.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Horizontal Scaling
&lt;/h2&gt;

&lt;p&gt;This is referred to as scaling out and involves increasing the number of servers responsible for hosting an application. Requests from users are then distributed across multiple servers ensuring that no single server is overwhelmed. It is a little bit more complex than vertical scaling as this method involves the use of a load balancer to distribute the load. &lt;/p&gt;

&lt;h2&gt;
  
  
  Load Balancing
&lt;/h2&gt;

&lt;p&gt;This method is often associated with horizontal scaling and involves setting up a server that distributes the load or incoming requests among a group of servers. There are several methods or algorithms load balancers use to distribute load.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Round Robin: This is the simplest and most commonly used method. Here requests are distributed across various servers in a cyclical or rotated manner.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Least Connections: In this method, requests are routed to the server with the least number of active connections.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Weighted Round Robin: In this method, requests are routed to servers based on weights. Servers with higher weights receive more requests than others. This is especially important when one of the servers has more resources than other servers.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Nginx is a popular web server that is often used for load balancing though it has numerous other applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  Database Caching
&lt;/h2&gt;

&lt;p&gt;Database caching is a method used to reduce the number of queries sent to a database thereby increasing performance and scalability. It is used to store frequently accessed in temporary memory. This reduces the number of queries a server makes to a database i.e. the server queries the database cache instead.&lt;/p&gt;

&lt;p&gt;Redis and Memcached are popular caching solutions for Nodejs. &lt;/p&gt;

&lt;h2&gt;
  
  
  Database Query Optimization
&lt;/h2&gt;

&lt;p&gt;Optimizing the queries made by web servers to a database is also a good way of improving the ability of an application to scale. There are several ways database queries can be optimized:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Reducing the number of database queries made for each request.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Data indexing: This is more suitable for web applications where read queries vastly outnumber write queries as even though database indexing improves read queries, it increases the time required for write queries to complete.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Making use of database read replicas: Here other databases are added. These databases contain snapshots of the main database and read requests are distributed among the various read replicas. The main database is usually the only database that receives write requests to maintain consistency.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Appropriate use of PATCH and PUT requests: A lot of developers believe both PATCH and PUT requests are the same. However, this is not true. While PUT requests are used to modify an entire resource, PATCH requests are used to partially modify a resource, and thus PATCH requests have smaller payloads.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Reducing API payload: Most API payloads are quite small, however, there are times when the user needs to SELECT all resources that match certain criteria and these can result in thousands to millions of responses. In such cases, it is often more performant to compress or reduce the size of the payload.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Employing a Microservice Architecture
&lt;/h2&gt;

&lt;p&gt;Using a microservice architecture involves breaking an application into smaller and more manageable components or services that can be handled independently. These services often focus on specific functionality e.g. in a social media platform, there can be individual services for authentication, user profiles, posting, messaging, handling user feeds, comments, etc. This ensures that even if one of these services fails or encounters a downtime, the remaining services work well. It also allows for deploying individual services in a way that does not affect other services, thereby improving overall scalability. It is important to note that this method also increases the complexities involved with managing the application as a whole.&lt;/p&gt;

&lt;h2&gt;
  
  
  Asynchronous Programming
&lt;/h2&gt;

&lt;p&gt;Nodejs uses an event-driven, non-blocking I/O model makes it well-suited for building scalable asynchronous applications. Using techniques such as callbacks, Promises, and async/await allows developers to develop asynchronous code, non-blocking in Nodejs. &lt;/p&gt;

&lt;h2&gt;
  
  
  Stateless Authentication
&lt;/h2&gt;

&lt;p&gt;Stateless authentication also known as token-based authentication is an authentication method where the server does not store details of a user's session. The server instead generates a unique token for a client. The client subsequently uses that token to authenticate itself and the server validates the token upon each request.&lt;/p&gt;

&lt;p&gt;This process reduces the workload on a server as the server does not need to store the session data which can lead to increased memory and processing overhead.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;There are various methods of scaling a database and no one method is better than the other and often a mix of more than one of these methods is required to scale an application properly. It is the job of the developers to determine the best method or methods and implement them appropriately.&lt;/p&gt;

</description>
      <category>node</category>
      <category>systemdesign</category>
      <category>backend</category>
    </item>
    <item>
      <title>Designing a Realtime Tracking Application for Arewaport</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Sat, 29 Jul 2023 10:42:24 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/designing-a-realtime-tracking-application-for-arewaport-6cn</link>
      <guid>https://forem.com/alahirajeffrey/designing-a-realtime-tracking-application-for-arewaport-6cn</guid>
      <description>&lt;h2&gt;
  
  
  Problem Statement
&lt;/h2&gt;

&lt;p&gt;Arewaport is a hypothetical transportation company which has its headquarters in the northern part of Nigeria. Customers have been complaining of drivers taking detours during interstate travels. The management at Arewaport would like you to design and build an application that would enable them keep track of the location of their vehicle and drivers during travels. The system should also provide functionality that would allow users to book and make online payments for their journeys.&lt;/p&gt;

&lt;h2&gt;
  
  
  Goals and Objectives
&lt;/h2&gt;

&lt;p&gt;The goal of this post is to lay out my thought process as well as how I would go about designing and building the project. &lt;/p&gt;

&lt;h2&gt;
  
  
  Functional Requirement
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Management should be able to track the location of riders.&lt;/li&gt;
&lt;li&gt;Users should be able to books journeys. &lt;/li&gt;
&lt;li&gt;Users should be able to make payments online.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Non-Functional Requirements
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The system should be scalable and efficient&lt;/li&gt;
&lt;li&gt;The system should have high availability &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Choice of Frameworks, Tools and Technologies
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Backend framework: &lt;code&gt;Nestjs&lt;/code&gt; : Due to its default typescript support and the fact that I enjoy working with it. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Mobile framework: &lt;code&gt;react-native&lt;/code&gt; : I already know a little bit of react so using react-native seems like a natural progression. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Database: &lt;code&gt;Postgres&lt;/code&gt;: Postgres because it's the default database I use for projects as well as the fact that it is highly extensible. Since the data is not highly relational, a Nosql database such as &lt;code&gt;Mongodb&lt;/code&gt; would also work fine.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Data Model Design
&lt;/h2&gt;

&lt;p&gt;The following tables would be used to reflect our requirement &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Driver: this would contain the information of the driver and would include fields such as &lt;code&gt;id&lt;/code&gt;, &lt;code&gt;name&lt;/code&gt;, &lt;code&gt;mobile_number&lt;/code&gt;, &lt;code&gt;password&lt;/code&gt;, &lt;code&gt;email&lt;/code&gt;,&lt;code&gt;date_of_birth&lt;/code&gt;, &lt;code&gt;created_at&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Vehicle: this would hold information of the vehicle being used by a driver during a journey and would hold fields such as &lt;code&gt;id&lt;/code&gt;, &lt;code&gt;registratin_number&lt;/code&gt;, &lt;code&gt;vehicle_model&lt;/code&gt;, &lt;code&gt;date_bought&lt;/code&gt;,&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;User: this would hold user related information and would include fields such as &lt;code&gt;id&lt;/code&gt;, &lt;code&gt;name&lt;/code&gt;, &lt;code&gt;mobile_number&lt;/code&gt;, &lt;code&gt;email&lt;/code&gt;, &lt;code&gt;password&lt;/code&gt;, &lt;code&gt;created_at&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Journey: this would hold information about each journey made by Arewaport drivers. It would also hold the current location of the driver/vehicle and would store information such as &lt;code&gt;id&lt;/code&gt;, &lt;code&gt;vehicle_id&lt;/code&gt;, &lt;code&gt;driver_id&lt;/code&gt;, &lt;code&gt;status&lt;/code&gt;, &lt;code&gt;start_point&lt;/code&gt;, &lt;code&gt;destination&lt;/code&gt;, &lt;code&gt;time_started&lt;/code&gt;, &lt;code&gt;time_completed&lt;/code&gt;, &lt;code&gt;passengers&lt;/code&gt;, &lt;code&gt;current_location&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Payment: this table would hold payment related information such as &lt;code&gt;id&lt;/code&gt;, &lt;code&gt;journey_id&lt;/code&gt;, &lt;code&gt;amount&lt;/code&gt;, &lt;code&gt;payer_id&lt;/code&gt;, &lt;code&gt;payment_method&lt;/code&gt;, &lt;code&gt;payer_name&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  API Design
&lt;/h2&gt;

&lt;p&gt;We would focus mainly on the most important endpoints.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Register user: This endpoint would handle user registration.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;registerUser(name: string, mobileNumber: string, email: string, password: string)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Register driver: This endpoint would handle driver registration. Only admins would be allowed to register a driver. A Nestjs guard would be used to ensure only admins can access this endpoint.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;registerDriver(name: string, mobileNumber: string, dateOfBirth: Date)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Login drivers: this endpoint would be used to login drivers.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;loginDriver(email: string, password: string)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Login users: this endpoint would be used to login users.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;loginUser(email: string, password: string)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Start a journey: this endpoint would be used by users to signify that they have started a journey
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;startJourney(driverId: string, startPoint: string, destination: string, vehicleId: string, numberOfPassengers: number)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;End a journey: this endpoint would be used by drivers to signify the end of a journey
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;endJourney(driverId: string, journeyId: string)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Book a journey: this endpoint would be used by users to make online bookings.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bookAJourney(userId: string, journeyId: string)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Make a payment: this endpoint would provide the functionality to allow users to make payments online.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;makePayment(userId: string, amount: decimal, journeyId: string)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Driver Tracking
&lt;/h2&gt;

&lt;p&gt;Driver tracking would be implemented with the aid of a websocket connection between the client (rider's mobile phone) and the backend server. The client would be responsible for emitting events with a Payload containing its location and timestamps. The frequency of the events to be emitted would be configurable such that longer trips would have longer time in between while shorter trips would have shorter times. &lt;/p&gt;

&lt;p&gt;Due to the poor cell reception during transit, the Payloads to be sent to the server can be stored in a FIFO (First In First Out) queue which would be flushed to the server once there is cell reception &lt;/p&gt;

&lt;p&gt;The server would actively listen for the events and then query the database to save the details of the Payload accompanying the event once it is received.&lt;/p&gt;
&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;I intend to use a monolithic architecture as opposed to microservice architecture. The major reason for this is that at this stage, I do not see the need for the added complexities that using a microservice architecture brings. I also believe that for now, the major scaling needs for the project would be vertical as opposed to horizontal. &lt;/p&gt;

&lt;p&gt;If for whatever reason there was need to change the architecture in the future, I would divide this system into the following core services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;user service: this would handle operations regarding user authentication, authorization and information&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;core service: this service would handle the core operations of the system and would responsible for booking journeys by users, keeping track of the riders location&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;notification service: this service would be used to send push notifications as well as email notification to users and riders&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;payment service: this service would handle payment related activities.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Choice of Messaging System
&lt;/h2&gt;

&lt;p&gt;If I were to shift to a microservice architecture sometimes in the future, Rabbitmq would be my messaging system of choice as opposed to kafka. This is mainly because I already have experience using Rabbitmq. The second reason for my choice would be the fact that rabbitmq is easier to setup and get working unlike kafka which is a little bit more complicated. The only reason I would pick Kafka over rabbitmq is if I needed to retain, reread or process and analyze messages sent to the different queues by the services.&lt;/p&gt;

&lt;p&gt;This post was inspired by Karan Pratap Singh system design course. If you want to learn more about system design, &lt;/p&gt;
&lt;div class="ltag__link"&gt;
  &lt;a href="/karanpratapsingh" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F350819%2F371eca60-9d4d-4caa-9d8d-1c5e85dfcb92.jpg" alt="karanpratapsingh"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="/karanpratapsingh/system-design-the-complete-course-10fo" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;System Design: The complete course&lt;/h2&gt;
      &lt;h3&gt;Karan Pratap Singh ・ Aug 16 '22&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#distributedsystems&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#architecture&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#tutorial&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;



</description>
      <category>systemdesign</category>
      <category>backenddevelopment</category>
    </item>
    <item>
      <title>Understanding the Fundamentals of Postgres Architecture</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Tue, 18 Jul 2023 23:13:06 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/understanding-the-fundamentals-of-postgresql-architecture-4kjk</link>
      <guid>https://forem.com/alahirajeffrey/understanding-the-fundamentals-of-postgresql-architecture-4kjk</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;PostgreSQL, often referred to as Postgres, is a powerful and widely-used open-source relational database system with over 35 years of active development.&lt;/p&gt;

&lt;p&gt;In this post, we will discuss the fundamentals of the architecture of &lt;a href="https://www.postgresql.org/" rel="noopener noreferrer"&gt;postgres&lt;/a&gt;, exploring its key components and their interactions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;PostgreSQL is a client/server type relational database management system with a multi-process architecture and runs on a single host. &lt;/p&gt;

&lt;p&gt;Don't worry if you do not understand what the statement above means. We will take it in bits and discuss it individually.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Client-Server Model&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Postgres uses a client/server model just like in the case of modern web development. In this case, the clients connect to the server which is responsible for running the database. The client is simply the application that provides a query for the database server to run and as such could be anything ranging from a web server, a desktop application, or even other instances of Postgres databases. The client and server communicate via a TCP/IP network connection. TCP/IP stands for Transmission Control Protocol/Internet Protocol and is a suite of communication protocols used to by network devices to communicate and share data on the Internet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multi-Process Architecture&lt;/strong&gt;&lt;br&gt;
When we say Postgres uses a multi-process architecture, we are simply saying that for Postgres to handle multiple connections from clients to the database, the database server creates(forks) a new process for each of those connections. That process would then be responsible for serving that client only. Each process is responsible for managing its own memory space and resources, ensuring isolation and security.&lt;/p&gt;

&lt;p&gt;There are different types of Postgres process types and they include&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Postmaster (Daemon) Process&lt;/li&gt;
&lt;li&gt;Backend Process&lt;/li&gt;
&lt;li&gt;Client Process&lt;/li&gt;
&lt;li&gt;Checkpointer Process&lt;/li&gt;
&lt;li&gt;Writer Processes&lt;/li&gt;
&lt;li&gt;Background Process&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Postmaster (Daemon) Process: this is the first process that starts when Postgres is started. It is the job of the postmaster process to perform recovery, initialize shared memory, manage incoming client connections, and fork new backend processes to handle those connections.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Backend Process: this process is responsible for handling user queries and transmitting the result to the user.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Client Process: this is the process that is responsible for handling user connections. It is often forked by the postmaster process and dedicated to a particular client whenever a connection is made.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Writer Process: this process is responsible for asynchronously flushing modified data pages from memory to disk which improves performance.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Checkpointer Processes: this process is responsible for creating checkpoints after which dirty buffers are flushed to the disk.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Background Process: these are a collection of processing that are responsible for running tasks such as automatic vacuuming to reclaim disk space, analyzing statistics for query optimization, and managing replication.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Memory Architecture
&lt;/h2&gt;

&lt;p&gt;Postgres memory can be classified into two broad categories:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Local Memory&lt;/strong&gt;: this is memory allocated by each backend process for its use. This memory is allocated by the backend process to enable it to process queries and can only be accessed by the backend process that owns it. The local memory can be divided into sub-areas;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Work Mem: this is used for sorting tuples by ORDER BY and DISTINCT operations, and for joining tables.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Maintenance work mem: this is used for some kinds of maintenance operations in Postgres.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Temp buffers: this is used for storing temporary files and tables.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Shared Memory&lt;/strong&gt;: this category of memory is allocated by the Postgres server when it starts up and is used by all the running Postgres processes. This allows processes to communicate with each other efficiently and share data without having to constantly read and write to disk. This area is further divided into several fixed-sized sub-areas.&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Shared Buffer: this acts as a cache for frequently accessed data pages. They hold copies of data pages from disk in memory, reducing disk I/O and improving query performance. Written or modified data are called dirty pages.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;WAL Buffer: WAL which stands for Write-Ahead Logging is a technique used by Postgres to guarantee the durability of database modifications. It ensures that changes are first written to a transaction log, known as the WAL, before being applied to the actual data files on disk. The WAL buffer is responsible for storing these changes until they are written to disk. This mechanism enables the recovery of data in the event of a system crash or failure. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Commit Log: the commit log is a component of the transaction log where information about the state of all transactions is recorded. There are four transaction states used by Postgres &lt;code&gt;IN_PROGRESS&lt;/code&gt;, &lt;code&gt;COMMITTED&lt;/code&gt;, &lt;code&gt;ABORTED&lt;/code&gt;, and &lt;code&gt;SUB-COMMITTED&lt;/code&gt;. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqr2di4mk3pd9jcs2v50.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqr2di4mk3pd9jcs2v50.png" alt="Postgres memory architecture" width="584" height="490"&gt;&lt;/a&gt;&lt;br&gt;
Image source: Severalnines&lt;/p&gt;

&lt;h2&gt;
  
  
  Query Processing
&lt;/h2&gt;

&lt;p&gt;When a query is executed in Postgres, it goes through several stages to be processed and produce the desired results: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Parsing: in this step, the parser analyzes the query analyzes the syntax of the query, and creates what is called a parse tree which is an internal representation of the query from the plain SQL text.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Query Rewriting: in this step, the parse tree is transformed and optimized by the PostgreSQL query optimizer.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Planning: here, the query planner generates an execution plan, which specifies the most effective and efficient sequence of operations needed to retrieve or modify the data&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Execution: in this step, the execution engine carries out the execution plan by accessing the tables and indexes in the order that was created by the plan tree.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this post, we reviewed the fundamentals of the architecture of Postgres. While this post is not exhaustive, it provides a basis from which you can dive deeper. &lt;/p&gt;

</description>
      <category>postgres</category>
      <category>architecture</category>
      <category>database</category>
    </item>
    <item>
      <title>How to Build a GraphQL API with Express and TypeScript</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Sun, 16 Jul 2023 15:10:50 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/how-to-build-a-graphql-api-with-express-and-typescript-471f</link>
      <guid>https://forem.com/alahirajeffrey/how-to-build-a-graphql-api-with-express-and-typescript-471f</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In a previous post titled &lt;a href="https://dev.to/alahirajeffrey/graphql-an-introduction-2plm"&gt;GraphQL: An Introduction&lt;/a&gt;, we delved into what GraphQL was and touched on how the GraphQL architecture differed from the REST (Representational State Transfer) architecture, various GraphQL concepts as well as situations where it was appropriate to use GraphQL as opposed to REST. &lt;/p&gt;

&lt;p&gt;In this post, we are going to learn how to build GraphQL APIs with express and typescript by building a simple barebones API that allows users to share information about their pets.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before we begin, make sure you have Node.js and npm (Node Package Manager) installed locally. Additionally, basic knowledge of JavaScript, Node.js, and TypeScript would be beneficial.&lt;/p&gt;

&lt;h2&gt;
  
  
  Initializing the Project
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Create a new directory for your project and navigate to it using your terminal.&lt;/li&gt;
&lt;li&gt;Initialize an npm project by running &lt;code&gt;npm init -y&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Install the required dependencies by running the command &lt;code&gt;npm install express express-graphql graphql graphql-tools&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Install the dev dependencies by running &lt;code&gt;npm install -D typescript ts-node @types/node @types/express @types/graphql nodemon&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Configuring Typescript
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;create a &lt;code&gt;tsconfig.json&lt;/code&gt; file in the root directory of your project and save the configuration below.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "compilerOptions": {
    "target": "ES2019",
    "module": "commonjs",
    "strict": true,
    "esModuleInterop": true,
    "resolveJsonModule": true,
    "outDir": "./dist"
  },
  "include": ["src"],
  "exclude": ["node_modules"]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Setting up the Project
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Create a directory in the root directory of your project and name it &lt;code&gt;src&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Create five files in the &lt;code&gt;src&lt;/code&gt; directory and name them: &lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;server.ts&lt;/code&gt;: this file would hold and export the code for our Express GraphQL server.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;index.ts&lt;/code&gt;: this file would be responsible for importing and running the Express GraphQL server which would be imported from the &lt;code&gt;./server.ts&lt;/code&gt; file. &lt;/li&gt;
&lt;li&gt;
&lt;code&gt;database.ts&lt;/code&gt;: this file would hold our mock database as we would not be using a real database for the sake of simplicity.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;resolvers.ts&lt;/code&gt;: this file would hold the resolvers. Resolvers are functions that are responsible for generating responses for GraphQL queries.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;schema.ts&lt;/code&gt;: This file would hold our GraphQL schema including all the available types, mutations, and queries.&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;code&gt;server.ts file&lt;/code&gt; and setup the express server as below
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import express from "express";

const server = express();

export default server;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Update the index.ts file with the following code
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import server from './server';

const PORT = 3000;

server.listen(PORT, () =&amp;gt; {
  console.log(`Server is running on localhost:${PORT}`);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Open up the package.json which holds all the project's metadata and is located at the root of your project directory and update the scripts section as seen below
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"scripts": {
  "start": "node dist/index.js",
  "start:dev": "nodemon --watch src --exec ts-node src/index.ts"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Upon completion of the steps above, running the command &lt;code&gt;npm run start:dev&lt;/code&gt; would start the project using nodemon which would automatically restart the project once changes are made and saved. You should see something like the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5b2eqnh8m545rq0nh1v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5b2eqnh8m545rq0nh1v.png" alt="nodemon setup" width="800" height="418"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Defining the GraphQL schema
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;code&gt;schema.ts&lt;/code&gt; file you created earlier and type the code below
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { buildSchema } from "graphql";

const schema = buildSchema(`
   type Pet {
    id: ID!
    name: String!
    age: Int!
    pictureUri: String
    ownerName: String!
  }

  type Query {
        getPets: [Pet]
        getPet(id: ID!): Pet
    }

  type Mutation {
        createPet(name: String!, age: Int!, pictureUri: String, ownerName: String!): Pet!
        updatePet(id: ID!, name: String, age: Int, pictureUri: String, ownerName: String): Pet!
        deletePet(id: ID!): ID!
    }
`);

export default schema;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's break this code down a little bit:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The schema variable in the code is responsible for holding our GraphQL schema.&lt;/li&gt;
&lt;li&gt;We created a pet schema that has 5 fields. An &lt;code&gt;id&lt;/code&gt; field of type &lt;code&gt;ID&lt;/code&gt;, a &lt;code&gt;name&lt;/code&gt; field of type &lt;code&gt;String&lt;/code&gt;, an &lt;code&gt;age&lt;/code&gt; field of type &lt;code&gt;Int&lt;/code&gt;, a &lt;code&gt;pictureUri&lt;/code&gt; field of type &lt;code&gt;String&lt;/code&gt;, and an &lt;code&gt;ownerName&lt;/code&gt; field of type &lt;code&gt;String&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Next, we created two queries. A &lt;code&gt;getPets&lt;/code&gt; query which returns an array of &lt;code&gt;Pet&lt;/code&gt; Objects and a &lt;code&gt;getPet&lt;/code&gt; query which takes an &lt;code&gt;id&lt;/code&gt; as an argument and returns a &lt;code&gt;Pet&lt;/code&gt; Object.&lt;/li&gt;
&lt;li&gt;We also created a &lt;code&gt;createPet&lt;/code&gt; mutation which returns the created pet object, a &lt;code&gt;updatePet&lt;/code&gt; mutation which returns the updated pet object and a &lt;code&gt;deletePet&lt;/code&gt; mutation which returns the id of the deleted pet.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It is important to note that the ! symbol which is used in several fields in the GraphQL schema implies that the field is required.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating the resolvers
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;code&gt;resolvers.ts&lt;/code&gt; file and write the following code
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import pets from "./database";
import { randomUUID } from "crypto";

type Pet = {
  id: string;
  name: string;
  age: number;
  pictureUri: string;
  ownerName: string;
};

const getPet = (args: { id: string }): Pet | undefined =&amp;gt; {
  return pets.find((pet) =&amp;gt; pet.id === args.id);
};

const getPets = (): Pet[] =&amp;gt; {
  return pets;
};

const createPet = (args: {
  name: string;
  age: number;
  pictureUri: string;
  ownerName: string;
}): Pet =&amp;gt; {
  // generate randon uuid for pet object
  const generatedId = randomUUID().toString();
  // create pet object and save
  const pet = { id: generatedId, ...args };
  pets.push(pet);
  return pet;
};

const updatePet = (args: {
  id: string;
  name?: string;
  age?: number;
  pictureUri?: string;
  ownerName?: string;
}): Pet =&amp;gt; {
  // loop through pets array and get object of pet
  const index = pets.findIndex((pet) =&amp;gt; pet.id === args.id);
  const pet = pets[index];

  // update field if it is passed as an argument
  if (args.age) pet.age = args.age;
  if (args.name) pet.name = args.name;
  if (args.pictureUri) pet.pictureUri = args.pictureUri;

  return pet;
};

const deletePet = (args: { id: string }): string =&amp;gt; {
  // loop through pets array and delete pet with id
  const index = pets.findIndex((pet) =&amp;gt; pet.id === args.id);
  if (index !== -1) {
    pets.splice(index, 1);
  }

  return args.id;
};

export const root = {
  getPet,
  getPets,
  createPet,
  updatePet,
  deletePet,
};

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;First and foremost, we created a Pet type to enable type checking in typescript. After that, we created five resolvers for the queries and mutations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;getPet&lt;/code&gt;: this resolver returns a response for the &lt;code&gt;getPet&lt;/code&gt; query and it takes an &lt;code&gt;id&lt;/code&gt; argument of type &lt;code&gt;string&lt;/code&gt;and returns a pet object if it exists.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;getPets&lt;/code&gt;: this resolver gets all the pet objects that are currently in the pets array. It is responsible for returning a response for the &lt;code&gt;getPets&lt;/code&gt; query.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;createPet&lt;/code&gt;: this is responsible for creating a pet and requires the &lt;code&gt;id&lt;/code&gt;, &lt;code&gt;name&lt;/code&gt;,&lt;code&gt;pictureUri&lt;/code&gt;, &lt;code&gt;ownerName&lt;/code&gt; and &lt;code&gt;age&lt;/code&gt; which are all &lt;code&gt;string&lt;/code&gt; types except for the &lt;code&gt;age&lt;/code&gt; argument which is a number type.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;updatePet&lt;/code&gt;: this is responsible for updating a pet and take a required &lt;code&gt;id&lt;/code&gt; argument of type &lt;code&gt;string&lt;/code&gt; as well as other optional arguments i.e. &lt;code&gt;name&lt;/code&gt;, &lt;code&gt;pictureUri&lt;/code&gt;, &lt;code&gt;ownerName&lt;/code&gt; and &lt;code&gt;age&lt;/code&gt; &lt;/li&gt;
&lt;li&gt;
&lt;code&gt;deletePet&lt;/code&gt;: this takes a required &lt;code&gt;id&lt;/code&gt; argument of type &lt;code&gt;string&lt;/code&gt;, deletes the pet object, and returns the pet id.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Update Code
&lt;/h2&gt;

&lt;p&gt;After writing the schemas and resolvers, the next step is to update the server to make use of the schema as resolvers. Ensure your &lt;code&gt;server.ts&lt;/code&gt; file looks like the one below&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import express from "express";
import { graphqlHTTP } from "express-graphql";
import schema from "./schema";
import { root } from "./resolvers";

const server = express();

// setup graphql
server.use(
  "/graphql",
  graphqlHTTP({
    schema: schema,
    rootValue: root,
    graphiql: true,
  })
);

export default server;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Testing the API
&lt;/h2&gt;

&lt;p&gt;If you have reached this stage without problems, congratulations. Now all that's left is for us to test the API. We do this by running the server. Simply type the command &lt;code&gt;npm run start:dev&lt;/code&gt; into your terminal and navigate to &lt;code&gt;http://localhost:3000/graphql&lt;/code&gt; in your browser. You should see the GraphQL playground as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tjngdrfsaqr4h51nki0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2tjngdrfsaqr4h51nki0.png" alt="graphiql" width="800" height="449"&gt;&lt;/a&gt;  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a pet by typing the code below into the playground
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mutation{createPet(name:"lubindos",age: 10, ownerName:"jeffrey",pictureUri:"pictureUri"){
  name,
  ownerName,
  pictureUri,
  age,
  id
}}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Update a pet
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mutation{updatePet(id:"081d0b95-8421-4cb0-8089-c5b8642731ae",name:"lubindos junior",pictureUri:"pictureUri"){
  name,
  ownerName,
  pictureUri,
  age, 
  id
}}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Delete a pet
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mutation{deletePet(id:"cf31ec68-ada2-46c0-b4a2-9dedd14d2176")}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Get a pet
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  getPet(id:"52599910-16fa-4744-a3fa-7900a8b70185"){
    id,
    age,
    name,
    ownerName,
    pictureUri
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Get all pets
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  getPets{
    id,
    age,
    name,
    ownerName,
    pictureUri
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;NB&lt;/strong&gt;: Dont forget to edit the fields in the mutation&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this tutorial, we dived a little bit deeper into how to create a GraphQL API with express and typescript by building a simple API. In the next tutorial, we would use the nestjs framework as well as Postgres database and an ORM (Object Relational Mapper) to learn real-world use cases.&lt;/p&gt;

&lt;p&gt;Link to code on github: &lt;a href="https://github.com/alahirajeffrey/pets-api" rel="noopener noreferrer"&gt;pets-graphql-api&lt;/a&gt;&lt;/p&gt;

</description>
      <category>graphql</category>
      <category>typescript</category>
      <category>express</category>
      <category>api</category>
    </item>
    <item>
      <title>GraphQL: An Introduction</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Fri, 14 Jul 2023 15:59:56 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/graphql-an-introduction-2plm</link>
      <guid>https://forem.com/alahirajeffrey/graphql-an-introduction-2plm</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;GraphQL stands for Graph Query Language and as per the official documentation is defined as "a query language for your API, and a server-side runtime for executing queries using a type system you define for your data". The official documentation further states that "it is an API standard that provides a more efficient, powerful and flexible alternative to REST (Representational State Transfer)".&lt;/p&gt;

&lt;p&gt;In this article, we will delve into the world of GraphQL, exploring its fundamental concepts, comparing it with REST, and understanding when it's advantageous to choose GraphQL over REST. By the end, you'll have a solid understanding of GraphQL's capabilities and how it can enhance your API development experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is GraphQL?
&lt;/h2&gt;

&lt;p&gt;GraphQL is an open-source query language and runtime for APIs developed by Facebook. However, unlike other query languages such as SQL (Structured Query Language) which are often used to make queries to databases (such as a Postgres database) as well as other information systems, GraphQL allows clients to make requests or communicate with a server in a structured way.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is REST
&lt;/h2&gt;

&lt;p&gt;REST is a software architectural style for sharing data between different systems. It is often used to create REST APIs (Application Programming Interfaces) that follow the principles of REST which are a set of rules that define how resources are identified, accessed, and manipulated. These principles include Stateless Communication, Uniform Interfaces, Client-Server Separation and Cacheability.&lt;/p&gt;

&lt;h2&gt;
  
  
  GraphQL vs REST
&lt;/h2&gt;

&lt;p&gt;There are advantages and disadvantages to using both GraphQL and REST and even though they both solve similar problems i.e. communication between systems, they go about it in different ways. The major difference is in their architecture as well as the way requests are made.  &lt;/p&gt;

&lt;p&gt;When requesting data from a REST server, the client requests the data it needs from a specific endpoint for example if you were building a blogging platform that allowed users to create profiles and write blogs. If a user needed to view his/her profile, a GET request would be made to a specific endpoint such as &lt;code&gt;localhost:5000/api/v1/profile/:profileId&lt;/code&gt;. However, if that user needed to make a request to get every blog he or she had ever written, the user would make a GET request to another endpoint such as &lt;code&gt;localhost:5000/api/v1/blog/:profileId&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;In GraphQL on the other hand, a client requests or queries the server for any data it needs from a single endpoint. This allows the client to precisely specify the data it needs and receive it in a single request there mitigating the issues of over-fetching or under-fetching data that often occur with REST APIs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Important Concepts in GraphQL
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GraphQL Schemas:&lt;/strong&gt; At the core of GraphQL lies the schema. It can be said to be the blueprint for communication between the client and the server. It specifies the data types, queries the client can make, the type of mutations the client can make, and the relationship between types.  GraphQL schemas provide clear documentation which reduces errors that occur as a result of mismatched data between client and server. Below are examples of GraphQL schema types, queries, and mutations.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type Author {
  id: ID!
  name: String!
  email: String!
  blogs: [Blog!]!
}

type Blog {
  id: ID!
  title: String!
  content: String!
  author: Author!
}


type Query {
  blogs: [Blog!]!
  blog(id: ID!): Blog
  authors: [Author!]!
  author(id: ID!): Author
}

type Mutation {
  createBlog(title: String!, content: String!, authorId: ID!): Blog!
  updateBlog(id: ID!, title: String, content: String): Blog!
  deleteBlog(id: ID!): Boolean!
  createAuthor(name: String!, email: String!): Author!
  updateAuthor(id: ID!, name: String, email: String): Author!
  deleteAuthor(id: ID!): Boolean!
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GraphQL Queries:&lt;/strong&gt; A query is a GraphQL operation that allows a client to request data from a GraphQL server. It can be thought of as a GET request. Queries are well structured in a hierarchical model that allows the client the flexibility to specify the exact data it needs thereby eliminating the need for fixed endpoints as is the case in REST APIs. Below are examples of GraphQL queries.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- Fetch all blogs with the author's name and id
query {
  blogs {
    id
    title
    content
    author {
      id
      name
    }
  }
}

- Fetch a single blog with the author's id and name by blog id
query {
  blog(id: "123") {
    id
    title
    content
    author {
      id
      name
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GraphQL Mutations:&lt;/strong&gt; While queries are used to fetch data from the server, mutations on the other hand are used to create or modify data on the server. It allows the client to specify the exact data it wants to change. Mutations can be thought of as POST, PUT, PATCH, and DELETE requests which are used in REST APIs. Below are examples of GraphQL mutations.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- Create blog
mutation {
  createBlog(title: "New Blog", content: "This is the content.", authorId: "456") {
    id
    title
    content
    author {
      id
      name
    }
  }
}
- Create author
mutation {
  createAuthor(name: "John Doe", email: "johndoe@example.com") {
    id
    name
    email
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  When to use GraphQL
&lt;/h2&gt;

&lt;p&gt;The choice between GraphQL and REST depends on various factors and even though most developers if given the choice would still choose REST, there are times when GraphQL could be the better option.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In cases where you would want the client to control the exact amount of data it needs to avoid over-fetching and under-fetching of data, GraphQL could be the better option.&lt;/li&gt;
&lt;li&gt;Using GraphQL alleviates bandwidth concerns as small devices such as smartwatches and mobile phones cannot handle large amounts of data. In such cases, GraphQL's ability to allow the client to specify the data it requires might make it a better option.&lt;/li&gt;
&lt;li&gt;GraphQL can optimize the performance of a server by reducing the number of network requests required to get required data as all required data can be gotten in one network request. &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this article, we've explored the fundamentals of GraphQL and compared it with the traditional REST approach. GraphQL provides a more efficient, flexible, and client-centric way of communicating with APIs. By allowing clients to request only the necessary data, GraphQL mitigates issues like over-fetching and under-fetching, leading to optimized network usage and improved performance. &lt;/p&gt;

&lt;p&gt;To learn how to implement GraphQL APIs using express, click &lt;a href="https://dev.to/alahirajeffrey/how-to-build-a-graphql-api-with-express-and-typescript-471f"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>graphql</category>
      <category>api</category>
    </item>
    <item>
      <title>A NodeJS newbie's guide to understanding NPM and NPX</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Wed, 14 Jun 2023 14:52:00 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/a-nodejs-newbies-guide-to-understanding-npm-and-npx-34j6</link>
      <guid>https://forem.com/alahirajeffrey/a-nodejs-newbies-guide-to-understanding-npm-and-npx-34j6</guid>
      <description>&lt;p&gt;Understanding the difference between npm and npx and the way they are used is very important for any software engineer working with Node.js. This is because these tools play crucial roles in the development workflow of anyone working with Node.js, allowing developers to manage dependencies and execute packages efficiently.&lt;/p&gt;

&lt;p&gt;In this short and coincise article, we'll delve into and shed more light on the diffrence between both npm and npx. Permit me to start with a short and personal story which took place not too long ago. On that fateful day, I was interviewing for the role of a software engineering intern. During the course of the interview, I was asked the difference between npm and npx and even though my answer was not entirely wrong, it wasn't quite right either. The major reason for this was that the interviewer more or less guided me to an answer that could be considered slightly correct.&lt;/p&gt;

&lt;p&gt;Anyways, immediately after the interview, I decided to do my own research and figure out the actual differences between npm and npx was and write a little about it, just in case it would be of use to someone else. Before I start, I must say that this article is in no way comprehensive and it would not be far fetched to say there are a plethora of articles out there that do more justice to the topic. However, I think this article could provide a sort of gentle introduction for newbies like me. &lt;/p&gt;

&lt;p&gt;So for starters, npm stands for node package manager and is used to download and install node packages locally while npx stands for node package execute and is used to execute node packages. &lt;/p&gt;

&lt;p&gt;Npm in itself is a powerful tool and serves as the default package manager for Node.js. I like to think that npm is to nodejs what pip is to python. Its primary function is to help developers install, manage, and distribute JavaScript packages and libraries with other developers. With npm, a developer can easily download and incorporate an external packages made by other developers into his projects without having to reinvent the wheel. This saves a lot of time and effort as well as encourages cooperation among the developer community. By specifying dependencies in a package.json file, npm can fetch and install the required packages locally within a project.&lt;/p&gt;

&lt;p&gt;An example of using npm would be to install  the create-react-package which is used to setup react locally using &lt;code&gt;npm install create-react-app&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;On the other hand, npx is a command-line tool that comes bundled with npm starting from version 5.2.0. While npm excels at package installation and management, npx shines in package execution. With npx, you can run packages without explicitly installing them first. This is particularly useful when you need to execute one-off or infrequently used packages, as it eliminates the need for a separate installation step&lt;/p&gt;

&lt;p&gt;An example would be running the &lt;code&gt;npx create-react-app&lt;/code&gt; which automatically checks to see if you have the package installed and then exexutes it or goes online, downloads the package, executes the package, and removes the package. &lt;/p&gt;

&lt;p&gt;In summary, the key difference between npm and npx is the fact that npm is primarily used for package installation and management while npx on the other hand is used to execute packages even without installing them.&lt;/p&gt;

&lt;p&gt;In conclusion, understanding the diffrences between npm and npx is vital for software engineers working with Node.js. By grasping the differences between npm and npx, developers can streamline their development workflow and optimize their productivity. &lt;/p&gt;

</description>
      <category>node</category>
      <category>npm</category>
      <category>npx</category>
    </item>
    <item>
      <title>How to setup an express typescript project</title>
      <dc:creator>Alahira Jeffrey Calvin</dc:creator>
      <pubDate>Thu, 01 Jun 2023 08:24:20 +0000</pubDate>
      <link>https://forem.com/alahirajeffrey/how-to-setup-an-express-typescript-project-4a5i</link>
      <guid>https://forem.com/alahirajeffrey/how-to-setup-an-express-typescript-project-4a5i</guid>
      <description>&lt;p&gt;I have tried setting up an express typescript project on a few occasions, and on each occasion, I end up having to go through several articles to solve that problem. Setting up an express typescript project can be a bit challenging, especially if you are not familiar with either express or typescript.&lt;br&gt;
This is one of the reasons I prefer nest over express as nest comes with typescript setup and is configured automatically upon initialization.&lt;/p&gt;

&lt;p&gt;My solution to this is to create an article that I could always refer to when I am starting a new express typescript project. However, I'd be even happier if this article becomes of use to someone else who shares the same problem. &lt;/p&gt;

&lt;p&gt;Before we begin, I am going to assume you have nodejs and typescript installed locally. To find out if you have both typescript and node installed, simply open up your terminal or command prompt and type the command &lt;code&gt;tsc --version&lt;/code&gt; and &lt;code&gt;node --version&lt;/code&gt;. If you have them installed, your terminal or command prompt should return the versions of typescript and node installed, just like in the picture below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7hkq4p57paj1wkzsqyso.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7hkq4p57paj1wkzsqyso.png" alt="check if node and typecript are installed"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If your terminal does not resemble the picture above, simply install nodejs and typescript. You can simply google how to do that.  To start the project, simply navigate to the folder you want to start the project and create a folder. You can then initialize a node project by running the command &lt;code&gt;npm init -y&lt;/code&gt;. you should see an output like the one below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0w1d7qtdyumjveo4zwye.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0w1d7qtdyumjveo4zwye.png" alt="initialize node"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The package.json file is going to contain the metadata for your node projects. After that, all that is left is to initialize&lt;br&gt;
typescript and that can be done by running the &lt;code&gt;tsc --init&lt;/code&gt; command which returns the output below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqpwv063a82u320ml6o6a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqpwv063a82u320ml6o6a.png" alt="initialize typescript"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The tsconfig.json file contains all the configuration settings for the typescript compiler. Feel free to browse through&lt;br&gt;
it and edit as you see fit. More often than not, the default configurations should be more than enough. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;express is the framework for building web applications, &lt;/li&gt;
&lt;li&gt;dotnev is used to access environment variables, &lt;/li&gt;
&lt;li&gt;nodemon automatically restarts the server when code changes are detected during development &lt;/li&gt;
&lt;li&gt;ts-node allows us to run typeScript code directly without compiling&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You simply install them by running npm install express dotenv &lt;br&gt;
while you install nodemon as npm install nodemon ts-node -D. This specifies that nodemon be installed as a development dependency as it is only used during development.&lt;/p&gt;

&lt;p&gt;After installation is complete, open up the package.json file and add the following scripts under the scripts section&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

"start": "node dist/index.js",
"dev": "nodemon --watch src --exec ts-node src/index.ts",
"build": "tsc -p"


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You should have something like the image below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5lduftkvwkp8p90i4l7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5lduftkvwkp8p90i4l7.png" alt="npm scripts"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The "build" script tells typescript to build and compile the project into javascript, the "start" script contains &lt;br&gt;
instructions to run the compiled javascript code, and the "dev" script is used to start the server in development mode using &lt;br&gt;
nodemon and ts-node. It automatically starts the server after each save.&lt;/p&gt;

&lt;p&gt;If you have successfully done all of the above without any errors, then congratulations. You can now start building web applications with both express and typescript.&lt;/p&gt;

</description>
      <category>express</category>
      <category>typescript</category>
      <category>node</category>
    </item>
  </channel>
</rss>
