<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Uyi Oboite</title>
    <description>The latest articles on Forem by Uyi Oboite (@praisephs).</description>
    <link>https://forem.com/praisephs</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/praisephs"/>
    <language>en</language>
    <item>
      <title>Deploying Nginx on Azure Ubuntu VM: My Experience</title>
      <dc:creator>Uyi Oboite</dc:creator>
      <pubDate>Thu, 30 Jan 2025 16:03:53 +0000</pubDate>
      <link>https://forem.com/praisephs/deploying-nginx-on-azure-ubuntu-vm-my-experience-3102</link>
      <guid>https://forem.com/praisephs/deploying-nginx-on-azure-ubuntu-vm-my-experience-3102</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Deploying Nginx on an Azure virtual machine (VM) was a great learning experience that strengthened my understanding of cloud infrastructure, Linux server management, and networking configurations. This blog post details my approach, challenges encountered, solutions applied, and how this task aligns with my professional goals.&lt;/p&gt;

&lt;p&gt;Nginx is a web server, meaning it serves web content, handles HTTP requests, and acts as a reverse proxy. Reverse proxy means a server or application that functions as an intermediary between clients and actual web servers, intercepting incoming requests from clients, forwarding them to the appropriate backend server, and then sending the response back to the client.&lt;/p&gt;

&lt;p&gt;Also note that Nginx as a web server needs an operating system to run, reason I am deploying it on an Azure Ubuntu VM. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My Approach&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Here is a step-by-step breakdown of how I completed the task:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.Creating an Azure Resource Group and Virtual Machine&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first step was to set up an Azure resource group to keep all project-related metadata organized. Then, I created an Ubuntu-based VM to host my Nginx installation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fglzu0pmtma9cwxi1e2uo.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fglzu0pmtma9cwxi1e2uo.JPG" alt="Image description" width="800" height="414"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2.Connecting to the Azure VM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After deploying the VM, I used the following command to log in to my Azure account from the Ubuntu terminal: &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;                 az login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fplkhc3ca1my7xgpzxt4e.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fplkhc3ca1my7xgpzxt4e.JPG" alt="Image description" width="800" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once authenticated, I established an SSH connection to the VM using:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;           ssh username@&amp;lt;VM_IP_Address&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhx84oufnktramr5njo34.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhx84oufnktramr5njo34.JPG" alt="Image description" width="800" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3.Updating System Packages&lt;/strong&gt;&lt;br&gt;
To ensure that my system was up to date, I ran:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;      sudo apt update &amp;amp;&amp;amp; sudo apt upgrade -y
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The -y flag automatically confirmed the installation of package updates.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4.Installing Nginx&lt;/strong&gt;&lt;br&gt;
I installed Nginx using:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;         sudo apt install nginx -y
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;5.Starting and Enabling Nginx&lt;/strong&gt;&lt;br&gt;
Once installed, I started the Nginx service and enabled it to start on boot:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;       sudo systemctl start nginx
       sudo systemctl enable nginx
       sudo systemctl status nginx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;A successful installation was confirmed when the output displayed active (running).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvof4wd4clpvvxdvphii4.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvof4wd4clpvvxdvphii4.JPG" alt="Image description" width="800" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6.Verifying Nginx Installation&lt;/strong&gt;&lt;br&gt;
To verify that Nginx was running correctly, I opened my browser and navigated to  &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;         http://&amp;lt;VM_IP_Address&amp;gt;/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2haygows779dxh11ava8.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2haygows779dxh11ava8.JPG" alt="Image description" width="800" height="323"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I was greeted with the default Nginx welcome page, confirming that the server was working.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7.Customizing the Default Nginx Page&lt;/strong&gt;&lt;br&gt;
To replace the default Nginx welcome page, I edited the HTML file located at /var/www/html/index.html:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        sudo vim /var/www/html/index.html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I replaced the default content with:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;h1&amp;gt;Welcome to DevOps Stage 0 - Uyiosa Praise Oboite/Uyi&amp;lt;/h1&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then, I saved the changes and restarted Nginx to apply them:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        sudo systemctl restart nginx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;To confirm the update, I reloaded the browser at http:/// and saw my custom message displayed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3x7ircg6uusqdkq2s3r4.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3x7ircg6uusqdkq2s3r4.JPG" alt="Image description" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Challenges and Resolutions&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.SSH Connection Failure on Port 22&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Issue: Initially, I couldn't SSH into the VM because the connection timed out.&lt;/p&gt;

&lt;p&gt;Resolution: I updated the VM's Network Security Group (NSG) inbound rules to allow traffic on port 22.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Permission Issues When Editing the Nginx HTML Page&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Issue: I encountered a permission error when trying to edit /var/www/html/index.html.&lt;/p&gt;

&lt;p&gt;Resolution: I used superuser privileges (sudo) before editing the file:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;       sudo vim /var/www/html/index.html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Key Takeaways&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Completing this task reinforced my understanding of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Cloud infrastructure management using Azure.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Linux system administration (package management, service control, and file editing).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Networking configurations (SSH and NSG rules).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Web server deployment using Nginx.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This knowledge directly supports my career aspirations in DevOps and Cloud Engineering, providing foundational skills for automating infrastructure and managing cloud-hosted applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://hng.tech/hire/devops-engineers" rel="noopener noreferrer"&gt;DevOps Engineers&lt;/a&gt;&lt;br&gt;
&lt;a href="https://hng.tech/hire/cloud-engineers" rel="noopener noreferrer"&gt;Cloud Engineers&lt;/a&gt;&lt;br&gt;
&lt;a href="https://hng.tech/hire/site-reliability-engineers" rel="noopener noreferrer"&gt;Site Reliability Engineers&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Author&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Uyiosa Praise Oboite&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://www.linkedin.com/in/uyiosa-praise-oboite-141352224/" rel="noopener noreferrer"&gt;LinkedIn Profile&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Real-Time Monitoring Dashboard Project Using FastAPI and Azure</title>
      <dc:creator>Uyi Oboite</dc:creator>
      <pubDate>Thu, 19 Sep 2024 15:26:08 +0000</pubDate>
      <link>https://forem.com/praisephs/real-time-monitoring-dashboard-project-using-fastapi-and-azure-l3o</link>
      <guid>https://forem.com/praisephs/real-time-monitoring-dashboard-project-using-fastapi-and-azure-l3o</guid>
      <description>&lt;p&gt;&lt;strong&gt;Project Overview&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This application monitors system metrics (CPU, memory, and disk usage) in real-time and provides visual representations via an HTML interface. The project was developed using FastAPI for the backend and Jinja2 for dynamic HTML rendering. The application is hosted on Azure App Service, making it scalable and easily accessible from anywhere.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Technologies Used&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FastAPI:&lt;/strong&gt; Lightweight web framework used to build APIs quickly and efficiently.&lt;br&gt;
&lt;strong&gt;Uvicorn:&lt;/strong&gt; Lightning-fast ASGI server for serving the FastAPI application.&lt;br&gt;
&lt;strong&gt;Jinja2:&lt;/strong&gt; Template engine for rendering HTML dynamically based on system metrics.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;File Structure&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. monitoring_app.py:&lt;/strong&gt; &lt;br&gt;
The main FastAPI application that handles system metrics and renders the HTML templates.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This is the main Python file where the FastAPI application is defined.&lt;/li&gt;
&lt;li&gt;It contains the endpoints for retrieving system metrics (CPU, memory, and disk usage) using the psutil library.&lt;/li&gt;
&lt;li&gt;The application uses the Jinja2 templating engine to render HTML pages dynamically, displaying the system metrics in a clean format.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkypu7g4unjjfi3smeh71.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkypu7g4unjjfi3smeh71.JPG" alt="Image description" width="800" height="566"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0a2y2l7m05z0b7t9svbf.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0a2y2l7m05z0b7t9svbf.JPG" alt="Image description" width="771" height="596"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;requirements.txt:
This file lists all the dependencies needed to run the project.
Key dependencies include FastAPI for building the web framework, Uvicorn for running the app, Psutil for retrieving system metrics, and Jinja2 for rendering HTML templates.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa18typucjxayt93r3did.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa18typucjxayt93r3did.JPG" alt="Image description" width="506" height="296"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. templates/:&lt;/strong&gt; &lt;br&gt;
This directory contains the HTML files used to display system metrics.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;home.html:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;This is the template displayed when a user visits the root endpoint /.&lt;/li&gt;
&lt;li&gt;It can include a welcome message and provide navigation to the 
metric pages (CPU, memory, disk).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwzis9dww51yqpbrnlelz.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwzis9dww51yqpbrnlelz.JPG" alt="Image description" width="540" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;cpu_usage.html:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Displays the current CPU usage in a styled, centered format.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm36i5rqaggr4wld871of.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm36i5rqaggr4wld871of.JPG" alt="Image description" width="581" height="680"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;memory_usage.html&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Displays the current memory usage (total, available, and percentage used)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;disk_usage.html&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Displays the disk usage (total, used, free, and percentage used).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fliq38vhqcqfcvyclrrer.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fliq38vhqcqfcvyclrrer.JPG" alt="Image description" width="672" height="678"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxut1ul1avddxfeu8vg2n.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxut1ul1avddxfeu8vg2n.JPG" alt="Image description" width="599" height="678"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How It Works&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;FastAPI for the Backend: The backend is built with FastAPI to serve system metrics through different endpoints (/metrics/cpu, /metrics/memory, /metrics/disk).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Jinja2 for Rendering: The system metrics are passed to the Jinja2 templates, which dynamically generate HTML pages. For example, the CPU usage is rendered with the current percentage centered and styled using CSS.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure Deployment: The application is deployed on Azure App Service, providing a highly available and scalable environment&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Azure Deployment Process&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The following steps outline how I deployed my FastAPI Monitoring Application to Azure App Service, making use of the Azure CLI to automate the process and configure the deployment.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Install Necessary Python Packages&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Before starting the deployment process, I installed the necessary dependencies, including FastAPI, Uvicorn, jinja2 and Psutil&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzximsny4k77o9dazczcl.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzximsny4k77o9dazczcl.JPG" alt="Image description" width="800" height="71"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an Azure Resource Group&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;An Azure Resource Group is a container that holds related resources for an Azure solution. To create the resource group, I used the following command&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fos13lvcufazzewq9x5w5.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fos13lvcufazzewq9x5w5.JPG" alt="Image description" width="800" height="35"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;monitoring-app-rg: This is the name of the resource group.&lt;/li&gt;
&lt;li&gt;eastus: The Azure region where the resources are hosted&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Create an Azure App Service Plan&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;An App Service Plan defines the region, size, and scale of the web apps running on Azure. In this step, I created an App Service plan with the B1 pricing tier on a Linux server&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0azarqyf4qrg1cvtkrj.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0azarqyf4qrg1cvtkrj.JPG" alt="Image description" width="800" height="33"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;--is-linux: Specifies that this is a Linux-based service plan.&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Create an Azure Web App
The Web App is the actual hosting service for the application. Here, I created a Python 3.9 Web App within the resource group and plan&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08w4fuit5af0q1mlqr3z.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F08w4fuit5af0q1mlqr3z.JPG" alt="Image description" width="800" height="51"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;--runtime "Python|3.9": Specifies the Python version for the app&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Initialize Git and Commit Code&lt;br&gt;
Once the Azure resources were set up, I initialized a Git repository in my local project folder, added all the files, and made an initial commit&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Configure Local Git Deployment to Azure&lt;br&gt;
Azure App Service allows you to configure local Git deployment. This step sets up the connection between my local Git repository and Azure Web App&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwtw0fl4n9wep0figb59q.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwtw0fl4n9wep0figb59q.JPG" alt="Image description" width="800" height="34"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After running this command, Azure provided a Git Remote URL that I needed to use to push my local code to the Azure App Service&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5312yuseq3r2itpjr3z.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz5312yuseq3r2itpjr3z.JPG" alt="Image description" width="800" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Add Azure Remote to Git
Using the Git URL provided in the previous step, I added the Azure remote to my local Git configuration&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp3pj64kp32k18yc07fh2.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp3pj64kp32k18yc07fh2.JPG" alt="Image description" width="800" height="50"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Get Username and Password from Azure Portal&lt;br&gt;
To push the code to Azure, I needed the username and password for the Azure Git repository:&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I went to the Azure Portal and navigated to my App Service (monitoring_app).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From there, I selected the Deployment Center in the left-hand menu.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Under the User Credentials section, I found the Git Username (which &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;is usually the name of your app) and generated/reset the Git 
Password.&lt;/li&gt;
&lt;li&gt;Username: Typically, it’s the name of your app (e.g., 
@monitoring_app).&lt;/li&gt;
&lt;li&gt;Password: Generated in the Deployment Center.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Push Code to Azure Remote&lt;br&gt;
With the username and password in hand, I was able to push my local code to the Azure Web App by using &lt;strong&gt;"git push azure"&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Upon pushing, I was prompted to enter the username and password that I retrieved from the Azure portal. After a successful push, Azure automatically deployed the code&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz4j28dgyvgk63hghuk7n.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz4j28dgyvgk63hghuk7n.JPG" alt="Image description" width="413" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Verify the Deployment
After the deployment was successful, I visited my web app's URL to ensure it was running&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5x1vzxxgie0uekz28dd5.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5x1vzxxgie0uekz28dd5.JPG" alt="Image description" width="800" height="273"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1v9lr1n4lhgoo6mwl9cs.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1v9lr1n4lhgoo6mwl9cs.JPG" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frrijonwe63os3wsi7b6t.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frrijonwe63os3wsi7b6t.JPG" alt="Image description" width="800" height="524"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9yz0cwrblusbw8s53v22.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9yz0cwrblusbw8s53v22.JPG" alt="Image description" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The application was live, and the CPU, memory, and disk usage metrics were accessible via the respective endpoints:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;/metrics/cpu&lt;/li&gt;
&lt;li&gt;/metrics/memory&lt;/li&gt;
&lt;li&gt;/metrics/disk&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Author&lt;/strong&gt;&lt;br&gt;
This project was developed by &lt;strong&gt;Uyiosa Praise Oboite&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;GitHub Repository&lt;br&gt;
You can find the source code for this project here: &lt;a href="https://github.com/praisephs/azure_monitoring.git" rel="noopener noreferrer"&gt;Git Repository&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Building Devopsfetch for Server Information Retrieval and Monitoring</title>
      <dc:creator>Uyi Oboite</dc:creator>
      <pubDate>Wed, 24 Jul 2024 19:44:07 +0000</pubDate>
      <link>https://forem.com/praisephs/building-devopsfetch-for-server-information-retrieval-and-monitoring-1hcg</link>
      <guid>https://forem.com/praisephs/building-devopsfetch-for-server-information-retrieval-and-monitoring-1hcg</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Devopsfetch is a Bash script that provides detailed information about system components, including active ports, Docker containers, Nginx configurations, user logins, and system activities. The script is used directly in the terminal, offering command-line access to monitor and retrieve system information, making it an essential tool for system administrators and DevOps engineers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Creating the Script&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Navigate to Your Project Directory:&lt;br&gt;
First, you need to navigate to the directory where you want to create your script. You can do this using the cd command in the terminal.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd6ewgaz2ab5dpy52zt0s.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd6ewgaz2ab5dpy52zt0s.JPG" alt="Image description" width="800" height="74"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create the Script File:&lt;br&gt;
Use the touch command to create an empty file. This command will simply create the file without opening it for editing&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu76mq7mg6mhnzsbgc6f0.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu76mq7mg6mhnzsbgc6f0.JPG" alt="Image description" width="800" height="141"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Edit the File:&lt;br&gt;
Open the file with a text editor of your choice, such as nano, vim, or any other editor. Using vim editor in this case can be seen below&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flr8qkj652o1e23zj6zid.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flr8qkj652o1e23zj6zid.JPG" alt="Image description" width="800" height="100"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Write the Script:&lt;br&gt;
Copy and paste the script content into the editor&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkvkcdkngvsvevpdaokoj.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkvkcdkngvsvevpdaokoj.JPG" alt="Image description" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save and Exit the Editor:&lt;br&gt;
If you’re using vim &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Enter Insert Mode:&lt;br&gt;
Press i to enter insert mode, where you can start typing or paste &lt;br&gt;
your script content.&lt;br&gt;
Paste or Write the Script:&lt;br&gt;
Paste or write your script content while in insert mode.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save and Exit:&lt;br&gt;
After entering the script content, follow these steps to save and &lt;br&gt;
exit:&lt;br&gt;
Press Esc to exit insert mode.&lt;br&gt;
Type :wq and press Enter.&lt;br&gt;
The :wq command writes (saves) the file and then quits vim.&lt;br&gt;
If you only want to save the file and not exit vim, use :w instead. &lt;br&gt;
To quit without saving, use :q!&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://github.com/praisephs/server_info_retrieval_monitoring/blob/main/devopsfetch.sh" rel="noopener noreferrer"&gt;Click this link for the content of the devopsfetch script&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sample Information Retrieval:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Display all active ports and services (-p or --port)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ex62f8j78z6a59cl6rl.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ex62f8j78z6a59cl6rl.JPG" alt="Image description" width="682" height="442"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide detailed information about a specific port (-p)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzu87a6zfobhkxxhsvdj3.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzu87a6zfobhkxxhsvdj3.JPG" alt="Image description" width="708" height="601"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;List all Docker images and containers (-d or --docker)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flw1eypeuef9z8lhwfk21.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flw1eypeuef9z8lhwfk21.JPG" alt="Image description" width="800" height="294"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide detailed information about a specific container &lt;br&gt;
(-d container_name)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft85wksybzh9llbyaw7v6.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft85wksybzh9llbyaw7v6.JPG" alt="Image description" width="800" height="382"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Display all Nginx domains and their ports (-n or --nginx)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F711vm5g3b0czja9az4qc.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F711vm5g3b0czja9az4qc.JPG" alt="Image description" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide detailed configuration information for a specific domain &lt;br&gt;
(-n domain)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9cpq626ds701odomqnqb.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9cpq626ds701odomqnqb.JPG" alt="Image description" width="770" height="641"&gt;&lt;/a&gt; &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;List all users and their last login times (-u or --users)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5ksp9d62dguvx0g9t2p.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5ksp9d62dguvx0g9t2p.JPG" alt="Image description" width="672" height="340"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide detailed information about a specific user (-u username)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbn6mschslmudaoqve4ek.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbn6mschslmudaoqve4ek.JPG" alt="Image description" width="722" height="413"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Display activities within a specified time range (-t or --time)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxrpexkyjivox2iqddziv.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxrpexkyjivox2iqddziv.JPG" alt="Image description" width="800" height="167"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Automating Dependency Installation and System Monitoring&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The install_dependencies.sh Script&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This script is designed to automate the installation and setup of essential system dependencies for a server. This script ensures that Docker, Nginx, and log-rotate are installed and properly configured to start on system boot&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Script Overview&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Update and Install Dependencies&lt;/li&gt;
&lt;li&gt;Updates the package lists for the Ubuntu system.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Installs Docker, Nginx, and logrotate if they are not already installed&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable and Start Services&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enables Docker and Nginx services to start automatically on system boot.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Starts Docker and Nginx services immediately after installation.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Script Breakdown&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Update and Install Dependencies&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Updates the package lists for the Ubuntu system.
This command updates the local package index with the latest changes made in the repositories. It ensures that you install the latest versions of the software packages&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flcsuol4x6ennb0mhf9al.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flcsuol4x6ennb0mhf9al.JPG" alt="Image description" width="548" height="96"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Installing Docker, Nginx, and Logrotate
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2gefbgkx91e1zxzcxlsu.JPG" alt="Image description" width="630" height="53"&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;docker.io: The package for Docker, which is used to manage containers.&lt;/li&gt;
&lt;li&gt;nginx: The package for Nginx, which is a web server and reverse proxy server.&lt;/li&gt;
&lt;li&gt;logrotate: The package for managing and rotating log files.&lt;/li&gt;
&lt;li&gt;The -y flag automatically answers "yes" to any prompts during installation&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Enabling and Starting Docker Service
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhil3mlyzzac8srck010s.JPG" alt="Image description" width="509" height="112"&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;systemctl enable docker: Configures Docker to start automatically when the system boots.&lt;/li&gt;
&lt;li&gt;systemctl start docker: Starts the Docker service immediately.&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Enabling and Starting Nginx Service&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fli5astzb27u6x7fmgzfl.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fli5astzb27u6x7fmgzfl.JPG" alt="Image description" width="470" height="129"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ensuring Log-rotate is Installed&lt;br&gt;
Installs log-rotate to manage and rotate log files, ensuring that logs do not consume excessive disk space&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F444jjgiitxjm43azkcyw.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F444jjgiitxjm43azkcyw.JPG" alt="Image description" width="449" height="88"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Usage&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Make the Script Executable&lt;br&gt;
Before running the script, ensure it has executable permissions&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyjcakuelsd6gzaoobkwy.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyjcakuelsd6gzaoobkwy.JPG" alt="Image description" width="800" height="108"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run the Script&lt;br&gt;
Execute the script to perform the installation and setup&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpmypw12mvs9h7t5k7dgk.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpmypw12mvs9h7t5k7dgk.JPG" alt="Image description" width="800" height="107"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://github.com/praisephs/server_info_retrieval_monitoring/blob/main/install_dependencies.sh" rel="noopener noreferrer"&gt;install_dependencies script&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The devopsfetch_monitor.sh Script&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A devopsfetch_monitor.sh script is designed to continuously monitor and log various system metrics and configurations. It collects information such as system details, CPU and memory usage, disk usage, active users, recent logins, open ports, and Nginx domain information. The collected data is then logged into a specified log file&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3nn4p666d2n21qwcqkuw.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3nn4p666d2n21qwcqkuw.JPG" alt="Image description" width="800" height="541"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Location of the Script:&lt;br&gt;
The script is located at&lt;br&gt;
/home/praisephs/server_monitoring/devopsfetch_monitor.sh&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw94pmyz01ntg5jhzjjxz.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw94pmyz01ntg5jhzjjxz.JPG" alt="Image description" width="800" height="137"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Log File Configuration:&lt;br&gt;
The log file where the monitoring data is stored is specified in the script&lt;br&gt;
LOG_FILE="/home/praisephs/server_monitoring/devopsfetch_logs/devopsfetch.log"&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbsf28elffqcg4i988h9a.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbsf28elffqcg4i988h9a.JPG" alt="Image description" width="800" height="71"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;How It Works&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Initialization:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The script starts by defining the log file location and the format for timestamps:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqj5qr9c4hf7q3h0fn6dp.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqj5qr9c4hf7q3h0fn6dp.JPG" alt="Image description" width="800" height="101"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It logs the start of the monitoring process:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy46k1m4lj38szv56enne.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy46k1m4lj38szv56enne.JPG" alt="Image description" width="740" height="83"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Continuous Monitoring Loop:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;The script enters an infinite loop, where it performs the following actions at each iteration:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;System Information Collection: It logs a timestamp and then collects various system metrics:&lt;/p&gt;

&lt;p&gt;System Information: General system information is collected using the uname -a command&lt;/p&gt;

&lt;p&gt;CPU and Memory Usage: Top processes by CPU and memory usage are listed using the top command&lt;/p&gt;

&lt;p&gt;Disk Usage: Disk space usage for all mounted filesystems is collected using the df -h command&lt;/p&gt;

&lt;p&gt;Memory Status: Memory usage and availability are logged using the free -h command&lt;/p&gt;

&lt;p&gt;Active Users: Currently logged-in users are listed using the who command&lt;/p&gt;

&lt;p&gt;Recent User Logins: Recent login attempts are logged using the &lt;br&gt;
last -n 5 command&lt;/p&gt;

&lt;p&gt;Open Ports: A list of open network ports is generated using the ss -tuln command&lt;/p&gt;

&lt;p&gt;Nginx Domain Information: The script extracts domain information from Nginx configuration files located in /etc/nginx/sites-available/&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Logging the Data: All collected data is appended to the specified log file in a well-structured format&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Delay Between Iterations: The script waits for an hour before starting the next data collection cycle&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmh9wuspbsv6kmhmiju85.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmh9wuspbsv6kmhmiju85.JPG" alt="Image description" width="687" height="133"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://github.com/praisephs/server_info_retrieval_monitoring/blob/main/devopsfetch_monitor.sh" rel="noopener noreferrer"&gt;Link to devopsfetch_monitor,sh&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Logging the Data: All collected data is appended to the specified log file in a well-structured format&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Log Rotation&lt;/strong&gt;&lt;br&gt;
To manage the size of the log file and ensure the system does not run out of disk space, a manual log rotation script manual_log_rotate.sh is used.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Location of the Log Rotation Script&lt;br&gt;
The log rotation script is located at:&lt;br&gt;
/home/praisephs/server_monitoring/manual_log_rotate.sh&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg157s58pmsnc67ydp7e7.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg157s58pmsnc67ydp7e7.JPG" alt="Image description" width="800" height="149"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;How It Works&lt;br&gt;
The log rotation script checks the size of the log file and rotates it if it exceeds a specified size&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Log File and Maximum Size Configuration&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu9h6v77y0w83cu4w79yq.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu9h6v77y0w83cu4w79yq.JPG" alt="Image description" width="714" height="153"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Log Size Check and Rotation
The script checks if the log file size exceeds MAX_SIZE. If it does, the log file is renamed with a timestamp, and a new log file is created
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdz1wqpvkk50dbw5b9bw2.JPG" alt="Image description" width="706" height="257"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The log rotation script should is scheduled to run periodically using a cron job to ensure the log file does not grow too large&lt;/p&gt;

&lt;p&gt;In simple terms, a cron job is a scheduled task that runs automatically at specific intervals on a Unix-based system (like Linux). You can think of it as a timer that triggers a particular command or script to execute at regular times, such as daily, weekly, or every hour&lt;/p&gt;

&lt;p&gt;Usage&lt;br&gt;
Starting the Monitoring Script&lt;br&gt;
To start the monitoring, run the script using:&lt;br&gt;
sudo /home/praisephs/server_monitoring/devopsfetch_monitor.sh&lt;/p&gt;

&lt;p&gt;NB: &lt;em&gt;Ensure that the script has execute permissions and is run with appropriate permissions to access all required system information&lt;/em&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Starting the Log Rotation Script&lt;br&gt;
The log rotation script can be run manually or scheduled to run periodically:&lt;br&gt;
sudo /home/praisephs/server_monitoring/manual_log_rotate.sh&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Log File Access&lt;br&gt;
The log file containing the monitoring data is located at:&lt;br&gt;
/home/praisephs/server_monitoring/devopsfetch_logs/devopsfetch.log&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Users can access and review this log file to analyze system metrics and activity&lt;/p&gt;

&lt;p&gt;To print the status of system monitoring use:&lt;br&gt;
cat /home/praisephs/server_monitoring/devopsfetch_logs/devopsfetch.log&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjx2wdwrhonsz2b7tumz5.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjx2wdwrhonsz2b7tumz5.JPG" alt="Image description" width="800" height="594"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Confirmation of System Monitoring&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4p3h4jw77km959vhvdwv.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4p3h4jw77km959vhvdwv.JPG" alt="Image description" width="800" height="90"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Seeing both devopsfetch.log and devopsfetch.log.1.gz in devopsfetch_logs directory indicates that the system monitoring is working and data archiving is in place&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;devopsfetch.log: This is the current active log file where new log entries are appended&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;devopsfetch.log.1.gz: This is a compressed archive of the previous log file. The .gz extension indicates that it's been compressed using gzip&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The presence of the archived log file (devopsfetch.log.1.gz) shows that log rotation is working, and old log data is being archived properly to save space and manage log files effectively.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Managing Users in PraisePHS Microsystems Ltd with a Bash Script</title>
      <dc:creator>Uyi Oboite</dc:creator>
      <pubDate>Thu, 04 Jul 2024 15:49:45 +0000</pubDate>
      <link>https://forem.com/praisephs/managing-users-in-praisephs-microsystems-ltd-with-a-bash-script-26jo</link>
      <guid>https://forem.com/praisephs/managing-users-in-praisephs-microsystems-ltd-with-a-bash-script-26jo</guid>
      <description>&lt;p&gt;In any corporate environment, managing users efficiently and securely is paramount. PraisePHS Microsystems Ltd. employs a systematic approach to user management using a Bash script. This article will explain how the provided Bash script functions, ensuring smooth and secure user creation and management.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Overview&lt;/strong&gt;&lt;br&gt;
The script takes a file containing user and group information as input, creates users, assigns them to specified groups, sets passwords, and logs the process. The users.txt file, for example, provides the necessary user and group details in a semicolon-separated format.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bash Script Breakdown&lt;/strong&gt;&lt;br&gt;
Here's a step-by-step explanation of the script:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The script starts by defining the necessary files: the input file containing user information, the log file for logging actions, and the password file for storing generated passwords.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkvizzlcx494iaxbidx7x.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkvizzlcx494iaxbidx7x.JPG" alt="Image description" width="771" height="168"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;This function generates a random password using OpenSSL's rand function, ensuring a strong and unique password for each user.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgn4nj94eluzycwl0h4ew.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgn4nj94eluzycwl0h4ew.JPG" alt="Image description" width="593" height="109"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;This section ensures the existence and correct permissions of the log and password files. If these files or directories do not exist, the script creates them and sets appropriate permissions to ensure security&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3tf92urazug22jk0wkx.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3tf92urazug22jk0wkx.JPG" alt="Image description" width="689" height="345"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The script reads each line of the input file, removes any carriage returns, trims whitespace, and skips empty lines to ensure clean data processing.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ktx2m9zf3s610o3h6s6.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ktx2m9zf3s610o3h6s6.JPG" alt="Image description" width="715" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;User Existence Check: The script checks if a user already exists. If they do, it logs this information.&lt;/li&gt;
&lt;li&gt;User Creation: If the user does not exist, the script generates a password, creates the user with a home directory, sets the password, and logs these actions.&lt;/li&gt;
&lt;li&gt;Group Management: The script adds the user to specified groups. If a group does not exist, it creates the group before adding the user.&lt;/li&gt;
&lt;li&gt;Permissions and Ownership: Finally, the script sets the permissions and ownership
for the user's home directory to ensure security.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbrzr1iye8s2la75xni4f.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbrzr1iye8s2la75xni4f.JPG" alt="Image description" width="800" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Example Input File: users.txt&lt;br&gt;
The users.txt file contains user information in the format username; group1,group2&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7mc8z5djs5yyxh2n3oe.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7mc8z5djs5yyxh2n3oe.JPG" alt="Image description" width="497" height="120"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For more information about the HNG internship, visit the &lt;a href="https://hng.tech/internship" rel="noopener noreferrer"&gt;HNG Internship page&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you are looking to hire talented interns, check out the &lt;a href="https://hng.tech/hire" rel="noopener noreferrer"&gt;HNG Hire page&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloudcomputing</category>
      <category>devops</category>
      <category>bash</category>
      <category>ubuntu</category>
    </item>
  </channel>
</rss>
