<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Rishab Kumar</title>
    <description>The latest articles on Forem by Rishab Kumar (@rishabk7).</description>
    <link>https://forem.com/rishabk7</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/rishabk7"/>
    <language>en</language>
    <item>
      <title>The Cloud Resume Challenge - Beginner Cloud Project</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Fri, 08 Mar 2024 02:59:10 +0000</pubDate>
      <link>https://forem.com/rishabk7/the-cloud-resume-challenge-3il7</link>
      <guid>https://forem.com/rishabk7/the-cloud-resume-challenge-3il7</guid>
      <description>&lt;h2&gt;
  
  
  A beginner cloud project - The Cloud Resume API
&lt;/h2&gt;

&lt;p&gt;Want to impress potential employers and gain hands-on cloud experience at the same time? The Cloud Resume API Challenge is the perfect way to start!  By building a cloud-based API for your resume, you'll showcase your cloud computing skills while creating something that has real-world value. And the best part – you don't need to be a cloud guru to get started.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenge Requirements:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;NoSQL Database&lt;/strong&gt;: These flexible databases, like DynamoDB (AWS), Firestore (GCP), and Cosmos DB (Azure), are a great fit for storing resume data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Serverless Functions&lt;/strong&gt;: This technology lets you run code without worrying about managing servers. It's the heart of your resume API!&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CI/CD (GitHub Actions)&lt;/strong&gt;: This tool automates the process of updating your API. Every time you change your code, GitHub Actions will repackage and redeploy it for you.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd1xg9ual3fm8pmoj805v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd1xg9ual3fm8pmoj805v.png" alt="Cloud Resume API Architecture" width="800" height="514"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step-by-Step Guide
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Pick Your Cloud Playground&lt;/strong&gt;: The Cloud Resume API Challenge supports AWS, Google Cloud Platform (GCP), or Microsoft Azure. Pick a provider that you like best!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Design Your Data&lt;/strong&gt;:  Think about what information your resume API will include.  Start with the basics like your name, contact details, experience, and skills. You can always expand later!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Store your Resume Data&lt;/strong&gt;: Use the NoSQL Database within your selected cloud provider to store your resume data in JSON.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Time to Code&lt;/strong&gt;:  Your core component will be a serverless function. This function is responsible for fetching your resume data from the database and crafting a response for anyone who uses your API.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploy to Cloud&lt;/strong&gt;: Deployment is the process that makes your API accessible over the internet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Test Drive Your API&lt;/strong&gt;: Once your API is deployed, you (or anyone else!) can send a request and see your resume displayed in a structured JSON format.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tips for Beginners
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Start Simple: Your initial API doesn't have to be fancy. Get the core functions working, then add features over time.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Power of Community: Visit the &lt;a href="https://cloudresumeapi.dev"&gt;Cloud Resume API website&lt;/a&gt; to see examples from others who have taken on this challenge. You're not alone!&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Ready to build your Cloud Resume API? My YouTube video has the detailed steps of the challenge.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/iZq8aaGMpjM"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Share Your Success&lt;/strong&gt;: Did you conquer the challenge? I'd love to hear about it! Share your work and any questions in the comments below.&lt;br&gt;
Let's get started on your cloud journey!&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>aws</category>
      <category>azure</category>
      <category>gcp</category>
    </item>
    <item>
      <title>Deploying Grafana to Azure’s Web Apps for Containers</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Mon, 21 Aug 2023 15:41:17 +0000</pubDate>
      <link>https://forem.com/rishabk7/deploying-grafana-to-azures-web-apps-for-containers-119p</link>
      <guid>https://forem.com/rishabk7/deploying-grafana-to-azures-web-apps-for-containers-119p</guid>
      <description>&lt;p&gt;Hello Cloud adventurers, I have been on a learning journey this year with containerization, specifically docker. Continuing the series of blogs, I wanted to try deploying Grafana to Azure but as a docker container. I know there are different ways and different services within Azure that you can use to deploy a container, but I will be using &lt;a href="https://azure.microsoft.com/en-ca/products/app-service/web"&gt;Azure Web Apps.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Deploying Grafana to Azures Web Apps for Containers is a straightforward process:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;A storage account&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;An Azure Files share&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Initiate an empty Sqlite database&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Web Apps for Containers Azure App Service&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Mount the Azure Files share&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Set an environment variable&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lets get started.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create a Storage Account
&lt;/h2&gt;

&lt;p&gt;Begin by creating an Azure Storage account. This is where we will create the File Share later. You might ask, why do we need a File Share?&lt;/p&gt;

&lt;p&gt;If I deployed Grafana as a container, the Sqlite database (and plug-ins) would be lost as soon as the container is reset. Hence, we will be using the File Share as storage for our Grafana database.&lt;/p&gt;

&lt;p&gt;I have created &lt;code&gt;grafana-rg&lt;/code&gt; as the resource group where all the resources will be deployed for this blog post. For the Storage Account, make sure you use a unique name and I went with &lt;code&gt;Standard&lt;/code&gt; performance with &lt;code&gt;Geo-Redundant Storage (GRS)&lt;/code&gt;. I left everything else as default.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lnhr1afL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692279763590/e530bd26-dc26-42c9-b98b-26dae27c92b4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lnhr1afL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692279763590/e530bd26-dc26-42c9-b98b-26dae27c92b4.png" alt="Creating a Storage Account in Azure" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create an Azure Files Share
&lt;/h2&gt;

&lt;p&gt;Navigate to your newly created storage account and scroll down to &lt;code&gt;File Shares&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--1HcPYP13--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692281095634/14c01326-0e5e-4407-9904-2d5e7dc2633d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--1HcPYP13--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692281095634/14c01326-0e5e-4407-9904-2d5e7dc2633d.png" alt="Azure Storage Account with File Shares highlighted on the Configuration Blade." width="800" height="407"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the + File share button.&lt;/p&gt;

&lt;p&gt;For the Name, I chose &lt;code&gt;grafana-storage&lt;/code&gt; you can choose whatever you like. I went with the &lt;code&gt;Transaction optimized&lt;/code&gt; tier. Everything else was left to default.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oGZygc6o--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692281117633/994fc10c-0f49-4295-99bc-b1b5e69f20cf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oGZygc6o--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692281117633/994fc10c-0f49-4295-99bc-b1b5e69f20cf.png" alt="Creating new File Share with Transaction optimized Tier" width="800" height="517"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Initiate an empty Sqlite database
&lt;/h2&gt;

&lt;p&gt;I found it the hard way that if we leave Grafana to create the database, it will run into a database lock error. To overcome this, were going to create a Sqlite database manually and then, using the Azure CLI, we will upload the database to the file share.&lt;/p&gt;

&lt;p&gt;On a Windows machine, you can run the following commands.&lt;/p&gt;

&lt;p&gt;💡&lt;/p&gt;

&lt;p&gt;Provided youve already installed &lt;a href="https://chocolatey.org/install"&gt;choco&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you are on macOS, Sqlite is already installed and, therefore, you can skip the first command.&lt;/p&gt;

&lt;p&gt;Also, youll need &lt;a href="https://learn.microsoft.com/en-us/cli/azure/install-azure-cli"&gt;Azure CLI&lt;/a&gt; installed and authenticated with your Azure account.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;choco install sqlite


sqlite3 grafana.db 'PRAGMA journal_mode=wal;'

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, let's copy the &lt;code&gt;grafana.db&lt;/code&gt; to our File Share. We will use the &lt;code&gt;azcopy&lt;/code&gt; command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;az login az storage copy -s .\grafana.db -d https://&amp;lt;storage_account&amp;gt;.file.core.windows.net/&amp;lt;file_share&amp;gt; --subscription &amp;lt;subscription_name&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace , , and , with the appropriate values: the name of your storage account, the name of your file share and your subscriptions id or name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m2c71O25--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282141394/deb8fe0f-0e48-4c53-a2a8-95956c8029ab.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m2c71O25--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282141394/deb8fe0f-0e48-4c53-a2a8-95956c8029ab.png" alt="AZ Copy command output, copied the grafana.db over to Azure File Share" width="800" height="167"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  App Service
&lt;/h2&gt;

&lt;p&gt;Create a new Azure web application that publishes a Docker Container and choose Linux for the OS. Choose a pre-existing Linux App Service Plan or create a new one.&lt;/p&gt;

&lt;p&gt;There is a Free F1 Plan, but I went with Basic B1.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7SPOg-wX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282217748/65f7c12d-138a-4671-8cfb-84fb68904dcc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7SPOg-wX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282217748/65f7c12d-138a-4671-8cfb-84fb68904dcc.png" alt="Creating a Web App, first settings page with Docker Container and Pricing plan as Basic B1." width="800" height="565"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For the Docker options, were going to leave the defaults for the moment. Well update this later once we have our environment fully configured. All other options we will the default.&lt;/p&gt;

&lt;p&gt;💡&lt;/p&gt;

&lt;p&gt;Make sure the &lt;code&gt;Enable Public Access&lt;/code&gt; is checked.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pNWx1jhW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282334460/a243fa3c-bc2b-41fc-b3cb-81a1387a5bd1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pNWx1jhW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282334460/a243fa3c-bc2b-41fc-b3cb-81a1387a5bd1.png" alt="Using the default settings for Web App Container under Docker tab" width="800" height="495"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the Azure Web App has been successfully configured, you can access it by the app service URL. You should see the nginx message.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pIlbIc_6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282346811/68700be8-aa70-4620-b04c-c6ee69a99196.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pIlbIc_6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282346811/68700be8-aa70-4620-b04c-c6ee69a99196.png" alt="Azure Web App page with Default Domain and Configuration highlighted" width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we will configure the Web App container to use the Azure File Share as storage.&lt;/p&gt;

&lt;h2&gt;
  
  
  Mount the Azure Files share
&lt;/h2&gt;

&lt;p&gt;In the Azure portal, on the App Services blade, click on &lt;code&gt;Configuration&lt;/code&gt;, then &lt;code&gt;Path mappings&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m-sbnYY1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282622040/0b2e3ece-3d5c-4fe1-846a-0117f2c97339.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m-sbnYY1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282622040/0b2e3ece-3d5c-4fe1-846a-0117f2c97339.png" alt="Azure Web App Configuration window, with Path mappings highlighted" width="800" height="477"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now click on &lt;code&gt;+ New Azure Storage Mount&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CKYqB2_V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282647645/f666631a-e042-4d50-830e-806f3044be63.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CKYqB2_V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282647645/f666631a-e042-4d50-830e-806f3044be63.png" alt="Azure Web App configuration: adding New Azure Storage Mount" width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Name the storage mount whatever youd prefer and choose the &lt;em&gt;Storage Account&lt;/em&gt; you created before, &lt;code&gt;Azure Files&lt;/code&gt; for &lt;em&gt;Storage Type&lt;/em&gt;, and the Azure File share created before for the &lt;em&gt;Storage Container&lt;/em&gt;. Finally, most important, for the &lt;em&gt;Mount path&lt;/em&gt;, type in &lt;code&gt;/var/lib/grafana&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xiZqsH5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282653075/599ff125-67ee-46b9-b58a-2bc267f17028.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xiZqsH5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692282653075/599ff125-67ee-46b9-b58a-2bc267f17028.png" alt="New Azure Storage Mount for Azure Web App with all the parameters filled in" width="787" height="529"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;OK&lt;/code&gt; then &lt;code&gt;Save&lt;/code&gt; at the top.&lt;/p&gt;

&lt;p&gt;We just attached our Azure File share to our container and it will be mounted at &lt;code&gt;/var/lib/grafana&lt;/code&gt; which is Grafanas default file path for the database and its plugins. This means, should our container be restarted for any reason, none of our settings will be lost.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Set an Environment Variable&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Now we have Grafana storing the database in our &lt;code&gt;/var/lib/grafana&lt;/code&gt; mount path. However, it will not use the &lt;code&gt;PRAGMA&lt;/code&gt; flag that we used earlier, therefore, we will experience &lt;a href="https://github.com/grafana/grafana/issues/16638"&gt;database locking&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Grafana allows us to overwrite default configuration variables by leveraging &lt;strong&gt;environment variables&lt;/strong&gt;. So, were going to use Grafanas default database connection string, but append the necessary flag.&lt;/p&gt;

&lt;p&gt;Click on the &lt;code&gt;Configuration&lt;/code&gt; blade once again for your Web App(you should still be on the &lt;strong&gt;Configuration&lt;/strong&gt; blade from the previous step).&lt;/p&gt;

&lt;p&gt;Now, choose &lt;code&gt;Application settings&lt;/code&gt; and add a &lt;code&gt;+ New application setting.&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Rb9qjWRS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692283877904/1a9cb91f-4ecf-440f-a1e4-96b1a6f97f07.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Rb9qjWRS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692283877904/1a9cb91f-4ecf-440f-a1e4-96b1a6f97f07.png" alt="Azure Web App Configuration page with Application Settings highlighted" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For the setting, enter the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Name: GF_DATABASE_URL
Value: sqlite3:///var/lib/grafana/grafana.db?cache=private&amp;amp;mode=rwc&amp;amp;_journal_mode=WAL

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Leave the &lt;code&gt;Deployment slot setting&lt;/code&gt; unchecked.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BQbpTcyb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284016752/3478b7da-09bb-46b9-994d-f3f5963a3bfd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BQbpTcyb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284016752/3478b7da-09bb-46b9-994d-f3f5963a3bfd.png" alt="Adding new application setting for Azure Web App" width="800" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure you click &lt;code&gt;Save&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hz40poZH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284045144/4c2eadd5-65e9-4a88-820e-b3d49ff70307.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hz40poZH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284045144/4c2eadd5-65e9-4a88-820e-b3d49ff70307.png" alt="Azure Web App Configuration page with Save button highlighted" width="800" height="226"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Awesome, now we have the environment configured, but we need to update our containers image source.&lt;/p&gt;

&lt;p&gt;In the App Service blade, under &lt;code&gt;Deployment Center&lt;/code&gt;, choose &lt;code&gt;Settings&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--o4rWyNjl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284123943/44f79ab4-cc35-4be2-ae70-a76b7c0262ab.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--o4rWyNjl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284123943/44f79ab4-cc35-4be2-ae70-a76b7c0262ab.png" alt="Azure Web App Deployment Center page: changing the container settings" width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, the &lt;code&gt;Full Image Name and Tag will&lt;/code&gt; be set to &lt;code&gt;nginx&lt;/code&gt;(which was set when we created the App Service).&lt;/p&gt;

&lt;p&gt;We want to use the Grafana Container Image, I have my own docker image for Grafana that I want to use, but you can use the default one that is provided by Grafana.&lt;/p&gt;

&lt;p&gt;Curious on how to create your own Docker image, &lt;a href="https://youtu.be/tvIcZZBvnOk"&gt;watch this video on how I did it.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--J6D4Y-GG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284234451/3cdab5b0-7e3f-41a8-9ec2-cb349ac0e9c9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--J6D4Y-GG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284234451/3cdab5b0-7e3f-41a8-9ec2-cb349ac0e9c9.png" alt="Docker Hub with my own grafana-container image" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Change the registry source to &lt;code&gt;Docker Hub&lt;/code&gt;, and the image to &lt;code&gt;grafana/grafana&lt;/code&gt; if you want to use the official &lt;a href="https://hub.docker.com/r/grafana/grafana"&gt;Grafana Image&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Click &lt;code&gt;Save&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5pO1Fn7V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284241428/f76ffe28-30b8-4c3e-a10f-8d51d2a6a86d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5pO1Fn7V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284241428/f76ffe28-30b8-4c3e-a10f-8d51d2a6a86d.png" alt="Azure Web App Deployment Center Container settings filled in with appropriate values for my grafana image" width="800" height="745"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It may take a moment for Azure to pick up the changes once you submit them. You can click &lt;code&gt;Refresh&lt;/code&gt; a few times and check the logs to ensure the container was deployed correctly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---XEQWrtr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284479916/3125a882-7c4c-42ef-a885-7f09b60a5dad.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---XEQWrtr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284479916/3125a882-7c4c-42ef-a885-7f09b60a5dad.png" alt="Azure Web App portal with Refresh and Logs button highlighted" width="800" height="310"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now let's visit the Web App URL to see the changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tt7IaGCu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284537794/f16d8122-1156-4333-8ae2-2a9af3231142.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tt7IaGCu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1692284537794/f16d8122-1156-4333-8ae2-2a9af3231142.png" alt="Grafana Login screen when accessing the Azure Web App URL" width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Conclusion&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;We did it! We now have Grafana deployed to Azure Web Apps for Containers with Azure File Share as storage, making it redundant, not prone to losing data on container restarts and the ability to scale to support the necessary load.&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>devops</category>
      <category>azure</category>
    </item>
    <item>
      <title>Understanding Docker - as an 11 year old</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Thu, 08 Jun 2023 13:14:45 +0000</pubDate>
      <link>https://forem.com/rishabk7/understanding-docker-as-an-11-year-old-2n5p</link>
      <guid>https://forem.com/rishabk7/understanding-docker-as-an-11-year-old-2n5p</guid>
      <description>&lt;p&gt;Hello, cloud explorers!&lt;/p&gt;

&lt;p&gt;Have you ever played with LEGO blocks? If not, you still might know what they are. Those colorful, versatile bricks that let your imagination run wild? Sure, you have! We all love to build castles, robots, or spacecraft with them, right? Now, what if I told you there's something quite similar in the world of computers, and it's called Docker!&lt;/p&gt;

&lt;p&gt;Hold up! You might be thinking, "What does &lt;strong&gt;Docker&lt;/strong&gt; have to do with my LEGO blocks?" Well, let's embark on a journey to unravel this mystery!&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In our day-to-day life, we interact with many applications on our smartphones, tablets, or computers games, educational apps, and a lot more. But have you ever wondered how these applications are created? They're built by software developers using different tools and languages. It's like constructing a LEGO masterpiece, but instead of physical blocks, they use blocks of code!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fsijwLSk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://images.unsplash.com/photo-1585366119957-e9730b6d0f60%3Fixlib%3Drb-4.0.3%26ixid%3DM3wxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8fA%253D%253D%26auto%3Dformat%26fit%3Dcrop%26w%3D2371%26q%3D80" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fsijwLSk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://images.unsplash.com/photo-1585366119957-e9730b6d0f60%3Fixlib%3Drb-4.0.3%26ixid%3DM3wxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8fA%253D%253D%26auto%3Dformat%26fit%3Dcrop%26w%3D2371%26q%3D80" alt="white and black lego toy by danielkcheung on Unsplash" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, there's a challenge. Just like your LEGO creation might need a specific type of brick, these applications also need certain elements to work properly. When these applications are moved from one place to another (like from a developer's computer to a server in the cloud), they might not work correctly because some essential elements are missing or not compatible. It's like trying to fit your LEGO masterpiece into different rooms with different settings some rooms might not have enough space or the right table for your masterpiece.&lt;/p&gt;

&lt;p&gt;This is where Docker, the hero of our story, comes into play!&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding Docker
&lt;/h2&gt;

&lt;p&gt;Docker is like a magical toolbox that provides a perfect environment (a room) for your application (LEGO masterpiece) to run. This toolbox is portable and can be carried anywhere. It ensures that no matter where you take it, the application will run just as intended, just like your LEGO masterpiece remains intact inside the box.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Iw194GXZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1685990306532/f4b3635e-3ea4-49df-894c-5f13a36896fa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Iw194GXZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1685990306532/f4b3635e-3ea4-49df-894c-5f13a36896fa.png" alt="Docker Diagram" width="800" height="896"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This magical toolbox is what we call a &lt;strong&gt;Docker Container&lt;/strong&gt;. In technical terms, a Docker Container is a lightweight, standalone package that includes everything an application needs to run code, runtime, libraries, and system tools. No matter where you move this container, the application will always run without any issues!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HS12Dbvi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1685990813473/2ceed61f-e375-47e7-a892-f0ffef7de9a9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HS12Dbvi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1685990813473/2ceed61f-e375-47e7-a892-f0ffef7de9a9.png" alt="Docker Container with Docker File and Image" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, Docker also has something called &lt;strong&gt;Docker Images&lt;/strong&gt;. They're like the instruction manual that comes with your LEGO sets, telling you what pieces you need and how to put them together. In the same way, Docker Images provide the blueprint for creating Docker Containers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up Our Adventure
&lt;/h2&gt;

&lt;p&gt;So there it is, folks! Docker, in simple terms, is a fantastic tool that helps create and deliver software applications conveniently and reliably, just like transporting a LEGO masterpiece safely in a magic toolbox.&lt;/p&gt;

&lt;p&gt;Although Docker sounds a bit technical, just remember that even the most complex things can be understood when related to something fun and familiar. Today, it's LEGO blocks and Docker, tomorrow it could be something else!&lt;/p&gt;

&lt;p&gt;I am Rishab, feel free to reach out to me &lt;a href="https://twitter.com/rishabk7"&gt;@rishabk7&lt;/a&gt; on Twitter or &lt;a href="https://instagram.com/rishabincloud"&gt;@rishabincloud&lt;/a&gt; on IG. Also, I wrote this &lt;a href="https://blog.rishabkumar.com/docker-cheat-sheet"&gt;cheat sheet for Docker&lt;/a&gt; that you might find helpful.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>devops</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Ultimate Docker Cheat Sheet</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Tue, 30 May 2023 14:45:23 +0000</pubDate>
      <link>https://forem.com/rishabk7/ultimate-docker-cheat-sheet-3j8l</link>
      <guid>https://forem.com/rishabk7/ultimate-docker-cheat-sheet-3j8l</guid>
      <description>&lt;p&gt;Last year, I started learning more about containerization, which meant gaining some skills with Docker, an &lt;a href="https://github.com/docker/docker"&gt;open-source project&lt;/a&gt; for automating the deployment of applications as portable, self-sufficient containers.&lt;br&gt;&lt;br&gt;
If you use Docker, you are well aware of how effective it can be in streamlining and improving development procedures. However, the numerous commands and Dockerfile instructions can sometimes feel overwhelming, especially if you're new to the platform. That's why I've put together this Docker cheat sheet to help you keep track of the most common commands.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Docker Command Line Interface (CLI) Commands&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  General Commands
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker version&lt;/code&gt;: Need to check which Docker version you're running? This command will provide all the information.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker info&lt;/code&gt;: If you're looking for system-wide information related to Docker, this command is your go-to.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker help &amp;lt;command&amp;gt;&lt;/code&gt;: Are you uncertain about a specific command? Add the command after &lt;code&gt;docker help&lt;/code&gt; to get detailed information.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Image Commands
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker images&lt;/code&gt;: This command will provide a list of all the images present on your system.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker pull &amp;lt;image&amp;gt;&lt;/code&gt;: This command allows you to pull an image from a registry.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker rmi &amp;lt;image&amp;gt;&lt;/code&gt;: Use this command to remove one or images from your system.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Container Commands
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker ps&lt;/code&gt;: List all running containers with this command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker ps -a&lt;/code&gt;: List all containers, including stopped ones.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker run &amp;lt;image&amp;gt;&lt;/code&gt;: Use this command to run a command in a new container, pulling the image if needed and starting the container.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker stop &amp;lt;container&amp;gt;&lt;/code&gt;: Stop a running container.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker rm &amp;lt;container&amp;gt;&lt;/code&gt;: To remove one or more containers from your system.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Dockerfile Commands
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker build -t &amp;lt;tag&amp;gt; .&lt;/code&gt;: This command lets you build an image from a Dockerfile in the current directory,&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker tag &amp;lt;image&amp;gt; &amp;lt;tag&amp;gt;&lt;/code&gt;: You can tag an image to a name (local or registry) with this command.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Docker Compose Commands
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker-compose up&lt;/code&gt;: This command builds, (re)creates, starts, and attaches to containers for a service.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker-compose down&lt;/code&gt;: If you want to stop and remove containers, networks, images, and volumes, use this command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;docker-compose build&lt;/code&gt;: This command is used to build or rebuild services.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Dockerfile Instructions
&lt;/h2&gt;

&lt;p&gt;Dockerfile instructions are used to assemble a Docker image. Here are some of the essentials:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;FROM&lt;/code&gt;: This sets the base image for subsequent instructions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;RUN&lt;/code&gt;: This allows you to execute commands in a new layer on top of the current image and commit the results.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;CMD&lt;/code&gt;: This specifies the command to run when a container is launched.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;EXPOSE&lt;/code&gt;: You can inform Docker that the container listens on the specified network ports at runtime with this instruction.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;ENV&lt;/code&gt;: Set environment variables using this instruction.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;ADD/COPY&lt;/code&gt;: These instructions let you copy new files, directories, or remote file URLs and add them to the image filesystem.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;ENTRYPOINT&lt;/code&gt;: Configure a container that will run as an executable with this instruction.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;VOLUME&lt;/code&gt;: This creates a mount point and marks it as holding externally mounted volumes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;USER&lt;/code&gt;: This sets the user name or UID used when running the image and for any following RUN, CMD, and ENTRYPOINT instructions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;WORKDIR&lt;/code&gt;: This sets the working directory for any following RUN, CMD, ENTRYPOINT, COPY, and ADD instructions.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Lastly, it is always good practice to clean up and remove unused Docker resources. Docker provides a clean-up command for this: &lt;code&gt;docker system prune&lt;/code&gt;. However, use this command with caution, as it will remove all unused resources.&lt;br&gt;&lt;br&gt;
For more in-depth information about Docker CLI commands and Dockerfile instructions, refer to the &lt;a href="https://docs.docker.com/"&gt;&lt;strong&gt;official Docker documentation&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This cheat sheet should serve as a handy reference guide whether you're a Docker newbie or a seasoned professional. I also made a PDF version which you can &lt;a href="https://www.buymeacoffee.com/rishabincloud/e/139731"&gt;&lt;strong&gt;download here.&lt;/strong&gt;&lt;/a&gt;&lt;br&gt;&lt;br&gt;
Free feel to reach out to me if you have any questions, I am &lt;a href="https://twitter.com/rishabk7"&gt;@rishabk7&lt;/a&gt; on Twitter and you can also find me on &lt;a href="https://linkedin.com/in/rishabkumar7"&gt;LinkedIn.&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>devops</category>
      <category>container</category>
    </item>
    <item>
      <title>How I passed the AWS DevOps Engineer Professional Exam</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Mon, 15 May 2023 19:14:30 +0000</pubDate>
      <link>https://forem.com/aws-builders/how-i-passed-the-aws-devops-engineer-professional-exam-21ic</link>
      <guid>https://forem.com/aws-builders/how-i-passed-the-aws-devops-engineer-professional-exam-21ic</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In this blog post, I'll share my experience taking the AWS DevOps Pro Exam, how I prepared for it, well I guess not prepared for it, and some recommendations for resources to help you pass the exam.&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;p&gt;Like the &lt;a href="https://blog.rishabkumar.com/how-i-passed-azure-az-400-devops-engineer-exam" rel="noopener noreferrer"&gt;Azure DevOps Expert Exam (AZ-400)&lt;/a&gt;, I didn't have any specific preparation for the AWS DevOps PRO Exam either. I had a coupon expiring and the exam version was retiring, so I decided to wing it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://twitter.com/rishabk7/status/1633101885820268544" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fquv1volp7ifyfzqb2zko.png" alt="Tweet About winging the AWS DevOps Engineer Pro Exam" width="800" height="669"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That being said, I do have one year of DevOps engineering experience using TeamCity, CloudFormation, Terraform, and Azure DevOps, which helped me understand DevOps principles and where AWS tools fit in the process. However, I wasn't familiar with AWS-specific tooling, such as CodeCommit, CodePipeline, and CodeBuild.&lt;/p&gt;

&lt;h2&gt;
  
  
  My Exam Experience
&lt;/h2&gt;

&lt;p&gt;I took the exam in person at a nearby college rather than doing it remotely from my home, via PearsonVUE. The exam took almost two and a half hours out of the three hours allotted, and I found it challenging to focus after the 30-35th question due to a lack of breakfast and a bad headache.&lt;/p&gt;

&lt;p&gt;The questions were lengthier compared to the Azure DevOps Expert Exam (AZ-400), requiring more reading and remembering the context of the questions. Nevertheless, I passed the exam with a score of 756.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp73vjw6qes0flw1llivd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp73vjw6qes0flw1llivd.png" alt="AWS DevOps Engineer Professional Exam Report" width="800" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Recommendations for Exam Preparation
&lt;/h2&gt;

&lt;p&gt;I suggest familiarizing yourself with all the DevOps tools offered by AWS, such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;CodeCommit, CodeBuild, CodeDeploy, CodePipeline&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CloudFormation&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Elastic Beanstalk&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;SSM and OpsWorks&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;There were also questions about CloudTrail, CloudWatch logs, monitoring, AWS Config, and AWS Inspector.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Understanding the concepts of &lt;strong&gt;fault tolerance&lt;/strong&gt; , &lt;strong&gt;disaster recovery&lt;/strong&gt; , and &lt;strong&gt;high availability&lt;/strong&gt; is essential. If you have a strong understanding of the &lt;strong&gt;Software Development Lifecycle&lt;/strong&gt; (SDLC) and have worked within a DevOps or Cloud team, you should be well-prepared for the exam.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;p&gt;Though I didn't use any resources for my preparation, I recommend checking out Stephen Marek's courses on Udemy for AWS certificate courses.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.udemy.com/share/101WpU3@IsZWka_V744pFTkUKJyG6jr5J5QDyL__a_b2eUrEQowa0q811H3Psd2D11YwSasl/" rel="noopener noreferrer"&gt;AWS Certified DevOps Engineer Professional 2023 - DOP-C02&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For practice exams, Tutorial Dojo/Jon Bonso's practice exams are a great option.&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://portal.tutorialsdojo.com/product/aws-certified-devops-engineer-professional-practice-exams/" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ftd-portal-cdn.tutorialsdojo.com%2Fwp-content%2Fuploads%2F2022%2F06%2FAWS-Certified-DevOps-Engineer-Professional-Practice-Exams-DOP-C01-Course-Image.jpg" height="457" class="m-0" width="800"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://portal.tutorialsdojo.com/product/aws-certified-devops-engineer-professional-practice-exams/" rel="noopener noreferrer" class="c-link"&gt;
          AWS Certified DevOps Engineer Professional Practice Exams 2025
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Be an AWS DevOps Engineer! 150 AWS Certified DevOps Engineer Professional (DOP-C02) Practice Exam Questions w/ Complete Explanations, References, and Cheat Sheets — all with UNLIMITED ACCESS FOR 1 YEAR so you can study at your own pace, anytime, anywhere. This practice test course comes with four training modes: (1) Timed Mode, (2) Review Mode, (3) Section-Based Tests, and (4) Final Test. Plus Bonus Flashcards to help you pass your AWS Certified DevOps Engineer Professional exam on your first try!
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Ftd-portal-cdn.tutorialsdojo.com%2Fwp-content%2Fuploads%2F2020%2F01%2Fcropped-tutorialsdojo_favicon-BLUE-2-32x32.png" width="32" height="32"&gt;
        portal.tutorialsdojo.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;While I didn't prepare extensively for the AWS DevOps Pro Exam, my experience in the field and understanding of DevOps principles helped me pass. Remember that hands-on experience is &lt;strong&gt;invaluable&lt;/strong&gt; when preparing for an exam like this. Good luck with your preparation, and I hope you find these insights helpful!&lt;/p&gt;

&lt;p&gt;Bonus: If you are a video person, check out my &lt;a href="https://youtube.com/@rishabkumar7" rel="noopener noreferrer"&gt;YouTube Channel&lt;/a&gt;, where I talk about Cloud, DevOps and tech.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/xtIZ3AyaRK4"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Feel free to reach out to me on &lt;a href="https://twitter.com/rishabk7" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt; or &lt;a href="https://linkedin.com/in/rishabkumar7" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, if you have any questions.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>certification</category>
    </item>
    <item>
      <title>How I Passed the Microsoft Azure AZ-400 DevOps Engineer Expert Exam</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Mon, 08 May 2023 19:01:14 +0000</pubDate>
      <link>https://forem.com/rishabk7/how-i-passed-the-microsoft-azure-az-400-devops-engineer-expert-exam-5h2c</link>
      <guid>https://forem.com/rishabk7/how-i-passed-the-microsoft-azure-az-400-devops-engineer-expert-exam-5h2c</guid>
      <description>&lt;p&gt;I recently passed the Microsoft Azure AZ-400 DevOps Engineer Expert exam &lt;strong&gt;without any preparation&lt;/strong&gt; , which is designed to test your knowledge and skills in DevOps practices: continuous integration and deployment, infrastructure as code, and monitoring and logging. In this blog post, I want to share my experience of taking the exam without much preparation and how I managed to pass it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exam Objective
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Configure processes and communications&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Design and implement source control&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Design and implement build and release pipelines&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Develop a security and compliance plan&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Implement an instrumentation strategy&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisite
&lt;/h2&gt;

&lt;p&gt;In order to become a Microsoft Certified: DevOps Engineer Expert, you need either &lt;a href="https://learn.microsoft.com/en-us/certifications/azure-administrator/"&gt;Azure Administrator Associate AZ-104&lt;/a&gt; or &lt;a href="https://learn.microsoft.com/en-us/certifications/azure-developer/"&gt;Azure Developer Associate AZ-204.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7IXivyEW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1683563433732/5cb4184d-cf8a-43c9-ad77-e5c2ec294a25.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7IXivyEW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1683563433732/5cb4184d-cf8a-43c9-ad77-e5c2ec294a25.png" alt="" width="800" height="413"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation
&lt;/h2&gt;

&lt;p&gt;First of all, I have to admit that I got lucky to have a voucher for the exam, which I received after participating in the &lt;strong&gt;Microsoft Ignite Cloud Skills Challenge&lt;/strong&gt;. Otherwise, the exam fee is around &lt;strong&gt;$165&lt;/strong&gt; , which is not cheap. I booked the exam last year in November, for February 15th (this year), which was the last day to use the voucher. I did not prepare for the exam because I forgot that I had the exam on Feb 15th. Two days before the exam, I received an email from Pearson, the exam provider, reminding me of the upcoming exam, I thought I could not learn all the topics in such a short time.&lt;/p&gt;

&lt;p&gt;I was surprised and nervous at the same time, as I did not expect the exam to be that close. I looked at the exam outline and realized that it was extensive, covering various aspects of DevOps practices and Azure services. However, I remembered that I had already gone through the Microsoft Learn modules when I participated in the cloud skills challenge back in November. That gave me some confidence as I had some background knowledge of the topics covered in the exam.&lt;/p&gt;

&lt;p&gt;I also had &lt;strong&gt;one year of experience&lt;/strong&gt; working as a DevOps Engineer with Azure DevOps as the primary tool for pipelines and continuous integration and deployment. That gave me a good understanding of how Azure services work together and how to configure them for different scenarios. Although I had not worked with Azure DevOps for some time, I could still remember the dashboard and some of the workflows I had set up.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exam Experience
&lt;/h2&gt;

&lt;p&gt;When I started the exam, I realized that some questions were challenging and required careful analysis and understanding of the scenarios. One area where I struggled was the different version control systems, such as Perforce and Apache Subversion, which I had not worked with before.&lt;/p&gt;

&lt;p&gt;However, I managed to pass the exam with a score of &lt;strong&gt;710 out of 1000&lt;/strong&gt; , which was just above the passing score of 700. I did the worst in the &lt;strong&gt;Developer Security and Compliance Plan section&lt;/strong&gt; , which I had not practiced before. I did the best in the &lt;strong&gt;Implement and Instrumentation Strategy section&lt;/strong&gt; , which I was comfortable with because of my experience.&lt;/p&gt;

&lt;p&gt;Although I managed to pass the exam without much preparation, I would not recommend anyone to &lt;strong&gt;take that risk&lt;/strong&gt;. It is better to study and revise the topics covered in the exam, especially if you are not familiar with Azure services and DevOps practices. Also, make sure you practice the exam questions and understand the scenarios presented. A year of experience working with Azure DevOps and Azure services gave me some advantages, but that might not be the case for everyone.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Resources&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Microsoft Learn is your &lt;strong&gt;best friend&lt;/strong&gt; for any Azure certification.&lt;/p&gt;

&lt;p&gt;Go through the &lt;a href="https://learn.microsoft.com/en-us/certifications/devops-engineer/"&gt;AZ-400 Microsoft Learn module&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you are looking for a video-based course, I don't have specific recommendations but highly suggest watching this playlist by John Savill:&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://www.youtube.com/playlist?list=PLlVtbbG169nFr8RzQ4GIxUEznpNR53ERq" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://res.cloudinary.com/practicaldev/image/fetch/s--4AOg5Nls--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://i.ytimg.com/vi/R74bm8IGu2M/hqdefault.jpg%3Fsqp%3D-oaymwEWCKgBEF5IWvKriqkDCQgBFQAAiEIYAQ%3D%3D%26rs%3DAOn4CLAEEJK0L0vnoOplad9WpFDxdpXviQ%26days_since_epoch%3D19486" height="94" class="m-0" width="168"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://www.youtube.com/playlist?list=PLlVtbbG169nFr8RzQ4GIxUEznpNR53ERq" rel="noopener noreferrer" class="c-link"&gt;
          
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;p class="truncate-at-3"&gt;
          Make sure to check out the repo for all the code, artifacts and additional recommended videos. https://github.com/johnthebrit/DevOpsMC
        &lt;/p&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://res.cloudinary.com/practicaldev/image/fetch/s--VL8TsHpK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://www.youtube.com/s/desktop/994ec74a/img/favicon.ico" width="16" height="16"&gt;
        youtube.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;If you are a video person, I also have a YouTube video talking about my experience with the exam.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/QCFChF-V24s"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;In conclusion, passing the Microsoft Azure AZ-400 DevOps Engineer Expert exam requires a good understanding of Azure services and DevOps practices. With proper preparation and practice, anyone can pass the exam and earn the certification. If you have any questions about the exam or the certification, feel free to leave a comment below.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>devops</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Deploying a static website to AWS with Pulumi</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Tue, 02 May 2023 12:00:42 +0000</pubDate>
      <link>https://forem.com/aws-builders/deploying-a-static-website-to-aws-with-pulumi-14n1</link>
      <guid>https://forem.com/aws-builders/deploying-a-static-website-to-aws-with-pulumi-14n1</guid>
      <description>&lt;p&gt;Deploying a static website to the cloud has never been easier, thanks to Infrastructure as Code (IaC) tools like Pulumi. If you're like me, a developer who has used Terraform for your IaC needs in the past, Pulumi offers an alternative that allows you to write code in your preferred programming language (TypeScript/JavaScript, Python, Go, .NET, and Java) to provision and manage cloud infrastructure. In this blog post, we will walk through the steps to deploy a static website to Amazon Web Services (AWS) using Pulumi, ps: this is my first time trying it out.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Pulumi
&lt;/h2&gt;

&lt;p&gt;Since I am doing this demo on a macOS, it is easy with homebrew:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;brew install pulumi/tap/pulumi

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;For Linux, here is the install script:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl -fsSL https://get.pulumi.com | sh

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;If you are on Windows, you can download the MSI &lt;a href="https://www.pulumi.com/docs/get-started/install/"&gt;here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After installation, here is the list of available commands:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;@Rishabs-MacBook-Pro ~ pulumi
Usage:
  pulumi [command]

Available Commands:
  about Print information about the Pulumi environment.
  cancel Cancel a stack's currently running update, if any
  config Manage configuration
  console Opens the current stack in the Pulumi Console
  convert Convert Pulumi programs from a supported source program into other supported languages
  destroy Destroy all existing resources in the stack
  gen-completion Generate completion scripts for the Pulumi CLI
  help Help about any command
  import Import resources into an existing stack
  login Log in to the Pulumi Cloud
  logout Log out of the Pulumi Cloud
  logs Show aggregated resource logs for a stack
  new Create a new Pulumi project
  org Manage Organization configuration
  package Work with Pulumi packages
  plugin Manage language and resource provider plugins
  policy Manage resource policies
  preview Show a preview of updates to a stack's resources
  refresh Refresh the resources in a stack
  schema Analyze package schemas
  stack Manage stacks
  state Edit the current stack's state
  up Create or update the resources in a stack
  version Print Pulumi's version number
  watch Continuously update the resources in a stack
  whoami Display the current logged-in user

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Other Requirements
&lt;/h2&gt;

&lt;p&gt;Make sure you have AWS CLI configured. You can read more on how to download and configure AWS CLI &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html"&gt;here.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And for the static website, I am using my terminal-portfolio as an example.&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A9-wwsHG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/rishabkumar7"&gt;
        rishabkumar7
      &lt;/a&gt; / &lt;a href="https://github.com/rishabkumar7/terminal-portfolio"&gt;
        terminal-portfolio
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;h1&gt;
terminal-portfolio&lt;/h1&gt;
&lt;/div&gt;

  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/rishabkumar7/terminal-portfolio"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;



&lt;h2&gt;
  
  
  Deploying the site
&lt;/h2&gt;

&lt;p&gt;Now, let's create a new directory for our project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir static-website &amp;amp;&amp;amp; static-website

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We'll be using &lt;code&gt;pulumi new&lt;/code&gt; to initialize a new Pulumi project in my favorite programming language, Python.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pulumi new static-website-aws-python

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Go through the prompts to configure the project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SEyDj0Ru--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742160004/280b68ee-3a58-46ae-abac-6f77cdbafa1e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SEyDj0Ru--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742160004/280b68ee-3a58-46ae-abac-6f77cdbafa1e.png" alt="Configuring the Pulumi project with CLI" width="800" height="369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It will ask you to either paste your access token or log in using your browser.&lt;/p&gt;

&lt;p&gt;If you haven't created a Pulumi account, go ahead and hit &lt;code&gt;enter,&lt;/code&gt; it will launch the browser and take you to the sign-in page. Sign-up for the Pulumi account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--odzKI0RY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742281084/f9db2aad-c0e0-4030-9baa-a825b21417f3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--odzKI0RY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742281084/f9db2aad-c0e0-4030-9baa-a825b21417f3.png" alt="Sign up for the Pulumi Account" width="800" height="555"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the account creation is complete, go back to your terminal, and you'll see that the authentication was successful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HZVrR3tJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742403128/a45445ae-914a-4ee6-b0dd-113da6c30329.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HZVrR3tJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742403128/a45445ae-914a-4ee6-b0dd-113da6c30329.png" alt="Pulumi CLI will configure and ask for basic project info" width="800" height="411"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's go through the project creation setup, after login is complete it will prompt you with project configurations:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7WYiXSUQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742471639/c32200a0-ba54-4216-be8d-7c51063b10ba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7WYiXSUQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742471639/c32200a0-ba54-4216-be8d-7c51063b10ba.png" alt="Project configurations and settings" width="800" height="460"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lU1YnMvO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742541913/ca50d02c-74b3-4b15-8f9d-17ae80b6dcf2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lU1YnMvO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742541913/ca50d02c-74b3-4b15-8f9d-17ae80b6dcf2.png" alt="Pulumi Finished setting up the project" width="800" height="460"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you have a finished project thats ready to be deployed, configured with the most common settings.&lt;/p&gt;

&lt;p&gt;Also, let's inspect the code in &lt;code&gt;__main__.py&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import pulumi
import pulumi_aws as aws
import pulumi_synced_folder as synced_folder

# Import the program's configuration settings.
config = pulumi.Config()
path = config.get("path") or "./www"
index_document = config.get("indexDocument") or "index.html"
error_document = config.get("errorDocument") or "error.html"

# Create an S3 bucket and configure it as a website.
bucket = aws.s3.Bucket(
    "bucket",
    website=aws.s3.BucketWebsiteArgs(
        index_document=index_document,
        error_document=error_document,
    ),
)

# Set ownership controls for the new bucket
ownership_controls = aws.s3.BucketOwnershipControls(
    "ownership-controls",
    bucket=bucket.bucket,
    rule=aws.s3.BucketOwnershipControlsRuleArgs(
        object_ownership="ObjectWriter",
    )
)

# Configure public ACL block on the new bucket
public_access_block = aws.s3.BucketPublicAccessBlock(
    "public-access-block",
    bucket=bucket.bucket,
    block_public_acls=False,
)

# Use a synced folder to manage the files of the website.
bucket_folder = synced_folder.S3BucketFolder(
    "bucket-folder",
    acl="public-read",
    bucket_name=bucket.bucket,
    path=path,
    opts=pulumi.ResourceOptions(depends_on=[
        ownership_controls,
        public_access_block
    ])
)

# Create a CloudFront CDN to distribute and cache the website.
cdn = aws.cloudfront.Distribution(
    "cdn",
    enabled=True,
    origins=[
        aws.cloudfront.DistributionOriginArgs(
            origin_id=bucket.arn,
            domain_name=bucket.website_endpoint,
            custom_origin_config=aws.cloudfront.DistributionOriginCustomOriginConfigArgs(
                origin_protocol_policy="http-only",
                http_port=80,
                https_port=443,
                origin_ssl_protocols=["TLSv1.2"],
            ),
        )
    ],
    default_cache_behavior=aws.cloudfront.DistributionDefaultCacheBehaviorArgs(
        target_origin_id=bucket.arn,
        viewer_protocol_policy="redirect-to-https",
        allowed_methods=[
            "GET",
            "HEAD",
            "OPTIONS",
        ],
        cached_methods=[
            "GET",
            "HEAD",
            "OPTIONS",
        ],
        default_ttl=600,
        max_ttl=600,
        min_ttl=600,
        forwarded_values=aws.cloudfront.DistributionDefaultCacheBehaviorForwardedValuesArgs(
            query_string=True,
            cookies=aws.cloudfront.DistributionDefaultCacheBehaviorForwardedValuesCookiesArgs(
                forward="all",
            ),
        ),
    ),
    price_class="PriceClass_100",
    custom_error_responses=[
        aws.cloudfront.DistributionCustomErrorResponseArgs(
            error_code=404,
            response_code=404,
            response_page_path=f"/{error_document}",
        )
    ],
    restrictions=aws.cloudfront.DistributionRestrictionsArgs(
        geo_restriction=aws.cloudfront.DistributionRestrictionsGeoRestrictionArgs(
            restriction_type="none",
        ),
    ),
    viewer_certificate=aws.cloudfront.DistributionViewerCertificateArgs(
        cloudfront_default_certificate=True,
    ),
)

# Export the URLs and hostnames of the bucket and distribution.
pulumi.export("originURL", pulumi.Output.concat("http://", bucket.website_endpoint))
pulumi.export("originHostname", bucket.website_endpoint)
pulumi.export("cdnURL", pulumi.Output.concat("https://", cdn.domain_name))
pulumi.export("cdnHostname", cdn.domain_name)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So the template requires no additional configuration. Once the new project is created, you can deploy it immediately with &lt;a href="https://www.pulumi.com/docs/reference/cli/pulumi_up"&gt;&lt;code&gt;pulumi up&lt;/code&gt;&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PmMmwOH2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742572075/16c83a7b-99af-4388-b981-77a409f3b68a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PmMmwOH2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742572075/16c83a7b-99af-4388-b981-77a409f3b68a.png" alt="Pulumi will look at the code and identify resources to be deployed" width="800" height="588"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There will be a prompt, asking you if you want to perform this update, type &lt;code&gt;yes&lt;/code&gt; and hit &lt;code&gt;enter.&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OBXN14m---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742663919/404c69aa-ba00-463d-9045-caa0b14b048c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OBXN14m---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742663919/404c69aa-ba00-463d-9045-caa0b14b048c.png" alt="Pulumi starts deploying the resources" width="800" height="619"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see, it created 8 resources. And also gave us the following outputs:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Config&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;cdnHostname&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The provider-assigned hostname of the CloudFront CDN. Useful for creating &lt;code&gt;CNAME&lt;/code&gt; records to associate custom domains.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;cdnURL&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The fully-qualified HTTPS URL of the CloudFront CDN.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;originHostname&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The provider-assigned hostname of the S3 bucket.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;originURL&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The fully-qualified HTTP URL of the S3 bucket endpoint.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Let's check out our static website by visiting the &lt;code&gt;cdnURL&lt;/code&gt; which will looks something like this - &lt;a href="https://d2384wrx9ddsro.cloudfront.net/"&gt;https://d2384wrx9ddsro.cloudfront.net/&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ctReq5KQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742932248/f440105d-a7ef-4621-a6ae-7ca1452b7c72.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ctReq5KQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742932248/f440105d-a7ef-4621-a6ae-7ca1452b7c72.png" alt="My static site served by S3 bucket" width="800" height="332"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Aye! We have a static website running on AWS!&lt;/p&gt;

&lt;h2&gt;
  
  
  Customizing the site
&lt;/h2&gt;

&lt;p&gt;To customize the website to be my &lt;code&gt;terminal portfolio&lt;/code&gt;, I am going to copy/clone the GitHub repository within our &lt;code&gt;static-website&lt;/code&gt; directory.&lt;/p&gt;

&lt;p&gt;So, this is what my directory structure looks like now:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XOqpJhOd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682743104712/ca0bf118-dc8e-4bfa-b29b-d5aa8da3833f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XOqpJhOd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682743104712/ca0bf118-dc8e-4bfa-b29b-d5aa8da3833f.png" alt="My directory structure for the project with the new folder" width="800" height="320"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then using the &lt;a href="https://www.pulumi.com/docs/reference/cli/pulumi_config_set"&gt;&lt;code&gt;pulumi config set&lt;/code&gt;&lt;/a&gt;, I am going to point to &lt;code&gt;terminal-portfolio&lt;/code&gt; folder, instead of the &lt;code&gt;www&lt;/code&gt; folder, with the &lt;code&gt;path&lt;/code&gt; setting:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pulumi config set path terminal-portfolio

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then let's deploy the changes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pulumi up

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4oFvrJ0P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682743164107/88497ed1-bd99-484e-b6b7-67543985ab9b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4oFvrJ0P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682743164107/88497ed1-bd99-484e-b6b7-67543985ab9b.png" alt="Pulumi detects the changes to the config and deploys them" width="800" height="660"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see that new changes have been deployed, but when you navigate to the CDN URL, it still might show the old "Hello, World!" website, that's due to the cache, by default, the generated program configures the CloudFront CDN to cache files for 600 seconds (10 minutes).&lt;/p&gt;

&lt;p&gt;But we can change that!&lt;/p&gt;

&lt;p&gt;Let's look at &lt;code&gt;__main.py__&lt;/code&gt; :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;default_cache_behavior=aws.cloudfront.DistributionDefaultCacheBehaviorArgs(
        target_origin_id=bucket.arn,
        viewer_protocol_policy="redirect-to-https",
        allowed_methods=[
            "GET",
            "HEAD",
            "OPTIONS",
        ],
        cached_methods=[
            "GET",
            "HEAD",
            "OPTIONS",
        ],
        default_ttl=600,
        max_ttl=600,
        min_ttl=600,

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can change those values to your desired settings.&lt;/p&gt;

&lt;p&gt;Let's check my terminal-portfolio again, by visiting the CDN URL - &lt;a href="https://d2384wrx9ddsro.cloudfront.net/"&gt;https://d2384wrx9ddsro.cloudfront.net&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QIy6IB1a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742091909/37448cd6-c564-4061-a162-c9f49c59e95c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QIy6IB1a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://cdn.hashnode.com/res/hashnode/image/upload/v1682742091909/37448cd6-c564-4061-a162-c9f49c59e95c.png" alt="My terminal-portfolio website is now being served by S3" width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Voila!&lt;/p&gt;

&lt;p&gt;In conclusion, Pulumi is a powerful tool for deploying and managing cloud infrastructure with ease. Whether you're new to Pulumi or have used it before, this blog post has shown you how to use it to deploy a static website to AWS.&lt;/p&gt;

&lt;p&gt;Follow me here on &lt;a href="https://dev.to/rishabk7"&gt;dev.to&lt;/a&gt; or &lt;a href="https://twitter.com/rishabk7"&gt;Twitter&lt;/a&gt;/&lt;a href="https://linkedin.com/in/rishabkumar7"&gt;LinkedIn&lt;/a&gt; to stay up-to-date with my latest blog posts and tech tutorials.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cloud</category>
      <category>iac</category>
    </item>
    <item>
      <title>AWS CLI Cheat Sheet: Quick Reference Guide for Cloud Developers</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Fri, 10 Mar 2023 19:55:34 +0000</pubDate>
      <link>https://forem.com/rishabk7/aws-cli-cheat-sheet-quick-reference-guide-for-cloud-developers-1kfg</link>
      <guid>https://forem.com/rishabk7/aws-cli-cheat-sheet-quick-reference-guide-for-cloud-developers-1kfg</guid>
      <description>&lt;p&gt;With cloud computing becoming more popular and AWS being one of the leading cloud providers, it's essential for developers to understand how to use the AWS Command Line Interface (CLI).&lt;/p&gt;

&lt;p&gt;The command line interface (CLI) is a powerful tool that allows developers to manage AWS resources and services from the command line, and it can greatly improve your workflow. However, with so many commands and options available, getting started can be overwhelming for beginners. This is where my AWS CLI Cheat Sheet comes into play. It provides you with a concise yet comprehensive reference guide covering the most commonly used AWS CLI commands for services like EC2, IAM and S3 in this blog. Whether you're new to AWS or an experienced developer looking to improve your workflow, this cheat sheet will help.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installation
&lt;/h2&gt;

&lt;p&gt;First, you will need to install the AWS CLI on your machine. You can find the instructions on how to install the latest version of AWS CLI:&lt;/p&gt;

&lt;h3&gt;
  
  
  Linux:
&lt;/h3&gt;

&lt;p&gt;x86&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Linux Arm&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl "https://awscli.amazonaws.com/awscli-exe-linux-aarch64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  MacOS:
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl "https://awscli.amazonaws.com/AWSCLIV2.pkg" -o "AWSCLIV2.pkg"
sudo installer -pkg AWSCLIV2.pkg -target /

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Windows:
&lt;/h3&gt;

&lt;p&gt;Download and run the AWS CLI MSI installer for Windows (64-bit)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://awscli.amazonaws.com/AWSCLIV2.msi"&gt;https://awscli.amazonaws.com/AWSCLIV2.msi&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Alternatively, you can run the &lt;code&gt;msiexec&lt;/code&gt; command to run the MSI installer.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;msiexec.exe /i https://awscli.amazonaws.com/AWSCLIV2.msi

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we have the AWS CLI installed, let's cover the important command to configure it with our AWS Account.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuring AWS CLI
&lt;/h2&gt;

&lt;p&gt;In order to authenticate to our AWS account, you will need to generate an Access key and secret access key for an IAM user.&lt;/p&gt;

&lt;p&gt;You can refer to the documentation from AWS on how to create Access keys for IAM users - &lt;a href="https://aws.amazon.com/premiumsupport/knowledge-center/create-access-key/"&gt;Create an AWS access key (&lt;/a&gt;&lt;a href="http://amazon.com"&gt;amazon.com&lt;/a&gt;&lt;a href="https://aws.amazon.com/premiumsupport/knowledge-center/create-access-key/"&gt;)&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The AWS CLI stores our information in a &lt;em&gt;profile&lt;/em&gt; named &lt;code&gt;default&lt;/code&gt; in the &lt;code&gt;credentials&lt;/code&gt; file. By default, the information in this profile is used when you run an AWS CLI command that doesn't explicitly specify a profile to use.&lt;/p&gt;

&lt;p&gt;The following example shows how you can configure the AWS CLI. Replace them with your own values:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ aws configure
AWS Access Key ID [None]: AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: ca-central-1
Default output format [None]: json

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We have the AWS CLI configured now to work with out AWS account. Let's go over the commands now to interact with AWS EC2, IAM and S3 services.&lt;/p&gt;

&lt;h2&gt;
  
  
  AWS EC2
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Keypairs
&lt;/h3&gt;

&lt;p&gt;list all keypairs:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-key-pairs&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create a keypair:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 create-key-pair --key-name --output text&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create a new local private/public keypair, using RSA 4096-bit:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;ssh-keygen -t rsa -b 4096&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;import an existing keypair:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 import-key-pair --key-name keyname_test --public-key-material file:///home/rkumar/id_rsa.pub&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete a keypair:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 delete-key-pair --key-name&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Images
&lt;/h3&gt;

&lt;p&gt;list all private AMI's, ImageId and Name tags:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 describe-images --filter "Name=is-public,Values=false" --query 'Images[].[ImageId, Name]' --output text

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;delete an AMI, by ImageId:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 deregister-image --image-id ami-00000000&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Instances
&lt;/h3&gt;

&lt;p&gt;list all instances (running, and not running):&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-instances&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list all instances running:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-instances --filters Name=instance-state-name,Values=running&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create a new instance:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 run-instances --image-id ami-a0b1234 --instance-type t2.micro --security-group-ids sg-00000000 --dry-run

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;stop an instance:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 terminate-instances --instance-ids &amp;lt;instance_id&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list status of all instances:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-instance-status&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list status of a specific instance:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-instance-status --instance-ids &amp;lt;instance_id&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list all running instance, Name tag and Public IP Address:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 describe-instances --filters Name=instance-state-name,Values=running --query 'Reservations[].Instances[].[PublicIpAddress, Tags[?Key==Name].Value | [0] ]' --output text

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Security Groups
&lt;/h3&gt;

&lt;p&gt;list all security groups:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-security-groups&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create a security group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 create-security-group --vpc-id vpc-1a2b3c4d --group-name web-access --description "web access"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list details about a securty group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 describe-security-groups --group-id sg-0000000&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;open port 80, for everyone:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress --group-id sg-0000000 --protocol tcp --port 80 --cidr 0.0.0.0/24&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;get my public ip:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;my_ipaddress=$(dig +short&lt;/code&gt; &lt;a href="http://myip.opendns.com"&gt;&lt;code&gt;myip.opendns.com&lt;/code&gt;&lt;/a&gt; &lt;code&gt;@&lt;/code&gt;&lt;a href="http://resolver1.opendns.com"&gt;&lt;code&gt;resolver1.opendns.com&lt;/code&gt;&lt;/a&gt;&lt;code&gt;); echo $my_ipaddress&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;open port 22, just for my ip:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress --group-id sg-0000000 --protocol tcp --port 80 --cidr $my_ipaddress/24&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;remove a firewall rule from a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 revoke-security-group-ingress --group-id sg-0000000 --protocol tcp --port 80 --cidr 0.0.0.0/24&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete a security group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws ec2 delete-security-group --group-id sg-00000000&lt;/code&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  AWS IAM
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Users
&lt;/h3&gt;

&lt;p&gt;list all user's info:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-users&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list all user's usernames:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-users --output text | cut -f 6&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list current user's info:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam get-user&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list current user's access keys:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-access-keys&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;crate new user:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws iam create-user --user-name UserName

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;create multiple new users, from a file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;allUsers=$(cat ./user-names.txt) for userName in $allUsers; do aws iam create-user --user-name $userName done

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;list all users:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-users --no-paginate&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;get a specific user's info:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam get-user --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete one user:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam delete-user --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete all users:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;allUsers=$(aws iam list-users --output text | cut -f 6);
allUsers=$(cat ./user-names.txt) for userName in $allUsers; do aws iam delete-user 
--user-name $userName done

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Access Keys
&lt;/h3&gt;

&lt;p&gt;list all access keys:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-access-keys&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list access keys of a specific user:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-access-keys --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create a new access key:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam create-access-key --user-name UserName --output text | tee UserName.txt&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list last access time of an access key:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam get-access-key-last-used --access-key-id AKSZZRR7RKZY4EXAMPLE&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;deactivate an access key:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam update-access-key --access-key-id AKIZNAA7RKZY4EXAMPLE --status Inactive --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete an access key:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam delete-access-key --access-key-id AKIZNAA7RKZY4EXAMPLE --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Group and Policies
&lt;/h3&gt;

&lt;p&gt;list all groups:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-groups&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;create a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam create-group --group-name GroupName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam delete-group --group-name GroupName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list all policies:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-policies&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;get a specific policy:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam get-policy --policy-arn&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list all users, groups, and roles, for a given policy:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-entities-for-policy --policy-arn&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list policies, for a given group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-attached-group-policies --group-name GroupName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;add a policy to a group:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws iam attach-group-policy --group-name GroupName --policy-arn arn:aws:iam::aws:policy/AdministratorAccess

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;add a user to a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam add-user-to-group --group-name GroupName --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list users, for a given group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam get-group --group-name GroupName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list groups, for a given user:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam list-groups-for-user --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;remove a user from a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam remove-user-from-group --group-name GroupName --user-name UserName&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;remove a policy from a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam detach-group-policy --group-name GroupName --policy-arn arn:aws:iam::aws:policy/AdministratorAccess&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;delete a group:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws iam delete-group --group-name GroupName&lt;/code&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  AWS S3
&lt;/h2&gt;

&lt;p&gt;list buckets:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 ls&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;list bucket content:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 ls s3://&amp;lt;bucketName&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;make bucket:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 mb s3://&amp;lt;bucketName&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;remove empty bucket:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 rb s3://&amp;lt;bucketName&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;copy to bucket:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 cp &amp;lt;object&amp;gt; s3://&amp;lt;bucketName&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;copy from bucket:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 cp s3://&amp;lt;bucketName&amp;gt;/&amp;lt;object&amp;gt; &amp;lt;destination&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;move object:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 mv s3://&amp;lt;bucketName&amp;gt;/&amp;lt;object&amp;gt; &amp;lt;destination&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;sync objects:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 sync &amp;lt;local&amp;gt; s3://&amp;lt;bucketName&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;removed objects:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws s3 rm s3://&amp;lt;bucketName&amp;gt;/&amp;lt;object&amp;gt;&lt;/code&gt;&lt;/p&gt;




&lt;p&gt;You can download the PDF verision of the AWS CLI cheat-sheets here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.buymeacoffee.com/rishabincloud/e/122221"&gt;AWS EC2&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Hope you liked this post, feel free to reach out to me on &lt;a href="https://twitter.com/rishabk7"&gt;Twitter&lt;/a&gt; or &lt;a href="https://linkedin.com/in/rishabkumar7"&gt;LinkedIn&lt;/a&gt;.&lt;br&gt;&lt;br&gt;
Happy Coding!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>What is the Cloud Resume Challenge?</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Fri, 10 Feb 2023 15:55:32 +0000</pubDate>
      <link>https://forem.com/aws-builders/what-is-the-cloud-resume-challenge-ma5</link>
      <guid>https://forem.com/aws-builders/what-is-the-cloud-resume-challenge-ma5</guid>
      <description>&lt;p&gt;In this blog post, I'll talk about the AWS Cloud Resume Challenge and my experience with it. I had just finished my &lt;a href="https://blog.rishabkumar.com/a-cloud-guru-azure-resume-challenge-2021" rel="noopener noreferrer"&gt;Azure Cloud Resume Challenge&lt;/a&gt; and was eager to get started on the AWS version. Despite the fact that I had finished a portfolio website hosted on S3 with CloudFront, it was not a complete version of the Cloud Resume Challenge. The challenge didn't exist at the time, and my portfolio lacked some of the principles mentioned in it, such as a database aspect and Infrastructure as Code (IAC).&lt;/p&gt;

&lt;p&gt;So I decided to make my own version of the Cloud Resume Challenge, complete with accompanying videos. The challenge's website is called &lt;a href="https://cloudresumechallenge.dev/" rel="noopener noreferrer"&gt;Cloud Resume Challenge dot dev&lt;/a&gt;, and the AWS version is available there.&lt;/p&gt;

&lt;h2&gt;
  
  
  AWS Cloud Resume Challenge
&lt;/h2&gt;

&lt;p&gt;These are the steps that are part of the challenge:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Certification&lt;/li&gt;
&lt;li&gt;HTML&lt;/li&gt;
&lt;li&gt;CSS&lt;/li&gt;
&lt;li&gt;Static Website&lt;/li&gt;
&lt;li&gt;HTTPS&lt;/li&gt;
&lt;li&gt;DNS&lt;/li&gt;
&lt;li&gt;Javascript&lt;/li&gt;
&lt;li&gt;Database&lt;/li&gt;
&lt;li&gt;API&lt;/li&gt;
&lt;li&gt;Python&lt;/li&gt;
&lt;li&gt;Tests&lt;/li&gt;
&lt;li&gt;Infrastructure as Code&lt;/li&gt;
&lt;li&gt;Source Control&lt;/li&gt;
&lt;li&gt;CI/CD (Back end)&lt;/li&gt;
&lt;li&gt;CI/CD (Front end)&lt;/li&gt;
&lt;li&gt;Blog post&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fib64z1u0xs01twl70r27.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fib64z1u0xs01twl70r27.png" alt="AWS Cloud Resume Challenge"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Certification
&lt;/h3&gt;

&lt;p&gt;The first step in completing the challenge is to work towards a certification, which I already hold. Because I have four AWS Cloud certifications, I can skip this step.&lt;/p&gt;

&lt;h3&gt;
  
  
  HTML &amp;amp; CSS
&lt;/h3&gt;

&lt;p&gt;The next step is to use HTML, CSS, and JavaScript to create a static website for your resume. This website should be an HTML website rather than a Word document. Forest, the creator of the Cloud Resume Challenge, demonstrates how this website should look.&lt;/p&gt;

&lt;p&gt;Back in 2018, I built a similar portfolio website, which can be found online by searching for "HTML5 portfolio free templates." HTML5 up.net is a recommended website for templates, where you can find a variety of templates that can be downloaded and customised to your liking. I used a template, changed the theme to purple, my favourite colour, and added my own personal information to the website.&lt;/p&gt;

&lt;p&gt;To complete this step, you will need to familiarize yourself with HTML and CSS, as most of the code is already written for you. You simply need to edit the HTML files with your own information.&lt;/p&gt;

&lt;h3&gt;
  
  
  S3 and CloudFront
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fym4nhm7kkf7ifg1ldqiz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fym4nhm7kkf7ifg1ldqiz.png" alt="AWS Cloud Resume Challenge"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next step is to create an S3 bucket with a static website and enable HTTPS using Cloudfront, an AWS CDN service. In a later video, I'll show you how to create the S3 bucket, upload your static website, configure HTTPS, and get a free domain or buy one from Namecheap.&lt;/p&gt;

&lt;h3&gt;
  
  
  What's Next?
&lt;/h3&gt;

&lt;p&gt;The steps that follow include creating a DynamoDB database, writing Python Lambda functions, and using Terraform for Infrastructure as Code (IAC). I'll also be using GitHub for source control and setting up Continuous Integration and Continuous Deployment (CI/CD) for my website's frontend and backend.&lt;/p&gt;

&lt;p&gt;Finally, I'm excited to share my experience with the AWS Cloud Resume Challenge and will be posting weekly updates on my progress. Subscribe to &lt;a href="https://www.youtube.com/channel/UCtLwBE6ZNXnQdQp5o36BUxA" rel="noopener noreferrer"&gt;my channel&lt;/a&gt; to stay up to date, and join me in the next video as I begin building my website.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/NNKzYhvqq5w"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>resume</category>
      <category>challenge</category>
    </item>
    <item>
      <title>Dev Retro 2022 - The year I changed 2 jobs and left my dream company</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Thu, 05 Jan 2023 18:35:00 +0000</pubDate>
      <link>https://forem.com/rishabk7/dev-retro-2022-the-year-i-changed-2-jobs-and-left-my-dream-company-2emc</link>
      <guid>https://forem.com/rishabk7/dev-retro-2022-the-year-i-changed-2-jobs-and-left-my-dream-company-2emc</guid>
      <description>&lt;h2&gt;
  
  
  New Beginnings
&lt;/h2&gt;

&lt;p&gt;Start of 2022 I finally left the company I had been working for since I started my tech career as Tech Support. It was hard to say farewell to the company where I had amazing teammates, peers and mentors who supported me throughout my journey from Tech Support to Cloud Engineer to DevOps Engineer.&lt;/p&gt;

&lt;p&gt;I started working at Google as Technical Solutions Specialist in Jan 2022, and it was a dream for me to work for Google.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/DY2gJhJaQfI"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;It was a hard decision for me to end my time at Google after 4 months, but it was better for my mental health! More on why I left Google:&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/M9DvAms4klw"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;After quitting Google, I was on a job search and was looking forward to entering the DevRel space. And after a few interviews at different companies, I joined Twilio as a Staff Developer Evangelist.&lt;/p&gt;

&lt;p&gt;Check out my intro blog post on Twilio's blog.&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
      &lt;div class="c-embed__cover"&gt;
        &lt;a href="https://www.twilio.com/en-us/blog/introducing-rishab-kumar" class="c-link s:max-w-50 align-middle" rel="noopener noreferrer"&gt;
          &lt;img alt="" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.twilio.com%2Fcontent%2Fdam%2Ftwilio-com%2Fcore-assets%2Fsocial%2Ftwilio-blog-default-ogimage.png" height="418" class="m-0" width="800"&gt;
        &lt;/a&gt;
      &lt;/div&gt;
    &lt;div class="c-embed__body"&gt;
      &lt;h2 class="fs-xl lh-tight"&gt;
        &lt;a href="https://www.twilio.com/en-us/blog/introducing-rishab-kumar" rel="noopener noreferrer" class="c-link"&gt;
          Introducing Developer Evangelist Rishab Kumar | Twilio
        &lt;/a&gt;
      &lt;/h2&gt;
      &lt;div class="color-secondary fs-s flex items-center"&gt;
          &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.twilio.com%2Fcontent%2Fdam%2Ftwilio-com%2Fcore-assets%2Fsocial%2Ffavicon-32x32.png" width="32" height="32"&gt;
        twilio.com
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  Working in DevRel
&lt;/h2&gt;

&lt;p&gt;Now that my new job started, working in DevRel had its own challenges.&lt;/p&gt;

&lt;p&gt;The first one was speaking at large events, which I hadn't done until now. My first event was Collision in Toronto after 4 weeks of joining Twilio, where I presented about Twilio's SMS API in Python.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhntpzzkovxcotwugfi2l.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhntpzzkovxcotwugfi2l.jpeg" alt="Me presenting at Collision, my first tech demo at in-person event" width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In total, I spoke at 5 different in-person events in the following cities:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Toronto, ON&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Boston, MA&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Manhattan, NY&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Rock Hill, NY&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Attending AWS re:Invent was on my list since 2018 and Twilio made it possible for me. I was able to attend the conference this year in vegas. Met so many amazing community members.&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1598828702879846403-987" src="https://platform.twitter.com/embed/Tweet.html?id=1598828702879846403"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1598828702879846403-987');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1598828702879846403&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;h2&gt;
  
  
  Learning and Teaching
&lt;/h2&gt;

&lt;p&gt;This year I spent more time teaching than learning, which is the first time it has happened. Professionally mentored at 3 hackathons, provided 1-1 mentorship to various students and professionals as time permitted and also I think the biggest milestone for me this year was, teaching Cloud Computing at the college that I graduated from 4 years ago!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fssdno3zeu29q2o8sehpl.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fssdno3zeu29q2o8sehpl.jpeg" alt="My student ID and Staff badge at the same college, student in 2018 and staff in 2022" width="800" height="598"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Apart from this, also contributed to Learn to Cloud, which is a free open-source guide to help you get into the cloud. Talking about &lt;em&gt;learn to cloud&lt;/em&gt;, I also did a cloud podcast with my friend Gwyn for Learn to Cloud.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/y8J4fKRCnTU"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Community
&lt;/h2&gt;

&lt;p&gt;As you can already tell this year I produced a good amount of content for the cloud and DevOps community. In total, I published 52 YouTube videos, 12 podcast episodes and 7 technical articles.&lt;/p&gt;

&lt;p&gt;Hoping to keep this momentum and provide more value to the community!&lt;/p&gt;

&lt;p&gt;All this hard work did pay off as some amazing people nominated me for the &lt;a href="https://betakit.com/canadas-2022-developer-30-under-30/" rel="noopener noreferrer"&gt;Canada Developer 30 under 30 award.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3825q2b489pb2a0pkq4.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3825q2b489pb2a0pkq4.jpeg" alt="Me holding the Canada Developer 30 Under 30 award" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  2023
&lt;/h2&gt;

&lt;p&gt;I am looking forward to building some amazing connections in the community in 2023 and keep contributing.&lt;/p&gt;

&lt;p&gt;If you enjoyed this article and want to stay up-to-date with the latest content, be sure to subscribe to &lt;a href="https://rishabkumar.substack.com" rel="noopener noreferrer"&gt;my newsletter&lt;/a&gt;. You'll receive updates about my latest articles, as well as helpful tips and resources to help you in your journey in tech.&lt;/p&gt;

</description>
      <category>dev</category>
      <category>journey</category>
      <category>2022</category>
    </item>
    <item>
      <title>Are Cloud Certifications worth it?</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Sun, 06 Nov 2022 15:09:36 +0000</pubDate>
      <link>https://forem.com/rishabk7/are-cloud-certifications-worth-it-423j</link>
      <guid>https://forem.com/rishabk7/are-cloud-certifications-worth-it-423j</guid>
      <description>&lt;p&gt;So the most valuable cloud engineers or developers in many established organizations don't necessarily have loads of certifications. Instead they bring extensive experience in IT infrastructure and have hands on experience with cloud and a habit of self-taught learning. I think there are two sides to cloud certifications and I do want to cover both.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pros
&lt;/h2&gt;

&lt;p&gt;I'll start with the pros, cloud certifications or certifications in general in IT, this has been a long going debate and just like everything there are upsides and downsides. So one of the great benefits of certifications is they provide you a structured learning path to understand a particular cloud provider or technology. What I mean by that, let's take an example of AWS Certified Developer Associate certification, if you take that certification, it shows you a structured path on all the tooling that's available in AWS for developers: AWS CodeBuild, CodePipeline, Version Control, etc. and how you might use them.&lt;/p&gt;

&lt;p&gt;My point being you get a structured learning path to learn these skills. Similarly with DevOps Engineer Certificate both in Azure or AWS, it gives you learning paths to learn the tools that are needed by DevOps engineer which are provided by AWS or Azure. So you learn about Azure DevOps, how you can build CI/CD pipelines, how you can have version control in Azure DevOps. So for a structured learning path, I think certifications are great!&lt;/p&gt;

&lt;p&gt;There's also one very important point Cloud based certification has been a very good way for people to transform their careers, especially if you compare it to traditional degrees, these are lot more affordable. And just like me, there are 1000 others who didn't have computer science degrees but were able to pivot into cloud or were able to start their careers in cloud through these certifications.&lt;/p&gt;

&lt;p&gt;Now, I want to talk about cons.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cons
&lt;/h2&gt;

&lt;p&gt;One of it being theoretical knowledge. I've seen a lot of exam dumps being shared on the Internet. And that defies the entire purpose of you learning a new cloud provider or a new skill! Because then you're just doing it for the sake of passing the exam and it just doesn't hold any real value, you won't be able to answer some tough questions during interviews if you don't understand the underlying technology. I feel like lot of the cloud certifications right now are based on theoretical knowledge and can be cracked through exam dumps or doing practice exams.&lt;/p&gt;

&lt;p&gt;Another con can be, that you might not have prior experience and you can pass cloud certifications with no experience working in cloud even though there are certifications that would say you need six months hands on experience which I think you should have, because lack of in field experience working with cloud sometimes creates dissatisfaction from the hiring teams. However, if you are able to justify that, through right projects or making sure you have built two or three solid projects in cloud utilizing different services, and you're able to answer questions around why you chose those services, what were some decisions you made when picking those services and designing the architecture for your projects. I think this con can be avoided.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;So I would say certifications are good for structured learning paths. Even if you don't sit these certification exams but just go through the learning path, because you'll get a proper structure on which service to start with first and what are some fundamentals you need. Cons being: People have been cracking certifications without really knowing the underlying fundamentals and just going through practice questions and passing those certifications. I think that defies the purpose of you upskilling to get into cloud. That's my take on either certifications in IT or cloud are worth it or not.&lt;/p&gt;

&lt;p&gt;If you find this article helpful/insightful or have any questions for me, I am available &lt;a href="//twitter.com/rishabk7"&gt;@rishabk7&lt;/a&gt; on Twitter.&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>career</category>
      <category>certifications</category>
      <category>aws</category>
    </item>
    <item>
      <title>AZ-104 Study Guide: Azure Administrator</title>
      <dc:creator>Rishab Kumar</dc:creator>
      <pubDate>Mon, 07 Jun 2021 15:05:33 +0000</pubDate>
      <link>https://forem.com/rishabk7/az-104-study-guide-azure-administrator-afm</link>
      <guid>https://forem.com/rishabk7/az-104-study-guide-azure-administrator-afm</guid>
      <description>&lt;p&gt;Hello amazing people 👋&lt;/p&gt;

&lt;p&gt;Hope you all are safe out there. I recently sat my AZ-104 exam and glad to share that I passed 🥳.&lt;br&gt;
I took around 4 weeks to prepare for the exam and I am gonna be honest, I was not that confident about it.&lt;br&gt;
I have sat some fundamental certification for Azure but didn't had much practical experience, so doing some hands-on labs was the way to go. Here is a project that I did, which helped for the preparation, along with some Microsoft Azure labs:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.rishabkumar.com/a-cloud-guru-azure-resume-challenge-2021" rel="noopener noreferrer"&gt;Acloudguru Azure Resume Challenge 2021&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/MicrosoftLearning" rel="noopener noreferrer"&gt;
        MicrosoftLearning
      &lt;/a&gt; / &lt;a href="https://github.com/MicrosoftLearning/AZ-104-MicrosoftAzureAdministrator" rel="noopener noreferrer"&gt;
        AZ-104-MicrosoftAzureAdministrator
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      AZ-104 Microsoft Azure Administrator
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;AZ-104: Microsoft Azure Administrator&lt;/h1&gt;
&lt;/div&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Welcome&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;This repository is for instructors teaching Microsoft courses. If you are in class, please ask your instructor for assistance.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;&lt;a href="https://microsoftlearning.github.io/AZ-104-MicrosoftAzureAdministrator/" rel="nofollow noopener noreferrer"&gt;Link to labs (HTML format)&lt;/a&gt;&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Are you a MCT?&lt;/strong&gt; - Have a look at our &lt;a href="https://microsoftlearning.github.io/MCT-User-Guide/" rel="nofollow noopener noreferrer"&gt;GitHub User Guide for MCTs&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;To preview this course in a self-paced format, see our &lt;strong&gt;&lt;a href="https://mslabs.cloudguides.com/guides/AZ-104%20Exam%20Guide%20-%20Microsoft%20Azure%20Administrator" rel="nofollow noopener noreferrer"&gt;interactive lab simulations&lt;/a&gt;&lt;/strong&gt;. You may find slight differences between the interactive simulations and the hosted labs, but the core concepts and ideas being demonstrated are the same.&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Security Issue - April 2023&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;Effective immediately, the Admin password will be removed from the JSON template parameter files. This means students will have to provide a password when the template is deployed. This effects Labs 4, 5, 6, 7, 10 and 11.  The lab instructions will be changed to reflect this change.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;What are we doing?&lt;/h2&gt;

&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;To support this course, we will need to make frequent updates…&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/MicrosoftLearning/AZ-104-MicrosoftAzureAdministrator" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;Also, I realized that I sat 11 certification exams in the last 12 months&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1397158152806744067-489" src="https://platform.twitter.com/embed/Tweet.html?id=1397158152806744067"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1397158152806744067-489');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1397158152806744067&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;p&gt;I know a lot of these are fundamentals and associate level, but I am progressing so, this year my focus is on some expert/pro certifications.&lt;/p&gt;

&lt;h3&gt;
  
  
  Here is my AZ-104 Microsoft Azure Administrator Exam Study Guide
&lt;/h3&gt;

&lt;p&gt;The high-level view of the skills measured in the exam according to Microsoft:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Candidates for this exam should have subject matter expertise implementing, managing, and monitoring an organization’s Microsoft Azure environment.&lt;/p&gt;

&lt;p&gt;A candidate for this exam should have at least six months of hands-on experience administering Azure, along with a strong understanding of core Azure services, Azure workloads, security, and governance. In addition, this role should have experience using PowerShell, Azure CLI, Azure portal, and Azure Resource Manager templates.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;Manage Azure identities and governance (15-20%)&lt;/li&gt;
&lt;li&gt;Implement and manage storage (15-20%)&lt;/li&gt;
&lt;li&gt;Deploy and manage Azure compute resources (20-25%)&lt;/li&gt;
&lt;li&gt;Configure and manage virtual networking (25-30%)&lt;/li&gt;
&lt;li&gt;Monitor and back up Azure resources (10-15%)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Free resources to prepare:
&lt;/h3&gt;

&lt;p&gt;Azure Administrator Certification (AZ-104) - Full Course by &lt;a href="https://twitter.com" rel="noopener noreferrer"&gt;Andrew Brown&lt;/a&gt; on freecodecamp.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/10PbGbTUSAg"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Microsoft Learn path:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/learn/paths/az-104-administrator-prerequisites/" rel="noopener noreferrer"&gt;AZ-104: Prerequisites for Azure administrators&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/learn/paths/az-104-manage-identities-governance/" rel="noopener noreferrer"&gt;AZ-104: Manage identities and governance in Azure&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/learn/paths/az-104-manage-storage/" rel="noopener noreferrer"&gt;AZ-104: Implement and manage storage in Azure&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/learn/paths/az-104-manage-compute-resources/" rel="noopener noreferrer"&gt;AZ-104: Deploy and manage Azure compute resources&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/learn/paths/az-104-manage-virtual-networks/" rel="noopener noreferrer"&gt;AZ-104: Configure and manage virtual networks for Azure administrators&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/learn/paths/az-104-monitor-backup-resources/" rel="noopener noreferrer"&gt;AZ-104: Monitor and back up Azure resources&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Resources I used:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;ACloud Guru: &lt;a href="https://acloud.guru/overview/160303d7-6947-4fbc-9d19-fa304849f92e?_ga=2.112388748.944750847.1622233593-805091676.1622233593" rel="noopener noreferrer"&gt;AZ-104 Microsoft Azure Administrator Certification Prep&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.whizlabs.com/microsoft-azure-certification-az-104/practice-tests/" rel="noopener noreferrer"&gt;Whizlabs Practice Tests&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The experience:
&lt;/h3&gt;

&lt;p&gt;To be honest, I found the exam a bit difficult compared to AWS Associate level exams, but that could be due to the fact that I have been working in AWS for almost 2 years and that is not the case with Azure. I am still glad that I passed. Looking forward towards the AZ-400 DevOps Engineer Expert.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>certification</category>
      <category>cloud</category>
      <category>cloudskills</category>
    </item>
  </channel>
</rss>
