<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Akande Bolaji</title>
    <description>The latest articles on Forem by Akande Bolaji (@therealbolaji).</description>
    <link>https://forem.com/therealbolaji</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/therealbolaji"/>
    <language>en</language>
    <item>
      <title>Creating a Serverless Resume with Visitor Counter in Azure</title>
      <dc:creator>Akande Bolaji</dc:creator>
      <pubDate>Fri, 11 Sep 2020 09:47:06 +0000</pubDate>
      <link>https://forem.com/therealbolaji/creating-a-serverless-resume-with-visitor-counter-in-azure-3f78</link>
      <guid>https://forem.com/therealbolaji/creating-a-serverless-resume-with-visitor-counter-in-azure-3f78</guid>
      <description>&lt;p&gt;This article is part of &lt;a href="https://aka.ms/ServerlessSeptember2020" rel="noopener noreferrer"&gt;#ServerlessSeptember&lt;/a&gt;. You'll find other helpful articles, detailed tutorials, and videos in this all-things-Serverless content collection. New articles from community members and cloud advocates are published every week from Monday to Thursday through September. &lt;/p&gt;

&lt;p&gt;Find out more about how Microsoft Azure enables your Serverless functions at &lt;a href="https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=servsept20-devto-cxaall" rel="noopener noreferrer"&gt;https://docs.microsoft.com/azure/azure-functions/&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;The idea for this article is from the &lt;a href="https://cloudresumechallenge.dev/" rel="noopener noreferrer"&gt;cloud resume challenge&lt;/a&gt;. We need to build a serverless static resume website with a visitor counter. We need a serverless service to store the visitor’s count (Azure Cosmos DB), a serverless service to retrieve and update the visitor’s count from our store(Azure Functions), a serverless service to host our static resume website (Azure Storage) and an additional service to cache and make our static website load faster (Azure CDN). This tutorial would be divided into four parts covering each steps and services&lt;/p&gt;

&lt;h3&gt;
  
  
  Building our store using Cosmos DB
&lt;/h3&gt;

&lt;p&gt;Azure Cosmos DB is a fully managed NoSQL database service which is a perfect choice for our serverless store. Lets get started&lt;/p&gt;

&lt;h5&gt;
  
  
  Steps
&lt;/h5&gt;

&lt;ul&gt;
&lt;li&gt;From azure portal, search for &lt;code&gt;azure cosmos db&lt;/code&gt; and select the match&lt;/li&gt;
&lt;li&gt;On the cosmos db page, select &lt;code&gt;create&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fd02qrropsqlcxmggiy8h.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fd02qrropsqlcxmggiy8h.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For Project details, choose your preferred Azure subscription and resource group to group your Cosmos DB account. For instance details, enter an account name of your choice which has to be globally unique, Our API would be &lt;code&gt;Core(SQL)&lt;/code&gt; as we would be using SQL syntax, Capacity mode is &lt;code&gt;serverless&lt;/code&gt; and we can leave the remaining options to default and go ahead to select &lt;code&gt;review + create&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1mteo1kvml5accjzvkt3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1mteo1kvml5accjzvkt3.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Once deployment is complete, Select &lt;code&gt;Go To Resource&lt;/code&gt; and choose &lt;code&gt;Data Explorer&lt;/code&gt;. On data explorer page, select &lt;code&gt;New Container&lt;/code&gt;.  Enter a &lt;code&gt;database id&lt;/code&gt;, &lt;code&gt;container id&lt;/code&gt;, and &lt;code&gt;partition key&lt;/code&gt; of your choice.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fr3hju8xo3q4kz1734bar.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fr3hju8xo3q4kz1734bar.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After creating DB, select your created DB from the &lt;code&gt;SQL API&lt;/code&gt; list and go ahead to select &lt;code&gt;New Item&lt;/code&gt;. Enter &lt;code&gt;id&lt;/code&gt; as &lt;code&gt;home&lt;/code&gt; or whatever you would like to call your resume home page and &lt;code&gt;count&lt;/code&gt; as Zero &lt;code&gt;0&lt;/code&gt;. Click &lt;code&gt;save&lt;/code&gt; and our store is ready for use.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Building our serverless function using Azure Function
&lt;/h2&gt;

&lt;p&gt;Azure Function is a serverless compute service that lets you run event-triggered code without having to explicitly provision or manage infrastructure which makes it a great choice for our use case.&lt;/p&gt;

&lt;h4&gt;
  
  
  Steps
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;From azure portal, search for &lt;code&gt;Function App&lt;/code&gt; and select the match&lt;/li&gt;
&lt;li&gt;On the function App Page, select &lt;code&gt;create function&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5a7m5uh2aqisogkhlv42.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5a7m5uh2aqisogkhlv42.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For Project details, choose your preferred Azure subscription and resource group to group your Function App. For Instance Details, Enter a function App Name of your choice which is globally unique. Select &lt;code&gt;Code&lt;/code&gt; as option for Publish, Select &lt;code&gt;Node.js&lt;/code&gt; as Runtime stack, Leave the remaining options as default and select &lt;code&gt;Review + Create&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flezbecn0h1ft7q1zxih9.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flezbecn0h1ft7q1zxih9.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Once deployment is complete, Select &lt;code&gt;Go to Resource&lt;/code&gt; and select &lt;code&gt;functions&lt;/code&gt; on the Resource page.&lt;/li&gt;
&lt;li&gt;Select Add, and select &lt;code&gt;HTTP trigger&lt;/code&gt; from the New Function popup. Leave the details option as default and Select &lt;code&gt;Create Function&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;We would like update the visitor count table and get the updated count all from our Function. This is possible with Integration. Select &lt;code&gt;Integration&lt;/code&gt; from the function page.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4rlio4yp7my4y0k868ba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4rlio4yp7my4y0k868ba.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;code&gt;Add Input&lt;/code&gt; to add an integration to get our visitors count for a specific page. Binding Type is &lt;code&gt;Azure Cosmos DB&lt;/code&gt;, Database name should be the name of our CosmosDB  and Collection name should be the name of our Cosmos DB container both created earlier. Create a new Cosmos DB account connection using your created cosmos db account. Add an SQL query to get a specific page count &lt;code&gt;SELECT * FROM c WHERE c.id = {id}&lt;/code&gt;. This query selects an item from our cosmos db where &lt;code&gt;item.id&lt;/code&gt; is equal to our input query id. Select &lt;code&gt;Ok&lt;/code&gt; to create our input integration.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fufbtinwvf18ce3hn7gxw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fufbtinwvf18ce3hn7gxw.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;code&gt;Add Output&lt;/code&gt; to add an integration to update our visitors count for a specific page. Binding Type is &lt;code&gt;Azure Cosmos DB&lt;/code&gt;, Database name should be the name of our CosmosDB created earlier and Collection name should be the name of our Cosmos DB container. Select &lt;code&gt;Ok&lt;/code&gt; to create our input integration.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F02ow6430flznoj0j0l8v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F02ow6430flznoj0j0l8v.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;code&gt;Code + Test&lt;/code&gt; and your function should look something like this:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = async function (context, req, data) {
  context.bindings.outputDocument = data[0];
  context.bindings.outputDocument.count += 1;
  context.res = {
     body: data[0].count
  };
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The third parameter &lt;code&gt;data&lt;/code&gt; in our async function returns item from our cosmos db by running the SQL Query in the input integration we created earlier. An alternative to that would be &lt;br&gt;
&lt;code&gt;data = context.bindings.inputDocument&lt;/code&gt; where &lt;code&gt;inputDocument&lt;/code&gt; is the name of the input integration. &lt;br&gt;
To update the visitor count in our db, we get a reference to the item we want to update using the inputDocument binding and update the count by one before returning the result. Azure function is so smart that our changes are reflected on the returned data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fipwhw7t4zi4ydc6gfcaj.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fipwhw7t4zi4ydc6gfcaj.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Let’s test that our function works by using the &lt;code&gt;test/run&lt;/code&gt; feature. We would normally make a get request to our endpoint and query parameter of &lt;code&gt;id&lt;/code&gt; with a value of &lt;code&gt;home&lt;/code&gt;. We used a value of home since the item we want to get from our cosmos db was saved with an &lt;code&gt;id&lt;/code&gt; of &lt;code&gt;home&lt;/code&gt;. You should copy your Function App API endpoint so you can call it from your resume web page.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Hosting our static resume on Azure storage
&lt;/h2&gt;

&lt;p&gt;I would assume we all can build a simple resume using html. You can build using any frontend framework but I used plain html, css and Javascript for mine. I won’t be going into the details of building the resume so feel free to clone mine &lt;a href="https://github.com/akandeBolaji/cloud-resume-frontend" rel="noopener noreferrer"&gt;here&lt;/a&gt;. The important thing you need to understand here is you would have to fetch the visitor’s count using Javascript from our Function App API endpoint and display the result on the web page. Let’s go ahead to upload our resume on Azure storage.&lt;/p&gt;

&lt;h4&gt;
  
  
  Steps
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;From azure portal, search for &lt;code&gt;storage accounts&lt;/code&gt; and select the match&lt;/li&gt;
&lt;li&gt;On the storage accounts page, Select &lt;code&gt;add&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;For Project details, choose your preferred Azure subscription and resource group to group your Function App. For Instance Details, Enter a storage account Name of your choice which is globally unique. Leave the remaining options as default and select &lt;code&gt;Review + Create&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Once deployment is complete, Select &lt;code&gt;Go to Resource&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fykn2owslaf38qfmu78py.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fykn2owslaf38qfmu78py.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select &lt;code&gt;Static website&lt;/code&gt;, on static website page, select &lt;code&gt;enable&lt;/code&gt; as it's disabled by default. Enter your resume website index page and error page and click &lt;code&gt;save&lt;/code&gt;. You can see the url to your static website but it’s empty since we haven’t uploaded our files. Select the &lt;code&gt;$web&lt;/code&gt; container to upload your files.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fk31zkiwu8i8tj3j5zblk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fk31zkiwu8i8tj3j5zblk.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Upload your files and make sure your Javascript uses the Function App api gateway to retrieve the visitor’s count.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fm1dyjdkqsiql9uf5x5rr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fm1dyjdkqsiql9uf5x5rr.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We need to update our function App to allow request from our static website url else we wouldn’t be able to get the visitors count. go to the Function App you created and search for &lt;code&gt;CORS&lt;/code&gt;. add your static website url to list of allowed origins. Make sure to enable &lt;code&gt;Access-Control-Allow-Credentials&lt;/code&gt;. Everything should work fine now.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fg9lrpagpthy5c5vvbd5k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fg9lrpagpthy5c5vvbd5k.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using Azure CDN to cache resume and reduce load times
&lt;/h2&gt;

&lt;p&gt;Azure Content Delivery Network (CDN) lets you reduce load times, save bandwidth, and speed responsiveness.&lt;/p&gt;

&lt;h4&gt;
  
  
  Steps
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Go to storage accounts and select the storage for your static website.&lt;/li&gt;
&lt;li&gt;search for CDN and select &lt;code&gt;create new CDN endpoint&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fx0r7322vsn2i97io9z13.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fx0r7322vsn2i97io9z13.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose &lt;code&gt;create new&lt;/code&gt;, enter a name for your CDN profile. Select the &lt;code&gt;Standard Microsoft&lt;/code&gt; option for pricing tier. Enter a globally unique name for CDN endpoint name and select the storage account as origin name. click &lt;code&gt;create&lt;/code&gt; and wait for some minutes while it deploys your content to CDN. Don't forget to add your CDN endpoint url to list of allowed origins in your Function App.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffe9daaoko3itpmyfa9wi.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffe9daaoko3itpmyfa9wi.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's our final result.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We have successfully hosted our resume on azure storage and used Azure CDN for caching, Azure Cosmos DB to store visitor's count and Function App to retrieve the count. An added functionality on your own would be to add a custom domain to our CDN integration.&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>azure</category>
      <category>cosmosdb</category>
      <category>cdn</category>
    </item>
    <item>
      <title>Cloud Resume Challenge</title>
      <dc:creator>Akande Bolaji</dc:creator>
      <pubDate>Sat, 13 Jun 2020 14:31:49 +0000</pubDate>
      <link>https://forem.com/therealbolaji/cloud-resume-challenge-4kfb</link>
      <guid>https://forem.com/therealbolaji/cloud-resume-challenge-4kfb</guid>
      <description>&lt;p&gt;I started the cloud resume challenge as a way to have hands-on experience with the cloud. I have three years experience working as a Fullstack developer and I am excited to pursue a new challenge in cloud. The cloud resume challenge is an initiative by Forrest Brazeal to help new people to break into the Cloud industry, as many are losing their jobs because of the coronavirus pandemic. It took me a week to accomplish and below are some useful azure technology and skills I learnt while doing the challenge.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The first task which was to create my resume page using HTML, CSS and Javascript was quite easy. I got a resume template on Codepen and I was able to tweak it to my taste. The Javascript part involved making  an API call to my Api gateway with fetch to get and display the visitors count. At this point, I was using a dummy api endpoint.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The second task was to upload my resume to an Amazon S3 bucket. I was using the amazon console to upload my static website and set bucket access to public and I had to repeat this process multiple times after making changes to my HTML. You would read in later tasks how i later made this process effortless.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The third task was to allow the S3 bucket use HTTPS for security using CloudFront and point the Cloudfront distribution to a custom domain. I obtained a domain name through Route53, created an SSL certificate using AWS Certificate Manager and pointed this to my CloudFront distribution.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;My next task was to persist the visitor's count to a database. I used DynamoDB for this. I was able to use the update item query that would update before getting the item. This would save cost in the sense that a single DynamoDB query and API endpoint was enough to Update and Get the visitor's count.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next task was to create a lambda function that would communicate with my DynamoDB. I had to use Python for this since it was part of the challenge requirements. Python was quite easy to grasp and in few hours, I was able to create a Lambda function with proper IAM roles/policies to access DyanmoDB.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The next tasks are the most challenging and the parts I enjoyed working on the most in this challenge. I created a SAM (serverless application model) template to handle creating the DynamoDB, Lambda function and tests, and API gateway integration. I used the AWS CLI for this and i was able to test my functions locally using docker. I wrote unit tests for my lambda function and i was able to test that they work locally with the Dynamodb using moto from AWS.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I then used GitHub Actions to automate the process of building and testing my SAM template after each push to my private backend repository. If the build and test were successful, It goes ahead to deploy the SAM template on AWS.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I repeated the same process to automate uploading my static Frontend code to my aws bucket. I used Github actions for this and also invalidated my CloudFront Cache in the workflow. I made sure i stored my AWS credentials in Github Secrets for security reasons.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The resume project was complete and it was time to prepare for my AWS Certified Cloud Practitioner exams as this was part of the challenge requirements. I prepared for the exam and passed within two weeks using the official study guide from AWS, Practice questions from &lt;a href="https://acloud.guru"&gt;https://acloud.guru&lt;/a&gt; and the practical experience gotten from the tasks above.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;When this challenge is over, I would be writing a more detailed article on the technical side of this project.&lt;/p&gt;

&lt;p&gt;You can check the Finished product at &lt;a href="https://bolajiakande.com"&gt;https://bolajiakande.com&lt;/a&gt; &lt;br&gt;
You can also view my AWS Certification here &lt;a href="https://www.youracclaim.com/badges/27320586-7c1d-46c3-9773-69dff3ce206e/public_url"&gt;https://www.youracclaim.com/badges/27320586-7c1d-46c3-9773-69dff3ce206e/public_url&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Overall, This was a great learning experience to me and I'm really happy to participate in this challenge. I would advice both Intermediate and Fresh Cloud Engineers to try this challenge out.&lt;br&gt;
Here is a link to try it out - &lt;a href="https://cloudresumechallenge.dev/"&gt;https://cloudresumechallenge.dev/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
