<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: saifmomin</title>
    <description>The latest articles on Forem by saifmomin (@saifmomin).</description>
    <link>https://forem.com/saifmomin</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/saifmomin"/>
    <language>en</language>
    <item>
      <title>Code better with Amazon CodeWhisperer</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Fri, 22 Sep 2023 13:24:04 +0000</pubDate>
      <link>https://forem.com/saifmomin/code-better-with-amazon-codewhisperer-42nm</link>
      <guid>https://forem.com/saifmomin/code-better-with-amazon-codewhisperer-42nm</guid>
      <description>&lt;p&gt;There is a lot of buzz around Generative AI these days. CodeWhisperer is a Generative AI product from AWS that is powered by a Foundation Model trained on billions of lines of code. It speeds up software development by providing contextual code suggestions in real time. &lt;/p&gt;

&lt;p&gt;This blog discusses how AI code generator tools like  CodeWhisperer could change the way software is developed.&lt;/p&gt;

&lt;h2&gt;
  
  
  CodeWhisperer - The Code Buddy
&lt;/h2&gt;

&lt;p&gt;CodeWhisperer is your personal Coding Assistant. With CodeWhisperer, you code faster and spend less time writing boilerplate code. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Quick Highlights&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CodeWhisperer currently supports 15 popular programming languages and VS Code &amp;amp; JetBrains IDEs. &lt;/li&gt;
&lt;li&gt;Support for JupyterLab is also available.&lt;/li&gt;
&lt;li&gt;It is integrated with AWS services like SageMaker Studio, AWS Lambda and Cloud9. &lt;/li&gt;
&lt;li&gt;Offered in two tiers: Individual (Free) and Professional (Priced).&lt;/li&gt;
&lt;li&gt;To get started, you need an IDE, an AWS Toolkit extension and an authentication method (AWS Builder ID or IAM Identity Center or IAM). &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Code Examples&lt;/strong&gt;&lt;br&gt;
Let's now see some illustrations to witness how CodeWhisperer can boost your coding superpowers. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Auto-complete:&lt;/strong&gt; CodeWhisperer will auto-complete your comments or code as you type.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3l5xkjcfurltu3nz1rvy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3l5xkjcfurltu3nz1rvy.png" alt="auto-complete comment"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqujqzx0grpu4k6dqbvst.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqujqzx0grpu4k6dqbvst.png" alt="auto-complete"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Get full function code from comments:&lt;/strong&gt; You can provide comments as prompts to CodeWhisperer and it will implement the function for you.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftasfr8wbbvftsvcy6rgq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftasfr8wbbvftsvcy6rgq.png" alt="full func"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Get full function code from function signature:&lt;/strong&gt; CodeWhisperer can generate the function body from the function signature.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdhnz5wcj76zrl5u74zp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdhnz5wcj76zrl5u74zp.png" alt="signature"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Generate Docstring from Code:&lt;/strong&gt; Type the delimiter (/** */) above the method and CodeWhsiperer will generate the docstring for it.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcemtfm5mkr77ff6spv19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcemtfm5mkr77ff6spv19.png" alt="doctring"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Get line-by-line code recommendations:&lt;/strong&gt; CodeWhsiperer can provide line-by-line code suggestions and can eventually complete the entire code block.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9vfs0ywlcamxxdp3llw9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9vfs0ywlcamxxdp3llw9.png" alt="line-by-line"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Write SQL queries:&lt;/strong&gt; CodeWhisperer can suggest Structured Query Language (SQL) code based on comments.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3bp59ti6skqtno8wh2u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3bp59ti6skqtno8wh2u.png" alt="sql"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8bxb6bbrztu6qb6zlk7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8bxb6bbrztu6qb6zlk7.png" alt="sql"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Write shell scripts:&lt;/strong&gt; CodeWhisperer can help you write code for shell scripts.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu98q8n0854stg9rtlx56.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu98q8n0854stg9rtlx56.png" alt="shell script"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzp9noowyhlmf8oy7vvhd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzp9noowyhlmf8oy7vvhd.png" alt="shell script"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Write unit test:&lt;/strong&gt; CodeWhisperer can create unit tests to develop testing scenarios for code.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5qthldp2k42oqtb0ffcs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5qthldp2k42oqtb0ffcs.png" alt="unit test"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Create test data:&lt;/strong&gt; CodeWhisperer can help you generate dummy data - just enter the first object as an example, and CodeWhisperer will continue repeating the pattern.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw7tzcil8c7sye0ypee9k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw7tzcil8c7sye0ypee9k.png" alt="test"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn9czkriga5w5er4xekon.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn9czkriga5w5er4xekon.png" alt="test"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Write Infrastructure as Code:&lt;/strong&gt; CodeWhisperer can help you write IaC code using AWS CDK.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frp0h95pn8uon2dv5vora.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frp0h95pn8uon2dv5vora.png" alt="cdk"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;Write code in Jupyter Notebook with SageMaker Studio:&lt;/strong&gt; Data Scientists can use CodeWhisperer to get code suggestions directly in the Python notebooks in Amazon SageMaker Studio.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjz49g5hv1ig5cskstxhs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjz49g5hv1ig5cskstxhs.png" alt="sage"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  CodeWhisperer - More than a Code Generator
&lt;/h2&gt;

&lt;p&gt;Now that we've seen how CodeWhisperer can help you write code faster, let's look at how it's more than just a coding advisor.   &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Helping Developers write Secure Code&lt;/strong&gt;&lt;br&gt;
CodeWhisperer helps you deliver secure code by shifting security commitments closer to the developer. CodeWhisperer can run security scans on the code directly in the IDE, allowing developers to detect security vulnerabilities, and get code suggestions to fix them instantly. CodeWhisperer uses Amazon CodeGuru to perform the security scan.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Inculcating use of Responsible AI&lt;/strong&gt;&lt;br&gt;
Does using the code generated by CodeWhisperer subject the organisations to copyright infringement? Fortunately, AWS has a solution to mitigate the legal risk associated with using AI-generated code. CodeWhisperer can flag code for license references (for example, MIT or Apache) ensuring developers don’t inadvertently use licensed code while accepting code suggestions from CodeWhisperer. This empowers developers to use AI responsibly and enables organisations to deliver products to their customers without infringing on the copyright.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Defusing Privacy Concerns&lt;/strong&gt;&lt;br&gt;
When you work with CodeWhisperer, AWS may collect your data for service improvement purposes. So are you putting your organization’s data at risk by using CodeWhisperer? Not really, because CodeWhisperer gives you the control to decide whether or not you want to share your data with AWS. This helps developers and organisations alike to build great software confidently with CodeWhisperer without any privacy concerns.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Transforming the Developer Role&lt;/strong&gt;&lt;br&gt;
The emergence of Generative AI tools like CodeWhisperer opens up the need for a new breed of Developers - those who can code as well as work effectively with AI. Prompt engineering is an emerging job role in the IT industry. Developers can take the leap and become specialists in prompt design, which is about crafting effective prompts to generate desired output from generative AI models. CodeWhisperer can also help developers learn new programming languages more quickly. A Java developer, for example, can easily start writing Python code using CodeWhisperer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Removing Technology Barriers&lt;/strong&gt;&lt;br&gt;
AI code generators like Amazon CodeWhisperer play an important part in bringing users closer to technology. One of the best things about CodeWhisperer is that it comes with a free tier, making it easily accessible to everyone. &lt;/p&gt;

&lt;h2&gt;
  
  
  Looking Ahead
&lt;/h2&gt;

&lt;p&gt;The future of Generative AI is exciting! It would be interesting to see how Amazon CodeWhisperer evolves over time. For instance, it would be nice to see a chat window and code translation capability in the product in future. &lt;/p&gt;

&lt;p&gt;For now, go ahead and witness firsthand how it changes the coding experience.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>generativeai</category>
      <category>codewhisperer</category>
      <category>aws</category>
    </item>
    <item>
      <title>Right Scaling AWS Lambda</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Thu, 01 Sep 2022 05:57:04 +0000</pubDate>
      <link>https://forem.com/saifmomin/right-scaling-aws-lambda-1ndf</link>
      <guid>https://forem.com/saifmomin/right-scaling-aws-lambda-1ndf</guid>
      <description>&lt;p&gt;AWS Lambda is a serverless compute service that revolutionised the serverless movement. Lambda allows you to run your application &lt;em&gt;at scale&lt;/em&gt; without managing servers. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;So how does your Lambda function scale?&lt;/em&gt; Lambda scales out &lt;em&gt;automatically&lt;/em&gt; in response to incoming requests; and it scales down to zero when the traffic stops. Lambda manages the infrastructure and the scaling for you, but it does give you a few ways to control the scaling and that in turn helps you with cost and performance benefits.&lt;/p&gt;

&lt;p&gt;Before we look at how the scaling controls work, let's understand a few concepts first - &lt;em&gt;Execution Environment &amp;amp; Concurrency&lt;/em&gt;. &lt;/p&gt;

&lt;p&gt;Behind the scenes, your Lambda functions run inside an isolated runtime environment which is called an &lt;strong&gt;Execution Environment&lt;/strong&gt;. When your function is invoked, Lambda spins up an instance of the execution environment to run your function code.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;An Execution Environment is a secure and isolated container that runs your function code.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;em&gt;What happens when the requests arrive at a faster rate invoking your function even before it has finished running?&lt;/em&gt; In such a case, Lambda creates additional execution environments, which run the function concurrently. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Thus Lambda &lt;strong&gt;Concurrency&lt;/strong&gt; is the number of execution environments of your function that are active and serving requests at any given time.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--J1o2fUeY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/urj08wuu4pjxv1hau3fq.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--J1o2fUeY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/urj08wuu4pjxv1hau3fq.jpg" alt="exec env" width="880" height="254"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;So how many functions can you scale concurrently?&lt;/em&gt; There is a limit to how many execution environments can be provisioned concurrently. Lambda has a soft limit of running 1000 concurrent execution environments per Region. The function continues to scale until the concurrency limit for the function in the Region is reached. After the limit is reached all additional requests fail with a throttling error.&lt;/p&gt;

&lt;p&gt;Note that the common pool (quota limit) is shared among all the functions running in the Region. So your function may be throttled when the pool is consumed by other functions in the Region. &lt;em&gt;How do we mitigate the throttling?&lt;/em&gt; This can be avoided using &lt;strong&gt;Reserved Concurrency&lt;/strong&gt; control.&lt;/p&gt;

&lt;p&gt;You can reserve execution environments capacity from the common pool for your functions using Reserved concurrency. Reserved Concurrency guarantees the maximum number of concurrent execution environments for the function.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use Reserved concurrency for your business-critical functions to ensure guaranteed scaling.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;For example, if you set Reserved Concurrency to 4, your function is guaranteed to scale to 4 instances (however, beyond that it will be throttled). There are many other reasons to use Reserved concurrency beyond just guaranteed scaling. You can use it to control the cost of Lambda, or you can protect a backend resource from being overwhelmed by your function scale, or you can use it as a kill switch (set the reservation to zero and all invocations to the function stop). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--10ae6Vfc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c3pkkbrku6aeh2oe6563.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--10ae6Vfc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c3pkkbrku6aeh2oe6563.jpg" alt="resv con" width="880" height="351"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;em&gt;But... how fast can the function scale?&lt;/em&gt; To understand this we need to know what happens behind the scenes when a Lambda function is invoked.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6dHFaEUr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/frefjoec95tayeab8ez4.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6dHFaEUr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/frefjoec95tayeab8ez4.jpg" alt="func life" width="880" height="266"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When the Lambda service receives a request to run a function, it does a few things - &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;downloads function code &lt;/li&gt;
&lt;li&gt;creates a new execution environment &lt;/li&gt;
&lt;li&gt;runs initialization code&lt;/li&gt;
&lt;li&gt;runs handler code&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So you see before your function handler runs other steps must complete. This adds latency and is referred to as &lt;strong&gt;Cold Start&lt;/strong&gt;. Cold start is not a desirable property because of course you want your function to scale fast. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8rGPlYPE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gitoqeif25oy7t1aj981.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8rGPlYPE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gitoqeif25oy7t1aj981.jpg" alt="cold start" width="880" height="306"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;So how do we reduce cold starts?&lt;/strong&gt;&lt;/em&gt; There are a few ways to mitigate the impact of cold starts, like Execution environment reuse, Function warmers and Provisioned concurrency. Other factors like language runtime, memory size and optimised code also help improve startup latency.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Execution environment reuse:&lt;/strong&gt; &lt;em&gt;What happens after the function execution completes?&lt;/em&gt; The Lambda service retains the execution environment for a non-deterministic period, instead of destroying it immediately after execution.  During this time, if another request arrives for the same function, the service may reuse the environment, which makes this second request typically complete faster. However, to reduce cold starts, you should not depend on execution environment reuse for several reasons: like, when your function scales up due to traffic or when you update your function code or configuration, your next function invocation would result in a new execution environment. Also, AWS runs Lambda in multiple Availability Zones (AZ) for high availability, so a function can be invoked in different AZ resulting in a new execution environment.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--pNvb6JKq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8695p3v9rist3e5u4or3.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pNvb6JKq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8695p3v9rist3e5u4or3.jpg" alt="env reuse" width="880" height="259"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Function warmers:&lt;/strong&gt; A simple hack is to use ping to keep the function warm, like this open source &lt;a href="https://github.com/jeremydaly/lambda-warmer"&gt;lambda warmer&lt;/a&gt;. However, this again is not a guaranteed way to reduce cold starts. For example, it does not work if the Lambda service runs your function in another AZ.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Provisioned Concurrency:&lt;/strong&gt; The recommended solution to reduce cold starts is to use &lt;em&gt;Provisioned Concurrency&lt;/em&gt;. The Provisioned Concurrency feature prepares execution environments in advance keeping functions initialised and ready to respond. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use Provisioned Concurrency to solve the cold start issue for your latency-sensitive application.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;For example, if you set Provisioned Concurrency to 4, your function will scale to 4 instances without experiencing fluctuations in latency.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TgjAD-h4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k75qnca8hzmncc9tcoga.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TgjAD-h4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k75qnca8hzmncc9tcoga.jpg" alt="prov con" width="880" height="303"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Note that Provisioned Concurrency incurs charges. You pay a price for each warm container. &lt;em&gt;So is there a way to control the cost of Provisioned Concurrency?&lt;/em&gt; Yes, you can use &lt;strong&gt;Application Auto Scaling&lt;/strong&gt; with Provisioned Concurrency. Application Auto Scaling will add or remove the number of warm environments just for the necessary moments, like when traffic starts or during peak usage. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Combine Application Auto Scaling with Provisioned Concurrency to control the cost of Provisioned Concurrency.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Closing
&lt;/h2&gt;

&lt;p&gt;I hope this post helps you understand how AWS Lambda scales and how concurrency controls can be applied to optimise cost and performance.&lt;/p&gt;

&lt;p&gt;Thank you for reading!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>lambda</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Data Transfer Options in AWS</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Mon, 01 Aug 2022 06:41:28 +0000</pubDate>
      <link>https://forem.com/saifmomin/data-transfer-options-in-aws-1b1e</link>
      <guid>https://forem.com/saifmomin/data-transfer-options-in-aws-1b1e</guid>
      <description>&lt;p&gt;Businesses around the world are increasingly migrating to cloud. Data is an important asset of any business. Thus before moving data to cloud, it is imperative to understand the use case and pick the right tool for data transfer. AWS offers a wide range of services to help you smoothly migrate your data. &lt;/p&gt;

&lt;p&gt;So with the variety of data transfer services available with AWS, how do you choose the right service. Here, I assist you decide and choose the right AWS service, through a visual guide. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hV70ie0T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sfsq8gqtxej8vgm1qnsp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hV70ie0T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sfsq8gqtxej8vgm1qnsp.png" alt="AWS Data Transfer Options" width="880" height="956"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;See the full diagram &lt;a href="https://www.xmind.net/m/r54GKa"&gt;here&lt;/a&gt; &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Note: size of data and available network bandwidth, form a major decision factor in choosing the right data transfer solution. Refer the chart to get an approximate idea on the data transfer speed (source: google cloud).&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Di_78zBL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kwi70t4er0xrqb6alzvv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Di_78zBL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kwi70t4er0xrqb6alzvv.png" alt="Data Transfer Speed Chart" width="880" height="637"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;AWS also supports products from &lt;a href="https://aws.amazon.com/backup-recovery/partner-solutions/"&gt;AWS Storage Competency Partners&lt;/a&gt; to integrate your on-premises data with AWS cloud.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Closing Thought
&lt;/h2&gt;

&lt;p&gt;I hope this post helps you in selecting the right AWS service for your cloud data transfer case!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>cloudmigration</category>
      <category>datatransfer</category>
    </item>
    <item>
      <title>Awesome AWS Tools - Identity &amp; Access</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Sat, 16 Jul 2022 06:05:57 +0000</pubDate>
      <link>https://forem.com/saifmomin/awesome-aws-tools-identity-access-ad4</link>
      <guid>https://forem.com/saifmomin/awesome-aws-tools-identity-access-ad4</guid>
      <description>&lt;p&gt;Security is built into the core of AWS cloud. AWS offers foundational services like IAM, KMS, Cognito, GuardDuty, Inspector, Macie and many more, to help you meet security requirements in the cloud. In addition, AWS provides a number of tools that can further help you improve security posture and operate confidently on the cloud.&lt;/p&gt;

&lt;p&gt;Here, I have discussed the tools very briefly; the idea is to make the audience aware of these tools so that they can benefit from it. For greater details, you may dive deep on the official AWS pages.&lt;/p&gt;

&lt;p&gt;In this post we will look at the following AWS tools from the Identity &amp;amp; Access domain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Policy Generator&lt;/li&gt;
&lt;li&gt;IAM Policy Simulator&lt;/li&gt;
&lt;li&gt;Web Identity Federation Playground&lt;/li&gt;
&lt;li&gt;Access Advisor&lt;/li&gt;
&lt;li&gt;IAM Access Analyzer&lt;/li&gt;
&lt;li&gt;Access analyzer for S3&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  AWS Policy Generator
&lt;/h2&gt;

&lt;p&gt;The AWS Policy Generator is a tool that helps you create policies that control access to AWS resources. You can create policy documents for different types of policies like IAM policy, S3 Bucket policy, SQS Queue policy, SNS Topic policy and VPC Endpoint policy. The process is quite simple - you first select a policy type, then add permission for an AWS service and finally click Generate Policy button to get a JSON policy document. The Policy Generator tool is kind of similar to the Visual Editor in AWS console that allows you to create and edit a policy.&lt;/p&gt;

&lt;p&gt;You can access the AWS Policy Generator using the link here:&lt;br&gt;
&lt;a href="https://awspolicygen.s3.amazonaws.com/policygen.html"&gt;https://awspolicygen.s3.amazonaws.com/policygen.html&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  IAM Policy Simulator
&lt;/h2&gt;

&lt;p&gt;AWS provides you with a Policy Simulator tool that helps you test and troubleshoot policies in your AWS environment. The tool makes it easier for you to test the effects of policies before deploying them into production. You can quickly author new policies or test existing ones without struggling with Access Denied errors. You simply select an IAM entity (user, group or role), choose the policy that you want to evaluate, select an action to simulate, and click the Run Simulation button to see the result. &lt;/p&gt;

&lt;p&gt;You can access the AWS Policy Simulator using the link here (you need to be signed-in to your AWS account):&lt;br&gt;
&lt;a href="https://policysim.aws.amazon.com"&gt;https://policysim.aws.amazon.com&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Web Identity Federation Playground
&lt;/h2&gt;

&lt;p&gt;Web identity federation basically allows you to access AWS services using a web identity provider such as Amazon, Google or Facebook. To see how it works, look at Web Identity Federation Playground from AWS. This tool lets you explore three key steps of web identity federation. First, you authenticate with an identity provider (Amazon, Google or Facebook). Second, you obtain temporary security credentials. Lastly, you make calls to AWS resources using the obtained credentials. You can see the request and response on the page while you perform the given steps. &lt;/p&gt;

&lt;p&gt;You can access the Web Identity Federation Playground using the link here:&lt;br&gt;
&lt;a href="https://web-identity-federation-playground.s3.amazonaws.com/index.html"&gt;https://web-identity-federation-playground.s3.amazonaws.com/index.html&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Access Advisor
&lt;/h2&gt;

&lt;p&gt;Access Advisor gives you information on policies that were last accessed by an IAM entity (users, groups, roles and policies). Access Advisor, in the IAM console, shows the services that an IAM identity can access and when those services were last accessed. You can review this data to revoke unused permissions. This helps you to adhere to the &lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html#grant-least-privilege"&gt;principle of least privilege&lt;/a&gt;. &lt;/p&gt;

&lt;h2&gt;
  
  
  IAM Access Analyzer
&lt;/h2&gt;

&lt;p&gt;IAM Access Analyzer is a tool that monitors access to your AWS resources. This AWS tool basically provides you three key capabilities: First, it helps you identify resources that are shared with an external entity. Second, it helps validate IAM policies against policy grammar and best practices. You can see Access Analyzer in action while creating or editing a policy in IAM console, where you can view policy validation check findings that include security warnings, errors and suggestions for your policy. Third, it can generate IAM policies based on access activity by an IAM entity (user or role) in AWS CloudTrail logs. You can see this in the "Generate policy based on CloudTrail events" section on the Permissions tab of an IAM user or an IAM role.&lt;/p&gt;

&lt;p&gt;AWS IAM Access Analyzer is powered by Zelkova, which is another AWS tool that uses automated reasoning to analyze policies. You can read more about Zelkova &lt;a href="https://aws.amazon.com/blogs/security/protect-sensitive-data-in-the-cloud-with-automated-reasoning-zelkova/"&gt;here.&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Access analyzer for S3
&lt;/h2&gt;

&lt;p&gt;Access analyzer for S3 is a tool from AWS that helps organizations minimize the risk of S3 bucket data leaks. This AWS tool alerts you to S3 buckets on two critical security risks: 1) Buckets with public access (those buckets that can be accessed by anyone on the internet) and 2) Buckets with access from other AWS accounts (those buckets that are conditionally shared with other AWS accounts). &lt;/p&gt;

&lt;p&gt;You can even preview and validate access to your S3 bucket before deploying your policy. This helps you validate public and cross-account access to your bucket before you save your policy. To preview access, in the S3 console, open the Edit bucket policy page and draft a policy. Under Preview external access, choose an existing account analyzer and then click on Preview button. Access Analyzer generates a preview of findings for access to your bucket. &lt;/p&gt;

&lt;h2&gt;
  
  
  Closing Thoughts
&lt;/h2&gt;

&lt;p&gt;Go check out these cool AWS tools today and run your solutions securely on the cloud!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Resilience of Storage Services on AWS</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Thu, 12 May 2022 05:18:59 +0000</pubDate>
      <link>https://forem.com/saifmomin/resilience-of-storage-services-on-aws-2095</link>
      <guid>https://forem.com/saifmomin/resilience-of-storage-services-on-aws-2095</guid>
      <description>&lt;p&gt;AWS has the most reliable Global Infrastructure for cloud that allows you to build secure, scalable and highly available workloads in the cloud. While building solutions on AWS, it is important to know the data resiliency offered by AWS cloud. &lt;/p&gt;

&lt;p&gt;In this post I will talk about the resiliency of storage services on AWS. We will look at the following AWS storage services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Instance Store&lt;/li&gt;
&lt;li&gt;Amazon Elastic Block Store (Amazon EBS)&lt;/li&gt;
&lt;li&gt;Amazon Elastic File System (Amazon EFS)&lt;/li&gt;
&lt;li&gt;Amazon FSx for Windows&lt;/li&gt;
&lt;li&gt;Amazon FSx for Lustre&lt;/li&gt;
&lt;li&gt;Amazon Simple Storage Service (Amazon S3)&lt;/li&gt;
&lt;li&gt;AWS Storage Gateway&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Instance Store&lt;/strong&gt;  &lt;/p&gt;

&lt;p&gt;Instance stores provides temporary block-level storage for EC2 instances. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Instance Stores do not offer any level of resiliency.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The data in an instance store persists only during the lifetime of its associated instance. The data in the instance store is lost if the instance stops, hibernates, terminates or the underlying disk drive fails. However, data in the instance store is persisted in case of instance reboot.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Elastic Block Store (Amazon EBS)&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Amazon EBS provides block-level persistent storage volumes attached to EC2 instances. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;EBS volumes are replicated within an Availability Zone (AZ). &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Amazon EBS volumes are designed to be highly available, reliable, and durable. Amazon EBS volume data is replicated across multiple servers in an Availability Zone to prevent the loss of data from the failure of any single component.&lt;/p&gt;

&lt;p&gt;Amazon EBS also has snapshot feature to help support your data resiliency and backup needs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Elastic File System (Amazon EFS)&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Amazon EFS provides a simple, serverless, set-and-forget elastic file system. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Amazon EFS file systems (except for EFS One Zone) are resilient to one or more Availability Zone failures within an AWS Region.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;With EFS Standard storage classes, every EFS file system object (such as directory, file, and link) is redundantly stored across multiple Availability Zones. You can architect your application to failover from one AZ to other AZs in the Region to ensure the highest level of application availability. &lt;/p&gt;

&lt;p&gt;With EFS One Zone storage classes, your data is redundantly stored within a single Availability Zone. &lt;/p&gt;

&lt;p&gt;Mount targets are designed to be highly available within an AZ for all EFS storage classes.&lt;/p&gt;

&lt;p&gt;Amazon EFS is designed to sustain concurrent device failures by quickly detecting and repairing any lost redundancy. &lt;/p&gt;

&lt;p&gt;Amazon EFS also supports resiliency needs using features such as EFS Replication and data backup using AWS Backup or EFS-to-EFS backup. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon FSx for Windows&lt;/strong&gt;  &lt;/p&gt;

&lt;p&gt;Amazon FSx for Windows provides fully managed file storage built on Windows Server.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Single-AZ file systems designed to be resilient within an Availability Zone by replicating data within single AZ. Multi-AZ file systems provide redundancy across multiple AZs.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Amazon FSx automatically replicates your data within an Availability Zone to protect it from component failure and automatically replaces infrastructure components in the event of a failure. &lt;/p&gt;

&lt;p&gt;Amazon FSx offers Multi-AZ option that include an active and standby file server in separate AZs. In the event of a failure of the active file server or an AZ, Amazon FSx automatically fails over to the standby so you can resume file system operations without a loss of availability to your data.&lt;/p&gt;

&lt;p&gt;Amazon FSx also automatically takes highly durable, file-system consistent daily backups to S3 (using Volume Shadow Copy Service), and allows you to take additional backups at any point.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon FSx for Lustre&lt;/strong&gt;   &lt;/p&gt;

&lt;p&gt;Amazon FSx for Lustre provides fully managed shared storage with the scalability and performance of the popular Lustre file system.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Data replicated within an Availability Zone with Persistent file system.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Amazon FSx for Lustre provides a parallel file system, where data is stored across multiple network file servers to maximize performance and reduce bottlenecks. &lt;/p&gt;

&lt;p&gt;Amazon FSx for Lustre offers a choice of scratch and persistent file systems to accommodate different data processing needs. &lt;/p&gt;

&lt;p&gt;Persistent file systems are ideal for longer-term storage and throughput-focused workloads. In persistent file systems, data is replicated within an AZ, and file servers are replaced automatically if they fail. &lt;/p&gt;

&lt;p&gt;Scratch file systems are ideal for temporary storage and shorter-term processing of data. Data is not replicated and does not persist if a file server fails. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Simple Storage Service (Amazon S3)&lt;/strong&gt;   &lt;/p&gt;

&lt;p&gt;Amazon S3 is an object storage service built to store and retrieve any amount of data from anywhere.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;S3 objects (except for S3 One Zone-IA) are resilient to three or more Availability Zones in an AWS Region.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;S3 Standard, S3 Standard-IA, S3 Intelligent-Tiering, S3 Glacier Instant Retrieval, S3 Glacier Flexible Retrieval, and S3 Glacier Deep Archive storage - these storage classes redundantly store objects on multiple devices spanning a minimum of three Availability Zones in an AWS Region. &lt;/p&gt;

&lt;p&gt;The S3 One Zone-IA storage class stores data redundantly across multiple devices within a single Availability Zone. &lt;/p&gt;

&lt;p&gt;Amazon S3 service is designed to sustain concurrent device failures by quickly detecting and repairing any lost redundancy, and it also regularly verifies the integrity of your data using checksums.&lt;/p&gt;

&lt;p&gt;Amazon S3 offers several other features to help support your data resiliency and backup needs such as lifecycle configuration, versioning, object locking and replication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Storage Gateway&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;AWS Storage Gateway is a hybrid cloud storage service that provide on-premises applications with access to virtually unlimited cloud storage.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;AWS Storage Gateway provides High Availability on VMware.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Storage Gateway achieves high availability on VMware through a series of continuous health-checks against critical operations of the gateway that is integrated with VMware vSphere High Availability (VMware HA). With this integration, Storage Gateway automatically recovers from most service interruptions in under 60 seconds (whether deployed in an on-premises VMware environment, or in VMware Cloud on AWS). This protects storage workloads against hardware, hypervisor, or network failures, storage errors, as well as software issues that lead to connection timeouts or file-share, volume, or tape unavailability.&lt;/p&gt;

&lt;p&gt;Storage Gateway also supports your data resiliency and backup needs using features such as volume backup, volume cloning, and tape archival.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;br&gt;
I hope this post helps you understand how your data is reliably stored on AWS using the built-in resiliency offered by various AWS Storage Services.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>cloud</category>
      <category>resilience</category>
    </item>
    <item>
      <title>Are you Well-Architected?</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Sat, 30 Apr 2022 09:30:07 +0000</pubDate>
      <link>https://forem.com/saifmomin/are-you-well-architected-2dk1</link>
      <guid>https://forem.com/saifmomin/are-you-well-architected-2dk1</guid>
      <description>&lt;p&gt;In this post, I will try to describe the significance of AWS Well-Architected Framework. You can find more details in AWS &lt;a href="https://docs.aws.amazon.com/wellarchitected/latest/framework/welcome.html"&gt;whitepaper&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Every architect aims at creating robust software systems. There is no silver bullet to creating good architecture but there are best practices and principles that apply regardless of the technology. AWS Well-Architected is a framework that provides set of guiding tenets to design and run successful  solutions on cloud. &lt;/p&gt;

&lt;p&gt;The framework helps you build secure, reliable, efficient, cost-effective, and sustainable workloads in the cloud. It provides a way to consistently measure your architecture against best practices, identify areas for improvement, address shortcomings and thus in the process helps your architecture to continuously evolve. The framework can be applied to all kinds of cloud engagements such as cloud migrations, re-design of existing workloads and new workloads. &lt;/p&gt;

&lt;h2&gt;
  
  
  The Framework
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://aws.amazon.com/architecture/well-architected/?wa-lens-whitepapers.sort-by=item.additionalFields.sortDate&amp;amp;wa-lens-whitepapers.sort-order=desc"&gt;AWS Well-Architected Framework&lt;/a&gt; describes design principles and architectural best practices for designing and running high quality solutions in the cloud. The framework includes &lt;strong&gt;general design principles&lt;/strong&gt;, &lt;strong&gt;architecture pillars&lt;/strong&gt;, &lt;strong&gt;domain-specific lenses&lt;/strong&gt;, &lt;strong&gt;hands-on labs&lt;/strong&gt;, and a &lt;strong&gt;well-architected tool&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  General Design Principles
&lt;/h2&gt;

&lt;p&gt;The framework identifies a set of general design principles to produce good architecture in the cloud:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Stop guessing your capacity needs.&lt;/li&gt;
&lt;li&gt;Test systems at production scale.&lt;/li&gt;
&lt;li&gt;Automate to make architectural experimentation easier.&lt;/li&gt;
&lt;li&gt;Allow for evolutionary architectures.&lt;/li&gt;
&lt;li&gt;Drive architectures using data.&lt;/li&gt;
&lt;li&gt;Improve through game days.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Six Pillars
&lt;/h2&gt;

&lt;p&gt;The Well-Architected Framework consists of six pillars - operational excellence, security, reliability, performance efficiency, cost optimization, and sustainability. Each pillar provides a set of &lt;strong&gt;design principles&lt;/strong&gt;, &lt;strong&gt;best practices&lt;/strong&gt;, and &lt;strong&gt;questions&lt;/strong&gt;. These pillars help you produce stable and efficient systems. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Operational Excellence:&lt;/strong&gt; The operational excellence pillar focuses on running and monitoring workloads in the cloud.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security:&lt;/strong&gt; The security pillar focuses on protecting information, systems and assets. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reliability:&lt;/strong&gt; The reliability pillar focuses on the ability to prevent and quickly recover from failures to meet demands.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Performance Efficiency:&lt;/strong&gt; The performance efficiency pillar focuses on using resources efficiently.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost optimization:&lt;/strong&gt; The cost optimization pillar focuses on avoiding unneeded costs and delivering business value at the lowest price point.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sustainability:&lt;/strong&gt; The sustainability pillar focuses on minimizing the environmental and societal impacts of running cloud workloads. &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I created a mindmap for the six pillars of Well-Architected Framework depicting the design principles, best practices, and questions for each pillar. See the mindmap &lt;a href="https://www.xmind.net/m/mrfnqQ"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5e2hMZU7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qrt50gt41ecruy1nn8wv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5e2hMZU7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qrt50gt41ecruy1nn8wv.png" alt="AWS Well-Architected Pillars" width="880" height="1022"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Lenses
&lt;/h2&gt;

&lt;p&gt;AWS Well-Architected Lenses extend the guidance offered by AWS Well-Architected Framework to specific industry and technology domains, such as SaaS, Serverless, IoT, HPC, financial services, games industry and more. On top of the core framework, there is an added set of best practices and questions targeted for the specific domain.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Labs
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://www.wellarchitectedlabs.com"&gt;AWS Well-Architected Labs&lt;/a&gt; provide you a repository of documentation and code to give you a hands-on experience in building a well-architected solution. The labs provide both beginner and advanced levels of curated solutions for each pillar. &lt;/p&gt;

&lt;h2&gt;
  
  
  The Review
&lt;/h2&gt;

&lt;p&gt;Architecture reviews are ideally conducted to identify areas that could be improved, or to address critical issues or to close technical debts. So how do you assess the wellness of an existing workload on AWS? The &lt;a href="https://aws.amazon.com/well-architected-tool/?whats-new-cards.sort-by=item.additionalFields.postDateTime&amp;amp;whats-new-cards.sort-order=desc"&gt;AWS Well-Architected Tool&lt;/a&gt; helps you with the well-architected review process; it provides a consistent mechanism for you to validate and measure the state of your architecture. It is important to note that the review process is not an audit; instead it should be treated as a light-weight process that is based on collaboration and conversation. The review process is not a one time event either; instead it should be conducted on a periodic basis like at every phase of migration or at important workload milestones. The outcome of the review process is a set of improvement areas that must be closed to improve the workload architecture. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Hope this post helps you understand how AWS Well-Architected Framework guides you in building well-architected solutions on AWS.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>cloud</category>
    </item>
    <item>
      <title>The Cloud Resume Challenge</title>
      <dc:creator>saifmomin</dc:creator>
      <pubDate>Sat, 24 Jul 2021 18:50:58 +0000</pubDate>
      <link>https://forem.com/saifmomin/the-cloud-resume-challenge-b2g</link>
      <guid>https://forem.com/saifmomin/the-cloud-resume-challenge-b2g</guid>
      <description>&lt;p&gt;I am exactly a year late in taking this &lt;a href="https://cloudresumechallenge.dev" rel="noopener noreferrer"&gt;Cloud Resume Challenge&lt;/a&gt;, I wish I had seen it earlier. Nevertheless I did it now for some fun and learning. It may feel simplistic to get a static website running on cloud, but when you actually  do it, you will realize there is enormous scope for learning. You get to learn how a website runs  - on the cloud - serverless; you will know how a web application works end-to-end full-stack , that constituets of frontend, CDN, DNS, TLS, Gateway, backend, IaC, and Automation. In designing the website, I haven't factored every quality attribute (security, cost, etc.) of every AWS services used, the output thus is far from perfect, you may excuse me on that. &lt;/p&gt;

&lt;p&gt;See the short profile in action at &lt;a href="https://saifmomin.net" rel="noopener noreferrer"&gt;https://saifmomin.net&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The code on GitHub repo is here: &lt;a href="https://github.com/saifmomin/crc-frontend" rel="noopener noreferrer"&gt;Frontend&lt;/a&gt; and &lt;a href="https://github.com/saifmomin/crc-backend" rel="noopener noreferrer"&gt;Backend&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This serverless website solution offers the following benefits:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Is durable with Amazon S3 storage&lt;/li&gt;
&lt;li&gt;Is performant by Amazon CloudFront content delivery network&lt;/li&gt;
&lt;li&gt;Is secured by HTTPS and additional security headers&lt;/li&gt;
&lt;li&gt;Is automated and deployed with AWS SAM&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmcsp4lua0v2ogvyb6waz.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmcsp4lua0v2ogvyb6waz.jpg" alt="Serverless Website Architecture"&gt;&lt;/a&gt;Figure: Serverless Website Architecture  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2 HTML:&lt;/strong&gt; First thing - you need to document your resume in HTML. So I created HTML5 page. The fact you are reading this blog is because HTML exist. Hypertext is an integral part of the world wide web, along with HTTP and URI, of course.  One of the core aspect of web design is that web pages must be fluid to adapt to the varying sizes of the devices it is viewed on. Content shall be treated like liquid. HTML is responsive by default, to an extent, but you need CSS to make it truly fluid! &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3 CSS:&lt;/strong&gt; The resume in plain HTML looks very textual, so I applied styling to it with Cascade Style Sheet (CSS3). CSS allows you to create Responsive Web Design, the main ingedient of which are - fluid grid, fluid media and media queries. This website is responsive by design, you may experience it on different screen sizes. There are frameworks like Bootstrap that helps you create responsive websites; I have used just CSS3 though. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4 Static S3 Website:&lt;/strong&gt; One you have a resume page in HTML and CSS, it is time to host it on the cloud. I deployed it to Amazon S3 as a static website. With S3, you can architect &amp;amp; host a modern static website without needing a webserver! It may look easy to host a static website from an S3 bucket but no, you need to evaluate your options thoughtfully. Two choices for hosting with S3 are REST API endpoint and Website endpoint and they have their &lt;a href="https://docs.aws.amazon.com/AmazonS3/latest/userguide/WebsiteEndpoints.html#WebsiteRestEndpointDiff" rel="noopener noreferrer"&gt;trade-offs&lt;/a&gt;. I adopted website endpoint solution which uses a Referer key to allow access only for requests with the custom Referer key; my S3 Bucket Policy is configured to only accept requests that contain this header. This keeps your S3 bucket configured as a website with access restricted by a Referer header thus blocking direct access to website endpoint. You can also serve the static website hosted on S3 using REST endpoint as origin with access restricted via CloudFront OAI, this allows you to keep your bucket private, but you do not get website redirection support (here, redirection may be achieved using Lambda@Edge though). You can as well host static website with AWS Amplify, it has built-in CI/CD workflows.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5 HTTPS:&lt;/strong&gt; Your website is now in S3 bucket ready to be consumed by viewers, but wait, you need to deliver the website content securely to your viewers and at a good speed. That basically means you need a Content Delivery Network (CDN), and that is CloudFront in AWS. CloudFront has points of presence (PoPs) that deliver content to end users at low latency and secures your content at both network and application level. &lt;br&gt;
To deliver the website over HTTPS (with a custom domain name - see step 6),  I created a free public TLS certificate with AWS Certificate Manager (ACM) and integrated it with CloudFront. Apart from HTTPS support, CloudFront offers many other security features like adding AWS WAF &amp;amp; AWS Shield Standard (defended by default), Field-level Encryption, Signed URLs and Signed Cookies, Geo-Restriction, OAI et al that could be leveraged. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6 DNS:&lt;/strong&gt; Now your website is secured at the edge and can be delivered fast to your viewers through CDN (CloudFront). But you would not like your end users to access your website with a domain name like d111111abcdef8.cloudfront.net, you can do better - by having (buying) a meaningful domain name for your website. I bought the domain on Route53 and it created a public hosted zone with name servers. Route53 is also a DNS that routes your end users to your website by translating your website name to IP address.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 7 Javascript:&lt;/strong&gt; At this stage you have a fully functional static website with HTML and CSS. Next, you need to display a visitor count on the website and for this you need a backend machinery (database to persist the count data, a service to process data, a gateway to front-door the service). To access your backend, you need Javascript at fontend. Along with visitor count, I also wanted to display live date. By design, JavaScript is a synchronous, blocking, &amp;amp; single-threaded. So how do you do it withour blocking the main thread - use Web Workers! I created a web worker that runs a worker thread in the background (note that Web Worker is not a part of JavaScript engine but a Browser API). &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 8 Database:&lt;/strong&gt; Here you create the first layer of your backend stack - a database to persist and update the visitor counter data. Since you are building your website for internet scale, you need a database that is highly scalable. Amazon DynamoDB is schema-less, join-less NoSQL database service from AWS that gives you ultra-high performance at hyper-scale! I created a single table with one partition key (website) and one attribute (visits) to persist the visitor count. For real-world projects, you need to design DynamoDB Table(s) carefully or else you may end up with a relational-like design.  Strive for as few tables as possible or the heck go for single-table design. &lt;br&gt;
DynamoDB has two read/write capacity modes and the pricing model is based on these two capacity modes - On-Demand and Provisioned. In general you start with On Demand and when the usage grows you may move to Provisoned mode, to gain some cost savings. Thus I opted for the on-demand capacity mode since the traffic is unpredictable for my new website.&lt;br&gt;
For sake of simplicity, I made these assumptions for costing -&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;each time a user access my website, 1 write of 1 KB and 1 eventually consistent read of 1 KB are performed. Note that by default DynamoDB uses eventually consistent reads, unless you specify otherwise. &lt;/li&gt;
&lt;li&gt;my website receives 1 million reads and 1 million writes per month&lt;/li&gt;
&lt;li&gt;for my region ap-south-1 the monthly cost would be $0.14 (0.285/2) for reads + $1.42 for writes = approx $ 1.56 per month. First 25 GB stored per month is free, so this shouldn't add to the cost as my table wouldn't grow that big. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For real-world projects you would consider other parameters like backups, data transfer, DAX, Streams, etc for calculating the cost.&lt;/p&gt;

&lt;p&gt;DynamoDB scales well, but how do you achieve concurrency at scale? DynamoDB allows you to do it using Atomic Counter or Conditional Writes (for busness-critical solutions) or maybe even Transactions. I made use of Atomic Counter. AWS provides the same visitor counter use case for atomic counter in their &lt;a href="https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/WorkingWithItems.html#WorkingWithItems.AtomicCounters" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 9 API:&lt;/strong&gt; You have database table and Javascript code at frontend through which you can CRUD your database, but this is never a good practice. We needmore layers between the frontend client and the database - a service and an api to front-door the service.  APIS are programmable interface that allow you to access a servce running somewhere on the cloud or the internet. Amazon API Gateway is a service that helps developers build secure and scalable APIs.&lt;br&gt;&lt;br&gt;
AWS offers two RESTful API flavors - REST API and HTTP API (confusingly named!) ; and a third type - websockets API. HTTP APIs in Amazon API Gateway are faster, cheaper (us-east-1: $1.00 per million vs $3.5 per million for REST API) and simpler to use (with native CORS support, JWT Authorizers instead of lambda authorizer, auto-deploy to $default stage, among others); but they lack features, like you can only have regional API endpoint with HTTP API. So for real-world projects, you must choose the right API type based on your requirements. See &lt;a href="https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-vs-rest.html" rel="noopener noreferrer"&gt;http-api-vs-rest&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Edge-optimized endpoint leverage Cloudfront PoP,  but you can't edit the distribution, for example you cannot add a WAF. With Regional API endpoint, you can have your own CloudFront distribution in front of your API , and thus have full control like you can have a WAF. If you have users who would access your API from all over the globe and you do not want to manage CloudFront distribution for your APIs, then go for edge-optimized API.  Though HTTP API looked promising with its low cost, speed and dev experience, but it only gives you regional endpoint and since my website is supposed to be accessed globally and that I want to stick with AWS managed CloudFront, I created a REST API. For throttling,  I left the Default Method Throttling on Stage Settings. Note that there are two places where you can configure throttling - Server-side (Stage setting) and Per-Client (Usage Plan with API Key).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 10 Python:&lt;/strong&gt; We have Javascript calling the API. The API now need to integrate with backend service - a lambda function + a programming language. Talking about Lambda - though the first AWS service, that is SQS, launched by AWS was serverless, it was Lambda which revolutionised the serverless movement. If cloud has disrupted the traditional data centres then Lambda has disrupted the cloud itself! Lambda natively supports Java, Go, PowerShell, Node.js, C#, Python, and Ruby code (at the time of this writing), but you can run any programming language on AWS Lambda using Custom Runtimes. I created a Lambda funtion with Python 3.8 runtime and wrote the code to connect with DynamoDB table. Important point to remember - make sure you have the right execution role (along with the right IAM Policy) attached to your Lambda to access DynamoDB. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 11 Tests:&lt;/strong&gt; At this stage you have full backend ready, but wait, it is not complete without a test. To be truly agile, you should have unit tests for your code; Test Driven Development (TDD) is a practice that can help you with it. I wrote unit test that would test my Python code and also mock AWS environmnent. I used Python moto library to write tests. Moto is a library that allows your tests to easily mock out AWS Services.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 12 Infrastructure as Code:&lt;/strong&gt; You now have backend infrastructue ready along with test, but you can still do better - turn your infrstructure into code.  IaC solves the problem of environment drift, just like containers.  AWS provides different ways to code your infra, like SAM, CDK, and Cloudformation at the core. I created a SAM YAML and provisioned the backend with SAM commands. For some reason that I coudnt figure out, I wasnt able to get the On Demand Billing Mode (BillingMode: PAY_PER_REQUEST) work through SAM, so I updated it through AWS CLI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 13 Source Control:&lt;/strong&gt; Now we have our frontend, backend and IaC code. The code needs to be managed in a repo. When you push code often to a repo for build and test, it gives you continuous integration. I created a private repo on GitHub. AWS has CodeCommit for Continuous Integration. AWS CodeCommit is a fully-managed source control service that hosts private Git repositories and makes it easy for teams to collaborate on code in a secure and highly scalable ecosystem. If you are working on AWS eco-system then CodeCommit could be the ideal chioce as it integrates very well with AWS services.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 14 CI/CD (Back end):&lt;/strong&gt; Once the source code repo is setup on GitHub, we should automate further. Automation is a powerful. It both saves time and help reduce human error.  Github Actions allows you to set up automated CI/CD workflows directly from the GitHub repo. I used the sample Python Application CI workflow from GitHub Actions and created the workflow with two jobs - a test job and a deploy job. When the test pass, SAM is packaged and deployed to AWS. Important warning - Do NOT keep AWS credentials in your code! On GitHub Actions - use GitHub secrets. On AWS - use IAM.  AWS has CodePipeline to orchestrate a CI/CD pipeline. One important advantage of CodePipeline is that authentication is handled with IAM roles instead of access keys for IAM users; so no need to manage access keys. With GitHub Actions, you must store the IAM users access keys in GitHub secrets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 15 CI/CD (Front end):&lt;/strong&gt; Finally, we have our backend code with IaC and CI/CD. The last step is to create a repo for frontend code and automate it with CI/CD. I created a simple workflow that uploads frontend code to S3 bucket and invalidates the CloudFront cache as well. The first 1,000 CloudFront invalidation per month are free. There is another way to to remove a file from CloudFront edge caches before it expires, and that is to use a version identifier in file names or in folder names. Note: using a versioned file names or folder names with a version id is not the same as S3 object versioning.&lt;br&gt;
See &lt;a href="https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/Invalidation.html#Invalidation_Expiration" rel="noopener noreferrer"&gt;this&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 16 Blog post:&lt;/strong&gt; Expressing your experience through writing is a good way to communicate. I have tried to share my experience through this blog!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1 Certification:&lt;/strong&gt; I did not fulfill the condition of taking the Cloud Practitioner certificate; I have other AWS certifications so I hope it is okay.&lt;/p&gt;

&lt;p&gt;Hope this post helps you understand how to host a static website on AWS.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>serverless</category>
      <category>architecture</category>
    </item>
  </channel>
</rss>
