<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Katoria H.</title>
    <description>The latest articles on Forem by Katoria H. (@khenry).</description>
    <link>https://forem.com/khenry</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/khenry"/>
    <language>en</language>
    <item>
      <title>Part 2: Pipeline fun with AWS</title>
      <dc:creator>Katoria H.</dc:creator>
      <pubDate>Thu, 17 Aug 2023 21:35:53 +0000</pubDate>
      <link>https://forem.com/aws-builders/part-2-pipeline-fun-with-aws-26cl</link>
      <guid>https://forem.com/aws-builders/part-2-pipeline-fun-with-aws-26cl</guid>
      <description>&lt;h2&gt;
  
  
  &lt;strong&gt;STEP 3&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Welcome back techies to Part 2 of Pipelines Galore! We are going to be diving into the creation of our code pipeline, retrieving our region ID &amp;amp; IAM creds, along with taking it up a notch with additional validation testing. If you have not reviewed &lt;a href="https://dev.to/khenry/part-1-pipeline-fun-with-aws-18ll"&gt;Part 1&lt;/a&gt; of this tutorial, please take the time to do so now so that you’re able to follow along as we wrap things up!&lt;/p&gt;

&lt;p&gt;So we’ve verified that we have successfully created our Secret via Secrets Manager, as well as generated a net-new KMS Customer-managed key. To ensure we have the correct region ID and IAM role (these will be needed for the pipeline), let’s run the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure get region
aws iam get-role --role-name &amp;lt;rolename&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SZ6Y6Aqe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d27on6uq1yyb5yi31wth.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SZ6Y6Aqe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d27on6uq1yyb5yi31wth.png" alt="Image description" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Please note that my role also includes a policy that has permissions for Elastic Beanstalk, CloudWatch, SQS, and X-Ray&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;(1) We will now begin with the creation of our Pipeline, which will &lt;strong&gt;ONLY&lt;/strong&gt; include a Source &amp;amp; Deploy phase, though production pipelines &lt;strong&gt;&lt;em&gt;will normally have a Build and Test phase&lt;/em&gt;&lt;/strong&gt; included as well. There are two ways that you can create your pipeline: Use a json file with the specific parameters and run the &lt;code&gt;aws codepipeline create-pipeline --cli-input-json file://pipeline.json&lt;/code&gt;​​ command once done with your file, OR via the Console, as shown below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the Console, Navigate to CodePipeline and select “Create Pipeline”&lt;/li&gt;
&lt;li&gt;Give your pipeline a name and select an existing role or create a new role&lt;/li&gt;
&lt;li&gt;For the Source stage, select your specific provider&lt;/li&gt;
&lt;li&gt;SKIP the Build stage&lt;/li&gt;
&lt;li&gt;For the Deploy stage, add your Elastic Beanstalk details&lt;/li&gt;
&lt;li&gt;Click “Create Pipeline”, which should take a few minutes to generate&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vXmQgGyy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ap107aw04rc2izj116vu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vXmQgGyy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ap107aw04rc2izj116vu.png" alt="Image description" width="800" height="738"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IYwiZNeG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6nhmcwdxxqyf8r9crooe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IYwiZNeG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6nhmcwdxxqyf8r9crooe.png" alt="Image description" width="800" height="649"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7SRpcjoD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m6rj4lkxtuw0v8r0ig44.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7SRpcjoD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/m6rj4lkxtuw0v8r0ig44.png" alt="Image description" width="800" height="671"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;If you'd like to test this out using the CLI instead, feel free to use this JSON sample to serve as a guide for your pipeline&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "pipeline": {
      "name": "yourpipelinename",
      "roleArn": "arn:aws:iam::1234567890:role/youriamrole",
      "artifactStore": {
        "type": "S3",
        "location": "yourbucketlocation"
      },
      "stages": [
        {
          "name": "Source",
          "actions": [
            {
              "name": "SourceAction",
              "actionTypeId": {
                "category": "Source",
                "owner": "AWS",
                "provider": "yourprovider",
                "version": "1"
              },
              "configuration": {
                "RepositoryName": "yourrepo",
                "BranchName": "master"
              },
              "outputArtifacts": [
                {
                  "name": "yourinput"
                }
              ],
              "runOrder": 1
            }
          ]
        },
        {
          "name": "Deploy",
          "actions": [
            {
              "name": "Deploy",
              "actionTypeId": {
                "category": "Deploy",
                "owner": "AWS",
                "provider": "Elastic Beanstalk",
                "version": "1"
              },
              "configuration": {
                "ApplicationName": "yourinput",
                "EnvironmentName": "yourinput"
              },
              "inputArtifacts": [
                {
                  "name": "yourinput"
                }
              ],
              "runOrder": 1
            }
          ]
        }
      ]
    }
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;(2) We can verify that our pipeline and deployment were successful by visiting CodePipeline &amp;amp; our Elastic Beanstalk domain, as seen below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fhxk70lN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pf16qfn28aqgol2hwjwq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fhxk70lN--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pf16qfn28aqgol2hwjwq.png" alt="Image description" width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VWilaeSe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6wg447yky0zx9ae9at2o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VWilaeSe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6wg447yky0zx9ae9at2o.png" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--76XHcJZf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i0v2oervofn2u8c1q9gy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--76XHcJZf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i0v2oervofn2u8c1q9gy.png" alt="Image description" width="800" height="287"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(3) Now, we can take this up a notch by modifying one of the files created earlier, committing a new change, and then verifying that we can reach our previous Elastic Beanstalk domain, in which we should see the newly created changes. We’ll now push our newly updated files to the repo by using the following git commands as before:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git add .
git commit -m "Commit message"
git push origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rWeyme-L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jaye2awqk41gjlqyod19.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rWeyme-L--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jaye2awqk41gjlqyod19.png" alt="Image description" width="800" height="229"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4xKnTLxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rq6c5frkgp6v2ezp0nvc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4xKnTLxi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rq6c5frkgp6v2ezp0nvc.png" alt="Image description" width="800" height="699"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(4) If you check the screenshots below, you'll notice that our changes to the index.html file were successful, and we also have some neat metrics that are shown with CloudWatch as well:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f3puhv05--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ilbrivwntsl9icy9hjxp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f3puhv05--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ilbrivwntsl9icy9hjxp.png" alt="Image description" width="800" height="290"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nPQq2vrT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cvhjv23xtvpfvfruz0cx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nPQq2vrT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cvhjv23xtvpfvfruz0cx.png" alt="Image description" width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And that just about wraps it up for this two-part tutorial! I hope you have enjoyed creating your very first pipeline if you're net-new to AWS CodePipeline. Remember, there are &lt;em&gt;&lt;strong&gt;multiple&lt;/strong&gt;&lt;/em&gt; ways that you can create a pipeline, and this tutorial just walked through one approach. Stay tuned for more tutorials that are coming your way! Follow me on &lt;a href="https://www.linkedin.com/in/katoria-henry-2018/"&gt;LinkedIn&lt;/a&gt; to be on the lookout for new blogs.&lt;/p&gt;

</description>
      <category>cicd</category>
      <category>automation</category>
      <category>devtools</category>
      <category>codedeploy</category>
    </item>
    <item>
      <title>Part 1: Pipeline fun with AWS</title>
      <dc:creator>Katoria H.</dc:creator>
      <pubDate>Thu, 17 Aug 2023 21:35:27 +0000</pubDate>
      <link>https://forem.com/aws-builders/part-1-pipeline-fun-with-aws-18ll</link>
      <guid>https://forem.com/aws-builders/part-1-pipeline-fun-with-aws-18ll</guid>
      <description>&lt;p&gt;Hello techies and future techies! We are back with a new tutorial leveraging AWS DevTools. As the name suggests, DevTools, or Developer Tools, are tools that are most commonly used by individuals operating in the DevOps space. There are so many tools that you can choose from if you’re looking to simplify your overall development life cycle, automate CI/CD pipelines, and of course, enhance and improve developer productivity. For this tutorial, we will be leveraging AWS CodeCommit, Elastic Beanstalk, Secrets Manager, KMS, IAM, and CodePipeline. Let’s dive into some of these services and explain their use cases!&lt;/p&gt;

&lt;p&gt;When you hear of AWS CodeCommit, I’m pretty sure that you may immediately think of Github, which is very similar in nature. CodeCommit is a fully managed source code control service that enables you to host secure and scalable Git repositories. A fully managed AWS service is one that AWS manages end-to-end, such as the infrastructure and resources required to deliver the service. CodeCommit provides version control for your application code, allowing teams to collaborate on software development projects efficiently. Some of the key features includes the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Git-based&lt;/strong&gt;: CodeCommit supports the popular Git version control system, making it easy to integrate with existing Git workflows and tools.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Private Repositories&lt;/strong&gt;: CodeCommit allows you to create private repositories to protect sensitive code from unauthorized access.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration&lt;/strong&gt;: It seamlessly integrates with other AWS services, including CodeBuild, CodeDeploy, and CodePipeline, facilitating a streamlined development and deployment process.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Like many, I automatically think of a service like “Vault” whenever I hear of “Secrets Manager”.. AWS Secrets Manager is a fully managed service for securely storing, retrieving, and managing sensitive information such as passwords, API keys, and database credentials. It helps centralize and rotate secrets, ensuring secure access to resources. Some of the key features includes the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Secret Storage&lt;/strong&gt;: Secrets Manager stores secrets securely in the AWS Cloud, encrypting them at rest and in transit.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automated Rotation&lt;/strong&gt;: It offers automatic rotation of secrets, reducing the risk of exposure and simplifying the management of credentials. **You should ALWAYS consider key rotation when operating in a Production environment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration&lt;/strong&gt;: Secrets Manager can be easily integrated with other AWS services and applications, ensuring secure access to resources without exposing sensitive information.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Elastic Beanstalk is pretty cool as it abstracts the underlying infrastructure, allowing you to focus on your application. It handles provisioning, scaling, and load balancing, making deployment and management easier. Some of the key features include the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Easy Rollbacks&lt;/strong&gt;: If a deployment doesn't go as planned, Elastic Beanstalk allows you to quickly roll back to the previous version&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Custom Domains &amp;amp; SSL&lt;/strong&gt;: You can easily map custom domain names to your Elastic Beanstalk environment and configure SSL certificates for secure communication.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;2 Environment Types&lt;/strong&gt;: Elastic Beanstalk offers two environment types: Web Server and Worker. Web Server environments are optimized for web applications, while Worker environments are suitable for background tasks or processing jobs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And finally, we have AWS CodePipeline, which is also a fully managed continuous integration and continuous delivery (CI/CD) service that automates the end-to-end software release process. It allows you to define, model, and visualize the different stages of your release pipeline.Key Features of CodePipeline can be found below:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pipeline Automation&lt;/strong&gt;: CodePipeline automates the build, test, and deployment stages of your application, enabling continuous integration and delivery of software changes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flexibility&lt;/strong&gt;: It supports a wide range of integration options with third-party tools and services, enabling you to customize your CI/CD workflow.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Visual Workflow&lt;/strong&gt;: CodePipeline provides a graphical interface to visualize and monitor the stages of your release process, making it easy to identify bottlenecks and issues.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The following prerequisites are highly recommended for this tutorial:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;An AWS Account&lt;/li&gt;
&lt;li&gt;AWS CLI Configured&lt;/li&gt;
&lt;li&gt;Github Account&lt;/li&gt;
&lt;li&gt;Familiarity with GitHub commands&lt;/li&gt;
&lt;li&gt;Familiarity with JSON&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let’s jump into creating a DevSecOps pipeline on AWS using AWS CLI. &lt;em&gt;Please note that this tutorial may slightly differ from the steps taken in a production environment setup.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;STEP 1&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;(1) The very first step that we want to initiate is to create a net-new application via the Console for Elastic Beanstalk (this can also be done via the CLI if you prefer). Navigate to Elastic Beanstalk, and do the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create Application &lt;/li&gt;
&lt;li&gt;Select ‘Web Server Environment’ under environment Tier&lt;/li&gt;
&lt;li&gt;Name your App&lt;/li&gt;
&lt;li&gt;Select the PHP Platform &lt;/li&gt;
&lt;li&gt;Leave the Application Code as ‘Sample Code’&lt;/li&gt;
&lt;li&gt;Select ‘Single Instance’ under Configuration Presets&lt;/li&gt;
&lt;li&gt;Select Next&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Configure your service access, networking, instance scaling, monitoring and logging to your preferences&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nAa0lOzC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kcak36hoc09oim7uudw4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nAa0lOzC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kcak36hoc09oim7uudw4.png" alt="Image description" width="800" height="471"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(2) Secondly, we’re going to create a repository for AWS CodeCommit, by running the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws codecommit create-repository --repository-name &amp;lt;yourreponame&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PV52ZTcj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fnwpdxrhpydzs48ijris.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PV52ZTcj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fnwpdxrhpydzs48ijris.png" alt="Image description" width="800" height="188"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(3) Next, we need to clone the repo created above, and push our newly created files to the repo, using the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone &amp;lt;repoURL&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Be sure to change into the repo directory once cloned&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;(4) To add our app files, we’re going to be locally cloning the repo found &lt;a href="https://github.com/aws-samples/aws-codepipeline-s3-codedeploy-linux/tree/master"&gt;here&lt;/a&gt;. Now, it’s totally up to you if you’d like to modify those files on your end. Once done, you should see multiple files, such as what I have shown here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PFuLMgba--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dqm8f6h646r1mw7wxo37.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PFuLMgba--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dqm8f6h646r1mw7wxo37.png" alt="Image description" width="800" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before pushing your newly created file to the repo created above, be sure that you have &lt;strong&gt;confirmed&lt;/strong&gt; that you have the correct &lt;strong&gt;IAM CodeCommit &amp;amp; Git permissions&lt;/strong&gt; attached to your particular user role. Without these credentials, you will not be able to proceed to the next steps of pushing your content to the repo.&lt;/p&gt;

&lt;p&gt;(5) We’ll now push our newly created file to the repo by using the following git commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git add .
git commit -m "Initial commit"
git push origin master
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--usu1N9aa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x0u1e042myhq8k1wp6yx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--usu1N9aa--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x0u1e042myhq8k1wp6yx.png" alt="Image description" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can verify that the newly created files have been successfully pushed to the repo by visiting AWS CodeCommit in the Console. You should also see a newly created S3 bucket, EC2 instance, CloudWatch Metrics, Auto Scaling Group, and so forth.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--v0e6F0kv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2mqv4rahgmuluilwl5c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--v0e6F0kv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r2mqv4rahgmuluilwl5c.png" alt="Image description" width="800" height="299"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YVZpQM5W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/12pgh7wozxtjsssuv7nl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YVZpQM5W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/12pgh7wozxtjsssuv7nl.png" alt="Image description" width="800" height="132"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;STEP 2&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;(1) Because we will be storing artifacts as well for the build, we will need to generate a new Secrets Manager Secret, followed by our KMS Key:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws secretsmanager create-secret \
  --name &amp;lt;name&amp;gt; \
  --description "Database credentials for my application" \
  --secret-string '{"username": "username", "password": "password"}' \
  --query 'ARN' --output text

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ARHIyQct--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/noygadjrflp02a7zijnq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ARHIyQct--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/noygadjrflp02a7zijnq.png" alt="Image description" width="800" height="115"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(2) Now that we have successfully created our secret, let’s execute the code below to generate a net-new customer-managed KMS key. Please note that an aws-managed key is provided by default when using CodePipeline, along with an S3 Bucket. If generated correctly, the output should be over 30 characters:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws kms create-key \ 
--description "&amp;lt;description&amp;gt;” \
--query 'KeyMetadata.KeyId' --output text
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kASViAps--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3wa7a7dzey8z9k0ceivm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kASViAps--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3wa7a7dzey8z9k0ceivm.png" alt="Image description" width="800" height="97"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(3) And finally, we need to assign the KMS key to the secret from Secrets Manager, using the commands below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws secretsmanager update-secret \
  --secret-id &amp;lt;yourinfo&amp;gt; \
  --kms-key-id $kmsKeyId
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mOBnjGHr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kg49e9vnfm6o8l4angtp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mOBnjGHr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kg49e9vnfm6o8l4angtp.png" alt="Image description" width="800" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;(4) Be sure that you've added server-side encryption to your S3 bucket generated by Elastic Beanstalk using the KMS key that you created above:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--O1uvbblr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eiazelg3m5yz0qe84inz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O1uvbblr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eiazelg3m5yz0qe84inz.png" alt="Image description" width="800" height="243"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Okay…before I get too winded here, stay tuned for &lt;a href="https://dev.to/aws-builders/part-2-pipeline-fun-with-aws-26cl"&gt;Part 2&lt;/a&gt; of this tutorial in which we dive into retrieving the region ID, IAM creds, and start building out our pipeline! See ya there!&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>elasticbeanstalk</category>
      <category>codepipeline</category>
      <category>aws</category>
    </item>
    <item>
      <title>AWS Certified SAA Study Guide: Services &amp; DevTools To Know</title>
      <dc:creator>Katoria H.</dc:creator>
      <pubDate>Sat, 08 Jul 2023 01:25:36 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-certified-saa-study-guide-services-tools-you-must-absolutely-understand-1f43</link>
      <guid>https://forem.com/aws-builders/aws-certified-saa-study-guide-services-tools-you-must-absolutely-understand-1f43</guid>
      <description>&lt;p&gt;Hey there AWS Enthusiasts! This blog is long overdue, but as promised, I wanted to walk you all through some very helpful tips if you’re preparing for the AWS Certified Solutions Architect - Associate Exam. This comprehensive study guide will provide helpful tips on areas such as High Availability, Fault Tolerance, Encryption, Resilience, and Security, as well as Compute, Networking, Storage, Database AWS services, and AWS deployment and management services. &lt;/p&gt;

&lt;p&gt;Whether you have prior AWS experience or you're new to the platform, following this guide and dedicating time to studying and practicing will greatly enhance your chances of success. Remember to supplement your studies with official AWS documentation, whitepapers, and practical hands-on experience. Good luck with your exam preparation!&lt;/p&gt;

&lt;p&gt;I’ll start off by being 100% honest here and mention that I did not study for the FULL four (4) weeks that I originally allotted, which is NOT what you should do! Instead, I studied for roughly 2 weeks, and this is primarily because of my experience with AWS and having the confidence to just for it, and it worked in my favor - this time! However, I wanted to give you all a thorough breakdown of what my original study plan resembled, in hopes that this may help for those that are struggling as to where to start! Soooo….here we go!&lt;/p&gt;

&lt;p&gt;BEFORE you do anything, be sure to download the AWS SAA OFFICIAL Exam Guide found &lt;a href="https://d1.awsstatic.com/training-and-certification/docs-sa-assoc/AWS-Certified-Solutions-Architect-Associate_Exam-Guide.pdf"&gt;here&lt;/a&gt;!!!&lt;/p&gt;

&lt;p&gt;For starters, &lt;em&gt;my&lt;/em&gt; general rule when studying for certifications is to ONLY use a total of three (3) resources (whitepapers are considered a stand-alone doc to me, and so the several that I read were not included in my total count) as to not overwhelm myself with a ton of info. The resources that I leveraged are as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;a href="https://learn.cantrill.io/"&gt;Adrian Cantrill’s SAA Course&lt;/a&gt; (as a refresher)&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://tutorialsdojo.com/"&gt;Tutorials Dojo Practice Exams&lt;/a&gt; (Review Mode &amp;amp; Section-Based Quizzes x2)&lt;/li&gt;
&lt;li&gt;&lt;a href="https://tutorialsdojo.com/"&gt;Neal Davis’s AWS SAA Cheat Sheet ONLY&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Part 1 - For the Newbies&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Now, if you’re totally a newbie to AWS and you’re stuck as to where to start, check out this comprehensive guide below, which is broken down for 10-12 weeks of studying time. If you intend to also leverage practice exams, I would recommend that you take at least 6-8 practice exams (2x). The Tutorials Dojo exams will challenge you for sure, but they are absolutely WORTH it!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 1-2&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;AWS Fundamentals&lt;/strong&gt;&lt;/em&gt;: Start by learning the basics of AWS services, including EC2, S3, IAM, VPC, and RDS. Understand the core concepts and get hands-on experience through AWS &lt;a href="https://www.wellarchitectedlabs.com/"&gt;Well-Architected Labs&lt;/a&gt;. As a general note, labs that are categorized as ‘100’ are introductory, and highly recommended if you lack practical hands-on experience. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 3-4&lt;/strong&gt;: &lt;strong&gt;Optional, but recommended&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Certified Cloud Practitioner&lt;/em&gt;&lt;/strong&gt;: Prepare for and pass the AWS Certified Cloud Practitioner exam. This certification will provide a super solid foundation for further AWS knowledge, and will cover the basis for what you will be diving into starting in Week 5 of your learning path. If you already have the certification or choose not to attempt the cert, you should definitely try to focus on completing more labs and continuing with practice exams in &lt;strong&gt;review mode&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 5-6&lt;/strong&gt;:&lt;br&gt;
Focus on understanding core services and their features, and ways that they would be used in a production environment. If necessary, revisit the level 100 labs that may have been challenging for you. At this point, you should be starting the section-based quizzes if leveraging Tutorials Dojo.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 7-8&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Security&lt;/em&gt;&lt;/strong&gt;: I would highly recommend that you take it up a notch with the labs and focus on levels '200-300', which are intermediate. At this point in your learning journey, you should be diving into IAM, AWS Organizations, AWS Key Management Service (KMS), AWS Web Application Firewall (WAF), AWS Shield, AWS Secrets Manager, etc. You need to thoroughly understand best practices for securing AWS resources and when to use certain technologies within your environment when designing for resilience. As a side note, definitely understand what it means for an architecture to be “loosely” coupled. You should also know the key differences between &lt;strong&gt;&lt;em&gt;monolithic architectures vs microservices&lt;/em&gt;&lt;/strong&gt; - You’ll thank me later 😀! The exam will absolutely test your knowledge on this!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Breaking Down Resilient Architectures Even Further&lt;/strong&gt;:&lt;br&gt;
Let’s side-track here and talk about the importance of designing for high availability, scalability, and resilience, all whilst ensuring your workloads are secure. Take a look at this 3-Tier Architecture that I created for AWS for one of my previous tutorials. &lt;strong&gt;What could I have done differently? What might I be missing? How would you design the same or similar architecture as a Solutions Architect?&lt;/strong&gt; These are thoughts that should be flowing through your brain as you’re studying for the cert, and so you MUST be familiar and comfortable with architectural design:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IbzIw0xJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vzob0mob9tba2kszgrai.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IbzIw0xJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vzob0mob9tba2kszgrai.png" alt="Image description" width="800" height="714"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 9-10&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Compute Services&lt;/em&gt;&lt;/strong&gt;: Take a deeper dive on Amazon EC2, AWS Lambda, and AWS Batch. Understand instance types, scaling options, serverless computing, and containerization with AWS. I’ll repeat this once more - UNDERSTAND SCALING OPTIONS! Regarding containerization, you should definitely understand AWS ECS &amp;amp; ECR, and use cases for leveraging AWS Fargate. While you’re at it, &lt;a href="https://www.youtube.com/watch?v=8BtmzVG_GdI"&gt;take a look&lt;/a&gt; at understanding the basics of what a container is, and dive into a super-simple &lt;a href="https://www.docker.com/101-tutorial/"&gt;Docker tutorial&lt;/a&gt; as it will HELP in understanding containerization concepts!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Networking Services&lt;/strong&gt;: Some of the services to keep in mind for networking are Amazon VPC (think about subnetting, elastic IPs, security group rules, etc), AWS Direct Connect, Amazon Route 53, and AWS Global Accelerator. Understand networking concepts, traffic routing, and hybrid connectivity. I can’t emphasize this enough in that you absolutely need to understand networking and the OSI model in general. If you have a general understanding of each layer of the &lt;strong&gt;OSI model&lt;/strong&gt;, it will be much easier for you to apply cloud networking concepts to your studying. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 11-12&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Storage and Database Services&lt;/em&gt;&lt;/strong&gt;: I can tell you all right now that understanding S3 is a game changer! You need to understand encryption techniques, storage options, access control, and so forth. You CANNOT skip the basics of S3. You have to learn this service in detail! You should also keep in mind other services such as Amazon EBS, Amazon DynamoDB, &amp;amp; Amazon RDS (take a deep dive on &lt;strong&gt;when&lt;/strong&gt; an organization should leverage RDS and consider cost optimization as well as other tradeoffs). Understand data durability, backups, and replication. Let me repeat that last line…UNDERSTAND backups and replication - This means, you need to understand the basis of Disaster Recovery, failovers, which AWS services provide the best optimization for data restoration, etc. A VERY helpful resource can be found &lt;a href="https://docs.aws.amazon.com/whitepapers/latest/disaster-recovery-workloads-on-aws/disaster-recovery-options-in-the-cloud.html"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Management Services&lt;/em&gt;&lt;/strong&gt;: You should be practicing with AWS CloudFormation, AWS CloudWatch, AWS Systems Manager, and AWS Trusted Advisor at a minimum at this point during your studies. Understand infrastructure as code, monitoring, and management tools. You should also continue working on the labs. Challenge yourself to work on the &lt;strong&gt;level 400&lt;/strong&gt; (Advanced) labs only during these few weeks. Trust me, they will be worth it to get deeper hands-on experience with development tools!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Review and Practice&lt;/strong&gt;: Recap all the topics covered in the previous weeks and take multiple practice exams. Challenge yourself to sit in a timed, 2-hour setting when taking the practice exams. You should start timing yourself around Week 8 during your learning journey to mentally prepare for the actual exam setting! &lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;strong&gt;Part 2 - For those with AWS hands-on experience&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Now, if you’re someone that has some AWS experience and/or extensive AWS experience, I would recommend that you take a look at this 4-6 week condensed guide below, as this may work best you:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 1&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Well-Architected Framework&lt;/em&gt;&lt;/strong&gt;: Review the six pillars of the Well-Architected Framework in-depth. You should have a thorough understanding of each pillar at this point.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Global Infrastructure&lt;/em&gt;&lt;/strong&gt;: Take a deeper dive on AWS Regions, Availability Zones (AZs), and edge locations. Understand the concepts of fault tolerance, high availability, and data durability. You should also be practicing daily with the exams and Well-Architected labs, level 400 (Advanced). Dive deeper into performance efficiency and reliability from the Well-Architected Framework.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;AWS Identity and Access Management (IAM)&lt;/strong&gt;&lt;/em&gt;: You most likely have a strong understanding of IAM, including users, groups, roles, policies, and permissions. Take it up a notch and think about IAM concepts being applied at the Organizational level or for users that require least privilege. It should be much easier for you to practice creating IAM users and managing access control. You should also try to do this via the CLI and not just within the AWS Management Console, such as the example below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws iam create-group --group-name Admins
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Week 2&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;AWS Elastic Compute Cloud (EC2)&lt;/strong&gt;&lt;/em&gt;: Dive deeper into EC2 and understand best practices for storage options and pricing models. Cost Optimization should be reviewed heavily during this week, along with comparisons for S3 vs EFS, vs FSx (at a minimum). Understand how to configure EC2 instances for high availability and fault tolerance via the CLI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Amazon Virtual Private Cloud (VPC)&lt;/em&gt;&lt;/strong&gt;: Review VPC components, such as subnets, route tables, security groups, and network ACLs. Learn how to design secure and scalable VPC architectures. You should be heavily focused on Operational Excellence, Security, &amp;amp; Reliability from the AWS Well Architected Framework. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Amazon Simple Storage Service (S3)&lt;/em&gt;&lt;/strong&gt;: Explore S3 features, including bucket policies, versioning, encryption, and cross-region replication. Understand S3 security and data consistency models. Check out the AWS CLI example below of what it looks like to specify SSE (Server Side Encryption) when uploading objects for an S3 bucket:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3api put-object --bucket DOC-EXAMPLE-BUCKET1 --key object-key-name --server-side-encryption AES256  --body file path
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Week 3&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Auto Scaling&lt;/em&gt;&lt;/strong&gt;: Familiarize yourself with Auto Scaling concepts and learn how to create and configure Auto Scaling groups. You should have deep knowledge of auto scaling plans, launch configurations, instance sizing/capacity thresholds, and a general understanding of how Auto Scaling enhances availability and scalability.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Elastic Load Balancing (ELB)&lt;/em&gt;&lt;/strong&gt;: Study the different types of ELBs, including Classic Load Balancer, Application Load Balancer, and Network Load Balancer. NOTE: You absolutely need to understand the AWS LoadBalancer Algorithm. If you do not understand application security, scalability, availability, and performance, you will have a tough time with deciphering which load balancer option may be best when reviewing the exam questions. You need to thoroughly understand how to configure and optimize load balancing. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Relational Database Service (RDS)&lt;/em&gt;&lt;/strong&gt;: Become more familiar with RDS database engines, Multi-AZ deployments, backup and restore options, and read replicas. Learn about encryption at rest and in transit. As a refresher, take a look at &lt;a href="https://wellarchitectedlabs.com/reliability/200_labs/200_bidirectional_replication_for_s3/"&gt;this&lt;/a&gt; level 200 Well Architected Lab related to &lt;strong&gt;bi-directional cross-region replication&lt;/strong&gt;. Regarding encryption, think deeply about when and what to encrypt. Think about encryption responsibilities. Think about encryption do’s and don'ts. I would highly recommend that you review how to encrypt an RDS instance, and that you understand the purpose of KMS keys (think about customer managed vs AWS managed keys for RDS).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 4&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Route 53&lt;/em&gt;&lt;/strong&gt;: The fundamentals of Route 53 should be top of mind if you have experience with creating websites that use customized endpoints, etc. You must understand DNS management, routing policies, health checks, and domain registration at a minimum. Think about various scenarios for designing resilient and highly available architectures using Route 53.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS Lambda&lt;/em&gt;&lt;/strong&gt;: Study serverless computing and Lambda functions. Learn how to create and configure Lambda functions, and understand their integration with other AWS services. I would also add in the purpose of APIs when working with Lambda functions. Practicing with Lambda functions, SNS, &amp;amp; SQS will be 100% beneficial when it comes time for the exam!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;AWS CloudFormation&lt;/em&gt;&lt;/strong&gt;: Dive deeper into infrastructure as code and learn how to use CloudFormation templates to automate resource provisioning and management. You shouldn’t just focus on the speed that CloudFormation provides when provisioning resources. You should also understand the management and governance aspects of using it - &lt;strong&gt;How does it help with lowering costs? Does it help with innovation? How does it contribute to operational excellence?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;**You should also be taking at least 1-2 practice exams per day (at a minimum) in EXAM Mode. Because you have general or extensive AWS experience, you should be challenging yourself for the full 4-6 weeks when taking the practice exams. No shortcuts!&lt;/p&gt;

&lt;p&gt;I genuinely hope that this guide has been helpful for those that are looking to take the AWS Certified Solutions Architect - Associate exam. Remember, experience triumphs everything, so if you’re not actually practicing in the labs as suggested or maybe writing blogs about your learning journey, the exam WILL NOT be a cake walk. Practice, then practice some more, and oh yes, don’t forget to practice!! You got this 🎉!&lt;/p&gt;

&lt;p&gt;Resources:&lt;br&gt;
&lt;a href="https://d1.awsstatic.com/training-and-certification/docs-sa-assoc/AWS-Certified-Solutions-Architect-Associate_Exam-Guide.pdf"&gt;Official AWS SAA Exam Guide&lt;/a&gt;&lt;br&gt;
&lt;a href="https://learn.cantrill.io/"&gt;Adrian Cantrill’s SAA Course&lt;/a&gt;&lt;br&gt;
&lt;a href="https://tutorialsdojo.com/"&gt;Tutorials Dojo AWS SAA Practice Exams &lt;/a&gt;&lt;br&gt;
&lt;a href="https://digitalcloud.training/category/aws-cheat-sheets/aws-solutions-architect-associate/"&gt;Neal Davis’s AWS SAA Cheat Sheet &lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.wellarchitectedlabs.com/"&gt;AWS Well-Architected Labs&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.youtube.com/watch?v=8BtmzVG_GdI"&gt;What is a Container?&lt;/a&gt; by F5 Dev Central&lt;br&gt;
&lt;a href="https://www.docker.com/101-tutorial/"&gt;Docker 101 Tutorial&lt;/a&gt; by Docker&lt;br&gt;
&lt;a href="https://wellarchitectedlabs.com/reliability/200_labs/200_bidirectional_replication_for_s3/"&gt;Well Architected Lab&lt;/a&gt;: Bi-Directional and Cross Regional Replication&lt;/p&gt;

</description>
      <category>aws</category>
      <category>development</category>
      <category>tooling</category>
      <category>solutionsarchitect</category>
    </item>
  </channel>
</rss>
