<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Adrian Mudzwiti </title>
    <description>The latest articles on Forem by Adrian Mudzwiti  (@adrianm).</description>
    <link>https://forem.com/adrianm</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/adrianm"/>
    <language>en</language>
    <item>
      <title>Serverless FastAPI Deployment: Actions Speak Louder Than Words</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sun, 30 Nov 2025 12:54:11 +0000</pubDate>
      <link>https://forem.com/aws-builders/serverless-fastapi-deployment-actions-speak-louder-than-words-1k8i</link>
      <guid>https://forem.com/aws-builders/serverless-fastapi-deployment-actions-speak-louder-than-words-1k8i</guid>
      <description>&lt;p&gt;The final chapter of the Serverless FastAPI app tetralogy has arrived, we started with developing our app locally, then we wrote tests and in the last chapter we used native tooling and services within AWS to secure our app from bad actors.&lt;/p&gt;

&lt;p&gt;We've reached a fork in the road, we can continue to manually deploy our app by running commands locally or we can incorporate a more traditional approach to automatically test and deploy our app using a CI/CD pipeline.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;I initially wanted to use Azure Pipelines, that's what I have been using at work daily for the past 6 years. I appreciate Azure DevOps, lovely platform.&lt;/p&gt;

&lt;p&gt;- Cristiano Ronaldo voice (Infamous rant about nothing changing at Man Utd)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;To change things up, I then thought why not use GitHub Actions ? The infrastructure and application code already exists in GitHub.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;GitHub Actions is a continuous integration and continuous delivery (CI/CD) platform that allows you to automate your build, test, and deployment pipeline. You can create workflows that build and test every pull request to your repository, or deploy merged pull requests to production.&lt;/p&gt;

&lt;p&gt;GitHub Actions goes beyond just DevOps and lets you run workflows when other events happen in your repository. For example, you can run a workflow to automatically add the appropriate labels whenever someone creates a new issue in your repository.&lt;/p&gt;

&lt;p&gt;- GitHub&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h1&gt;
  
  
  Adding GitHub as an identity provider
&lt;/h1&gt;

&lt;p&gt;We have identified GitHub Actions as the platform to implement our CI/CD pipeline, before we begin we need to setup an authenticated connection between GitHub and AWS. We will achieve this by setting up GitHub as an identity provider within &lt;strong&gt;AWS IAM&lt;/strong&gt;.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Navigate towards &lt;strong&gt;IAM&lt;/strong&gt; in the AWS management console, select &lt;strong&gt;Identity providers&lt;/strong&gt; under &lt;strong&gt;Access management&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Add provider&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;OpenID Connect&lt;/strong&gt; under &lt;strong&gt;Provider type&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Enter &lt;strong&gt;&lt;a href="https://token.actions.githubusercontent.com" rel="noopener noreferrer"&gt;https://token.actions.githubusercontent.com&lt;/a&gt;&lt;/strong&gt; under &lt;strong&gt;Provider URL&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Enter &lt;strong&gt;sts.amazonaws.com&lt;/strong&gt; under &lt;strong&gt;Audience&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Add provider&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Creating a custom IAM policy
&lt;/h1&gt;

&lt;p&gt;Next up is defining the permissions for an IAM role, we'll go for an approach that best aligns with the principle of least privilege.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Select &lt;strong&gt;Policies&lt;/strong&gt; under &lt;strong&gt;Access management&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Create policy&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Policy editor&lt;/strong&gt;, select the &lt;strong&gt;JSON&lt;/strong&gt; button.&lt;/li&gt;
&lt;li&gt;Replace the contents within the editor with the below JSON:


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Next&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Enter &lt;strong&gt;GitHubActionsDeploymentPolicy&lt;/strong&gt; under the &lt;strong&gt;Policy name&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Create policy&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Creating an IAM role
&lt;/h1&gt;

&lt;blockquote&gt;
&lt;p&gt;(IAM) roles are entities you create and assign specific permissions to that allow trusted identities such as workforce identities and applications to perform actions in AWS. When your trusted identities assume IAM roles, they are granted only the permissions scoped by those IAM roles. Using IAM roles is a security best practice because roles provide temporary credentials that do not need to be rotated.&lt;/p&gt;

&lt;p&gt;- Amazon Web Services&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ol&gt;
&lt;li&gt;Select &lt;strong&gt;Roles&lt;/strong&gt; under &lt;strong&gt;Access management&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Create role&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Web identity&lt;/strong&gt; under &lt;strong&gt;Trusted entity&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;token.actions.githubusercontent.com&lt;/strong&gt; as the identity provider under &lt;strong&gt;Web identity&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;sts.amazonaws.com&lt;/strong&gt; as the audience.&lt;/li&gt;
&lt;li&gt;The input box for &lt;strong&gt;GitHub organization&lt;/strong&gt; also supports personal GitHub accounts, feel free to enter your GitHub username.&lt;/li&gt;
&lt;li&gt;Enter the name of the GitHub repository you are using for this app. Select &lt;strong&gt;Next&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Search for and select the policy we created earlier ("GitHubActionsDeploymentPolicy"). Select &lt;strong&gt;Next&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Provide a name for the role, i.e &lt;strong&gt;GitHub-Actions-Assume-Role&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Scroll to the bottom of the page and select &lt;strong&gt;Create role&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Copy the arn for the role to your clipboard, you'll need this later.&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Creating our first GitHub Actions workflow
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;Navigate towards your GitHub repo.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;settings&lt;/strong&gt;, select &lt;strong&gt;secrets and variables&lt;/strong&gt;, select &lt;strong&gt;Actions&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;New repository secret&lt;/strong&gt;, enter &lt;strong&gt;ROLE_TO_ASSUME&lt;/strong&gt; as the name and enter the arn for the role you copied earlier as the &lt;strong&gt;Secret&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Open your project in Visual Studio Code (or your IDE of choice) and create a new branch.&lt;/li&gt;
&lt;li&gt;Create the following directories and files inside the project directory:


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;/li&gt;
&lt;li&gt;Edit the &lt;strong&gt;deploy.yml&lt;/strong&gt; file and add the following code:


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;At a high level, the &lt;strong&gt;deploy.yml&lt;/strong&gt; file contains 2 jobs, a job to run tests and a job to deploy the app. The &lt;strong&gt;Deploy&lt;/strong&gt; job is dependent on the previous &lt;strong&gt;Test&lt;/strong&gt; job passing. The &lt;strong&gt;role-to-assume&lt;/strong&gt; references the arn we stored as a repository secret earlier on.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Commit the changes and publish the new branch.&lt;/li&gt;
&lt;li&gt;Navigate towards your repo in GitHub, select the &lt;strong&gt;Pull requests&lt;/strong&gt; tab, select &lt;strong&gt;Compare &amp;amp; pull request&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Create pull request&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Navigate towards the &lt;strong&gt;Actions&lt;/strong&gt; tab, you should see the workflow running. You can review the logs to see the progress. If successful you'll see 2 endpoint URLs.
&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Destroying the cdk stack via GitHub Actions
&lt;/h1&gt;

&lt;p&gt;We've seen how our app is deployed using the &lt;strong&gt;deploy.yml&lt;/strong&gt; workflow, we'll now create a workflow to delete the CDK stack, it prompts the user for confirmation first before initiating the stack deletion.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Navigate back towards your IDE, select the &lt;strong&gt;destroy.yml&lt;/strong&gt; file and add the following code:


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;/li&gt;
&lt;li&gt;Commit the changes and push the changes.&lt;/li&gt;
&lt;li&gt;Navigate towards the &lt;strong&gt;Actions&lt;/strong&gt; tab in your browser, you should currently see the &lt;strong&gt;Deploy Player FC API&lt;/strong&gt; workflow that previously ran, navigate towards the main &lt;strong&gt;Actions&lt;/strong&gt; to see all workflows.&lt;/li&gt;
&lt;li&gt;You'll see the newly created &lt;strong&gt;Destroy Player FC CDK Stack&lt;/strong&gt; workflow, select the workflow, select the &lt;strong&gt;Run workflow&lt;/strong&gt; dropdown, enter &lt;strong&gt;y&lt;/strong&gt; to initiate the destruction of the CDK stack.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We now have a complete CI/CD pipeline that automatically tests our FastAPI app and deploys it to AWS. You can use this foundation to extend the workflows further or add additional deployment stages.&lt;/p&gt;

</description>
      <category>awscdk</category>
      <category>githubactions</category>
      <category>fastapi</category>
      <category>python</category>
    </item>
    <item>
      <title>Serverless FastAPI Security: Unlocked Doors Invite Unwanted Guests</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Thu, 24 Jul 2025 20:28:27 +0000</pubDate>
      <link>https://forem.com/aws-builders/serverless-fastapi-security-unlocked-doors-invite-unwanted-guests-5fnl</link>
      <guid>https://forem.com/aws-builders/serverless-fastapi-security-unlocked-doors-invite-unwanted-guests-5fnl</guid>
      <description>&lt;p&gt;In &lt;a href="https://dev.to/aws-builders/serverless-fastapi-development-building-player-fc-api-on-aws-3735"&gt;part 1&lt;/a&gt; and &lt;a href="https://dev.to/aws-builders/serverless-fastapi-testing-use-moto-and-just-mock-it-2p35"&gt;part 2&lt;/a&gt; of our &lt;strong&gt;Serverless FastAPI&lt;/strong&gt; series, we covered the development and testing aspects of our FastAPI app. Now we’ll shift our attention to security.&lt;/p&gt;

&lt;p&gt;Security shouldn’t be an afterthought, however, the theme of this series has been to get you up and running in a manner that is beginner-friendly, whilst also exposing you to shortcomings and approaches in an organic way.&lt;/p&gt;

&lt;p&gt;Up until now, our &lt;em&gt;Lambda Function URL&lt;/em&gt; has been publicly accessible, allowing every opportunist, every bad actor and their collective to have an open pass with an unprotected API provided they can find the endpoint URL.&lt;/p&gt;

&lt;p&gt;The thought of an unprotected &lt;em&gt;API&lt;/em&gt; in 2025 is starting to sound like a bad low-budget movie that went straight to DVD back in the 2000s. As the saying goes in the UK, it’s not looking good brav.&lt;/p&gt;

&lt;p&gt;We’ll need to secure our &lt;em&gt;API&lt;/em&gt; so that it’s not abused or compromised. AWS provides several complimentary security services that we can layer together to lock down access to our API.&lt;/p&gt;

&lt;p&gt;We’ll implement security in two phases:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Phase 1: AWS IAM Authentication&lt;/strong&gt; First, we’ll secure our existing Lambda Function URL by changing the authentication type from &lt;em&gt;None&lt;/em&gt; to &lt;em&gt;AWS_IAM&lt;/em&gt;, this restricts access to authenticated IAM users only.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Phase 2: OAuth 2.0 with API Gateway&lt;/strong&gt; Then, we’ll implement a more sophisticated OAuth 2.0 flow using API Gateway and Amazon Cognito, suitable for client applications that need programmatic access.&lt;/p&gt;

&lt;h2&gt;
  
  
  Components that we’ll leverage for full OAuth 2.0 implementation:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Cognito:&lt;/strong&gt; Handles user authentication and authorization for your web and mobile apps.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Resource Server:&lt;/strong&gt; Defines API scopes that control what operations clients can perform.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API Gateway:&lt;/strong&gt; HTTP API that intercepts requests, validates JWT tokens and forwards authorized request to Lambda.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JWT Authorizer:&lt;/strong&gt; Built-in API Gateway component that validates JWT tokens against Cognito’s issuer and audience claims.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Below is a high-level sequence diagram that illustrates the OAuth 2.0 flow a client needs to complete to interact with our fully secured API.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp2h2mxlb6hdn9x0uel8w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp2h2mxlb6hdn9x0uel8w.png" alt="OAuth 2.0 client credentials flow"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Revisiting a familiar cdk stack
&lt;/h2&gt;

&lt;p&gt;From the root of the project, activate the virtual environment, then navigate towards the iac directory and then open the &lt;strong&gt;iac_stack.py&lt;/strong&gt; file.&lt;/p&gt;

&lt;p&gt;We need to add a few imports for constructs that we will leverage as we build our security layer. Ensure that imports look like below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Updating the Lambda Function URL authentication
&lt;/h2&gt;

&lt;p&gt;There’s beauty in simplicity. Let’s start by securing our API, we can achieve this by changing the authentication type from &lt;em&gt;None&lt;/em&gt; to &lt;em&gt;AWS_IAM&lt;/em&gt;.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Deploy this change with &lt;code&gt;cdk deploy&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now, if we attempt to access our API endpoint via a browser or REST API client like Postman, we’ll get a &lt;strong&gt;403&lt;/strong&gt;(Forbidden) response. Try it out!&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the identity foundation with Amazon Cognito
&lt;/h2&gt;

&lt;p&gt;We are going to create an &lt;strong&gt;Amazon Cognito User Pool&lt;/strong&gt;, which is an OpenID Connect (OIDC) identity provider (IdP). Think of this as a guest list at an event, if you’re on the list, you’re allowed entry.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Creating our authentication domain
&lt;/h2&gt;

&lt;p&gt;Next, we’ll create a &lt;strong&gt;User Pool Domain&lt;/strong&gt;, if a &lt;strong&gt;User Pool&lt;/strong&gt; is the guest list, then a &lt;strong&gt;User Pool domain&lt;/strong&gt; is like the venue address where guests check in. It’s the specific location where your application(s) will send users to authenticate.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Setting access boundaries with the use of scopes
&lt;/h2&gt;

&lt;p&gt;Next up is enforcing boundaries with the use of &lt;strong&gt;scopes&lt;/strong&gt;. We have our guest list (User Pool) and our venue (User Pool Domain), but venues often have restricted areas, like a VVIP section cordoned off for elite members of society. That’s exactly what scopes do.&lt;/p&gt;

&lt;p&gt;Scopes define the permission level that clients can request, in our case, full read/write access to the API.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Creating a Resource Server
&lt;/h2&gt;

&lt;p&gt;Time to create the &lt;strong&gt;Resource Server&lt;/strong&gt;. This is like officially registering your event with the venue’s management system. It tells the venue ‘this is our event, these are our VIP areas, and here are the access rules’, essentially linking your API to the permission system.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Setting up machine-to-machine access
&lt;/h2&gt;

&lt;p&gt;We’re creating machine-to-machine access with a &lt;strong&gt;User Pool Client&lt;/strong&gt;. Think of this as backstage crew passes, these aren’t for the audience, but for the technical staff, sound engineers or equipment operators who need to access different areas to keep the event running.&lt;/p&gt;

&lt;p&gt;These credentials allow applications to authenticate themselves and request the access they need.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Building the fortress entrance
&lt;/h2&gt;

&lt;p&gt;Now we’re building the fortress entrance with &lt;strong&gt;API Gateway&lt;/strong&gt;. Think of this as constructing the main security checkpoint at your event’s entrance, the single point where every guest must pass through.&lt;/p&gt;

&lt;p&gt;It’s the official gateway that intercepts everyone trying to enter, checks their credentials and either grants or denies access to the venue.&lt;/p&gt;

&lt;p&gt;Scroll to the bottom of the file and add the below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Create JWT Authorizer
&lt;/h2&gt;

&lt;p&gt;Now we’re creating the &lt;strong&gt;JWT Authorizer&lt;/strong&gt;. Think of this as installing a high-tech security scanner at your entrance checkpoint.&lt;/p&gt;

&lt;p&gt;It automatically reads and validates the special security codes on each access pass, checking that they’re genuine, haven’t expired and were issued by the right authority. No human guard is needed to verify every detail.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Building Protected API Routes
&lt;/h2&gt;

&lt;p&gt;Now we’re building protected &lt;strong&gt;API routes&lt;/strong&gt;. Think of this as setting up specific security checkpoints for different areas of your venue, one for the main hall, another for VIP lounges and separate ones for backstage areas.&lt;/p&gt;

&lt;p&gt;Each checkpoint knows exactly what credentials to check and directs validated guests to the right location.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;This route protects the main players endpoint for add and retrieving players with JWT authorization for GET and POST operations.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Protects the individual player retrieval endpoint with JWT token validation.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Protects player update and removal endpoints (PATCH/DELETE) with JWT authorization.&lt;/p&gt;

&lt;p&gt;Run &lt;code&gt;cdk deploy&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing our API using a REST API client
&lt;/h2&gt;

&lt;p&gt;We’re going to test if our API still works with all the changes we have made. For that we will need a tool that can allow us to test our API, feel free to use any tool you’re comfortable with, I’ll use Postman from here on out.&lt;/p&gt;

&lt;p&gt;Let’s retrieve the app credentials that were created for us using the CDK, open a browser and login to the AWS Management Console.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Search for &lt;strong&gt;Cognito&lt;/strong&gt; in the main search bar, select the service.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;PlayerFCUserPool&lt;/strong&gt; under &lt;strong&gt;User Pools&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;App clients&lt;/strong&gt; under &lt;strong&gt;Applications&lt;/strong&gt; from the left-hand side of the window.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;PlayerFC-M2M-Client&lt;/strong&gt;, you should see a &lt;strong&gt;Client ID&lt;/strong&gt; and a hidden &lt;strong&gt;Client secret&lt;/strong&gt;, leave the browser window open. We’ll revisit this page shortly.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Creating an environment &amp;amp; variables in Postman
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open Postman, select &lt;strong&gt;Environments&lt;/strong&gt; from the sidebar.&lt;/li&gt;
&lt;li&gt;Select the “&lt;strong&gt;+&lt;/strong&gt;“ button to create a new environment.&lt;/li&gt;
&lt;li&gt;Enter a name for the environment, perhaps &lt;strong&gt;PlayerFC&lt;/strong&gt; shall suffice.&lt;/li&gt;
&lt;li&gt;Create a new variable called &lt;strong&gt;client_id&lt;/strong&gt;, set the type to secret, copy and paste the client id value from the App client in Amazon Cognito and set as the current value.&lt;/li&gt;
&lt;li&gt;Create another variable called &lt;strong&gt;client_secret&lt;/strong&gt;, set the type to secret, copy and paste the client secret value from the App client in Amazon Cognito and set as the current value.&lt;/li&gt;
&lt;li&gt;Create the the last variable called &lt;strong&gt;access_token&lt;/strong&gt;, set the type to secret, leave the initial value blank.&lt;/li&gt;
&lt;li&gt;Select 💾 &lt;strong&gt;Save&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn28fbp35bclua7qd32zs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn28fbp35bclua7qd32zs.png" alt="Postman environment variables"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Exporting &amp;amp; Importing OpenAPI definition
&lt;/h2&gt;

&lt;p&gt;Instead of manually creating the requests in Postman, we’ll head over to API Gateway export the OpenAPI definition and import the definition into Postman.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Search for &lt;strong&gt;API Gateway&lt;/strong&gt; in the AWS Management console. Select the service.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;PlayerFCHttpApi&lt;/strong&gt; under APIs, select &lt;strong&gt;Export&lt;/strong&gt; under &lt;strong&gt;Develop&lt;/strong&gt; from the left sidebar.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Latest configuration&lt;/strong&gt; from the &lt;strong&gt;Source&lt;/strong&gt; drop-down list.&lt;/li&gt;
&lt;li&gt;Ensure the toggle is set to on for &lt;strong&gt;Include API Gateway extensions&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;JSON&lt;/strong&gt; under &lt;strong&gt;Output format&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Click the &lt;strong&gt;Download&lt;/strong&gt; button.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdzsho904cp6prvjgx4p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdzsho904cp6prvjgx4p.png" alt="Export OpenAPI definition from API Gateway"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Navigate back to Postman and follow the below steps to import the OpenAPI definition:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Select &lt;strong&gt;Collections&lt;/strong&gt; from the left sidebar.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Import&lt;/strong&gt;, drag and drop the OpenAPI definition we downloaded earlier. Alternatively, select &lt;strong&gt;files&lt;/strong&gt;, select the OpenAPI definition.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Postman Collection&lt;/strong&gt;, select &lt;strong&gt;Import&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frlewgqx2rxbk4isq09tf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frlewgqx2rxbk4isq09tf.png" alt="Postman - Import API definition"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Generating an access token
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; Add a request under the &lt;strong&gt;players&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; Enter &lt;strong&gt;Generate access token&lt;/strong&gt; as the name for this request.&lt;/li&gt;
&lt;li&gt; Select &lt;strong&gt;POST&lt;/strong&gt; as the method.&lt;/li&gt;
&lt;li&gt; Enter &lt;em&gt;https://playerfc.auth.af-south-1.amazoncognito.com/oauth2/token&lt;/em&gt; in the URL input box.&lt;/li&gt;
&lt;li&gt; Select the &lt;strong&gt;Authorization&lt;/strong&gt; tab and ensure the &lt;strong&gt;Auth Type&lt;/strong&gt; is set to &lt;strong&gt;No Auth&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; Select the &lt;strong&gt;Body&lt;/strong&gt; tab, select &lt;strong&gt;x-www-form-urlencoded&lt;/strong&gt;, select &lt;strong&gt;Bulk Edit&lt;/strong&gt;, copy and paste the below:&lt;/li&gt;

&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4boykuvwbwko2wmxmfnr.png" alt="x-www-form-urlencoded properties"&gt;

&lt;li&gt; Select the &lt;strong&gt;Scripts&lt;/strong&gt; tab, copy and paste the below JavaScript for &lt;strong&gt;Post-response&lt;/strong&gt;, this will automatically populate the &lt;strong&gt;access_token&lt;/strong&gt; environment variable whenever you execute the &lt;strong&gt;Generate access token&lt;/strong&gt; request:&lt;/li&gt;

&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0gwn8xqn1238zs1e95bg.png" alt="JavaScript to update access_token environment variable"&gt;


&lt;li&gt; Select 💾 &lt;strong&gt;Save&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; If you execute the request, an access token is returned and set as an environment variable.&lt;/li&gt;


&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fllteeh4iq1pcwfz2rwh2.png" alt="Generating an access token"&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;br&gt;&lt;br&gt;
Now that we have an access token, let's add a player, in the last blog it was Christoper Nkunku, keeping it within the French national team, let’s add Kylian Mbappé.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Select the &lt;strong&gt;Post /players&lt;/strong&gt; request.&lt;/li&gt;
&lt;li&gt; Select the &lt;strong&gt;Authorization&lt;/strong&gt; tab, set the empty token field to &lt;strong&gt;access_token&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; Select the &lt;strong&gt;Body&lt;/strong&gt; tab, select &lt;strong&gt;raw&lt;/strong&gt;, copy and paste the below JSON payload into the request body:&lt;/li&gt;

&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;li&gt; Select &lt;strong&gt;Send&lt;/strong&gt;.&lt;/li&gt;

&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8vkac9pl60pite4sh27t.png" alt="Adding a player"&gt;

&lt;/ol&gt;



&lt;p&gt;Feel free to test the other endpoints, don’t forget to set the &lt;strong&gt;access_token&lt;/strong&gt; environment variable under the Authorization tab for all the other requests.&lt;/p&gt;

&lt;p&gt;If you forget to add the &lt;strong&gt;access_token&lt;/strong&gt; along with the request, you’ll be greeted with a &lt;strong&gt;401&lt;/strong&gt;(Unauthorized) response.&lt;/p&gt;

&lt;p&gt;If you’re looking for the complete code, you’ll find it under my &lt;a href="https://github.com/AdrianM10/player-fc-fastapi-app" rel="noopener noreferrer"&gt;GitHub repo&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With these security layers in place, your FastAPI app is protected from unauthorized access.&lt;/p&gt;

&lt;p&gt;This project might squeeze another post in the series, only time will tell. Till then, take care.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cdk</category>
      <category>python</category>
      <category>fastapi</category>
    </item>
    <item>
      <title>Serverless FastAPI Testing: Use Moto and Just Mock It!</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Tue, 01 Jul 2025 20:02:27 +0000</pubDate>
      <link>https://forem.com/aws-builders/serverless-fastapi-testing-use-moto-and-just-mock-it-2p35</link>
      <guid>https://forem.com/aws-builders/serverless-fastapi-testing-use-moto-and-just-mock-it-2p35</guid>
      <description>&lt;p&gt;In my previous blog post &lt;a href="https://dev.to/aws-builders/serverless-fastapi-development-building-player-fc-api-on-aws-3735"&gt;Serverless FastAPI Development: Building Player FC API on AWS&lt;/a&gt;, we explored creating and deploying a FastAPI application on AWS. In this blog post we’ll take a look at testing our app locally.&lt;/p&gt;

&lt;p&gt;We write tests to prove that our code works as designed, however since our code interacts with cloud services it’s somewhat of a challenge to mock tests to the cloud without actually making api calls that traverse the internet, well that is unless you use Moto.&lt;/p&gt;

&lt;p&gt;Moto is a Python library that mocks AWS services, allowing you to test without making real API calls.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Mock AWS Services?
&lt;/h2&gt;

&lt;p&gt;When it comes to testing applications that interact with cloud services like AWS, mocking becomes essential for a couple of practical reasons.&lt;/p&gt;

&lt;p&gt;First, cloud services cost money. Testing against resources deployed in the cloud isn’t free.&lt;/p&gt;

&lt;p&gt;Secondly, an active &amp;amp; reliable internet connection is required, it’s not ideal to have your tests bound to the internet. You might find yourself at a conference with slow and limited wifi connectivity or a space with public wifi that shouldn’t be trusted. You could be on a plane or train, you might even find yourself in a remote area.&lt;/p&gt;

&lt;p&gt;Mocking allows you to run tests locally without incurring additional costs. Everyone loves to save money after all.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up Your Test Environment
&lt;/h2&gt;

&lt;p&gt;Some preparation is required to ensure we can run our tests, we need a way for our tests to import modules that we have written as well as letting &lt;strong&gt;pytest&lt;/strong&gt; know where these files are located.&lt;/p&gt;

&lt;p&gt;This can be achieved by creating a &lt;strong&gt;conftest.py&lt;/strong&gt; file as well as a &lt;strong&gt;pyproject.toml&lt;/strong&gt; file.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;conftest.py&lt;/strong&gt; file gets the absolute path of the project root directory.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;pyproject.toml&lt;/strong&gt; file sets the path for our app, test paths and silences a deprecation warning for &lt;strong&gt;botocore&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Create these files at your project’s root:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Your First Test: The Root Endpoint
&lt;/h2&gt;

&lt;p&gt;Create a directory that will be a home for our &lt;strong&gt;tests&lt;/strong&gt;, name it tests and within this directory create a file named &lt;strong&gt;test_player.py&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Let’s create a test for our root endpoint, add the following imports at the top of the file:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Create a &lt;strong&gt;TestClient&lt;/strong&gt; object and pass &lt;strong&gt;app&lt;/strong&gt; as an argument, add a test function named &lt;strong&gt;test_root&lt;/strong&gt;, see below for the complete code snippet:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Run &lt;code&gt;pytest test_player.py::test_root&lt;/code&gt; in the terminal window. The test should pass.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmoimdjb37f46hdcperdg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmoimdjb37f46hdcperdg.png" alt="Test Root"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create Pytest Fixtures
&lt;/h2&gt;

&lt;p&gt;We will use &lt;strong&gt;Fixtures&lt;/strong&gt; to provide a defined, reliable and consistent context for our tests. This will include player data, mocked AWS credentials for moto and our mock DynamoDB table.&lt;/p&gt;

&lt;p&gt;Let’s add a couple of fixtures to our code, we will start with creating a fixture that contains a single player’s data, add this code directly below the &lt;strong&gt;client&lt;/strong&gt; object we created earlier:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Now we need to take a similar approach for representing all players, however creating a function with all this data will make the code long, a better approach would be to create a separate json file and load the data when the function is called.&lt;/p&gt;

&lt;p&gt;Create a file named &lt;strong&gt;players.json&lt;/strong&gt; in the &lt;strong&gt;tests&lt;/strong&gt; directory and populate it with the below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Add the below code to create a fixture that will load the all players data from the json file when the function is called:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Mocking AWS credentials and DynamoDB service
&lt;/h2&gt;

&lt;p&gt;Create a fixture that will mock AWS credentials for below by adding the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The mocked AWS credentials will be used as an argument for our mock DynamoDB table, add the below code to create another fixture for mocking the AWS DynamoDB service:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  CRUD Testing Journey
&lt;/h2&gt;

&lt;p&gt;With all the fixtures created, we are now at a stage that we can begin testing the other endpoints that would normally interact with AWS services, albeit mocked in nature.&lt;/p&gt;

&lt;p&gt;We can create a test that will create and return the player data, this function takes in the &lt;strong&gt;dynamodb_table&lt;/strong&gt; and &lt;strong&gt;player_data&lt;/strong&gt; fixtures we created earlier as arguments, add the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Run &lt;code&gt;pytest test_player.py::test_create_and_get_player&lt;/code&gt;, this test too shall pass.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3ax038cv5mjtny0dvu5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3ax038cv5mjtny0dvu5.png" alt="Test Create And Get Player"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Onto the next endpoint, lets test if we can get all players, this will be achieved by loading the players data from a json file and asserting that players names are found and if a certain player is not found.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Run &lt;code&gt;pytest test_player.py::test_get_all_players&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6tgcyq1egbhlf4ugojkc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6tgcyq1egbhlf4ugojkc.png" alt="Test Get All Players"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We’re on a roll with tests that are passing at this stage, lets test the endpoint for updating a player details, the player in question is &lt;strong&gt;Christopher Nkunku&lt;/strong&gt;, he will be transferring to &lt;strong&gt;Bayern Munich&lt;/strong&gt; and will take up the number 10 jersey.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Run &lt;code&gt;pytest test_player.py::test_update_player&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8phqvag9p929sr2xlf5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc8phqvag9p929sr2xlf5.png" alt="Test Update Player"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now let’s create a test for removing a player.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Run &lt;code&gt;pytest test_player.py::test_delete_player&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4uyufjaqu2i45g86z99l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4uyufjaqu2i45g86z99l.png" alt="Test Delete Player"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The final test is an edge case, lets create a test when removing a non existent player, an error 404 should be returned since the player does not exist.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Run &lt;code&gt;pytest test_player.py::test_delete_non_existent_player&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6yexlgs1fp8i1sktwqrg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6yexlgs1fp8i1sktwqrg.png" alt="Test Delete Non Existent Player"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Testing locally? Sorted.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Next up&lt;/strong&gt;: Locking down your Lambda Function URL because security isn’t optional. Stay tuned. ⚡️🔐&lt;/p&gt;

&lt;p&gt;I’ll cover that in a future post. Until then, happy testing. ⚡️🐍&lt;/p&gt;

</description>
      <category>python</category>
      <category>fastapi</category>
      <category>pytest</category>
      <category>dynamodb</category>
    </item>
    <item>
      <title>Serverless FastAPI Development: Building Player FC API on AWS</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sat, 11 Jan 2025 07:00:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/serverless-fastapi-development-building-player-fc-api-on-aws-3735</link>
      <guid>https://forem.com/aws-builders/serverless-fastapi-development-building-player-fc-api-on-aws-3735</guid>
      <description>&lt;p&gt;It's been a while since I've had the opportunity to build something simple, interesting and modern. Towards the backend of 2024 I stumbled across FastAPI and got excited, whilst I've built internal APIs at work before, I hadn't yet created anything public facing.&lt;/p&gt;

&lt;p&gt;Hello FastAPI! &lt;/p&gt;

&lt;p&gt;FastAPI is a modern, powerful framework for building APIs with Python and it seemed perfect for what I wanted to build, an API for basic football player info. I initially dubbed it "Jugador FC" before settling for "Player FC API".&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuring Environment.
&lt;/h2&gt;

&lt;p&gt;Before you begin, make sure you have the following requirements in place:&lt;/p&gt;

&lt;p&gt;AWS CDK&lt;br&gt;
Docker&lt;br&gt;
Python 3.12.7&lt;/p&gt;
&lt;h2&gt;
  
  
  Creating the Project
&lt;/h2&gt;

&lt;p&gt;Create a directory on your machine. Name it &lt;strong&gt;player_fc_fastapi_app&lt;/strong&gt;, within this directory create the following subdirectories:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;app&lt;/strong&gt;&lt;br&gt;
    Contains all the FastAPI code&lt;br&gt;
&lt;strong&gt;dynamo_db_local&lt;/strong&gt;&lt;br&gt;
    Contains a python script to create a local version of an Amazon DynamoDB Table&lt;br&gt;
&lt;strong&gt;iac&lt;/strong&gt;&lt;br&gt;
    Contains your stack files to create resources in AWS&lt;/p&gt;

&lt;p&gt;I have made it easier by providing the commands that you can run to save time below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The project directory structure should now look like below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Setting up the Python environment
&lt;/h2&gt;

&lt;p&gt;After creating the directory structure, create a text file called &lt;code&gt;requirements.txt&lt;/code&gt; and insert the following lines in it:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Once you have created the &lt;code&gt;requirements.txt&lt;/code&gt; file, create a virtual environment and install the dependencies:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Setting up Amazon DynamoDB Local
&lt;/h2&gt;

&lt;p&gt;Let's begin with setting up a local instance of DynamoDB, this will require &lt;strong&gt;Docker&lt;/strong&gt; to be installed and running.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;This will take a few seconds for the image to be pulled and starting a container, once done we can navigate towards the &lt;strong&gt;dynamo_db_local&lt;/strong&gt; directory and create a &lt;code&gt;create_ddb_table.py&lt;/code&gt; file, populate the file with the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;With this code, you can create a table in the local &lt;strong&gt;DynamoDB&lt;/strong&gt; instance. Run the code snippet.&lt;/p&gt;

&lt;h2&gt;
  
  
  FastAPI Development
&lt;/h2&gt;

&lt;p&gt;Now that we have a local instance of &lt;strong&gt;DynamoDB&lt;/strong&gt; up and running, let's begin creating our app, navigate towards the &lt;strong&gt;app&lt;/strong&gt; directory and create two files, &lt;code&gt;main.py&lt;/code&gt; and &lt;code&gt;requirements.txt&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Populate the &lt;code&gt;requirements.txt&lt;/code&gt; with the below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Create the below subdirectories :&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;models&lt;/strong&gt;&lt;br&gt;
     Pydantic Player models&lt;br&gt;
&lt;strong&gt;routers&lt;/strong&gt;&lt;br&gt;
     Contains routes&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Let's create a couple of models using &lt;strong&gt;Pydantic&lt;/strong&gt;, we will use the &lt;code&gt;Player&lt;/code&gt; and &lt;code&gt;UpdatePlayer&lt;/code&gt; models to define the data structure of player info we can add or modify.&lt;/p&gt;

&lt;p&gt;Within the &lt;strong&gt;models&lt;/strong&gt; subdirectory, create an empty &lt;code&gt;__init__.py&lt;/code&gt; file and a file named &lt;code&gt;players.py&lt;/code&gt; and fill with the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Within the &lt;strong&gt;routers&lt;/strong&gt; subdirectory, create an empty &lt;code&gt;__init__.py&lt;/code&gt; file and a file named &lt;code&gt;players.py&lt;/code&gt; and fill with the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;blockquote&gt;
&lt;p&gt;Creating an empty &lt;code&gt;__init__.py&lt;/code&gt; file turns a folder into a Python package.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Create a file named &lt;code&gt;main.py&lt;/code&gt; within the &lt;strong&gt;app&lt;/strong&gt; subdirectory and start populating it with the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;




&lt;h2&gt;
  
  
  Test Drive
&lt;/h2&gt;

&lt;p&gt;Time for a quick test drive, ensure you are in the &lt;strong&gt;app&lt;/strong&gt; directory and run the below command to start &lt;strong&gt;Uvicorn&lt;/strong&gt;:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Now that our app is up and running, navigate to &lt;strong&gt;&lt;a href="http://127.0.0.1:8000/docs/" rel="noopener noreferrer"&gt;http://127.0.0.1:8000/docs/&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You will see the automatic interactive API documentation with 6 endpoints available:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr1ryw5oha2gel0erkz10.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr1ryw5oha2gel0erkz10.png" alt="FastAPI Swagger Documentation" width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's try adding a player. Select the &lt;strong&gt;POST /players&lt;/strong&gt; endpoint, select the &lt;strong&gt;Try It out&lt;/strong&gt; button and use the below payload to add the world's best player, "Vinícius Júnior":&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Here's what each API operation looks like in action.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Adding a New Player:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9sbrehmjfh92x3m2xe9l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9sbrehmjfh92x3m2xe9l.png" alt="Add Player" width="800" height="477"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Retrieving All Players:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7qamr9t12xehw25yj4a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7qamr9t12xehw25yj4a.png" alt="Get All Players" width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Updating Player Information:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3183ctl0tvi7rxrihky9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3183ctl0tvi7rxrihky9.png" alt="Update Player" width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Getting Single Player Details:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Famk5u3kw9mz7ze3yrrkt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Famk5u3kw9mz7ze3yrrkt.png" alt="Get Player" width="800" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Removing a Player:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4cos3r7yz48nmlw3our.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4cos3r7yz48nmlw3our.png" alt="Delete Player" width="800" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Deployment using AWS CDK v2
&lt;/h2&gt;

&lt;p&gt;Now that we are comfortable with running and testing our app locally, it's time to deploy our app on AWS. We will use the AWS CDK v2.&lt;/p&gt;

&lt;p&gt;Navigate into the &lt;strong&gt;iac&lt;/strong&gt; directory, run the below command to initialize a cdk project:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Modify the &lt;strong&gt;requirements.txt&lt;/strong&gt; file found in the subdirectory, add the below line:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Let's define a DynamoDB Table, Lambda function and a Lambda function url. In the current &lt;strong&gt;iac&lt;/strong&gt; directory, there is another subdirectory that you need to navigate towards (&lt;strong&gt;iac&lt;/strong&gt;). Open the &lt;code&gt;iac_stack.py&lt;/code&gt; file and replace the contents of the CDK stack with the code below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;We have one final step before we initiate the deploy, set the flag for &lt;code&gt;local_development: bool&lt;/code&gt; to &lt;strong&gt;False&lt;/strong&gt; in the &lt;code&gt;players.py&lt;/code&gt; file in the &lt;strong&gt;app/routers&lt;/strong&gt; directory.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Activate the virtual environment within the &lt;strong&gt;iac&lt;/strong&gt; directory and install the dependencies with the below commands:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Deploy the app with the &lt;code&gt;cdk deploy&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;Once the deployment is complete, you'll see a function URL in the terminal output, this is your API endpoint on AWS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm421nzpdarqwxv8c5fqx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm421nzpdarqwxv8c5fqx.png" alt="CDK Deploy FastAPI APP" width="800" height="432"&gt;&lt;/a&gt;&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;p&gt;Test all endpoints using the function URL like we did during the local test drive. Once you have added a player it's time to verify if our player data has persisted or vanished into the ether.&lt;/p&gt;

&lt;p&gt;To verify everything's working:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Head over to the &lt;strong&gt;AWS Management Console&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Navigate to &lt;strong&gt;DynamoDB&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Find the &lt;strong&gt;Players&lt;/strong&gt; Table&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Explore table items&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You should see your player data in the cloud:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zqpy7eeviunpzt83s9n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zqpy7eeviunpzt83s9n.png" alt="Player FC DynamoDB Table" width="800" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;💡 &lt;strong&gt;Important:&lt;/strong&gt; Don't forget to clean up resources! When no longer needed, you can run the &lt;code&gt;cdk destroy&lt;/code&gt; command to delete all AWS resources that were created.&lt;/p&gt;

&lt;p&gt;That wraps up our journey from local FastAPI development to serverless deployment on AWS.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>fastapi</category>
      <category>python</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Exploring AWS Serverless Deployments with CDK v2: From RSS to X Posts - Part 3 of the Odyssey</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Thu, 01 Aug 2024 17:29:20 +0000</pubDate>
      <link>https://forem.com/aws-builders/exploring-aws-serverless-deployments-with-cdk-v2-from-rss-to-x-posts-part-3-of-the-odyssey-56c3</link>
      <guid>https://forem.com/aws-builders/exploring-aws-serverless-deployments-with-cdk-v2-from-rss-to-x-posts-part-3-of-the-odyssey-56c3</guid>
      <description>&lt;p&gt;Welcome to part 3 of “Exploring AWS Serverless Deployments with CDK v2”. Firstly I’d like to thank you for your patience as there’s been a bit of a gap since part 2. I was deep into studying and working on serverless projects at work which kept me away, but i’m excited to get back on track and continue our exploration.&lt;/p&gt;

&lt;p&gt;In previous posts, we’ve defined our constructs and deployed them to AWS. Today, we’ll focus on an essential practice: testing. Proper testing ensures that our deployments work as expected and can save us from potential issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Getting Started With Testing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To get started, you’ll need to add &lt;strong&gt;pytest&lt;/strong&gt; to your project’s dependencies (the main &lt;strong&gt;requirements.txt&lt;/strong&gt; file for our stack).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install -r requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Within our project, navigate towards the &lt;strong&gt;test&lt;/strong&gt; directory, then &lt;strong&gt;unit&lt;/strong&gt; and open the &lt;strong&gt;test_rss_lambda_ddb_socialshare_stack.py&lt;/strong&gt; file. This auto generated test file includes an example test.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;We don’t have an &lt;strong&gt;SQS&lt;/strong&gt; construct in our stack but reviewing the example test provides some level of insight on how to test a construct. Let’s delete the auto generated example test and create our own test.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setting Up The Testing Function&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;First, let’s create a reusable function to get the CloudFormation template from the stack:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;Testing DynamoDB Table Properties&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We’ll start by adding a test to check that the &lt;strong&gt;DynamoDB&lt;/strong&gt; table in our stack has the correct properties. Here’s how to do it:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;Testing Lambda Functions&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, let’s ensure that our stack creates the correct number of Lambda functions and verifies their runtime version:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;Running Tests&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To run tests you can execute &lt;strong&gt;pytest&lt;/strong&gt; in the terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pytest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Below is the output you should receive:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;================================================================================= test session starts =================================================================================
platform darwin -- Python 3.12.4, pytest-8.1.1, pluggy-1.4.0
rootdir: /Users/adrian/Developer/Projects/rss-lambda-ddb-socialshare
plugins: typeguard-2.13.3
collected 2 items                                                                                                                                                                     

tests/unit/test_rss_lambda_ddb_socialshare_stack.py ..                                                                                                                          [100%]

================================================================================= 2 passed in 21.86s ==================================================================================
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Part 3 of our series, we’ve learned how to test our CDK constructs. In the final installment, we will explore how to test Lambda functions locally.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Resources&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.pytest.org/en/stable/" rel="noopener noreferrer"&gt;Pytest&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/testing.html" rel="noopener noreferrer"&gt;Testing constructs&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>cdk</category>
      <category>python</category>
      <category>testing</category>
      <category>aws</category>
    </item>
    <item>
      <title>Exploring AWS Serverless Deployments with CDK v2: From RSS to X Posts - Part 2 of the Odyssey</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sun, 31 Mar 2024 13:30:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/exploring-aws-serverless-deployments-with-cdk-v2-from-rss-to-x-posts-part-2-of-the-odyssey-1035</link>
      <guid>https://forem.com/aws-builders/exploring-aws-serverless-deployments-with-cdk-v2-from-rss-to-x-posts-part-2-of-the-odyssey-1035</guid>
      <description>&lt;p&gt;In this blog post, we'll continue our exploration of AWS Serverless deployments with CDK v2 by focusing on Lambda functions.&lt;/p&gt;

&lt;p&gt;We'll explore how to create and integrate these functions into our architecture along with a crucial step of granting permissions to resources that are deployed within the stack.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Lambda&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first Lambda function that we will create will periodically query an RSS feed, process the data and store the data in DynamoDB.&lt;/p&gt;

&lt;p&gt;To get started with Lambda we will use the &lt;strong&gt;Amazon Lambda Python Library&lt;/strong&gt;, this will provide constructs for Python Lambda functions. This will require Docker to be installed and running.&lt;/p&gt;

&lt;p&gt;Modify the requirements file for the stack as below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Next we will create a directory for our first Lambda function:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir lambda_rss_ddb_func
cd lambda_rss_ddb_func
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Lets create a &lt;strong&gt;lambda_handler.py&lt;/strong&gt; file, this file will contain our code that performs the magic:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;code lambda_handler.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The next file that we will create will be the &lt;strong&gt;requirements&lt;/strong&gt; file for all our Python dependencies:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;code requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;We will only be using the &lt;strong&gt;requests&lt;/strong&gt; library in the Lambda function, so make sure to include &lt;strong&gt;requests&lt;/strong&gt; in the newly created &lt;strong&gt;requirements.txt&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Time to write some code in the &lt;strong&gt;lambda_handler.py&lt;/strong&gt; file, the code extracts &lt;strong&gt;post id&lt;/strong&gt;, &lt;strong&gt;post title&lt;/strong&gt; and &lt;strong&gt;link&lt;/strong&gt; from a website feed, we will be using the feed from &lt;a href="https://hypebeast.com" rel="noopener noreferrer"&gt;Hypebeast&lt;/a&gt;, once we have extracted the data we need, the data will be inserted into our &lt;strong&gt;DynamoDB Table&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Below is the code that you can populate your &lt;strong&gt;lambda_handler.py&lt;/strong&gt; file:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;Once you have modified your stack to include the construct to create a Lambda function, make sure to change the directory in your terminal to the root folder of the project.&lt;/p&gt;

&lt;p&gt;Let's turn our attention back to our stack to define the Lambda construct, add the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;&lt;em&gt;Amazon EventBridge rule&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To run the lambda function on a schedule, we can make use of an &lt;strong&gt;Amazon EventBridge rule&lt;/strong&gt; that will periodically run our Lambda function. Add the below construct and permissions to the stack:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;&lt;em&gt;Another Lambda function&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In a few moments time we will create another directory for our second lambda function that is invoked from new records(s) being added to our &lt;strong&gt;DynamoDB Table&lt;/strong&gt; and creates a post on X with the &lt;strong&gt;post title&lt;/strong&gt; and &lt;strong&gt;post link&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;SSM Parameter Store&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To access the X API, you'll need to create an X Developer account, I've included the link in the resources section of this post.&lt;/p&gt;

&lt;p&gt;We will need X credentials (consumer key, consumer secret, access token &amp;amp; access token secret).&lt;/p&gt;

&lt;p&gt;These credentials need to stored somewhere securely, the &lt;strong&gt;SSM Parameter Store&lt;/strong&gt; is a service that's free and will fulfill our next requirement well.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Nagivate towards the &lt;strong&gt;Parameter Store&lt;/strong&gt; under &lt;strong&gt;AWS Systems Manager&lt;/strong&gt; in the &lt;strong&gt;AWS Management Console&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select the &lt;strong&gt;Create parameter&lt;/strong&gt; button.&lt;/li&gt;
&lt;li&gt;In the &lt;strong&gt;Name&lt;/strong&gt; textbox, enter &lt;strong&gt;/x/consumer_key&lt;/strong&gt;, select &lt;strong&gt;SecureString&lt;/strong&gt; under &lt;strong&gt;Type&lt;/strong&gt; and paste your &lt;strong&gt;consumer_key&lt;/strong&gt; in the &lt;strong&gt;Value&lt;/strong&gt; textbox.&lt;/li&gt;
&lt;li&gt;Repeat the above process for the remaining credentials (consumer secret, access token &amp;amp; access token secret).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr42lucq5qvpnb6ypktsn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr42lucq5qvpnb6ypktsn.png" alt="SSM Parameter Store" width="800" height="236"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;X&lt;/strong&gt; API stuff out the way, let's create that directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir lambda_x_share_func
cd lambda_x_share_func 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Within this directory create another &lt;strong&gt;lambda_handler_py&lt;/strong&gt; file and a &lt;strong&gt;requirements.txt&lt;/strong&gt; file.&lt;/p&gt;

&lt;p&gt;Below is the code you should insert in the newly created &lt;strong&gt;lambda_handler_py&lt;/strong&gt; file in the &lt;strong&gt;lambda_x_share_func directory&lt;/strong&gt;:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;You'll need to specify a region when initializing the &lt;strong&gt;ssm_client&lt;/strong&gt;, early on I noticed that Lambda was unable to access the values in the &lt;strong&gt;Parameter Store&lt;/strong&gt;, this was strange as all the documentation I read seemed to indicate that the Lambda function should have been able to access the &lt;strong&gt;Parameter Store&lt;/strong&gt; in the same region.&lt;/p&gt;

&lt;p&gt;In the &lt;strong&gt;requirements.txt&lt;/strong&gt; file make sure to include &lt;strong&gt;tweepy&lt;/strong&gt;, that's the library that we will use in our Lambda function to interact programmatically with &lt;strong&gt;X&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Navigate back to the stack, we wil now create a construct for the &lt;strong&gt;LambdaShareFunc&lt;/strong&gt;, add the below code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;In order for our Lambda function to read the &lt;strong&gt;Parameter Store&lt;/strong&gt; values in &lt;strong&gt;SSM&lt;/strong&gt; we can create a SSM Policy statement, this will grant the Lambda function permissions to retrieve the secrets.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;We need to allow our Lambda function to act when new items are added to our DynamoDB table, this can be achieved using DynamoDB Streams, lets add another construct.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;One last code addition to our stack is to enable DynamoDB Streams on our Table. This is achieved by adding &lt;strong&gt;stream=dynamodb.StreamViewType.NEW_IMAGE&lt;/strong&gt; in the table construct:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;We're almost on the final stretch, ensure that &lt;strong&gt;Docker&lt;/strong&gt; is running.&lt;/p&gt;

&lt;p&gt;We can now deploy the stack using &lt;strong&gt;cdk deploy&lt;/strong&gt;, this will take a few moments.&lt;/p&gt;

&lt;p&gt;Below are screenshots from my terminal window and AWS Management Console of the successful deployment:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Froza530z5mzc5bs66n5o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Froza530z5mzc5bs66n5o.png" alt="CDK Deploy Final" width="800" height="554"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7cwhbk5mhfvi4xx3fizw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7cwhbk5mhfvi4xx3fizw.png" alt="CloudFormation RSS Lambda DynamoDB Social Share Stack" width="800" height="488"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'll navigate towards an X burner account I created a couple of years ago for testing the X API, below are screenshots with Posts created:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2bk7hpasdqnw3jig0hnl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2bk7hpasdqnw3jig0hnl.png" alt="X Profile Posts" width="800" height="674"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Conclusion&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this blog post, we've delved into the intricacies of integrating Lambda functions, adding permissions to newly created constructs  and enabling a DynamoDB stream trigger that invokes a Lambda Function to create a Post on X into our AWS serverless architecture deployments using CDK v2. &lt;/p&gt;

&lt;p&gt;In an upcoming blog post we will shift our focus on testing constructs and lambda functions locally.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Resources&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cdk/api/v2/python/aws_cdk.aws_lambda_python_alpha/README.html" rel="noopener noreferrer"&gt;Amazon Lambda Python Library&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.python.org/3/library/xml.etree.elementtree.html" rel="noopener noreferrer"&gt;The ElementTree XML API&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-create-rule-schedule.html" rel="noopener noreferrer"&gt;Amazon EventBridge rule&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developer.twitter.com/en/docs/platform-overview" rel="noopener noreferrer"&gt;X Developer Platform&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html" rel="noopener noreferrer"&gt;Change data capture for DynamoDB Streams&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.tweepy.org/en/stable/" rel="noopener noreferrer"&gt;Tweepy&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>cdk</category>
      <category>lambda</category>
      <category>dynamodb</category>
      <category>python</category>
    </item>
    <item>
      <title>Exploring AWS Serverless Deployments with CDK v2: From RSS to X Posts - Part 1 of the Odyssey</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sat, 30 Mar 2024 13:30:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/exploring-aws-serverless-deployments-with-cdk-v2-from-rss-to-x-posts-part-1-of-the-odyssey-3j8o</link>
      <guid>https://forem.com/aws-builders/exploring-aws-serverless-deployments-with-cdk-v2-from-rss-to-x-posts-part-1-of-the-odyssey-3j8o</guid>
      <description>&lt;p&gt;Weekend projects often come and go, but on the rare occasion, one does stand out that is truly worth documenting.&lt;/p&gt;

&lt;p&gt;In this multi-part blog series I invite you to join me on a journey into the realm of AWS serverless architecture deployments with CDK v2.&lt;/p&gt;

&lt;p&gt;Together, we'll embark on a fascinating exploration - from harvesting RSS feeds to crafting X Posts.&lt;/p&gt;

&lt;p&gt;Throughout this series, we'll explore how to set up a serverless solution that automatically gathers information from an RSS feed, extracts the important details from XML and saves them efficiently in a DynamoDB table.&lt;/p&gt;

&lt;p&gt;Plus, we'll see how new entries in our DynamoDB table invokes a Lambda function to create X Posts using DynamoDB Streams.&lt;/p&gt;

&lt;p&gt;In my professional sphere, I predominantly rely on the Azure SDK for Python, with occasional use of Terraform, while my personal projects frequently entail Pulumi, Terraform and once in a blue moon CloudFormation (Shocking, I know). However, for this specific endeavor, we are utilizing the AWS CDK as our conduit.&lt;/p&gt;

&lt;p&gt;AWS CDK is an Infrastructure as Code tool that empowers us to define and manage resources with code, offering a fresh perspective on serverless architecture deployments.&lt;/p&gt;

&lt;p&gt;Before we dive in, lets take a moment to visualize the architecture we are about to construct:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhn4z8ajn1vlgej738ach.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhn4z8ajn1vlgej738ach.png" alt="Reference Architecture" width="800" height="229"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Pre-requisites:&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS CLI&lt;/li&gt;
&lt;li&gt;AWS CDK v2&lt;/li&gt;
&lt;li&gt;Python&lt;/li&gt;
&lt;li&gt;Docker&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Creating a CDK Project&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To get started with creating a new CDK project, you'll need to enter the below commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir rss-lambda-ddb-socialshare
cd rss-lambda-ddb-socialshare
cdk init --language python 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;&lt;em&gt;cdk init&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the &lt;strong&gt;cdk init&lt;/strong&gt; command has executed, you'll need to activate the virtual environment with one of the below commands depending on your OS:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;source .venv/bin/activate  
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.venv\Scripts\activate.bat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Next you will need to install Python packages and dependencies required for our stack and constructs, run the below command:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install -r requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Feel free to explore the project directory. You'll notice folders and files have been created for you.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;┣ 📂.venv
┃ ┣ 📂bin
┃ ┣ 📂include
┃ ┣ 📂lib
┃ ┗ 📜pyvenv.cfg
┣ 📂rss_lambda_ddb_socialshare
┃ ┣ 📜__init__.py
┃ ┗ 📜rss_lambda_ddb_socialshare_stack.py
┣ 📂tests
┃ ┣ 📂unit
┃ ┃ ┣ 📜__init__.py
┃ ┃ ┗ 📜test_rss_lambda_ddb_socialshare_stack.py
┃ ┗ 📜__init__.py
┣ 📜.gitignore
┣ 📜app.py
┣ 📜cdk.json
┣ 📜README.md
┣ 📜requirements-dev.txt
┣ 📜requirements.txt
┗ 📜source.bat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The app.py file is your app's entry point. The code in this file instantiates an instance of the &lt;strong&gt;RssLambdaDdbSocialshareStack&lt;/strong&gt; class from the &lt;strong&gt;rss_lambda_ddb_socialshare/rss_lambda_ddb_socialshare_stack.py&lt;/strong&gt; file.&lt;/p&gt;

&lt;p&gt;The most important file that we care about is the &lt;strong&gt;rss_lambda_ddb_socialshare/rss_lambda_ddb_socialshare_stack.py&lt;/strong&gt; file, this is where most of the code to define resources will be added.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;We will modify the stack to create the below resources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;DynamoDB Table&lt;/li&gt;
&lt;li&gt;2 Lambda Functions&lt;/li&gt;
&lt;li&gt;EventBridge rule&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The first construct in our stack that we will add is a DynamoDB Table, you'll need to import &lt;strong&gt;RemovalPolicy&lt;/strong&gt; and &lt;strong&gt;aws_dynamodb as dynamodb&lt;/strong&gt;. Then modify the stack, I have included code snippets that you can use to overwrite what is currently in the &lt;strong&gt;rss_lambda_ddb_socialshare/rss_lambda_ddb_socialshare_stack.py&lt;/strong&gt; file.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The removal policy we have defined in the &lt;strong&gt;DynamoDB Table&lt;/strong&gt; construct is destructive, for production environments one would use the &lt;strong&gt;RETAIN&lt;/strong&gt; attribute, e.g. &lt;strong&gt;removal_policy=RemovalPolicy.RETAIN&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;cdk synth&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The next command to execute is &lt;strong&gt;cdk synth&lt;/strong&gt;, this will generate a CloudFormation template for our current stack and be used for deployment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;cdk bootstrap&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once synthesized, we need to prepare an environment (target AWS account and AWS Region) for deployment. This is done using the &lt;strong&gt;cdk bootstrap&lt;/strong&gt; command. This will provision an S3 bucket to store files as well as create IAM roles that are required to perform deployments. This command only needs to be run once.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdk bootsrap aws://{AWS-ACCOUNT-NUMBER}/{REGION}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4z7exstcvkq59zrgcok.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp4z7exstcvkq59zrgcok.png" alt="CDK Bootstrap" width="800" height="370"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;cdk deploy&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;cdk deploy&lt;/strong&gt; command will deploy our stack to AWS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapk12tgw9qb0qya0ey14.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapk12tgw9qb0qya0ey14.png" alt="CDK Deploy" width="800" height="201"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can verify the deployment was successful by viewing the terminal output, alternatively log into the &lt;strong&gt;AWS Management Console&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw05jris8sl4443ctwqth.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw05jris8sl4443ctwqth.png" alt="CDK DDB Deploy Success" width="800" height="211"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmyocwp3ra9j7qh2jfloy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmyocwp3ra9j7qh2jfloy.png" alt="DynamoDB Overview" width="800" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Conclusion&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;That concludes &lt;strong&gt;Part 1&lt;/strong&gt;, in an upcoming blog post, we will learn how to create constructs for Lambda functions, granting permissions for constructs within the stack and adding the actual code for our Lambda functions that performs the magic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Resources&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/stacks.html" rel="noopener noreferrer"&gt;Stacks&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/constructs.html" rel="noopener noreferrer"&gt;Constructs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cdk/api/v2/python/aws_cdk/RemovalPolicy.html" rel="noopener noreferrer"&gt;RemovalPolicy&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>cdk</category>
      <category>lambda</category>
      <category>dynamodb</category>
      <category>python</category>
    </item>
    <item>
      <title>A tale of invocation - Using AWS Lambda to transfer files from AWS S3 to Azure Blob Storage</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sat, 25 Mar 2023 19:31:51 +0000</pubDate>
      <link>https://forem.com/aws-builders/a-tale-of-invocation-using-aws-lambda-to-transfer-files-from-aws-s3-to-azure-blob-storage-4ko6</link>
      <guid>https://forem.com/aws-builders/a-tale-of-invocation-using-aws-lambda-to-transfer-files-from-aws-s3-to-azure-blob-storage-4ko6</guid>
      <description>&lt;p&gt;A few days ago I decided to scan a popular freelance site for an interesting problem to solve without the intention of applying for the job, as lady luck would have it I stumbled across an interesting challenge, the client needed to copy data from an S3 bucket to Azure Blob Storage using Power Automate. Seems like a fairly easy task right ?&lt;/p&gt;

&lt;p&gt;I started off with reading the &lt;a href="https://learn.microsoft.com/en-us/connectors/amazons3/" rel="noopener noreferrer"&gt;Microsoft Docs&lt;/a&gt; and reviewing the Amazon S3 Power Platform Connector.&lt;/p&gt;

&lt;p&gt;There is a caveat when it comes to using the connector as Microsoft explicitly expresses a known limit of object sizes needing to being less than 3.5 MB, as to be expected with low-code development platforms, there’s always a catch.&lt;/p&gt;

&lt;p&gt;This seemed like a perfect opportunity to build a Lambda function with Python.&lt;/p&gt;

&lt;p&gt;In this post I will show you how to create a Lambda function that is invoked once a file has been uploaded to S3 and using Lambda layers to package libraries that will be used by the Lambda function.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-requisites:
&lt;/h2&gt;

&lt;p&gt;• AWS Account&lt;br&gt;
• Amazon S3 Bucket&lt;br&gt;
• Azure Subscription&lt;br&gt;
• Azure Storage Account&lt;br&gt;
• Docker&lt;/p&gt;

&lt;p&gt;This post assumes that you have already created an S3 bucket in AWS and an Azure Storage account with a container. If you’re familiar with Pulumi, I have included Pulumi programs to create the required resources in both clouds in the below repo:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/AdrianM10/lambda-s3-to-blob/tree/main/IaC" rel="noopener noreferrer"&gt;lambda-s3-to-blob&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 1: Create an IAM execution role
&lt;/h2&gt;

&lt;p&gt;The Lambda function that we will create in the next step will require permissions to access S3 and write permissions to CloudWatch logs.&lt;/p&gt;

&lt;p&gt;Navigate towards the &lt;strong&gt;IAM&lt;/strong&gt; console and select &lt;strong&gt;Roles&lt;/strong&gt; under &lt;strong&gt;Access management&lt;/strong&gt; from the left side-menu, select the &lt;strong&gt;Create role&lt;/strong&gt; button, from the subsequent page that loads select &lt;strong&gt;AWS service&lt;/strong&gt; as the &lt;strong&gt;Trusted entity type&lt;/strong&gt; and &lt;strong&gt;Lambda&lt;/strong&gt; as the &lt;strong&gt;Use case&lt;/strong&gt;. On the &lt;strong&gt;Add permissions&lt;/strong&gt; page you’ll need to search and select the checkbox for the below permissions:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AmazonS3ReadOnlyAccess&lt;br&gt;
AWSLambdaBasicExecutionRole&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Provide a name for the role e.g &lt;strong&gt;s3ToAzureBlobRole&lt;/strong&gt; and finalize the creation of the role.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn82h2216utg09dmv0fka.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn82h2216utg09dmv0fka.png" alt="IAM Role Creation" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Step 2: Create a Lambda function
&lt;/h2&gt;

&lt;p&gt;Navigate towards the Lambda console and select the &lt;strong&gt;Create function&lt;/strong&gt; button.&lt;/p&gt;

&lt;p&gt;Select &lt;strong&gt;Author from scratch&lt;/strong&gt; and under the &lt;strong&gt;Basic information&lt;/strong&gt; section provide a name of the function eg. &lt;strong&gt;s3ToAzureBlob&lt;/strong&gt; and select &lt;strong&gt;Python 3.9&lt;/strong&gt; under the &lt;strong&gt;Runtime&lt;/strong&gt; drop-down.&lt;/p&gt;

&lt;p&gt;Under the &lt;strong&gt;Architecture&lt;/strong&gt; section, if you’re using Apple Silicon like me, make sure to select &lt;strong&gt;arm64&lt;/strong&gt; otherwise if you’re using an intel based machine, select &lt;strong&gt;x86_64&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;We’ll need to expand the &lt;strong&gt;Permissions&lt;/strong&gt; and make sure &lt;strong&gt;Use an existing role&lt;/strong&gt; is selected, from the drop-down select the role you created earlier (s3ToAzureBlobRole). Select the &lt;strong&gt;Create function&lt;/strong&gt; button.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu560kt2n66383zg6gtml.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu560kt2n66383zg6gtml.png" alt="AWS Lambda Console" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under the code tab, you’ll need to replace the code shown in the console editor with the code shown below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
import botocore
import os
import tempfile
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient
from azure.core.exceptions import ClientAuthenticationError, ServiceRequestError

s3 = boto3.resource('s3')

# Credentials for accessing Azure Blob Storage
storage_account_key = os.environ.get('storage_account_key')
storage_account_name = os.environ.get('storage_account_name')
connection_string = os.environ.get('connection_string')
container_name = os.environ.get('container_name')

def lambda_handler(event, context):
    # Get temp file location when running
    temFilePath = tempfile.gettempdir()

    # Change directory to /tmp folder
    os.chdir(temFilePath)

    for record in event['Records']:
        # Get bucket and key from s3 trigger event
        bucket = record['s3']['bucket']['name']
        key = record['s3']['object']['key']

        # Get file name from key, join /tmp folder path and file name
        file_name = key
        upload_file_path = os.path.join(temFilePath, file_name)

        try:
            # Download the object from s3
            s3.meta.client.download_file(bucket, key, key)

            def upload_to_blob_storage(file_path, file_name):
                """Upload file to Azure storage as blob from /tmp folder"""
                blob_service_client = BlobServiceClient.from_connection_string(connection_string)
                blob_client = blob_service_client.get_blob_client(container=container_name, blob=file_name)

                with open(file_path, "rb") as data:
                    blob_client.upload_blob(data, overwrite=True)
                    print(f" {file_name} uploaded to Azure Blob !")

            upload_to_blob_storage(upload_file_path, file_name)

        except FileNotFoundError:
            print(f"The file {key} does not exist")
        except botocore.exceptions.ClientError as error:
            print(error.response['Error']['Code'], error.response['Error']['Message'])
        except ClientAuthenticationError as e:
            print(f"Error uploading file: {e}")
        except ServiceRequestError as e:
            print(f"Error uploading file: {e}")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save the changes and select the &lt;strong&gt;Deploy&lt;/strong&gt; button.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Create an S3 trigger
&lt;/h2&gt;

&lt;p&gt;A trigger is a service or resource that invokes your function. We will be using S3 as a source that invokes our function each time an object is uploaded to the S3 bucket.&lt;/p&gt;

&lt;p&gt;Under the &lt;strong&gt;Trigger configuration&lt;/strong&gt;, select &lt;strong&gt;S3&lt;/strong&gt; as the source and select the bucket, under &lt;strong&gt;Event type&lt;/strong&gt; select &lt;strong&gt;Put&lt;/strong&gt;, ensure the checkbox under &lt;strong&gt;Recursive invocation&lt;/strong&gt; is selected and select the &lt;strong&gt;Add&lt;/strong&gt; button to complete the trigger configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdbs2izur6q6zr20h9h0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdbs2izur6q6zr20h9h0.png" alt="S3 Trigger" width="800" height="861"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 4: Create a Lambda Layer
&lt;/h2&gt;

&lt;p&gt;Lambda layers allow you to package additional libraries that will be used by your function. Our function will make use of Azure libraries (azure-storage-blob, azure-core) to communicate with Azure services from the Python code.&lt;/p&gt;

&lt;p&gt;You’ll need to create the following directory structure locally that is compatible with Python 3.9 to create a simulated Lambda environment with Docker.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;├── requirements.txt
└── python/
    └── lib/
        ├── python3.9/
        │   └── site-packages/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the requirements.txt file, specify the Azure Storage Blobs client library as well as the Azure Core shared client library for Python by adding the following lines in the file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;azure-storage-blob
azure-core
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ensure that the Docker daemon is running. You’ll need to install the library dependencies required by the Lambda function to the subfolders created earlier. Enter the below command:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;docker run -v "$PWD":/var/task "public.ecr.aws/sam/build-python3.9" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.9/site-packages/; exit"&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Once the dependencies have been downloaded you’ll need to zip the contents of the python folder.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;zip -r azurelib.zip python &amp;gt; /dev/null&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Navigate towards the Lambda console from your browser and expand the side menu by selecting the hamburger menu on the top left corner of the page. Select &lt;strong&gt;Layers&lt;/strong&gt; under &lt;strong&gt;Additional resources&lt;/strong&gt;. Select the &lt;strong&gt;Create layer&lt;/strong&gt; button, you’ll need to provide a name for the layer, upload the zipped file contain the library our function requires to interact with Azure and select &lt;strong&gt;Create&lt;/strong&gt; once you’ve provided the mandatory information.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1p2fia1a89h947597wpk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1p2fia1a89h947597wpk.png" alt="Create Lambda Layer" width="800" height="829"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Navigate towards the function you created earlier, scroll towards the bottom of the page and select &lt;strong&gt;Add a layer&lt;/strong&gt; under the &lt;strong&gt;Layers&lt;/strong&gt; section. Select &lt;strong&gt;Custom layers&lt;/strong&gt;, select the layer you created earlier and select the version.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4sabogynuyif83n8m6b9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4sabogynuyif83n8m6b9.png" alt="Add Lambda Layer" width="800" height="656"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 5: Adding environment variables
&lt;/h2&gt;

&lt;p&gt;Select the &lt;strong&gt;Configuration&lt;/strong&gt; tab, select &lt;strong&gt;Environment variables&lt;/strong&gt;, select the &lt;strong&gt;edit&lt;/strong&gt; button then select &lt;strong&gt;Add environment variable&lt;/strong&gt;, you’ll need to populate the Key/Value pairs with credentials required for accessing Azure Blob Storage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftv3oove15igfu8945p0k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftv3oove15igfu8945p0k.png" alt="Add Environment Variables" width="800" height="762"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Login to the Azure portal to retrieve the Access keys for the Storage account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqmcvzgtvzs0pynysedq1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqmcvzgtvzs0pynysedq1.png" alt="Azure Storage Account Credentials" width="800" height="603"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To test if the solution works, you can upload a file to the S3 bucket, you can then review the &lt;strong&gt;CloudWatch&lt;/strong&gt; logs for all log events.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffrfy8pmya2dz76qtqbn3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffrfy8pmya2dz76qtqbn3.png" alt="CloudWatch Log Events" width="800" height="393"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can also check if the container within your Azure Storage Account has an uploaded file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft8un7flvql1estgg2q7o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft8un7flvql1estgg2q7o.png" alt="Azure Storage Container Items" width="800" height="221"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Optional
&lt;/h2&gt;

&lt;p&gt;You might want to increase the &lt;strong&gt;Timeout&lt;/strong&gt; to 15 seconds to negate the function timing out, you can also increase the &lt;strong&gt;Ephemeral storage&lt;/strong&gt; from 512 MB to a higher capacity, up to 10 GB.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The best source for new project ideas can be found looking at freelance sites. This approach leads to a more organic approach to learning.&lt;/p&gt;

&lt;p&gt;Thanks for reading, I hope you’ve learnt something new.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>lamba</category>
      <category>python</category>
      <category>s3</category>
    </item>
    <item>
      <title>Using AWS WAF to protect your WordPress site hosted on AWS</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sat, 18 Feb 2023 08:44:49 +0000</pubDate>
      <link>https://forem.com/aws-builders/using-aws-waf-to-protect-your-wordpress-site-hosted-on-aws-38c3</link>
      <guid>https://forem.com/aws-builders/using-aws-waf-to-protect-your-wordpress-site-hosted-on-aws-38c3</guid>
      <description>&lt;p&gt;First blog post of the new year. It’s been a while. Throughout my cloud journey and continuous learning of the AWS platform and its services, I often notice that security is often overlooked when deploying WordPress sites, most tutorials will guide you through the steps for deploying a highly available WordPress site and neglect to show ways in which you can protect your WordPress site once deployed.&lt;/p&gt;

&lt;p&gt;Disclaimer - The use of an AWS Web Application Firewall (WAF) and managed rules highlighted in this blog post is by no means exhaustive but can help improve your security posture and better protect your WordPress site, in this blog post we’ll take a look at using specific AWS Managed Rules for WAF.&lt;/p&gt;

&lt;p&gt;Please take note that an AWS WAF can only be attached to the below resource types:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon CloudFront distributions&lt;br&gt;
Application Load Balancer&lt;br&gt;
Amazon API Gateway&lt;br&gt;
Amazon AppSync&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;An AWS WAF can be thought of as product that inspects traffic between your web application and the internet, it makes use of rules that allow you to block or allow web requests based on conditions that you define.&lt;/p&gt;

&lt;p&gt;We’ll be making use of a &lt;strong&gt;Web ACL&lt;/strong&gt; and a few &lt;strong&gt;AWS Managed Rules&lt;/strong&gt; to protect our WordPress site.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Managed Rules&lt;/strong&gt; contain pre-defined rules that are designed and managed by AWS and AWS Marketplace sellers to protect your web application.&lt;/p&gt;

&lt;p&gt;The AWS Managed Rule Groups that will be used are listed below:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;WordPress application managed rule group&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The WordPress application rule group contains rules that block request patterns associated with the exploitation of vulnerabilities specific to WordPress sites.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SQL database&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The SQL database rule group contains rules to block request patterns associated with exploitation of SQL databases, like SQL injection attacks. This can help prevent remote injection of unauthorized queries.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PHP Application&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The PHP application rule group contains rules that block request patterns associated with the exploitation of vulnerabilities specific to the use of the PHP programming language, including injection of unsafe PHP functions. This can help prevent exploitation of vulnerabilities that permit an attacker to remotely run code or commands for which they are not authorized.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Admin protection&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Admin protection rule group contains rules that allow you to block external access to exposed administrative pages.&lt;/p&gt;



&lt;p&gt;An important concept to understand is &lt;strong&gt;Web ACL rule capacity units&lt;/strong&gt; (WCU), for each rule group you enable, capacity units are accumulated and should not exceed 1500 WCU’s, this is a calculation done by AWS to control the operating resources required to run your rules, rule groups and Web ACLs.&lt;/p&gt;

&lt;p&gt;The estimated cost for creating 1 Web ACL is $ 5.00 per month (prorated hourly). Since we will be only using free rule groups, you will be charged for duration you have a WAF created and a Web ACL in use.&lt;/p&gt;

&lt;p&gt;Let’s get started with creating our first AWS WAF with managed rules and attach it to an application load balancer, later on we’ll even restrict access to the WordPress admin area to be only accessible to an IP address that we specify.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up an AWS WAF, Creating a Web ACL &amp;amp; Adding AWS Managed Rule groups
&lt;/h2&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Sign in to the &lt;strong&gt;AWS Management Console&lt;/strong&gt; and navigate towards to the &lt;strong&gt;AWS WAF console&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Create web ACL&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the Name field, enter a name for the Web ACL and select the resource type you wish to associate this Web ACL with.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Add AWS resources&lt;/strong&gt;, select an existing resource type (CloudFront / Application Load Balancer etc.) and select &lt;strong&gt;Next&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;Add rules and rule groups&lt;/strong&gt; page, select the &lt;strong&gt;Add rules&lt;/strong&gt; drop-down from the &lt;strong&gt;Rules&lt;/strong&gt; section.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Add managed rule groups&lt;/strong&gt;, on the &lt;strong&gt;Add managed rule groups&lt;/strong&gt; page, expand AWS managed rule groups.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under the &lt;strong&gt;Free rule groups&lt;/strong&gt; section, switch the toggle to on for the below rule groups:&lt;br&gt;
&lt;br&gt;&lt;br&gt;
&lt;em&gt;Admin protection&lt;br&gt;
PHP Application&lt;br&gt;
SQL database&lt;br&gt;
WordPress application&lt;/em&gt;&lt;br&gt;
&lt;br&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Add rules&lt;/strong&gt;. On the Add rules and rule groups page, select &lt;strong&gt;Next&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;Set rule priority&lt;/strong&gt; page, you can move the rules up and down to change the evaluation order.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;Configure metrics&lt;/strong&gt; page, ensure that the rules are checked and the &lt;strong&gt;Enable sampled requests&lt;/strong&gt; option is selected.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the &lt;strong&gt;Review and create web ACL&lt;/strong&gt; page, review the settings you have chosen and then select &lt;strong&gt;Create web ACL&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb9tk1x1sxo4nasgkrl4j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb9tk1x1sxo4nasgkrl4j.png" alt="Review Web ACL 1" width="800" height="634"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxzdphl1qmzv3ybd8qz9c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxzdphl1qmzv3ybd8qz9c.png" alt="Review Web ACL 2" width="800" height="816"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating an IP Set
&lt;/h2&gt;



&lt;p&gt;In order to restrict access to the WordPress admin area to be only accessible to an IP address of your choice, you”ll need to first create an &lt;strong&gt;IP Set&lt;/strong&gt;.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Navigate towards &lt;strong&gt;IP Sets&lt;/strong&gt; and select &lt;strong&gt;Create IP Set&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide a name, choose the AWS region where your other AWS resources have been deployed and provide an IP address.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F17a8nfpo34ucjfj0ykda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F17a8nfpo34ucjfj0ykda.png" alt="IP Set" width="800" height="226"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Restricting access to WordPress Admin area
&lt;/h2&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Navigate towards &lt;strong&gt;Web ACLs&lt;/strong&gt; and select the Web ACL created earlier.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select the &lt;strong&gt;Rules&lt;/strong&gt; tab, select the &lt;strong&gt;AWS-AWSManagedRulesAdminProtectionRuleSet **rule and select **Edit&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select the &lt;strong&gt;Override to Block&lt;/strong&gt; option from the drop-down in the Admin protection Rules section.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the &lt;strong&gt;Scope-down statement&lt;/strong&gt; section, ensure the checkbox for &lt;strong&gt;Enable scope-down statement&lt;/strong&gt; is selected.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the &lt;strong&gt;If a request&lt;/strong&gt; drop-down, select the &lt;strong&gt;doesn’t match the statement(NOT)&lt;/strong&gt; option, from the &lt;strong&gt;Inspect&lt;/strong&gt; drop-down select &lt;strong&gt;Originates from an IP address in&lt;/strong&gt;, from the &lt;strong&gt;IP Set&lt;/strong&gt; drop-down select IP the set that you created earlier and select the &lt;strong&gt;Source IP address&lt;/strong&gt; radio button.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;Save rule&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbad5n2m5pmcta8314jlb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbad5n2m5pmcta8314jlb.png" alt="Admin protection 1" width="800" height="576"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0qq1b4s8eevlrm9ob7j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0qq1b4s8eevlrm9ob7j.png" alt="Admin protection 2" width="800" height="761"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After editing the &lt;strong&gt;AWS-AWSManagedRulesAdminProtectionRuleSet&lt;/strong&gt; rule, if you attempt to access the WordPress admin area from an IP that’s not listed in the IP Set, you will be greeted with a 403 Forbidden response.&lt;/p&gt;

&lt;p&gt;To test it out, you can attempt to reach your WordPress admin area by using a connection through a VPN or via a mobile device or tablet using cellular data.&lt;/p&gt;

&lt;p&gt;That concludes this blog post on how you can better protect your WordPress site using AWS WAF, Web ACLs, IP Sets and AWS Managed Rules.&lt;/p&gt;

</description>
      <category>vibecoding</category>
    </item>
    <item>
      <title>Connect to your Linux EC2 instance using SSH + Visual Studio Code</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Mon, 12 Sep 2022 05:58:02 +0000</pubDate>
      <link>https://forem.com/aws-builders/connect-to-your-linux-ec2-instance-using-ssh-visual-studio-code-49k5</link>
      <guid>https://forem.com/aws-builders/connect-to-your-linux-ec2-instance-using-ssh-visual-studio-code-49k5</guid>
      <description>&lt;p&gt;In this blog post i'll show you how to configure &lt;strong&gt;VS Code&lt;/strong&gt; to allow remote development for Linux based EC2 instances hosted on AWS.&lt;/p&gt;

&lt;p&gt;On the rare occasion that I need to connect to Linux based EC2 instances, I normally use &lt;a href="https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager.html" rel="noopener noreferrer"&gt;Session Manager&lt;/a&gt; through the &lt;strong&gt;AWS Management Console&lt;/strong&gt; or use the &lt;a href="https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager-working-with-install-plugin.html" rel="noopener noreferrer"&gt;Session Manager plugin for AWS CLI&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Recently I had to take part in a web development project, the remote development environment ran on an EC2 instance. I had no desire of using the &lt;strong&gt;VI text editor&lt;/strong&gt; (that's too hardcore) and preferred to use &lt;strong&gt;VS Code&lt;/strong&gt; from my MacBook Pro as it has some pretty useful extensions installed.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;strong&gt;&lt;em&gt;Pre-requisites:&lt;/em&gt;&lt;/strong&gt;
&lt;/h1&gt;



&lt;p&gt;• &lt;strong&gt;AWS Account&lt;/strong&gt;&lt;br&gt;
• &lt;strong&gt;Visual Studio Code&lt;/strong&gt;&lt;/p&gt;



&lt;p&gt;This blog post assumes the reader has some basic knowledge of &lt;strong&gt;SSH&lt;/strong&gt; and is using macOS or a Linux based OS.&lt;/p&gt;

&lt;p&gt;To get started you'll need an &lt;strong&gt;SSH Config file&lt;/strong&gt;, on macOS it can be located in the &lt;strong&gt;~/.ssh&lt;/strong&gt; directory. I've created a new EC2 instance, new key-pair and moved the private key from the &lt;strong&gt;Downloads&lt;/strong&gt; directory to the &lt;strong&gt;~/.ssh&lt;/strong&gt; directory.&lt;/p&gt;

&lt;p&gt;To modify the &lt;strong&gt;config file&lt;/strong&gt;, use the below commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cd ~/.ssh
code config
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's add a new section containing our EC2 details. The section must conform to the below structure:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Host {Friendly Name Used for identification}
    HostName {Public DNS of EC2 instance}
    User {username}
    IdentityFile {location of private key}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgvanj26w0vsuc7v4u5jt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgvanj26w0vsuc7v4u5jt.png" alt="SSH Config" width="800" height="223"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next you'll need to install the &lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-ssh" rel="noopener noreferrer"&gt;Remote - SSH&lt;/a&gt; Extension in VS Code. &lt;/p&gt;

&lt;p&gt;Once installed, you'll notice the &lt;strong&gt;Remote Explorer&lt;/strong&gt; icon appear on the &lt;strong&gt;Activity Bar&lt;/strong&gt;, select the icon, this will bring the &lt;strong&gt;Primary Side Bar&lt;/strong&gt; into view. &lt;/p&gt;

&lt;p&gt;From the dropdown, select &lt;strong&gt;SSH Targets&lt;/strong&gt;, a list of SSH Targets will be listed.&lt;/p&gt;

&lt;p&gt;Right-click the &lt;strong&gt;SSH target&lt;/strong&gt;, you'll be presented with 2 options: &lt;strong&gt;Connect to Host in Current Window&lt;/strong&gt; and &lt;strong&gt;Connect to Host in New Window&lt;/strong&gt;.&lt;/p&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5amzjyrn6q7tbv3i5s7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5amzjyrn6q7tbv3i5s7m.png" alt="Remote Explorer" width="800" height="489"&gt;&lt;/a&gt;&lt;/p&gt;



&lt;p&gt;You will then be prompted to verify the fingerprint. Select &lt;strong&gt;Continue&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapvvqq8y3ha4lucaix5k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fapvvqq8y3ha4lucaix5k.png" alt="SSH Fingerprint" width="800" height="491"&gt;&lt;/a&gt;&lt;/p&gt;



&lt;p&gt;On the bottom left corner of VS Code, the &lt;strong&gt;Status bar&lt;/strong&gt; will indicate the connection status.&lt;/p&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj839nldnn02kmliwjg5l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj839nldnn02kmliwjg5l.png" alt="Session" width="800" height="171"&gt;&lt;/a&gt;&lt;/p&gt;



&lt;p&gt;That concludes this blog post where I showed how to configure &lt;strong&gt;VS Code for Remote Development&lt;/strong&gt; for Linux based EC2 instances hosted on AWS.&lt;/p&gt;

</description>
      <category>vscode</category>
      <category>ssh</category>
      <category>ec2</category>
    </item>
    <item>
      <title>Deploy a WordPress Amazon Lightsail instance using Terraform</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Sun, 15 May 2022 18:30:51 +0000</pubDate>
      <link>https://forem.com/aws-builders/deploy-a-wordpress-amazon-lightsail-instance-using-terraform-169i</link>
      <guid>https://forem.com/aws-builders/deploy-a-wordpress-amazon-lightsail-instance-using-terraform-169i</guid>
      <description>&lt;p&gt;Lately I've been learning about Terraform and getting into the habit of deploying infrastructure as code across different cloud service providers, it’s been a while since I found a use case to provision infrastructure on AWS.&lt;/p&gt;

&lt;p&gt;For inspiration behind this blog post I took a trip down memory lane circa late 2021, when I was going through a WordPress learning phase I initially used Amazon Lightsail for WordPress hosting, for some reason I used to click through the GUI to provision an Amazon Lightsail instance.&lt;/p&gt;

&lt;p&gt;I had previously looked at documentation around using CloudFormation and never bothered to pursue that avenue for Lightsail. With that in the past, this new found confidence and understanding in using infrastructure as code and in particular with Terraform it was time to solve this mini challenge of yester year.&lt;/p&gt;

&lt;p&gt;HashiCorp Terraform is an infrastructure as code tool that lets you define both cloud and on-prem resources in human-readable configuration files that you can version, reuse, and share.&lt;/p&gt;

&lt;p&gt;If you’re new to Terraform it might be worthwhile to read the official documentation or watch an awesome intro video by &lt;a href="https://www.techworld-with-nana.com" rel="noopener noreferrer"&gt;Nana Janashia&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.terraform.io/intro" rel="noopener noreferrer"&gt;What is Terraform | Terraform by HashiCorp &lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=l5k1ai_GBDE" rel="noopener noreferrer"&gt;Terraform explained in 15 mins | Terraform Tutorial for Beginners&lt;/a&gt;&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisites:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;• AWS Account&lt;br&gt;
• &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html" rel="noopener noreferrer"&gt;Named profile configured&lt;/a&gt;&lt;br&gt;
• &lt;a href="https://learn.hashicorp.com/tutorials/terraform/install-cli?in=onboarding/tfcb-week-2" rel="noopener noreferrer"&gt;Terraform must be installed&lt;/a&gt;&lt;br&gt;
• Visual Studio Code &lt;br&gt;
• &lt;a href="https://marketplace.visualstudio.com/items?itemName=AmazonWebServices.aws-toolkit-vscode" rel="noopener noreferrer"&gt;AWS Toolkit extension for VS Code&lt;/a&gt;&lt;br&gt;
&lt;br&gt;&lt;br&gt;
All the code in this blog can be found in the repo:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/AdrianM10/Amazon-Lightsail-Terraform" rel="noopener noreferrer"&gt;Amazon-Lightsail-Terraform&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you have cloned the repo, you'll need to run &lt;strong&gt;terraform init&lt;/strong&gt;, this command is used to initialize the working directory that contains our Terraform configuration files.&lt;/p&gt;

&lt;p&gt;I have included a &lt;strong&gt;variables.tf&lt;/strong&gt; file to move away from as many hard-coded values in the &lt;strong&gt;main.tf file&lt;/strong&gt; as possible. &lt;/p&gt;

&lt;p&gt;This will allow versatility and makes the template reusable for different types of Lightsail blueprints eg. LAMP, Node.js, Joomla, GitLab etc&lt;/p&gt;

&lt;p&gt;Official documentation from AWS on blueprints can be found &lt;a href="https://awscli.amazonaws.com/v2/documentation/api/latest/reference/lightsail/get-blueprints.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;In order to find the available blueprint IDs, type the below command in the AWS CLI:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;aws lightsail get-blueprints&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When you have entered your desired inputs in the &lt;strong&gt;variables.tf&lt;/strong&gt; file, you can apply your configuration using &lt;strong&gt;terraform apply&lt;/strong&gt; and the Amazon Lightsail instance will be provisioned.&lt;/p&gt;

&lt;p&gt;Useful resources:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lightsail_instance" rel="noopener noreferrer"&gt;Resource: aws_lightsail_instance&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://lightsail.aws.amazon.com/ls/docs/en_us/articles/understanding-regions-and-availability-zones-in-amazon-lightsail" rel="noopener noreferrer"&gt;Regions and Availability Zones in Amazon Lightsail&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>wordpress</category>
    </item>
    <item>
      <title>Encrypting Attached EBS Volumes</title>
      <dc:creator>Adrian Mudzwiti </dc:creator>
      <pubDate>Tue, 12 Apr 2022 19:41:23 +0000</pubDate>
      <link>https://forem.com/aws-builders/encrypting-attached-ebs-volumes-2bfe</link>
      <guid>https://forem.com/aws-builders/encrypting-attached-ebs-volumes-2bfe</guid>
      <description>&lt;p&gt;Time and time again whenever I'm reviewing an AWS environment I've always noticed that many customers neglect enabling encryption features for data at rest for EC2 volumes and snapshots, RDS and even S3 buckets and their objects.&lt;/p&gt;

&lt;p&gt;As with most cloud deployments it's fairly easy to provision resources and run them for quite some time until you realize that a flag or toggle that could improve your security posture should have been selected at the time of provisioning. &lt;/p&gt;

&lt;p&gt;Once a resource has been provisioned without enabling encryption at rest, the task of enabling encryption becomes a rather serious challenge.&lt;/p&gt;

&lt;p&gt;In this blog post I'll show you how to encrypt EBS volumes that are attached to EC2 instances. This is a fairly manual task that has to be done for each EBS attached volume, plan for downtime... It will take some time.&lt;/p&gt;

&lt;p&gt;AWS has a feature that needs to be configured per region to enable EBS encryption by default. This can be found in the &lt;strong&gt;Account attributes&lt;/strong&gt; tile under the EC2 Dashboard page. Now would be the perfect time to enable this feature for future deployments.&lt;/p&gt;

&lt;p&gt;Back to the task at hand, encrypting an EBS volume that is attached to a running EC2 instance has a few steps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Remediation:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1:&lt;/strong&gt; Navigate towards the EC2 console and select &lt;strong&gt;Volumes&lt;/strong&gt; under the &lt;strong&gt;Elastic Block Store&lt;/strong&gt; section.&lt;br&gt;
&lt;strong&gt;Step 2:&lt;/strong&gt; Select an unencrypted volume and then select the &lt;strong&gt;Actions&lt;/strong&gt; button and from the dropdown select &lt;strong&gt;Create snapshot&lt;/strong&gt;.&lt;br&gt;
&lt;strong&gt;Step 3:&lt;/strong&gt; Navigate towards &lt;strong&gt;Snapshots&lt;/strong&gt; on the left-hand side under the &lt;strong&gt;Elastic Block Store&lt;/strong&gt; section. &lt;br&gt;
&lt;strong&gt;Step 4:&lt;/strong&gt; Select the newly created snapshot, select the &lt;strong&gt;Actions&lt;/strong&gt; button then select &lt;strong&gt;Copy snapshot&lt;/strong&gt;. &lt;br&gt;
&lt;strong&gt;Step 5:&lt;/strong&gt; Under encryption, select the checkbox next to &lt;strong&gt;Encrypt this snapshot&lt;/strong&gt; and proceed to select the &lt;strong&gt;Copy snapshot&lt;/strong&gt; button at the bottom of the screen. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F435vclin107eie8nfpbc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F435vclin107eie8nfpbc.png" alt=" " width="800" height="897"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 6:&lt;/strong&gt; Create a volume from the copied snapshot by selecting the snapshot then proceed to select the &lt;strong&gt;Actions&lt;/strong&gt; button then select &lt;strong&gt;Create volume from snapshot&lt;/strong&gt;. From the &lt;strong&gt;Create volume&lt;/strong&gt; page, ensure to select the same &lt;strong&gt;Availability Zone&lt;/strong&gt; where your EC2 instance was originally deployed in (Open another tab and verify the region), scroll to bottom of the page and select &lt;strong&gt;Encrypt this snapshot&lt;/strong&gt;.&lt;br&gt;
&lt;strong&gt;Step 7:&lt;/strong&gt; Navigate towards &lt;strong&gt;Volumes&lt;/strong&gt;, you'll see the newly created encrypted volume in an &lt;strong&gt;Available&lt;/strong&gt; state. &lt;br&gt;
&lt;strong&gt;Step 8:&lt;/strong&gt; Navigate back towards the EC2 Dashboard and select &lt;strong&gt;Instances&lt;/strong&gt; under the Resources tile, select the instance that you intend to detach the unencrypted EBS volume and attach the newly encrypted volume, select the &lt;strong&gt;Storage&lt;/strong&gt; tab and copy the &lt;strong&gt;Root device name&lt;/strong&gt; to your clipboard. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsunvwfhc50b88hecr7fk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsunvwfhc50b88hecr7fk.png" alt=" " width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 9:&lt;/strong&gt; Stop the running instance, then navigate towards &lt;strong&gt;Volumes&lt;/strong&gt; under the &lt;strong&gt;Elastic Block Store&lt;/strong&gt; section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhf49z9mbxd01aw4fgsd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhf49z9mbxd01aw4fgsd.png" alt=" " width="800" height="95"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 10:&lt;/strong&gt; Select the unencrypted volume, then select the &lt;strong&gt;Actions&lt;/strong&gt; button and from the dropdown select &lt;strong&gt;Detach volume&lt;/strong&gt;.&lt;br&gt;
&lt;strong&gt;Step 11:&lt;/strong&gt; Now select the encrypted volume then select the &lt;strong&gt;Actions&lt;/strong&gt; button and from the dropdown select &lt;strong&gt;Attach volume&lt;/strong&gt;, from the attach volume page, select your instance and in the &lt;strong&gt;Device name&lt;/strong&gt; textbox paste the &lt;strong&gt;Root device name&lt;/strong&gt; from step 8 and select the &lt;strong&gt;Attach volume&lt;/strong&gt; button at the bottom of the screen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85qv7dkseq8tzpyppnjw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85qv7dkseq8tzpyppnjw.png" alt=" " width="800" height="783"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 12:&lt;/strong&gt; The final step is to start your instance. &lt;/p&gt;

&lt;p&gt;In 12 steps I've shown you how to encrypt an EBS volume that is attached to an EC2 instance, If you have a couple of EBS volumes this shouldn't take long, just make a note of &lt;strong&gt;Root device name&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;I'm having flashbacks of a customer that had over a 100 unencrypted volumes being used in production. That brings the conclusion to this blog post. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>ec2</category>
      <category>encryption</category>
      <category>security</category>
    </item>
  </channel>
</rss>
