<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Ashwin Venkatesan</title>
    <description>The latest articles on Forem by Ashwin Venkatesan (@imash24).</description>
    <link>https://forem.com/imash24</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/imash24"/>
    <language>en</language>
    <item>
      <title>Sample b</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Thu, 26 Feb 2026 05:52:10 +0000</pubDate>
      <link>https://forem.com/imash24/sample-b-gab</link>
      <guid>https://forem.com/imash24/sample-b-gab</guid>
      <description>&lt;p&gt;Structure Terraform projects using reusable modules (e.g., VPC, compute, database) instead of defining all resources in a single root configuration.&lt;br&gt;
    • Organize code with a clear separation between modules/ (reusable logic) and environment folders like dev/, prod/ to maintain clarity.&lt;br&gt;
    • Avoid code duplication by applying the DRY principle, using variables to customize modules per environment.&lt;br&gt;
    • Ensure each environment has isolated configurations and state to reduce blast radius and prevent accidental production impact.&lt;/p&gt;

&lt;p&gt;• Always store Terraform state in a remote backend (e.g., S3, Terraform Cloud, Azure Storage) instead of local state to enable secure team collaboration.&lt;br&gt;
    • Enable state locking (e.g., DynamoDB with S3 backend) to prevent concurrent terraform apply operations and avoid race conditions.&lt;br&gt;
    • Never commit terraform.tfstate files to version control, as they may contain sensitive data and infrastructure mappings.&lt;br&gt;
    • Use separate remote state files for each environment (dev, staging, prod) to ensure isolation and reduce blast radius.&lt;br&gt;
    • Protect state integrity through backend versioning and access control (IAM roles, RBAC) to prevent accidental modification or corruption.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Never hardcode sensitive values (passwords, API keys, access keys) directly in Terraform configuration files, as this exposes credentials in version control and increases security risk.
• Use secure secret management solutions such as environment variables, IAM roles, AWS Secrets Manager, HashiCorp Vault, or CI/CD secret stores to inject sensitive values securely.
• Mark sensitive input variables using sensitive = true to prevent secret values from being displayed in CLI output and logs.
• Ensure Terraform state files are protected, encrypted, and stored in secure remote backends, since state may contain sensitive data.
• Follow the principle of least privilege by granting only the minimum required permissions to Terraform execution roles.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Subject: Request for Billing Adjustment – Unexpected Charges&lt;/p&gt;

&lt;p&gt;Hello AWS Support Team,&lt;/p&gt;

&lt;p&gt;I am a student currently learning AWS and working on a personal project (POC). &lt;/p&gt;

&lt;p&gt;Recently, I noticed an unexpected charge of approximately INR 9000 in my account. This was unintentional and happened due to my lack of experience in managing AWS resources properly.&lt;/p&gt;

&lt;p&gt;As soon as I realized this, I immediately stopped and terminated all running resources to prevent further charges.&lt;/p&gt;

&lt;p&gt;I kindly request you to consider a one-time billing adjustment or waiver for this amount. As a student, this cost is quite significant for me, and I am actively learning and planning to continue using AWS responsibly in the future.&lt;/p&gt;

&lt;p&gt;I truly appreciate your support and understanding.&lt;/p&gt;

&lt;p&gt;Thank you,&lt;br&gt;
[Your Name]&lt;/p&gt;

</description>
    </item>
    <item>
      <title>12 Simple AWS Tips That Actually Help in Real Projects</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Sun, 08 Feb 2026 14:38:39 +0000</pubDate>
      <link>https://forem.com/imash24/12-simple-aws-tips-that-actually-help-in-real-projects-33i6</link>
      <guid>https://forem.com/imash24/12-simple-aws-tips-that-actually-help-in-real-projects-33i6</guid>
      <description>&lt;p&gt;These aren’t advanced tricks or new services.&lt;br&gt;
Just small habits that make AWS clearer, safer, and cheaper over time.&lt;/p&gt;

&lt;p&gt;1️⃣ Name every resource properly&lt;/p&gt;

&lt;p&gt;Random IDs are fine for AWS — not for humans.&lt;br&gt;
A simple naming format like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;env-service-purpose
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2️⃣ Check the Billing dashboard daily&lt;/p&gt;

&lt;p&gt;Even if you’re a beginner.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Just open:&lt;/li&gt;
&lt;li&gt;Current month spend&lt;/li&gt;
&lt;li&gt;Service-wise cost&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This removes fear around AWS bills and builds cost awareness early.&lt;/p&gt;

&lt;p&gt;3️⃣ Delete unused resources immediately&lt;/p&gt;

&lt;p&gt;Stopped using something? Delete it now.&lt;/p&gt;

&lt;p&gt;“Later” usually means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Forgotten EC2s&lt;/li&gt;
&lt;li&gt;Idle Load Balancers&lt;/li&gt;
&lt;li&gt;Surprise charges&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;4️⃣ Security Groups are stateful (important)&lt;/p&gt;

&lt;p&gt;If inbound traffic is allowed, return traffic is allowed automatically.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Knowing this prevents:&lt;/li&gt;
&lt;li&gt;Extra outbound rules&lt;/li&gt;
&lt;li&gt;Unnecessary open ports&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;5️⃣ Most AWS issues are IAM issues&lt;/p&gt;

&lt;p&gt;Before blaming the service, always check:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Role attached?&lt;/li&gt;
&lt;li&gt;Correct policy?&lt;/li&gt;
&lt;li&gt;Correct resource?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Permissions fail more often than AWS itself.&lt;/p&gt;

&lt;p&gt;6️⃣ Start with the simplest architecture&lt;/p&gt;

&lt;p&gt;Don’t jump straight to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multi-AZ&lt;/li&gt;
&lt;li&gt;Auto Scaling&lt;/li&gt;
&lt;li&gt;Complex networking&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Build simple → understand → then scale.&lt;/p&gt;

&lt;p&gt;7️⃣ Think in flows, not services&lt;/p&gt;

&lt;p&gt;Instead of memorising services, think:&lt;/p&gt;

&lt;p&gt;Request → Load balancer → App → Database → Response&lt;/p&gt;

&lt;p&gt;This helps in:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Debugging&lt;/li&gt;
&lt;li&gt;Architecture design&lt;/li&gt;
&lt;li&gt;Interviews&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;8️⃣ Always know what is public and what is private&lt;/p&gt;

&lt;p&gt;Ask yourself:&lt;/p&gt;

&lt;p&gt;Is this resource internet-facing?&lt;/p&gt;

&lt;p&gt;Who can access it?&lt;/p&gt;

&lt;p&gt;Public exposure is the #1 beginner mistake.&lt;/p&gt;

&lt;p&gt;9️⃣ Tags are not optional&lt;/p&gt;

&lt;p&gt;Even simple tags like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Project
Environment
Owner
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;🔟 Don’t overuse free tier blindly&lt;/p&gt;

&lt;p&gt;Free tier ≠ free forever.&lt;/p&gt;

&lt;p&gt;Some services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Expire after 12 months&lt;/li&gt;
&lt;li&gt;Have usage limits&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Always check the fine print.&lt;/p&gt;

&lt;p&gt;1️⃣1️⃣ Logs are your best friend&lt;/p&gt;

&lt;p&gt;Before guessing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Check CloudWatch logs&lt;/li&gt;
&lt;li&gt;Check metrics&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS usually tells you what is wrong — if you look.&lt;/p&gt;

&lt;p&gt;1️⃣2️⃣ Consistency matters more than speed&lt;/p&gt;

&lt;p&gt;You don’t need to learn everything at once.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Small, consistent AWS usage beats binge-learning services.&lt;/li&gt;
&lt;li&gt;Final thought&lt;/li&gt;
&lt;li&gt;AWS mastery doesn’t come from knowing all services.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It comes from:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Clear thinking&lt;/li&gt;
&lt;li&gt;Good habits&lt;/li&gt;
&lt;li&gt;Real usage
The rest follows naturally.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>devops</category>
      <category>beginners</category>
    </item>
    <item>
      <title>What AWS Certifications Don’t Teach You — Lessons from Real Cloud Projects</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Wed, 24 Dec 2025 07:17:14 +0000</pubDate>
      <link>https://forem.com/imash24/what-aws-certifications-dont-teach-you-lessons-from-real-cloud-projects-2pl4</link>
      <guid>https://forem.com/imash24/what-aws-certifications-dont-teach-you-lessons-from-real-cloud-projects-2pl4</guid>
      <description>&lt;p&gt;I used to think AWS was about &lt;strong&gt;collecting certifications&lt;/strong&gt; and drawing clean architectures.&lt;/p&gt;

&lt;p&gt;Then I built real projects.&lt;/p&gt;

&lt;p&gt;And everything changed.&lt;/p&gt;

&lt;p&gt;Certifications helped me understand the landscape — the names, the possibilities, the patterns. They gave me &lt;strong&gt;breadth&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Projects gave me something completely different — &lt;strong&gt;the ability to touch AWS, mess up configurations, stare at logs, question my life choices, fix them, redeploy,&lt;/strong&gt; and finally see it work. That gave me &lt;strong&gt;depth&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Most importantly, projects gave me confidence through failure, not success.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The AWS services I actually worked with&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Compute &amp;amp; Execution&lt;/strong&gt;: EC2, AWS Lambda&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Containers&lt;/strong&gt;: ECS (Fargate), ECR&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Networking&lt;/strong&gt;: VPC, ALB, Route 53, Security Groups&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CDN &amp;amp; Storag&lt;/strong&gt;e: CloudFront, S3&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CI/CD&lt;/strong&gt;: CodePipeline, CodeBuild, CodeDeploy&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Observability&lt;/strong&gt;: CloudWatch (Logs, Metrics, Alarms), AWS X-Ray&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI Layer&lt;/strong&gt;: AWS Bedrock (Llama 3), Amazon SNS for alerts&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These weren’t services I memorized for an exam — these were services I argued with until they started working.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common failures that humbled me (and what they taught me)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;CloudFront + S3 showing 403&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I assumed S3 should be public so CloudFront can read it. Wrong.&lt;br&gt;
The fix was learning how CloudFront OAC signs requests, and S3 should trust CloudFront, not the world.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;ALB Target Group Unhealthy&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I thought my app was broken. It wasn’t.&lt;br&gt;
The security group only allowed my IP, not ALB.&lt;br&gt;
Lesson learned: AWS debugging is 50% IAM + 50% networking + 0% magic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;ECS task stopped immediately&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I blamed Fargate. Turns out I misconfigured the container port and IAM execution role.&lt;br&gt;
Fixing it taught me the difference between task role vs execution role — something the exam never forced me to truly internalize.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lambda timeout in VPC&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I enabled VPC thinking it improves security. It did — by killing internet access.&lt;br&gt;
Then I learned why NAT Gateway matters for outbound, and why CloudWatch logs don’t magically appear without proper routes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;CodePipeline deploy failures&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is where IAM introduced itself personally.&lt;br&gt;
&lt;strong&gt;Missing s3:GetObject&lt;/strong&gt;, missing deploy permissions, missing log creation access — each retry taught me how to write minimal, correct IAM policies instead of admin-access panic mode.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Minute errors that cost maximum time&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One missing slash in health check path&lt;/li&gt;
&lt;li&gt;One port mismatch between ALB and container&lt;/li&gt;
&lt;li&gt;One wrong IAM action&lt;/li&gt;
&lt;li&gt;One region mismatch in a CLI command&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These tiny things didn’t just teach AWS — they taught me patience, precision, and why real engineers obsess over logs instead of hype.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How I handled working with multiple services together&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;I didn’t learn AWS by reading 100 blogs. I learned by&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Building something small&lt;/li&gt;
&lt;li&gt;Watching it fail in a confusing new way&lt;/li&gt;
&lt;li&gt;Reading AWS docs like a detective&lt;/li&gt;
&lt;li&gt;Digging through CloudWatch logs&lt;/li&gt;
&lt;li&gt;Tracing requests in X-Ray&lt;/li&gt;
&lt;li&gt;Fixing, redeploying, verifying&lt;/li&gt;
&lt;li&gt;Breaking again on purpose to understand it better&lt;/li&gt;
&lt;li&gt;Writing about it publicly so someone else can suffer less than me&lt;/li&gt;
&lt;li&gt;
That loop is my real AWS course.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;What I wish someone told me when I started:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Certifications make you aware, projects make you capable&lt;/li&gt;
&lt;li&gt;AWS feels easy until you open IAM or VPC&lt;/li&gt;
&lt;li&gt;Debugging is a skill, not a side quest&lt;/li&gt;
&lt;li&gt;The cloud rewards builders, not memorizers&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Precision &amp;gt; ambition, logs &amp;gt; assumptions&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You learn faster when you deploy and fail publicly than when you pass privately&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final takeaway&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;AWS didn’t become clear to me when I passed exams.&lt;br&gt;
It became clear when I failed deployments, fixed them, and realized:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;“I can build anything on AWS… as long as I expect it to break first.”&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;And honestly? That’s a comforting thought.&lt;/p&gt;

&lt;p&gt;Because now, when something fails, I don’t panic.&lt;br&gt;
I open CloudWatch. I open X-Ray. I open IAM.&lt;br&gt;
And I start solving.&lt;/p&gt;

&lt;p&gt;Not because I know everything,&lt;br&gt;
but because &lt;strong&gt;I know how to figure it out&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If my errors can save someone even 30 minutes of debugging, this post has already succeeded.&lt;/p&gt;

&lt;p&gt;Certifications gave me the map.&lt;br&gt;
Projects taught me the terrain.&lt;br&gt;
Failures trained me for the real world.&lt;/p&gt;

&lt;p&gt;And I’m still learning — in public — one broken deployment at a time.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>serverless</category>
      <category>community</category>
    </item>
    <item>
      <title>Building a Serverless AI Fitness Coach on AWS Using Bedrock (Llama 3), Lambda &amp; CloudFront</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Sat, 29 Nov 2025 08:10:05 +0000</pubDate>
      <link>https://forem.com/imash24/building-a-serverless-ai-fitness-coach-on-aws-using-bedrock-llama-3-lambda-cloudfront-3bf0</link>
      <guid>https://forem.com/imash24/building-a-serverless-ai-fitness-coach-on-aws-using-bedrock-llama-3-lambda-cloudfront-3bf0</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbpf1y2aqwyhrzxr845np.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbpf1y2aqwyhrzxr845np.png" alt=" " width="800" height="493"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Most fitness and calorie-tracking apps today start free… until they quietly push you into &lt;strong&gt;₹400–₹600/month subscriptions&lt;/strong&gt;.&lt;br&gt;
And honestly, for something as simple as calorie estimation and basic meal suggestions, that always felt unnecessary to me.&lt;/p&gt;

&lt;p&gt;So instead of paying for another premium plan, I decided to build my own &lt;strong&gt;AI-powered Fitness Coach&lt;/strong&gt; using AWS services.&lt;/p&gt;

&lt;p&gt;This project is fully serverless, extremely low-cost, and powered by Amazon Bedrock with the Llama 3 model.&lt;br&gt;
You can enter whatever meal you had, and the app will immediately give you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Estimated calories&lt;/li&gt;
&lt;li&gt;Simple diet improvement tips&lt;/li&gt;
&lt;li&gt;Balanced meal suggestions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And the best part?&lt;br&gt;
The whole thing runs for &lt;strong&gt;less than the price of one tea per month&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In this article, I’ll walk through the entire build with screenshots — from creating the DynamoDB table, IAM role, Lambda backend, Bedrock integration, API Gateway setup, and finally hosting the UI using S3 + CloudFront.&lt;/p&gt;

&lt;p&gt;If you're learning AWS, Bedrock, or serverless architecture, this is a great hands-on project to try.&lt;/p&gt;

&lt;p&gt;Let’s start.&lt;/p&gt;
&lt;h2&gt;
  
  
  1. Creating the DynamoDB Table
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zjfkrdmo1h3hyzne936.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zjfkrdmo1h3hyzne936.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I started the project by setting up the DynamoDB table that will store all the user meal history and AI responses.&lt;/p&gt;

&lt;p&gt;In the first screenshot, you can see me in the DynamoDB → Tables → Create Table page.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Table name&lt;/strong&gt;: fitness-coach-history&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Partition key&lt;/strong&gt;: userId (String)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sort key&lt;/strong&gt;: timestamp (String)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This structure lets us store multiple meal entries per user in chronological order.&lt;br&gt;
Once these two fields are added, the table setup is almost done.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjto5jo21w78afs7v3fgq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjto5jo21w78afs7v3fgq.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the second screenshot, you can see that the table was successfully created.&lt;br&gt;
No extra configurations here — just a clean table ready for Lambda to write into.&lt;/p&gt;
&lt;h2&gt;
  
  
  2. Creating the IAM Role for Lambda
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg931pp31hawzbyki6jj4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg931pp31hawzbyki6jj4.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With DynamoDB ready, I moved on to IAM to create a role that my Lambda function will use.&lt;/p&gt;

&lt;p&gt;Photo 3 shows me inside the &lt;strong&gt;IAM&lt;/strong&gt; → Roles section, clicking &lt;strong&gt;Create Role&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Here I selected:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Trusted entity type&lt;/strong&gt;: AWS service&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use case&lt;/strong&gt;: Lambda&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffdn73c23s25a6ga04cym.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffdn73c23s25a6ga04cym.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This ensures Lambda can assume this role.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdpch4uu0it0x3d94cysu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdpch4uu0it0x3d94cysu.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, I attached the &lt;strong&gt;AWSLambdaBasicExecutionRole policy&lt;/strong&gt;, which allows Lambda to write logs to CloudWatch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdaybcljolbwnq8u46t1k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdaybcljolbwnq8u46t1k.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the above, you can see the review screen where I named the role:&lt;br&gt;
&lt;strong&gt;LambdaBedrockFitnessRole&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once I created the role, it was ready for adding custom permissions.&lt;/p&gt;
&lt;h2&gt;
  
  
  3. Adding Inline Policy for DynamoDB &amp;amp; Bedrock Access
&lt;/h2&gt;

&lt;p&gt;Next, I needed the Lambda function to access both DynamoDB and Amazon Bedrock.&lt;br&gt;
So I added an inline policy to the IAM role.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo7snaqc3f11eoczepwm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo7snaqc3f11eoczepwm.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It shows the &lt;strong&gt;“Create Inline Policy”&lt;/strong&gt; screen for this role.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpxgz12fvdneeo9llbhys.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpxgz12fvdneeo9llbhys.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, I added a JSON policy that gives:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;dynamodb:PutItem&lt;/li&gt;
&lt;li&gt;dynamodb:GetItem&lt;/li&gt;
&lt;li&gt;dynamodb:Query&lt;/li&gt;
&lt;li&gt;Bedrock invoke model permissions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The only part that needs to be customised is the Resource ARN, where you replace:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;region → ap-south-1&lt;/li&gt;
&lt;li&gt;account ID → your AWS account ID&lt;/li&gt;
&lt;li&gt;table name → fitness-coach-history&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here is the policy you can copy and paste.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "DynamoDBAccess",
      "Effect": "Allow",
      "Action": [
        "dynamodb:PutItem",
        "dynamodb:GetItem",
        "dynamodb:Query"
      ],
      "Resource": "arn:aws:dynamodb:ap-south-1:YOUR-ACCOUNT-ID:table/fitness-coach-history"
    },
    {
      "Sid": "BedrockInvokeModel",
      "Effect": "Allow",
      "Action": [
        "bedrock:InvokeModel",
        "bedrock:InvokeModelWithResponseStream"
      ],
      "Resource": "*"
    }
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5satfdk4g4aeajik0frp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5satfdk4g4aeajik0frp.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Name your policy and review the changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcb1xhk8tw1vcyvk9a7qm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcb1xhk8tw1vcyvk9a7qm.png" alt=" " width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we have created the policy and successfully attached it to this role.&lt;/p&gt;

&lt;p&gt;At this point, DynamoDB and IAM setup is fully complete.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Creating the Lambda Function
&lt;/h2&gt;

&lt;p&gt;With IAM ready, the next step was to build the Lambda function that connects everything together — &lt;strong&gt;DynamoDB, Bedrock, and our frontend&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0ous38w8awqorwth7h2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0ous38w8awqorwth7h2.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Function name&lt;/strong&gt;: FitnessCoachLambda&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Runtime&lt;/strong&gt;: Python 3.14&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Execution role&lt;/strong&gt;: the IAM role we created earlier (LambdaBedrockFitnessRole)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Everything else was left as default.&lt;br&gt;
This Lambda function will become the core engine of the application.&lt;/p&gt;
&lt;h2&gt;
  
  
  5. Adding Environment Variables
&lt;/h2&gt;

&lt;p&gt;Before writing the logic, I added two environment variables for cleaner configuration:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3elq4ekuvyrnzni0hd3r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3elq4ekuvyrnzni0hd3r.png" alt=" " width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;TABLE_NAME&lt;/strong&gt; → fitness-coach-history&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MODEL_ID&lt;/strong&gt; → meta.llama3-8b-instruct-v1:0&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These variables help keep the code neat and allow us to change model or table names without modifying the function itself.&lt;/p&gt;

&lt;p&gt;I also grabbed the model ID directly from the Bedrock console (Llama 3 8B Instruct), and included a screenshot in the article to show exactly where to copy it from.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyh8svhna2at58cmxwhfr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyh8svhna2at58cmxwhfr.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  6. Writing the Lambda Code
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd7pm4ne0sxjn4wt4hlpc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd7pm4ne0sxjn4wt4hlpc.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Lambda function is responsible for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Receiving the user’s meal input&lt;/li&gt;
&lt;li&gt;Building a prompt for Amazon Bedrock&lt;/li&gt;
&lt;li&gt;Invoking the Llama 3 model&lt;/li&gt;
&lt;li&gt;Storing the response in DynamoDB&lt;/li&gt;
&lt;li&gt;Returning the AI-generated advice back to the user&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once I pasted the Python code (I'll drop the code below ), I deployed the function directly from the console.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3
import time
import os
from datetime import datetime

# Clients for Bedrock and DynamoDB
bedrock = boto3.client("bedrock-runtime")
dynamodb = boto3.client("dynamodb")

TABLE = os.environ["TABLE_NAME"]
MODEL_ID = os.environ["MODEL_ID"]

def lambda_handler(event, context):
    # 1) Parse HTTP request body
    body_str = event.get("body", "{}")
    try:
        body = json.loads(body_str)
    except:
        body = {}

    prompt = body.get("prompt")
    user_id = body.get("userId", "defaultUser")

    if not prompt:
        return {
            "statusCode": 400,
            "headers": {"Content-Type": "application/json"},
            "body": json.dumps({"error": "prompt required"})
        }

    # 2) Build Llama 3 prompt (instruction style)
    system_message = (
        "You are a strict but friendly fitness coach. "
        "First line MUST be: 'Estimated: ~XXX kcal' if the user describes food. "
        "Then give simple, practical advice and 2-3 improvements. "
        "If the user asks workout questions, give sets, reps, and rest time. "
        "Avoid medical claims."
    )

    llama_prompt = (
        "&amp;lt;|begin_of_text|&amp;gt;"
        "&amp;lt;|start_header_id|&amp;gt;system&amp;lt;|end_header_id|&amp;gt;\n"
        f"{system_message}\n"
        "&amp;lt;|start_header_id|&amp;gt;user&amp;lt;|end_header_id|&amp;gt;\n"
        f"{prompt}\n"
        "&amp;lt;|start_header_id|&amp;gt;assistant&amp;lt;|end_header_id|&amp;gt;\n"
    )

    request_body = {
        "prompt": llama_prompt,
        "max_gen_len": 300,
        "temperature": 0.7
    }

    # 3) Call Llama 3 on Bedrock
    response = bedrock.invoke_model(
        modelId=MODEL_ID,
        contentType="application/json",
        body=json.dumps(request_body)
    )

    payload = json.loads(response["body"].read())

    # Llama 3 Instruct returns 'generation'
    ai_output = payload.get("generation")

    if not ai_output:
        ai_output = "Model did not return output. Please try again."

    # 4) Save to DynamoDB
    timestamp = str(int(time.time()))
    date_str = datetime.utcnow().strftime("%Y-%m-%d")

    dynamodb.put_item(
        TableName=TABLE,
        Item={
            "userId": {"S": user_id},
            "timestamp": {"S": timestamp},
            "prompt": {"S": prompt},
            "response": {"S": ai_output},
            "date": {"S": date_str}
        }
    )

    # 5) Return JSON response
    return {
        "statusCode": 200,
        "headers": {
            "Access-Control-Allow-Origin": "*",
            "Content-Type": "application/json"
        },
        "body": json.dumps({"fitness_coach_reply": ai_output})
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  7. Testing the Lambda Function
&lt;/h2&gt;

&lt;p&gt;After deploying the code, I ran a simple test event to make sure everything was working properly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1j32hxj48nbk5nbxr3nk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1j32hxj48nbk5nbxr3nk.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The test returned&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;HTTP 200 OK&lt;/li&gt;
&lt;li&gt;AI-generated calorie estimation&lt;/li&gt;
&lt;li&gt;Meal suggestions&lt;/li&gt;
&lt;li&gt;No errors in CloudWatch logs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This confirmed that Bedrock access, DynamoDB writes, and the Lambda logic were all functioning end-to-end.&lt;/p&gt;

&lt;h2&gt;
  
  
  8. Verifying the Database Entry
&lt;/h2&gt;

&lt;p&gt;Next, I switched to DynamoDB to make sure the data was actually being logged.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv6ldgjy63ykukslc1rir.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv6ldgjy63ykukslc1rir.png" alt=" " width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Inside the “Explore items” section, I found new entries created by the test:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;userId&lt;/li&gt;
&lt;li&gt;timestamp&lt;/li&gt;
&lt;li&gt;prompt&lt;/li&gt;
&lt;li&gt;AI response&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This validated that the Lambda-to-DynamoDB integration was correct.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;9. Creating the API Gateway Endpoint&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With Lambda working perfectly, the next step was to make it accessible from the frontend&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7nju8sy3g5w6liz47ktb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7nju8sy3g5w6liz47ktb.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I used HTTP API (not REST API) since it's faster and cheaper.&lt;/p&gt;

&lt;p&gt;Inside the API Gateway console:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Created a new HTTP API&lt;/li&gt;
&lt;li&gt;Selected Lambda as the integration type&lt;/li&gt;
&lt;li&gt;Chose the region and selected my Lambda function&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is the simplest and most efficient way to build a lightweight serverless API endpoint.&lt;/p&gt;

&lt;h2&gt;
  
  
  10. Configuring the Route
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8bmcztle9f4l63421ke2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8bmcztle9f4l63421ke2.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the integration was connected, I defined a route:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Method&lt;/strong&gt;: POST&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Path&lt;/strong&gt;: /ask&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration target&lt;/strong&gt;: our Lambda function (FitnessCoachLambda)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This route will be used by the HTML frontend to send the meal details and receive AI-generated feedback.&lt;/p&gt;

&lt;h2&gt;
  
  
  11. Deploying the API
&lt;/h2&gt;

&lt;p&gt;After reviewing all the configurations, I created the API.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq45fz5uayfjuduja3p9m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq45fz5uayfjuduja3p9m.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The moment the API was deployed, API Gateway generated an &lt;strong&gt;Invoke URL&lt;/strong&gt; — this URL is what the JavaScript inside the frontend will call whenever a user enters their meal.&lt;/p&gt;

&lt;p&gt;This completed the entire backend:&lt;br&gt;
&lt;strong&gt;DynamoDB → IAM → Lambda → Bedrock → API Gateway&lt;/strong&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  12. Testing the API with Postman
&lt;/h2&gt;

&lt;p&gt;After deploying the API, the next step was validating whether the endpoint works outside Lambda.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3i04a5kv3i01tw2x9cg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3i04a5kv3i01tw2x9cg.png" alt=" " width="800" height="538"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I tested it using Postman:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Selected &lt;strong&gt;POST&lt;/strong&gt; method&lt;/li&gt;
&lt;li&gt;Pasted the Invoke URL from API Gateway&lt;/li&gt;
&lt;li&gt;Added a JSON body such as:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "prompt": "Give me a push pull workout plan",
  "userId": "ashwin"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Added a header:
&lt;strong&gt;Content-Type&lt;/strong&gt;: application/json&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once executed, Postman returned a &lt;strong&gt;200 OK&lt;/strong&gt; along with the AI-generated workout plan.&lt;br&gt;
This confirmed that API Gateway → Lambda → Bedrock flow was working fully end-to-end.&lt;/p&gt;
&lt;h2&gt;
  
  
  13. Enabling CORS on API Gateway
&lt;/h2&gt;

&lt;p&gt;To allow the frontend to call the API successfully, I enabled CORS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdzczwxom3nkms4o2iee5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdzczwxom3nkms4o2iee5.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The configuration included:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access-Control-Allow-Origin → *&lt;/li&gt;
&lt;li&gt;Access-Control-Allow-Headers → content-type&lt;/li&gt;
&lt;li&gt;Access-Control-Allow-Methods → POST&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This ensures the browser doesn’t block requests when the HTML file tries to call the API.&lt;/p&gt;
&lt;h2&gt;
  
  
  14. Creating the S3 Bucket for the Frontend UI
&lt;/h2&gt;

&lt;p&gt;With the backend ready, I moved on to hosting the user interface.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxbxjw4ckuwlm5slny4oe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxbxjw4ckuwlm5slny4oe.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bucket name: ai-fitness-coach-2025-ui&lt;/li&gt;
&lt;li&gt;Bucket type: General Purpose&lt;/li&gt;
&lt;li&gt;Standard settings for ACL and ownership&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This bucket will store the index.html file that users interact with.&lt;/p&gt;
&lt;h2&gt;
  
  
  15. Uploading the Frontend File
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4z5arcjbvu9qq810a2j2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4z5arcjbvu9qq810a2j2.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Inside the bucket, I uploaded the &lt;strong&gt;index.html&lt;/strong&gt; file.&lt;br&gt;
This file contains the entire UI along with the JavaScript that calls the backend API.&lt;/p&gt;

&lt;p&gt;Once uploaded, it immediately appeared in the object list of the bucket.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;16. Enabling Static Website Hosting&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ytlr361lllyqi2qgvtw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ytlr361lllyqi2qgvtw.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, I enabled &lt;strong&gt;Static Website Hosting&lt;/strong&gt; in the Properties tab.&lt;/p&gt;

&lt;p&gt;This option turns the S3 bucket into a simple website server.&lt;br&gt;
After enabling, AWS provided a &lt;strong&gt;bucket website endpoint&lt;/strong&gt;, which looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;http://ai-fitness-coach-2025-ui.s3-website.ap-south-1.amazonaws.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Opening this URL displayed the HTML UI exactly as expected.&lt;/p&gt;

&lt;h2&gt;
  
  
  17. Adding the Bucket Policy
&lt;/h2&gt;

&lt;p&gt;To make the website accessible publicly, I added a bucket policy allowing read access to all objects:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": "*",
      "Action": "s3:GetObject",
      "Resource": "arn:aws:s3:::ai-fitness-coach-2025-ui/*"
    }
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the policy was applied, the UI became reachable for anyone using the S3 website endpoint.&lt;/p&gt;

&lt;h2&gt;
  
  
  18. Testing the UI
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7w1u5yrymbgewm0edff.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7w1u5yrymbgewm0edff.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At this point, the static website was fully accessible.&lt;br&gt;
Opening the S3 endpoint displayed the UI where users can enter their meals and send the request to the backend API.&lt;/p&gt;

&lt;p&gt;The UI was functional, but to optimize performance and reduce latency worldwide, I integrated CloudFront next.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;19. Creating the CloudFront Distribution&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb0x3w8plyjkhieox6a8d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb0x3w8plyjkhieox6a8d.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The final step in the frontend setup was creating a CloudFront distribution.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;During the setup:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I selected S3 static website endpoint as the origin.&lt;/li&gt;
&lt;li&gt;Kept all recommended/default settings, since CloudFront already optimizes caching, TTLs, and routing for S3 origins.&lt;/li&gt;
&lt;li&gt;No custom behaviors or policies were required for this project.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This simple configuration is enough to get a production-quality CDN in front of the static UI.&lt;/p&gt;
&lt;h2&gt;
  
  
  21. Reviewing and Deploying the Distribution
&lt;/h2&gt;

&lt;p&gt;The review page showed the full configuration — S3 origin, default cache behavior, protocol settings, and standard CloudFront defaults.&lt;/p&gt;

&lt;p&gt;Everything looked good, so I created the distribution.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjgcsck5bx1ad3m7ntebo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjgcsck5bx1ad3m7ntebo.png" alt=" " width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After a few minutes, CloudFront assigned a global CDN URL, something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://dxxxxxxxxxxx.cloudfront.net/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  22. Accessing the UI Through CloudFront
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqcnuhw6j0ocdawhuaeez.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqcnuhw6j0ocdawhuaeez.png" alt=" " width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the distribution finished deploying, I opened the CloudFront URL — and the UI loaded instantly.&lt;/p&gt;

&lt;p&gt;The HTML page, JavaScript, and API integration all worked exactly as expected.&lt;br&gt;
Submitting a meal entry triggered the Lambda function, which invoked Amazon Bedrock, stored the result in DynamoDB, and returned the AI-generated feedback right on the UI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This completed the full serverless pipeline&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Frontend: CloudFront + S3&lt;/li&gt;
&lt;li&gt;Backend: API Gateway + Lambda&lt;/li&gt;
&lt;li&gt;AI: Amazon Bedrock (Llama 3)&lt;/li&gt;
&lt;li&gt;Database: DynamoDB&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Everything was running smoothly, globally accessible, and extremely cost-efficient.&lt;/p&gt;

&lt;h2&gt;
  
  
  ⭐ Conclusion
&lt;/h2&gt;

&lt;p&gt;This project started with a simple idea — build a personal AI Fitness Coach without paying monthly subscription fees.&lt;br&gt;
By combining AWS serverless services with Amazon Bedrock, it turned into a fully working end-to-end application that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;estimates calories&lt;/li&gt;
&lt;li&gt;gives personalised meal suggestions&lt;/li&gt;
&lt;li&gt;stores user history&lt;/li&gt;
&lt;li&gt;serves a fast UI through CloudFront&lt;/li&gt;
&lt;li&gt;runs at a fraction of traditional app costs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The best part is that the entire architecture is scalable, low-maintenance, and suitable for real production workloads with very minimal cost.&lt;br&gt;
If you’re learning AWS, this project is a great hands-on example of integrating multiple cloud services into a single workflow.&lt;/p&gt;

&lt;p&gt;I’ve included the full source code and steps so you can try it yourself or build on top of it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;⭐ GitHub Repository&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Full code, Lambda function, HTML UI, architecture details, and deployment notes are available here:&lt;/p&gt;

&lt;p&gt;👉 &lt;a href="https://github.com/Imash24/aws-ai-nutrition-coach" rel="noopener noreferrer"&gt;https://github.com/Imash24/aws-ai-nutrition-coach&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>devops</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Deploying A Flask App on AWS ECS with Real-time CloudWatch Monitoring</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Tue, 07 Oct 2025 14:35:16 +0000</pubDate>
      <link>https://forem.com/imash24/deploying-a-flask-app-on-aws-ecs-with-real-time-cloudwatch-monitoring-52cb</link>
      <guid>https://forem.com/imash24/deploying-a-flask-app-on-aws-ecs-with-real-time-cloudwatch-monitoring-52cb</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1oibfeeth580ylf1kwi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl1oibfeeth580ylf1kwi.png" alt=" " width="702" height="520"&gt;&lt;/a&gt;&lt;br&gt;
🚀 Want to see your &lt;strong&gt;Flask app live on AWS ECS within 15 minutes&lt;/strong&gt; — with real-time monitoring dashboards? 🔥&lt;/p&gt;

&lt;p&gt;In this walkthrough, we’ll take a &lt;strong&gt;Dockerized Flask application&lt;/strong&gt; (already uploaded to Amazon ECR/ you can also upload to Dockerhub) and deploy it on AWS ECS (Fargate).&lt;/p&gt;

&lt;p&gt;Once deployed, we’ll integrate &lt;strong&gt;Amazon CloudWatch&lt;/strong&gt; to monitor key metrics like CPU and memory utilization, and even visualize them through a &lt;strong&gt;custom CloudWatch Dashboard.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;By the end of this tutorial, you’ll have a &lt;strong&gt;fully managed, scalable, and monitored Flask app&lt;/strong&gt; running seamlessly on AWS.&lt;/p&gt;

&lt;p&gt;⚙️ &lt;strong&gt;Tech Stack &amp;amp; AWS Services Used&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before we dive in, here’s a quick look at the tools and services powering this project:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Flask&lt;/strong&gt; – Python-based lightweight web framework for the application.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Docker&lt;/strong&gt; – To containerize the Flask app.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon ECR&lt;/strong&gt; (Elastic Container Registry) – To store and manage Docker images.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon ECS&lt;/strong&gt; (Elastic Container Service) – To deploy and run the containerized Flask app.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS Fargate&lt;/strong&gt; – Serverless compute engine for running containers without managing EC2 instances.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon CloudWatch&lt;/strong&gt; – To collect metrics, monitor container performance, and visualize data using dashboards.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Let's Get Started:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Step 1:Create an ECS Cluster:&lt;/strong&gt;&lt;br&gt;
1.Navigate to Aws ECS on AWS Console&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6qcvo95nz5h8xl05xxxu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6qcvo95nz5h8xl05xxxu.png" alt=" " width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2.Now we need to create Cluster.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft77z8gpc81xzs2p39sx7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft77z8gpc81xzs2p39sx7.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmwtjqr564mzi0m208wi1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmwtjqr564mzi0m208wi1.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Give your Cluster a name – in my case I'm naming &lt;strong&gt;ECS-FLASK-APP-Cluster&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We dont need to touch other settings Leave as default and click &lt;strong&gt;Create.&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Wait for few minutes until our cluster gets created. And after few minutes you will see an success message.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkt7rt8zgsjr4o3287wqu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkt7rt8zgsjr4o3287wqu.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Create a Task Definition&lt;/strong&gt;&lt;br&gt;
Now our Cluster is ready. We need to create a &lt;strong&gt;Task Definition&lt;/strong&gt;&lt;br&gt;
-- it is basically a blueprint which tells ECS which container to run, how much memory and CPU to allocate and what kind of networking to use.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the ECS Console, go to &lt;strong&gt;Task Defintions&lt;/strong&gt; --&amp;gt; Click &lt;strong&gt;Create new Task definition&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbjurl1aad7yk429steen.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbjurl1aad7yk429steen.png" alt=" " width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Give a name for the task definition family.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next, choose the Launch type as &lt;strong&gt;Fargate&lt;/strong&gt;(it is serverless compute engine for containers, as we need not to Manage the servers ourselves).&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzfmi96en9sxss8rs83po.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzfmi96en9sxss8rs83po.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Under the Container section, click &lt;strong&gt;Add container&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter the name for your container.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the &lt;strong&gt;Image URI&lt;/strong&gt; field, paste the image URL from your Amazon ECR repository(you can also use external container repositories like DockerHub, I'm going with ECR).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select the port &lt;strong&gt;5000&lt;/strong&gt; as our Flask app is exposed on port 5000.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Allocate the required CPU/memory based on your app needs (I'm going with 512M CPU and 1GB memory).&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fktibw91oatlao30q80er.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fktibw91oatlao30q80er.png" alt=" " width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click Add, Then wait for few Minutes and save your Task defintion.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs03d8gx65uu4uwxy9akf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs03d8gx65uu4uwxy9akf.png" alt=" " width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;STEP 3: Create an ECS Service&lt;/strong&gt;&lt;br&gt;
Now our Task Definition is ready, it's time to create a &lt;strong&gt;Service&lt;/strong&gt; This tells how many copies of your app (tasks) of our container to run and keeps them running if other containers fail.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go to your &lt;strong&gt;ECS Cluster&lt;/strong&gt; --&amp;gt; click &lt;strong&gt;Create Service&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhkjoviucwf5383d3z9fu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhkjoviucwf5383d3z9fu.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select your Task Definition and Revision (the one you created in the previous step).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Give the service a name.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fauwm4j9an6fys5xt1f1g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fauwm4j9an6fys5xt1f1g.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Under the &lt;strong&gt;Launch type&lt;/strong&gt;, choose &lt;strong&gt;Fargate.&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgrmonogbfn41kuz56epq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgrmonogbfn41kuz56epq.png" alt=" " width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Under Networking, choose your VPC and subnets, and enable Auto-assign public IP if you want to access the app publicly.&lt;/li&gt;
&lt;li&gt;Create a &lt;strong&gt;Security Group&lt;/strong&gt; and allow port*&lt;em&gt;Port 5000&lt;/em&gt;* inbound, Which is very important.&lt;/li&gt;
&lt;li&gt;Review all configurations, then Click &lt;strong&gt;Create Service.&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fipvgqs1xy000gk3dsros.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fipvgqs1xy000gk3dsros.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Wait for a few moments, and you will have the service created.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjigca47w2sj2kd1rwr1t.png" alt=" " width="800" height="428"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;🌐 Step 4: Access the Flask App&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once the service is up and running, ECS automatically launches your container on AWS Fargate.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now it’s time to &lt;strong&gt;verify that your Flask app is live&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;In your &lt;strong&gt;ECS Cluster&lt;/strong&gt;, go to the Tasks tab under your service.&lt;/li&gt;
&lt;li&gt;You’ll see one or more &lt;strong&gt;running tasks&lt;/strong&gt; — click on the Task ID.&lt;/li&gt;
&lt;li&gt;Scroll down to the **Networking **section.&lt;/li&gt;
&lt;li&gt;Under &lt;strong&gt;Network bindings&lt;/strong&gt;, you’ll find the public IP address assigned to your running container.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwc4m8h6n4h2uzatl8cq4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwc4m8h6n4h2uzatl8cq4.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Copy that IP and open it in your browser&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fknp2udo10t8mm721e2iq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fknp2udo10t8mm721e2iq.png" alt=" " width="800" height="452"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;🎉 Congratulations — Your Flask App is Live!&lt;/strong&gt;&lt;br&gt;
Awesome work! Your Flask application is now &lt;strong&gt;successfully deployed on AWS ECS using Fargate&lt;/strong&gt; 🙌&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now that your app is live, let’s move to the next part — &lt;strong&gt;monitoring and observability&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 5: Create a CloudWatch Dashboard&lt;/strong&gt;&lt;br&gt;
To monitor your Flask app’s performance in real-time, we’ll start by &lt;strong&gt;creating a CloudWatch Dashboard&lt;/strong&gt;. This dashboard will give you a single view of all the important metrics for your ECS service and container.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the Amazon CloudWatch Console → click Dashboards → Create dashboard.&lt;/li&gt;
&lt;li&gt;Give your dashboard a name (Ecs-flask-dashboard).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8gzl6mr8fb4gr1se9jz0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8gzl6mr8fb4gr1se9jz0.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Choose a &lt;strong&gt;widget type&lt;/strong&gt; (like Line, Stacked Area, or Number) depending on what metric you want to display.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Add widget&lt;/strong&gt; and select the &lt;strong&gt;ECS / Fargate metrics&lt;/strong&gt; you want to track — for example:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;CPU Utilization&lt;br&gt;
Memory Utilization&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffua386n5dbk5npqcbbh8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffua386n5dbk5npqcbbh8.png" alt=" " width="800" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;After the required settings and customizing &lt;strong&gt;Save the dashboard.&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here you can see the dashboard we have created; wait for a few minutes, and you can see the line flow as per the metrics.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4853etp5aen3zpz50edu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4853etp5aen3zpz50edu.png" alt=" " width="800" height="389"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43sbyosiksuqojn6rn5s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43sbyosiksuqojn6rn5s.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;💻 &lt;strong&gt;Full Project Code&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can access the complete flask ECS project here:&lt;br&gt;
GitHub Link: &lt;a href="https://github.com/Imash24/ECS-FLASK-APP" rel="noopener noreferrer"&gt;https://github.com/Imash24/ECS-FLASK-APP&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Congrats! You’ve successfully deployed a Flask application on AWS ECS using Fargate, and set up a CloudWatch dashboard to monitor its performance in real-time.&lt;/p&gt;

&lt;p&gt;In this walkthrough, you learned how to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use &lt;strong&gt;ECS Fargate&lt;/strong&gt; to run containerized applications without managing servers.&lt;/li&gt;
&lt;li&gt;Deploy a Dockerized Flask app from ECR.&lt;/li&gt;
&lt;li&gt;Set up an ECS Service to keep your app running and scalable.&lt;/li&gt;
&lt;li&gt;Monitor key metrics like &lt;strong&gt;CPU,&lt;/strong&gt; &lt;strong&gt;memory&lt;/strong&gt;, and &lt;strong&gt;network traffic&lt;/strong&gt; using &lt;strong&gt;CloudWatch Dashboards&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This project gives you a hands-on &lt;strong&gt;understanding of containerized deployments on AWS&lt;/strong&gt;, and is a solid step toward mastering DevOps and cloud-native applications.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>aws</category>
      <category>cloud</category>
      <category>docker</category>
    </item>
    <item>
      <title>AWS CI/CD Made Easy: Build, Deploy, Repeat.</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Mon, 08 Sep 2025 09:01:07 +0000</pubDate>
      <link>https://forem.com/imash24/aws-cicd-made-easy-build-deploy-repeat-4fc9</link>
      <guid>https://forem.com/imash24/aws-cicd-made-easy-build-deploy-repeat-4fc9</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdyq5nk5cem91f5dxryw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdyq5nk5cem91f5dxryw.png" alt=" " width="800" height="1200"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🚀 Tired of deploying your app manually every time you push code to GitHub?&lt;/strong&gt;&lt;br&gt;
What if I told you that with just a few clicks, you can automate the entire process — from a GitHub commit to your app running live on an EC2 instance.&lt;/p&gt;

&lt;p&gt;In this tutorial, I’ll walk you through how &lt;strong&gt;AWS Developer Tools&lt;/strong&gt; — CodeBuild, CodeDeploy, and CodePipeline — work together to create a seamless deployment pipeline. No more manual SSH into EC2, no more missed steps — &lt;strong&gt;just commit, build, deploy, repeat.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🔧Prerequisites&lt;/strong&gt;&lt;br&gt;
Before we jump into AWS, here’s what you’ll need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Source Control&lt;/strong&gt;: GitHub (you can also use CodeCommit, but here we’ll stick with GitHub).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Compute:&lt;/strong&gt; EC2 instance. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CI/CD Tools&lt;/strong&gt;: CodePipeline, CodeBuild, CodeDeploy.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;IAM Roles:&lt;/strong&gt; We’ll create service roles for CodePipeline, CodeBuild, and CodeDeploy, and instance profile for EC2.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sample Application:&lt;/strong&gt; This works with any stack — Node.js, Python, Java, or even a static website.&lt;/li&gt;
&lt;li&gt;👉 If you don’t have one, feel free to fork &lt;strong&gt;my GitHub repo&lt;/strong&gt; and use that as your sample app.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Link to my Github repo&lt;/strong&gt;: &lt;a href="https://github.com/Imash24/AWS-CI-CD-DEMO" rel="noopener noreferrer"&gt;https://github.com/Imash24/AWS-CI-CD-DEMO&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Step 1: Create an EC2 Instance:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxoa1zi2szdx8vfn4b1m0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxoa1zi2szdx8vfn4b1m0.png" alt=" " width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the EC2 Console → Launch a new instance.&lt;/li&gt;
&lt;li&gt;Pick Amazon Linux 2 (free-tier eligible).&lt;/li&gt;
&lt;li&gt;Create/attach a security group that allows:&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Port 22&lt;/strong&gt; (SSH) → so you can connect to it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Port 3000&lt;/strong&gt; (or whatever port your app runs on) → so you can actually access your app in the browser.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once your Instance is up, you can use ssh to connect to your Instance or easiest way is to use &lt;strong&gt;EC2 Instance Connect.&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Formjayve8fnunooxyrqi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Formjayve8fnunooxyrqi.png" alt=" " width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After Connecting to your Console , We need to install the &lt;strong&gt;node.js&lt;/strong&gt; as well the &lt;strong&gt;Codedeploy agent&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;👉 But wait, what’s this &lt;strong&gt;CodeDeploy Agent&lt;/strong&gt; thing?&lt;br&gt;
Think of it as a &lt;strong&gt;messenger&lt;/strong&gt; that lives inside your EC2 instance. When AWS CodeDeploy pushes a deployment, the agent is the one that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Listens for instructions from the CodeDeploy service.&lt;/li&gt;
&lt;li&gt;Pulls the application files (artifacts) from S3 or GitHub.&lt;/li&gt;
&lt;li&gt;Runs the scripts you define in appspec.yml (like install dependencies, restart the app, etc.).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without the agent, your EC2 has no idea that CodeDeploy even exists — so it’s a must-have.&lt;/p&gt;

&lt;p&gt;Now Lets install the Node.js and CodeDeploy agent.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Update packages
sudo yum update -y  

# Install Node.js and npm
sudo yum install -y nodejs npm  

# Install CodeDeploy Agent
sudo yum install -y ruby wget
cd /home/ec2-user
wget https://aws-codedeploy-&amp;lt;your-region&amp;gt;.s3.&amp;lt;your-region&amp;gt;.amazonaws.com/latest/install
chmod +x ./install
sudo ./install auto
sudo service codedeploy-agent start

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Make sure to replace &lt;strong&gt;{your-region}&lt;/strong&gt; to match your region, in my case it is &lt;strong&gt;ap-south-1.&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5sziqj3gl5m1q2xupqfo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5sziqj3gl5m1q2xupqfo.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Verify the CodeDeploy Agent&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that we’ve installed the CodeDeploy agent, let’s make sure it’s actually running.&lt;/p&gt;

&lt;p&gt;Run this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo systemctl status codedeploy-agent.service 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F59goaynxd1vtq50c7tor.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F59goaynxd1vtq50c7tor.png" alt=" " width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Setting up CodeDeploy Application
&lt;/h2&gt;

&lt;p&gt;Now We have created an Ec2 instance and also setup the CodeDeploy agent on EC2, Lets set up the CodeDeploy Application.&lt;/p&gt;

&lt;p&gt;A &lt;strong&gt;CodeDeploy Application&lt;/strong&gt; = just a container that represents your app in AWS.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnid0t5vdl1amltm68i8t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnid0t5vdl1amltm68i8t.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;1️⃣ &lt;strong&gt;Create a CodeDeploy Application&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the AWS CodeDeploy Console → Applications → Create application.&lt;/li&gt;
&lt;li&gt;Give it a name (e.g., cicd-demo-app).&lt;/li&gt;
&lt;li&gt;Choose Compute platform → EC2/On-premises.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ibc4ecnqpp99zia07vc.png" alt=" " width="800" height="424"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;2️⃣ &lt;strong&gt;Create a Deployment Group:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Inside your application, &lt;strong&gt;click Create deployment group&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Give it a name (e.g., cicd-app-deployment-grp).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fff2h34s4m7wqzkt252ui.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fff2h34s4m7wqzkt252ui.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Service role:&lt;/strong&gt;&lt;br&gt;
You’ll need a service role for CodeDeploy (something like CodeDeployServiceRole).&lt;/p&gt;

&lt;p&gt;Attach the managed policy: &lt;strong&gt;AWSCodeDeployRole.&lt;/strong&gt;&lt;br&gt;
This allows CodeDeploy to talk to your EC2.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fma889xr9u5mns5dlp0ib.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fma889xr9u5mns5dlp0ib.png" alt=" " width="800" height="376"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lets Create an Service role for CodeDeploy,&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the IAM Console → Roles → Create role.&lt;/li&gt;
&lt;li&gt;For Trusted entity type, choose AWS service.&lt;/li&gt;
&lt;li&gt;For Use case, select &lt;strong&gt;CodeDeploy&lt;/strong&gt; → CodeDeploy.&lt;/li&gt;
&lt;li&gt;Click Next and attach the managed policy:&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWSCodeDeployRole&lt;/strong&gt; (this gives CodeDeploy the permissions it needs).&lt;/li&gt;
&lt;li&gt;Give the role a name (e.g., CodeDeployServiceRole).&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Create role&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyh6s464d4zuaxf8akyqf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyh6s464d4zuaxf8akyqf.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Environment configuration:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose Amazon EC2 instances.&lt;/li&gt;
&lt;li&gt;Pick your EC2 using Tags, My case i tagged as server.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flxn5onwpkqo0ma5nd9y1.png" alt=" " width="800" height="373"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Deployment settings:&lt;/strong&gt;&lt;br&gt;
Choose “One at a time” for simplicity (safe deployment).&lt;br&gt;
Once done, CodeDeploy knows:&lt;br&gt;
👉 “This application gets deployed to this EC2 instance with these rules.”&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Create a CodeBuild Project:&lt;/strong&gt;&lt;br&gt;
Now We Created and also setup the CodeDeploy that actually knows where to deploy our application (EC2).&lt;/p&gt;

&lt;p&gt;1️⃣ &lt;strong&gt;Go to CodeBuild Console&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click Create build project.&lt;/li&gt;
&lt;li&gt;Give it a name (e.g., cicd-demo-build).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ppaudmzktn6963ulbxj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ppaudmzktn6963ulbxj.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2️⃣ &lt;strong&gt;Configure Source&lt;/strong&gt;&lt;br&gt;
Source provider: GitHub (connect your repo or forked repo). Or simply you can select the public repository option and paste my GitHub repository. link.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjosq1bgwy6tc0yjisbkn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjosq1bgwy6tc0yjisbkn.png" alt=" " width="800" height="380"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3️⃣ &lt;strong&gt;Environment&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Environment image: Managed image.&lt;/li&gt;
&lt;li&gt;Operating system: Amazon Linux 2.&lt;/li&gt;
&lt;li&gt;Runtime: Standard.&lt;/li&gt;
&lt;li&gt;Service role: Create a new role (CodeBuild will do this automatically).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;4️⃣ &lt;strong&gt;Buildspec&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;buildspec.yml&lt;/strong&gt; file gives main instructions to ** CodeBuild.** It tells AWS exactly how to build your application — whether that’s installing dependencies, running tests, or packaging files.&lt;/p&gt;

&lt;p&gt;I’ve already pushed my own &lt;strong&gt;buildspec.yml&lt;/strong&gt; file into my GitHub repo, so &lt;strong&gt;CodeBuild&lt;/strong&gt; will automatically pick it up during the build stage.&lt;/p&gt;

&lt;p&gt;👉 In short, this file is where we define:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How to install dependencies&lt;/li&gt;
&lt;li&gt;How to run build/test steps&lt;/li&gt;
&lt;li&gt;What files should be bundled as artifacts for deployment&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kgo84vr8lxd08qv42ol.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7kgo84vr8lxd08qv42ol.png" alt=" " width="800" height="372"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;CodeBuild Completed Successfully!&lt;/strong&gt;&lt;br&gt;
We’ve set up and completed our CodeBuild project, and our application can now be built automatically using the instructions we defined in the buildspec.yml file. (The YAML handles dependency installation, build, and artifact creation).&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmui22ubvwyuzrps7a076.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmui22ubvwyuzrps7a076.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The next logical step is to orchestrate the entire &lt;strong&gt;CI/CD flow&lt;/strong&gt; — and that’s exactly where &lt;strong&gt;AWS CodePipeline&lt;/strong&gt; comes in.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀 &lt;strong&gt;What is CodePipeline?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;AWS CodePipeline is a fully managed &lt;strong&gt;CI/CD&lt;/strong&gt; orchestration service that automates the software release process. It connects different stages of delivery — Source → Build → Test → Deploy — into a single continuous pipeline.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;With CodePipeline, every time you push code to your repository:&lt;/li&gt;
&lt;li&gt;The pipeline automatically detects the change (via webhook or polling).&lt;/li&gt;
&lt;li&gt;It triggers CodeBuild to build and test the application.&lt;/li&gt;
&lt;li&gt;Once artifacts are ready, it hands them over to CodeDeploy (or another deploy service).&lt;/li&gt;
&lt;li&gt;The application is deployed to the target environment (EC2, ECS, Lambda, etc.).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;1️⃣ &lt;strong&gt;Go to CodePipeline in AWS Console&lt;/strong&gt;&lt;br&gt;
Open the AWS Management Console → Search for CodePipeline → Click Create pipeline and select &lt;strong&gt;Build a custom pipeline&lt;/strong&gt;.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqjqk1cuvr0vyqdedfg0n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqjqk1cuvr0vyqdedfg0n.png" alt=" " width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2️⃣ &lt;strong&gt;Pipeline Settings&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Pipeline name → cicd-demo-pipeline (you can name it anything you want).&lt;br&gt;
Service role → Select “New service role” (AWS will automatically create a role for CodePipeline with the right trust relationships).&lt;br&gt;
Leave advanced settings as default → Click Next.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5drxybqg42uwcqcg54e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk5drxybqg42uwcqcg54e.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3️⃣&lt;strong&gt;Add Source Stage&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is where our code comes from (GitHub in this case).&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Source provider → GitHub (v2).&lt;/li&gt;
&lt;li&gt;Connect to GitHub → Authorize your GitHub account with AWS.&lt;/li&gt;
&lt;li&gt;Repository → Select your repo.&lt;/li&gt;
&lt;li&gt;Branch → main (or whatever branch you want to deploy).&lt;/li&gt;
&lt;li&gt;Change detection → Leave as default (CodePipeline uses webhooks to detect new commits).&lt;/li&gt;
&lt;li&gt;Click Next.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ke5duzji9ak0tb2ri65.png" alt=" " width="800" height="368"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;4️⃣ &lt;strong&gt;Add Build Stage&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now we connect the CodeBuild project we created earlier.&lt;/li&gt;
&lt;li&gt;Build provider → AWS CodeBuild.&lt;/li&gt;
&lt;li&gt;Project name → Select the project you just created.&lt;/li&gt;
&lt;li&gt;Click Next.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5np2fhz5g4scckf82l9g.png" alt=" " width="800" height="369"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;5️⃣ &lt;strong&gt;Add Deploy Stage&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is where the built artifacts are deployed to your EC2 instance using CodeDeploy.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deploy provider → AWS CodeDeploy.&lt;/li&gt;
&lt;li&gt;Application name → Choose the CodeDeploy application you created.&lt;/li&gt;
&lt;li&gt;Deployment group → Select the deployment group.&lt;/li&gt;
&lt;li&gt;Click Next.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcetxnbwult5kl7j2gbx2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcetxnbwult5kl7j2gbx2.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6️⃣ &lt;strong&gt;Review &amp;amp; Create&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Double-check everything → Click Create pipeline.&lt;/li&gt;
&lt;li&gt;AWS will immediately trigger the pipeline for the first time.&lt;/li&gt;
&lt;li&gt;You’ll see stages: Source → Build → Deploy with live status updates.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frooiofqzn1myqa47rhmw.png" alt=" " width="800" height="423"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚨 &lt;strong&gt;Something Failed !!! Dont panic expected one&lt;/strong&gt;😂 &lt;br&gt;
This is expected if your EC2 instance doesn’t yet have the correct &lt;strong&gt;IAM&lt;/strong&gt; &lt;strong&gt;Instance Profile attached&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why does this happen?&lt;/strong&gt;&lt;br&gt;
Because the CodeDeploy agent running inside your EC2 needs permissions to talk to CodeDeploy and pull artifacts from S3. Without that IAM role attached, the agent has no credentials → so the deployment fails.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fckov4xwfqdhua7w06txe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fckov4xwfqdhua7w06txe.png" alt=" " width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;Fix:&lt;/strong&gt; Go to your EC2 instance → attach an IAM role (Instance Profile) with permissions like &lt;strong&gt;AmazonS3ReadOnlyAccess&lt;/strong&gt; and &lt;strong&gt;AWSCodeDeployRole&lt;/strong&gt;. Once added, rerun the deployment and it should succeed.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4x3d4r8gwatcc0ebqyv6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4x3d4r8gwatcc0ebqyv6.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We have attached &lt;strong&gt;AmazonS3ReadonlyAccess&lt;/strong&gt; and &lt;strong&gt;AWSCodeDeployRole&lt;/strong&gt;.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftfv977b7wjfh3rqzvmoo.png" alt=" " width="800" height="424"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;And there we go — Success!&lt;/strong&gt; 🎉&lt;br&gt;
After attaching the correct IAM Instance Profile to our EC2 and rerunning the pipeline, everything works smoothly.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Source Stage → grabbed the code from GitHub&lt;/li&gt;
&lt;li&gt;Build Stage → CodeBuild installed dependencies and packaged the app&lt;/li&gt;
&lt;li&gt;Deploy Stage → CodeDeploy agent pulled the artifact and deployed it to EC2.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3m4oxz6fby05pqc489s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3m4oxz6fby05pqc489s.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🌐 &lt;strong&gt;Accessing Our Application&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now that our pipeline has successfully deployed, let’s confirm everything is working.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I opened my EC2 public IP in the browser at port 3000 — and boom, our Node.js application is up and running! 🚀&lt;/li&gt;
&lt;li&gt;The best part? I didn’t have to SSH into the server or manually copy files. The whole thing was automated through CodePipeline → CodeBuild → CodeDeploy.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Falpwm713al61eczgm795.png" alt=" " width="800" height="450"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;👉 Next, &lt;strong&gt;let’s test the real magic of CI/CD:&lt;/strong&gt; we’ll make a small manual change in GitHub, push the code, and watch the pipeline automatically pick it up and deploy Version 2 to EC2.&lt;/p&gt;

&lt;p&gt;After making a small change in my code, I pushed it to the &lt;strong&gt;main branch&lt;/strong&gt; on GitHub.&lt;/p&gt;

&lt;p&gt;Within seconds, CodePipeline detected the update, automatically triggered a new run, and started executing the stages again — &lt;strong&gt;Source → Build&lt;/strong&gt; → &lt;strong&gt;Deploy&lt;/strong&gt;.&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1kotwnxjdj8a8vkk8sn9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1kotwnxjdj8a8vkk8sn9.png" alt=" " width="800" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🌟 &lt;strong&gt;Version 2 Deployed Successfully&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After the pipeline finished its run, I opened the application in the browser again at port 3000, and the changes from Version 2 were live!&lt;/p&gt;

&lt;p&gt;This proves that our CI/CD pipeline is fully functional: every time you push a commit to GitHub, CodePipeline detects it, CodeBuild packages it, and CodeDeploy deploys it — all automatically. No manual intervention, no missed steps.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh5c6suqpikhugwf80n7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh5c6suqpikhugwf80n7m.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;✅ &lt;strong&gt;Conclusion / Key Takeaways&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Automation is powerful&lt;/strong&gt;: Once set up, your pipeline handles every commit, build, and deployment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;CodePipeline&lt;/strong&gt; = orchestrator: It connects Source → Build → Deploy seamlessly.&lt;br&gt;
&lt;strong&gt;CodeBuild&lt;/strong&gt; = builder: Packages and prepares your app artifacts.&lt;br&gt;
&lt;strong&gt;CodeDeploy&lt;/strong&gt; = delivery agent: Pushes the code to EC2 instances and runs lifecycle scripts.&lt;br&gt;
&lt;strong&gt;IAM Roles Matter&lt;/strong&gt;: Both the CodeDeploy service role and EC2 instance profile are critical for permissions.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>tutorial</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Serverless Cost Tracker – Stay Alert. Stay Efficient.</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Sat, 02 Aug 2025 18:30:04 +0000</pubDate>
      <link>https://forem.com/imash24/serverless-cost-tracker-stay-alert-stay-efficient-1e07</link>
      <guid>https://forem.com/imash24/serverless-cost-tracker-stay-alert-stay-efficient-1e07</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7udc8kje00q47wip8j8t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7udc8kje00q47wip8j8t.png" alt=" " width="800" height="488"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;One Lambda function.&lt;/strong&gt; Logs AWS cost logs daily and gives instant alerts. With Beautiful Dashboards.&lt;br&gt;
This is &lt;strong&gt;Serverless Cost Intelligence&lt;/strong&gt; —a project I built from scratch using Serverless AWS tools to automate cost tracking like never before.&lt;/p&gt;

&lt;p&gt;Alright, enough with the theory; let's build this intelligence system from scratch. Even a beginner can follow these instructions. I have attached screenshots for everything we are going to build.&lt;/p&gt;

&lt;p&gt;🚀 &lt;strong&gt;Tech Stack &amp;amp; AWS Services Used&lt;/strong&gt;&lt;br&gt;
Before we dive into the steps, here’s a quick summary of the core AWS services and tools we are using for this project:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Lambda&lt;/strong&gt; – For running backend logic to fetch cost data automatically&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon CloudWatch Logs&lt;/strong&gt; – To log the cost data and debug if needed&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS Cost Explorer API&lt;/strong&gt; – To fetch cost and usage details programmatically&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon SNS&lt;/strong&gt; – For optional cost alerts/notifications&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon S3&lt;/strong&gt; – For storing logs or future data exports&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon EventBridge&lt;/strong&gt; – To automate the Lambda trigger on a schedule&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon QuickSight&lt;/strong&gt; – For building a dashboard to visualize the cost data.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Step-1: IAM Role Setup for Lambda
&lt;/h2&gt;

&lt;p&gt;To make our Lambda function communicate with the Cost Explorer Service, we need to define an IAM Role with appropriate permissions. For that, we'll create an IAM role with an inline policy attached to that role.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go to the AWS Console and type IAM and select the IAM Service from the UI.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3zdx1rubsgmapxpi360.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr3zdx1rubsgmapxpi360.png" alt=" " width="800" height="456"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now select the Create role.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F69rqw3lff6z45jz62rt6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F69rqw3lff6z45jz62rt6.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On the Next page, for the use case, select the lambda in the dropdown.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffixi7kfq25wzmyr3z1n3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffixi7kfq25wzmyr3z1n3.png" alt=" " width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now let's create a Role name, name anything but make it more descriptive, and click next, review the changes, and create the role.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80rz6px8nmpzbk1bp99c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80rz6px8nmpzbk1bp99c.png" alt=" " width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;We have now created the Role, but wait, we haven't attached any policy. Now go to the Permissions tab and click &lt;strong&gt;Add inline policy&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Switch to the JSON tab and paste the following policy&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzc7b6g13wp52zh0xvetu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzc7b6g13wp52zh0xvetu.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here is the Policy. You can copy it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "ce:GetCostAndUsage",
        "s3:PutObject",
        "s3:PutObjectAcl",
        "sns:Publish"
      ],
      "Resource": "*"
    }
  ]
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0fow9rqmdwj495ichwoh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0fow9rqmdwj495ichwoh.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now, name the Policy and click Create Policy.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdf2a65hzrt584bah8d8p.png" alt=" " width="800" height="406"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Step-1: Completed the IAM Role Setup&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Step 1, we created a dedicated IAM Role and attached a custom inline policy with all the necessary permissions. This role allows our Lambda function to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access AWS Cost Explorer to fetch cost data&lt;/li&gt;
&lt;li&gt;Write data to S3 for storing cost reports&lt;/li&gt;
&lt;li&gt;Publish messages to SNS Topics for alerts or notifications&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 2: Creating an S3 Bucket to Store our Cost Data.
&lt;/h2&gt;

&lt;p&gt;We'll set up an S3 Bucket that our Lambda function uses to save the AWS Cost Data in CSV Format. Let's Create the Bucket, Navigate to the AWS Console, and Search for S3.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy4o558k06ndhylikkluo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy4o558k06ndhylikkluo.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Click The Create Bucket.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylb065v72xmh0xgmh8rt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylb065v72xmh0xgmh8rt.png" alt=" " width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Just name the bucket, and leave all the other settings as default.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flf9494f7uy1nl0x05j1w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flf9494f7uy1nl0x05j1w.png" alt=" " width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click next, review the bucket settings, click Create bucket, and wait for the bucket to get created.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fln7skwdyugsy4k3hc3np.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fln7skwdyugsy4k3hc3np.png" alt=" " width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Step 2 Complete – S3 Bucket Creation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this step, we created a dedicated S3 bucket where our Lambda function can securely store the daily cost reports fetched from AWS.&lt;/p&gt;

&lt;p&gt;This bucket is the central storage for our cost logs and data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step-3: Setting Up SNS for Email Alerts
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Now, we are setting up Amazon SNS(Simple Notification Service) to send cost alerts directly to our email inbox.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to the Amazon Console and Search for SNS and Click.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1wn9cqi33s6qgisosf8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1wn9cqi33s6qgisosf8.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now we have to create a TopicName. Name your topic and click next step.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr39ezuw5q8i63jxax1ql.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr39ezuw5q8i63jxax1ql.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choose the Type as Standard, and create the Topic.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffjmh2s40qum5fdjxrdmz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffjmh2s40qum5fdjxrdmz.png" alt=" " width="800" height="422"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After that, we need to create a Subscription.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foejpvoekldrqrk82zxgi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foejpvoekldrqrk82zxgi.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here you need to select the SNS topic we have created previously and select the protocol as Email, Then Enter your Email and click Create the Subscription.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwt1yhqzavvsd40q27lb4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwt1yhqzavvsd40q27lb4.png" alt=" " width="800" height="380"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;One Thing, you will receive the confirmation mail in your inbox, click to verify, and you have completed this Step.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwfk3h9n75ikl95ufrh5a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwfk3h9n75ikl95ufrh5a.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Step 3 Complete – SNS Topic + Subscription Setup&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We have created an SNS topic and added an email subscription — this is the system that will notify us as soon as the AWS costs exceed our Budget.&lt;/p&gt;

&lt;h2&gt;
  
  
  🚀 Step 4: Lambda Function Setup
&lt;/h2&gt;

&lt;p&gt;In Step 4, we’ll create the actual Lambda function with Custom Logic that fetches cost data from AWS Cost Explorer, stores the data in S3, and then notifies us via the SNS Service. We'll also attach the IAM role we created before so Lambda has the right permissions to do communicate with other services.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go to the AWS Console and Search for Lambda and click.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zb2muvhzydzoxg2i5jj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zb2muvhzydzoxg2i5jj.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here, select the Author from scratch, and name your function. Select the runtime as python:3.13&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqflovczbimylvmequpy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqflovczbimylvmequpy.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Below, select the existing role we have created in our Step-1 and attach it to our lambda function.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbntnlhjaq8payk6p6rum.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbntnlhjaq8payk6p6rum.png" alt=" " width="800" height="305"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now, click Next, review the changes, and click Create Function and wait for the function to get created.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiauegmt8idbgtxdwenmy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiauegmt8idbgtxdwenmy.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Let's start writing our function logic. I will be uploading the Code snippet here.&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
import json
import datetime
import csv
import os

# Replace with your values
SNS_TOPIC_ARN = "arn:aws:sns:ap-south-1:123456789012:aws-cost-alerts"
S3_BUCKET = "smart-cost-tracker-logs"

def lambda_handler(event, context):
    today = datetime.date.today()
    start = (today - datetime.timedelta(days=2)).strftime('%Y-%m-%d')
    end = (today - datetime.timedelta(days=1)).strftime('%Y-%m-%d')

    client = boto3.client('ce')
    response = client.get_cost_and_usage(
        TimePeriod={'Start': start, 'End': end},
        Granularity='DAILY',
        Metrics=['UnblendedCost'],
        GroupBy=[{'Type': 'DIMENSION', 'Key': 'SERVICE'}]
    )

    services = response['ResultsByTime'][0]['Groups']

    # CSV content
    csv_lines = [["Service", "Cost (USD)"]]
    alert_lines = []
    total_cost = 0

    for service in services:
        name = service['Keys'][0]
        amount = float(service['Metrics']['UnblendedCost']['Amount'])
        total_cost += amount
        csv_lines.append([name, f"{amount:.4f}"])
        if amount &amp;gt; 1:  # Alert threshold (₹80+)
            alert_lines.append(f"{name}: ${amount:.2f}")

    # Save to S3
    file_name = f"daily-cost-{start}.csv"
    local_file_path = f"/tmp/{file_name}"

    with open(local_file_path, 'w', newline='') as file:
        writer = csv.writer(file)
        writer.writerows(csv_lines)

    s3 = boto3.client('s3')
    s3.upload_file(local_file_path, S3_BUCKET, file_name)

    # Send Alert if above threshold
    if total_cost &amp;gt; 5:  # Total &amp;gt; ₹400
        sns = boto3.client('sns')
        msg = f"AWS Daily Cost Alert - {start}\n\nTotal: ${total_cost:.2f}\n\n" + "\n".join(alert_lines)
        sns.publish(TopicArn=SNS_TOPIC_ARN, Subject="AWS Daily Cost Alert 🚨", Message=msg)

    return {
        'statusCode': 200,
        'body': json.dumps('Cost fetched and logged successfully!')
    }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Basically, this is How the Code Works:)&lt;br&gt;
📅 It fetches daily AWS cost data from the Cost Explorer.&lt;br&gt;
📂 Converts it into a CSV file and stores it in S3.&lt;br&gt;
💸 Checks for services costing more than $1 and prepares an alert.&lt;br&gt;
📬 If the total cost exceeds $5, it sends a notification via SNS to your email.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;⚠️NOTE&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Replace your SNS Topic ARN and S3 Bucket name at the top of the Code.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1x0hifdbphhni7x38sec.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1x0hifdbphhni7x38sec.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Another Quick thing to do, to make our function run faster, we are changing the Memory limit and Execution Time of our Lambda function.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa7epn0cl8gdvoxcdunzj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa7epn0cl8gdvoxcdunzj.png" alt=" " width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Here is the Screenshot for your reference.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2n1ugyegk4eosp48pkje.png" alt=" " width="800" height="365"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;✅ &lt;strong&gt;Step 4 Complete – Lambda Function Ready!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Awesome🎉, We have successfully created our Lambda function, attached the necessary IAM role, and plugged in the logic that:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Fetches AWS cost data&lt;/li&gt;
&lt;li&gt;Stores it safely in S3&lt;/li&gt;
&lt;li&gt;Sends alerts if the cost crosses the limit&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Your smart cost tracker is now fully functional and ready to Function Well.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step-5: Setting up a CloudWatch Events Rule&lt;/strong&gt; with a Cron expression to automate this Lambda function, so it runs every day without Manual Intervention.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go to the AWS Management Console and Type CloudWatch Events&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftrjljgnf84wcblztz9d7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftrjljgnf84wcblztz9d7.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From here, go to Events and create a new Rule.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F26975duuqt5avccqi4o7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F26975duuqt5avccqi4o7.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now let's name the rule, and select the Rule Type to &lt;strong&gt;Schedule&lt;/strong&gt;, then click Continue to Create Rule.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxlhhomuhg6fh6ge75sv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faxlhhomuhg6fh6ge75sv.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fni26oi0mhqselg6kso9e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fni26oi0mhqselg6kso9e.png" alt=" " width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Let's enter the Cron, &lt;/li&gt;
&lt;li&gt;&lt;p&gt;Schedule expression: cron(0 3 * * ? *) (Runs daily at 8:30 AM IST), change as per the need, and click next.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select the target as Lambda function, and select our created Lambda function, then click Create.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zzk2kp6om8xnxl8xm70.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zzk2kp6om8xnxl8xm70.png" alt=" " width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We have Successfully Created a Cron and Schedule, which&lt;br&gt;
auto triggers the lambda function that fetches the AWS Costs logs and stores them in an S3 Bucket.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;-Finally, let's deploy our lambda function, and test it once manually, we can see that the Status code to 200. and with a message body "Cost fetched and Logged Successfully," which means our project is 100% working well and good.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo82yymya9iqe1pjp7gkb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo82yymya9iqe1pjp7gkb.png" alt=" " width="800" height="398"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Also, let's confirm that the logs of the cost reports are stored in S3.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fea09ejhwd45wqi0t6w86.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fea09ejhwd45wqi0t6w86.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;br&gt;
We can see that a cost log file has been created by the lambda function and stored in S3.&lt;/p&gt;

&lt;p&gt;Congratulations! You’ve successfully built and tested Serverless Cost Tracker. Here's a quick recap of what we achieved so far:&lt;/p&gt;

&lt;p&gt;✅ Created and configured The IAM Role with inline policies for S3, Cost Explorer, SNS&lt;br&gt;
✅ Set up a secure S3 bucket to store cost reports&lt;br&gt;
✅ Created an SNS Topic with email subscription for alerts&lt;br&gt;
✅ Built and deployed a fully working Lambda Function&lt;br&gt;
✅ Manually tested the function – verified a 200 OK response and confirmed the log file was stored in S3&lt;br&gt;
✅ Set up CloudWatch Cron Scheduler to automate daily runs&lt;/p&gt;

&lt;h2&gt;
  
  
  📊 Bonus Visualization with QuickSight (Optional)
&lt;/h2&gt;

&lt;p&gt;You can also integrate the S3 bucket (where our cost data is stored) with Amazon QuickSight to generate clear, beautiful cost graphs and dashboards.&lt;/p&gt;

&lt;p&gt;I'll attach a sample screenshot to give you an idea — Which is optional, but very useful if you want a quick visual summary of daily AWS Costs and Associated Service Usage. It's even a better way to showcase the project and make it look even better.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvxmgprdakh1te3ok5js0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvxmgprdakh1te3ok5js0.png" alt=" " width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>devops</category>
      <category>aws</category>
      <category>cloudcomputing</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Building a High Performace website using AWS Cloudfront + ALB + EC2</title>
      <dc:creator>Ashwin Venkatesan</dc:creator>
      <pubDate>Wed, 16 Jul 2025 17:55:50 +0000</pubDate>
      <link>https://forem.com/imash24/building-a-high-performace-website-using-aws-cloudfront-alb-ec2-2jhd</link>
      <guid>https://forem.com/imash24/building-a-high-performace-website-using-aws-cloudfront-alb-ec2-2jhd</guid>
      <description>&lt;p&gt;&lt;strong&gt;Hosting a website on EC2 is simple&lt;/strong&gt; -- but it's not that scalable, say what if 100s of requests are flooding through your application/website. Headache, right!! In this post, we will set up a highly available and performant website using Application Load Balancer (ALB), EC2, and CloudFront for Efficient delivery.&lt;/p&gt;

&lt;p&gt;We will go through a step-by-step process to achieve this, and I have also included real screenshots and steps so even beginners can follow along.&lt;br&gt;
&lt;strong&gt;Final result?&lt;/strong&gt; A public-facing website that is delivered worldwide, served through Cloudfront (CDN). Interesting, right? Let's get started. 🚀&lt;/p&gt;

&lt;p&gt;📦Step 1: Launch an EC2 Instance&lt;/p&gt;

&lt;p&gt;(I already set up an EC2 instance with a basic website (HTML, CSS) and served it using Nginx.)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup details&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Installed Nginx + uploaded site files&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F18vudby4ngdsorxb6lh5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F18vudby4ngdsorxb6lh5.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And make sure to allow only necessary security groups for better Security reasons. In my case, on my EC2(security group), I allowed ports 22,80.&lt;/p&gt;

&lt;p&gt;After the Basic website setup, you can see that the site can be accessed by the public IP of my EC2 Instance.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdj8zkbzi07zidy3kv05h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdj8zkbzi07zidy3kv05h.png" alt=" " width="800" height="448"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🛠️&lt;strong&gt;Now Comes the Real Game:&lt;/strong&gt; Scaling It Up with ALB and CloudFront&lt;br&gt;
So far, we’ve deployed a simple website using an EC2 instance. That’s a great start — but it’s not scalable, not fault-tolerant, and not production-ready.&lt;/p&gt;

&lt;p&gt;In the next steps, I’ll take this basic setup and upgrade it to a high-performance architecture using:&lt;/p&gt;

&lt;p&gt;✅ Application Load Balancer (ALB) to distribute traffic&lt;br&gt;
✅ CloudFront to deliver content globally with low latency&lt;br&gt;
✅ Tighter Security Groups to expose only what’s needed&lt;/p&gt;

&lt;p&gt;⚖️Step 2: Configure the Application Load Balancer (ALB)&lt;/p&gt;

&lt;p&gt;Lets setup an ALB, &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8yur7xdvq9exwt5xnqeo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8yur7xdvq9exwt5xnqeo.png" alt=" " width="800" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Name the LoadBalancer and make it a public-facing loadbalancer, as we need to serve the traffic from users/public. Click next.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27wd6u4t2a09x1664vxm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27wd6u4t2a09x1664vxm.png" alt=" " width="800" height="402"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here, select the VPC (default VPC) and also select the Availability zones and subnets. I'm selecting two subnets for higher availability (us-east1a and us-east1b). Click next.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8v2womsw0dyu0wbzz744.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8v2womsw0dyu0wbzz744.png" alt=" " width="800" height="365"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next step is to create the Security group for the LoadBalancer In my case, I'm creating inbound rules on ports 80,443.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After that, click the Create Target group.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F555sz5m48ayjhpy1lf1u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F555sz5m48ayjhpy1lf1u.png" alt=" " width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Name The Target Group and in that page that's all we need for now leave all as such to Default.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fimqwjwiqh12pk87soeji.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fimqwjwiqh12pk87soeji.png" alt=" " width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now we need to register the targets as we are using the ec2 instances select those ec2 instances and click include as pending below. That's it.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnraza43bx8yzlpfn6m8w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnraza43bx8yzlpfn6m8w.png" alt=" " width="800" height="376"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now Select the newly created target group and press next.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F070id97vp6r2lh05xjbb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F070id97vp6r2lh05xjbb.png" alt=" " width="800" height="345"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Wait for few minutes before the loadbalancer gets fully set up and ready to serve the targets, That is our EC2 Instance. Once it becomes active, copy the loadbalancer's DNS name to access our website.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5odo0w5n0lqh76uqoayg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5odo0w5n0lqh76uqoayg.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Voila! Now we are serving our website using a load balancer's(DNS NAME!!!), but wait we are not done yet, we need to set up an CloudFront distribution.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqls9dit2xizc1qrix4k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsqls9dit2xizc1qrix4k.png" alt=" " width="800" height="451"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🌐 &lt;strong&gt;Step 3: Setup CloudFront&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Search for CLoudfront on the aws console&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmzzqsik7pl2mttnoah8v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmzzqsik7pl2mttnoah8v.png" alt=" " width="800" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;We need to Create a new cloudfront distribution, name your distribution and select the single website or app.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F57qdavoivzyequds41pl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F57qdavoivzyequds41pl.png" alt=" " width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now Select the origin type as Elastic load balancer and choose your load balancer below.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvx29zvtltkthz7buwu7h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvx29zvtltkthz7buwu7h.png" alt=" " width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Here in Security, if you need AWS WAF(Web application firewall) capabilities you can opt for here I'm selecting (Do not enable WAF).&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjz7zjvlmwuls7fzmagzr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjz7zjvlmwuls7fzmagzr.png" alt=" " width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now here Review the changes we did and proceed to create our distribution.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbq73vgg3s02x3deinfkq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbq73vgg3s02x3deinfkq.png" alt=" " width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Wait for few minutes to make the distribution to propagate fully and become active. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frrrqtcznw4mb4e8ttuux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frrrqtcznw4mb4e8ttuux.png" alt=" " width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Important step as we going with a sample go through on setting up an HTTP site, we need to setup the protocol to HTTP only.(If you are using the Custom domain or HTTPS , SSL certs make sure to turn on to HTTPS.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqmxrn8lfg7hozvtcc01g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqmxrn8lfg7hozvtcc01g.png" alt=" " width="800" height="423"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After few minutes you can see the last modified to deployed our site is ready for production.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffgy1gk6yoppvap90ha2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffgy1gk6yoppvap90ha2d.png" alt=" " width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🎉 &lt;strong&gt;Final Step: Access Your Website via CloudFront&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now, take the CloudFront distribution domain name (something like d1234abc.cloudfront.net) and open it in your browser. You should see your deployed website loading fast, served securely through CloudFront, routed via your Application Load Balancer, and ultimately hitting your EC2 instance.&lt;/p&gt;

&lt;p&gt;✅ Boom! You've just built and deployed a highly available, scalable, and globally optimized website using AWS infrastructure.&lt;br&gt;
From a simple EC2 instance to a fully distributed setup — this is how real-world deployments scale.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3wpsgwbrr77lirkwbo85.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3wpsgwbrr77lirkwbo85.png" alt=" " width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🙌 What’s Next?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can now explore:&lt;/li&gt;
&lt;li&gt;Adding SSL using ACM&lt;/li&gt;
&lt;li&gt;Custom domain via Route 53&lt;/li&gt;
&lt;li&gt;Auto scaling your EC2 instances&lt;/li&gt;
&lt;li&gt;Logging and monitoring with CloudWatch&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>programming</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
