<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Yeshwanth L M</title>
    <description>The latest articles on Forem by Yeshwanth L M (@yeshwanthlm).</description>
    <link>https://forem.com/yeshwanthlm</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/yeshwanthlm"/>
    <language>en</language>
    <item>
      <title>Building a Serverless DynamoDB MCP: Making Your AI Talk to Your Database</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Thu, 30 Apr 2026 10:16:53 +0000</pubDate>
      <link>https://forem.com/aws-builders/building-a-serverless-dynamodb-mcp-making-your-ai-talk-to-your-database-3jne</link>
      <guid>https://forem.com/aws-builders/building-a-serverless-dynamodb-mcp-making-your-ai-talk-to-your-database-3jne</guid>
      <description>&lt;h1&gt;
  
  
  Building a Serverless DynamoDB MCP: Making Your AI Talk to Your Database
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fim9m2eji32nl45ls38hg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fim9m2eji32nl45ls38hg.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Have you ever wished you could just &lt;em&gt;ask&lt;/em&gt; your AI assistant to query your database? Something like:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Hey Kiro, show me all active users from my DynamoDB table"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;or&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Add a new user named Alice with email &lt;a href="mailto:alice@example.com"&gt;alice@example.com&lt;/a&gt; to the Users table"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Well, that's exactly what we're building today! 🚀&lt;/p&gt;

&lt;h2&gt;
  
  
  The Big Picture: What Are We Building?
&lt;/h2&gt;

&lt;p&gt;We're creating a &lt;strong&gt;serverless MCP (Model Context Protocol) backend&lt;/strong&gt; on AWS that enables AI assistants like Kiro to interact with DynamoDB tables conversationally. Think of it as giving Kiro a direct, secure phone line to your DynamoDB database.&lt;/p&gt;

&lt;p&gt;Here's what makes this special:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;10 DynamoDB operations&lt;/strong&gt; exposed as natural language tools&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Completely serverless&lt;/strong&gt; - runs on AWS Lambda&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Secure by default&lt;/strong&gt; - AWS IAM authentication with SigV4 signing&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Zero local dependencies&lt;/strong&gt; - all the heavy lifting happens in the cloud&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Self-configuring&lt;/strong&gt; - tools are discovered dynamically&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Wait, What's MCP?
&lt;/h2&gt;

&lt;p&gt;Before we dive in, let's talk about MCP (Model Context Protocol). &lt;/p&gt;

&lt;p&gt;Think of MCP as a standardized way for AI assistants to use external tools. It's like giving your AI a toolbox where each tool does something specific - query a database, fetch weather data, send emails, etc.&lt;/p&gt;

&lt;p&gt;The protocol works like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;AI assistant connects to an MCP server&lt;/li&gt;
&lt;li&gt;Server tells AI what tools are available&lt;/li&gt;
&lt;li&gt;AI can call these tools when needed&lt;/li&gt;
&lt;li&gt;Server executes the tool and returns results&lt;/li&gt;
&lt;li&gt;AI uses the results to help the user&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;The beauty?&lt;/strong&gt; The AI doesn't need to know &lt;em&gt;how&lt;/em&gt; the tools work internally. It just needs to know &lt;em&gt;what&lt;/em&gt; they do and &lt;em&gt;how&lt;/em&gt; to call them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Build This Serverless?
&lt;/h2&gt;

&lt;p&gt;You might ask: "Why not just run a local server on my machine?"&lt;/p&gt;

&lt;p&gt;Great question! Here's why serverless wins:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. &lt;strong&gt;Centralized Management&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;One deployment serves all your team members. Update once, everyone benefits. No "it works on my machine" problems.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. &lt;strong&gt;Security at Scale&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;IAM-based authentication (no API keys to rotate)&lt;/li&gt;
&lt;li&gt;Each Lambda has scoped permissions&lt;/li&gt;
&lt;li&gt;Audit logs for every database operation&lt;/li&gt;
&lt;li&gt;Secrets managed by AWS Secrets Manager&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. &lt;strong&gt;Cost Efficiency&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Pay only when you use it. Lambda charges per request, not per hour. Most hobby projects? Practically free under AWS free tier.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. &lt;strong&gt;Automatic Scaling&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Whether it's you at 2 AM or your whole team during peak hours, it just works.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. &lt;strong&gt;No Infrastructure Headaches&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;No servers to patch, no runtime versions to manage, no "why is Python 3.8 broken on my Mac?"&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture: How It All Fits Together
&lt;/h2&gt;

&lt;p&gt;Let me paint you a picture of how this works:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;┌─────────────────────┐
│  You: "Show me all  │
│  users from Users   │
│  table"             │
└──────────┬──────────┘
           │
           ▼
┌─────────────────────┐
│  Claude Desktop     │ ← Your AI assistant
│  (MCP Client)       │
└──────────┬──────────┘
           │ stdio / JSON-RPC
           ▼
┌─────────────────────┐
│  Local Proxy        │ ← Signs requests with your AWS credentials
│  (proxy.sh)         │
└──────────┬──────────┘
           │ HTTPS + AWS IAM Auth
           ▼
┌─────────────────────┐
│  API Gateway        │ ← Entry point to AWS
│  (HTTP API)         │
└──────────┬──────────┘
           │
           ▼
┌─────────────────────┐
│  Lambda Functions   │ ← 11 functions, one per operation
│  - get-item         │
│  - put-item         │
│  - query            │
│  - scan             │
│  - etc...           │
└──────────┬──────────┘
           │
           ▼
┌─────────────────────┐
│  DynamoDB Tables    │ ← Your actual data
└─────────────────────┘
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The Flow, Step by Step:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;You ask Kiro&lt;/strong&gt; something about your database&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Kiro recognizes&lt;/strong&gt; it needs to use a DynamoDB tool&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Local proxy intercepts&lt;/strong&gt; the request and signs it with AWS SigV4&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API Gateway validates&lt;/strong&gt; the signature (IAM authentication)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lambda function executes&lt;/strong&gt; the DynamoDB operation&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Result comes back&lt;/strong&gt; as human-readable text&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Kiro uses the result&lt;/strong&gt; to answer your question&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The genius here? &lt;strong&gt;Kiro has no idea&lt;/strong&gt; it's talking to AWS. It thinks it's using a local tool. All the cloud complexity is hidden.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Key Design Decisions
&lt;/h2&gt;

&lt;p&gt;Let me walk you through the "why" behind each major decision:&lt;/p&gt;

&lt;h3&gt;
  
  
  Decision 1: Why Plain-Text Responses?
&lt;/h3&gt;

&lt;p&gt;DynamoDB returns data in this format:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Item"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"userId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"S"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"user001"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"S"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Alice Johnson"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"age"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"N"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"28"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ugly, right? Those &lt;code&gt;{"S": ...}&lt;/code&gt; and &lt;code&gt;{"N": ...}&lt;/code&gt; wrappers are DynamoDB's type system.&lt;/p&gt;

&lt;p&gt;Our Lambda functions convert this to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="s"&gt;Item from table 'Users'&lt;/span&gt;&lt;span class="err"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;userId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;user001&lt;/span&gt;
  &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Alice Johnson&lt;/span&gt;
  &lt;span class="na"&gt;age&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;28&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Why?&lt;/strong&gt; Because Kiro can narrate this naturally to you. No JSON parsing needed. It's optimized for conversation, not computation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Decision 2: Why One Lambda Per Operation?
&lt;/h3&gt;

&lt;p&gt;We could've built one mega-Lambda that handles everything. But we didn't. Here's why:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Principle of Least Privilege&lt;/strong&gt;: Each Lambda gets &lt;em&gt;only&lt;/em&gt; the permissions it needs.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;get-item&lt;/code&gt; Lambda → &lt;code&gt;dynamodb:GetItem&lt;/code&gt; permission only&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;put-item&lt;/code&gt; Lambda → &lt;code&gt;dynamodb:PutItem&lt;/code&gt; permission only&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;delete-item&lt;/code&gt; Lambda → &lt;code&gt;dynamodb:DeleteItem&lt;/code&gt; permission only&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If one Lambda gets compromised? Damage is limited.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Clear Separation&lt;/strong&gt;: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Each Terraform file = One Lambda&lt;/li&gt;
&lt;li&gt;Easy to understand, easy to modify&lt;/li&gt;
&lt;li&gt;Want to remove scan operation? Delete one file.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Cost Optimization&lt;/strong&gt;:&lt;br&gt;
Lambda charges by execution time. Smaller functions = faster cold starts = lower costs.&lt;/p&gt;
&lt;h3&gt;
  
  
  Decision 3: Why Self-Configuring Tools?
&lt;/h3&gt;

&lt;p&gt;The proxy script doesn't have any hardcoded tool definitions. On startup, it calls:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;GET /tools
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And receives:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"dynamodb_get_item"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Retrieve a single item from DynamoDB..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"inputSchema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"route"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/dynamodb/get-item"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;The magic?&lt;/strong&gt; Add a new tool to &lt;code&gt;dynamodb_ops.py&lt;/code&gt;, deploy, and the proxy &lt;em&gt;automatically&lt;/em&gt; discovers it. No client-side updates needed.&lt;/p&gt;

&lt;p&gt;This follows the Unix philosophy: &lt;strong&gt;"mechanism, not policy."&lt;/strong&gt; The proxy provides the mechanism (SigV4 signing, JSON-RPC), but the backend defines the policy (what tools exist).&lt;/p&gt;

&lt;h3&gt;
  
  
  Decision 4: Why AWS IAM Instead of API Keys?
&lt;/h3&gt;

&lt;p&gt;Traditional approach:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"super-secret-key-123"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Our approach:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Uses your AWS credentials&lt;/span&gt;
&lt;span class="c"&gt;# Same ones you use for AWS CLI&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Benefits:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ No keys to rotate every 90 days&lt;/li&gt;
&lt;li&gt;✅ Integrates with your existing AWS setup&lt;/li&gt;
&lt;li&gt;✅ CloudTrail logs every request&lt;/li&gt;
&lt;li&gt;✅ Can revoke access instantly via IAM&lt;/li&gt;
&lt;li&gt;✅ Supports MFA, temporary credentials, SSO&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The proxy signs every request&lt;/strong&gt; with AWS Signature Version 4. API Gateway validates the signature before Lambda even runs. It's the same security AWS Console uses.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Code: Let's Break It Down
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The Lambda Handler (Simplified)
&lt;/h3&gt;

&lt;p&gt;Here's what a Lambda function looks like (simplified for clarity):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_item_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Retrieve a single item from DynamoDB by primary key.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="c1"&gt;# Parse the request
&lt;/span&gt;    &lt;span class="n"&gt;params&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;loads&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;{}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="n"&gt;table_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;table_name&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;key&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;key&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Convert simple format to DynamoDB format
&lt;/span&gt;    &lt;span class="n"&gt;dynamodb_key&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;v&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;key&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nf"&gt;isinstance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;dynamodb_key&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;S&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="nf"&gt;isinstance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;float&lt;/span&gt;&lt;span class="p"&gt;)):&lt;/span&gt;
            &lt;span class="n"&gt;dynamodb_key&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;N&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="p"&gt;)}&lt;/span&gt;

    &lt;span class="c1"&gt;# Call DynamoDB
&lt;/span&gt;    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;dynamodb&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_item&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;TableName&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;table_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;Key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;dynamodb_key&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Format response as human-readable text
&lt;/span&gt;    &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Item&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{})&lt;/span&gt;
    &lt;span class="n"&gt;formatted&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;format_item&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;statusCode&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;headers&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Content-Type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text/plain&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Item from table &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;table_name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;:&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;formatted&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Three key parts:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Parse input&lt;/strong&gt; - Extract table name and key&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Convert formats&lt;/strong&gt; - Simple JSON → DynamoDB types&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Return readable text&lt;/strong&gt; - Not raw JSON&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  The Proxy Script (The Secret Sauce)
&lt;/h3&gt;

&lt;p&gt;The proxy does three critical things:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Tool Discovery:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# On startup&lt;/span&gt;
curl &lt;span class="nt"&gt;-X&lt;/span&gt; GET https://api.execute-api.us-east-1.amazonaws.com/tools
&lt;span class="c"&gt;# Saves tool definitions locally&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2. SigV4 Signing:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# For each request&lt;/span&gt;
&lt;span class="nv"&gt;signature&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;calculate_aws_signature &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$request&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
curl &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Authorization: AWS4-HMAC-SHA256 Credential=..."&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
     https://api.execute-api.us-east-1.amazonaws.com/dynamodb/get-item
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;3. JSON-RPC Translation:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Receives from Kiro:&lt;/span&gt;
&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"jsonrpc"&lt;/span&gt;: &lt;span class="s2"&gt;"2.0"&lt;/span&gt;, &lt;span class="s2"&gt;"method"&lt;/span&gt;: &lt;span class="s2"&gt;"tools/call"&lt;/span&gt;, &lt;span class="s2"&gt;"params"&lt;/span&gt;: &lt;span class="o"&gt;{&lt;/span&gt;...&lt;span class="o"&gt;}}&lt;/span&gt;

&lt;span class="c"&gt;# Translates to HTTP:&lt;/span&gt;
POST /dynamodb/get-item
&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"table_name"&lt;/span&gt;: &lt;span class="s2"&gt;"Users"&lt;/span&gt;, &lt;span class="s2"&gt;"key"&lt;/span&gt;: &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"userId"&lt;/span&gt;: &lt;span class="s2"&gt;"123"&lt;/span&gt;&lt;span class="o"&gt;}}&lt;/span&gt;

&lt;span class="c"&gt;# Returns to Kiro:&lt;/span&gt;
&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"jsonrpc"&lt;/span&gt;: &lt;span class="s2"&gt;"2.0"&lt;/span&gt;, &lt;span class="s2"&gt;"result"&lt;/span&gt;: &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"content"&lt;/span&gt;: &lt;span class="o"&gt;[{&lt;/span&gt;&lt;span class="s2"&gt;"type"&lt;/span&gt;: &lt;span class="s2"&gt;"text"&lt;/span&gt;, &lt;span class="s2"&gt;"text"&lt;/span&gt;: &lt;span class="s2"&gt;"..."&lt;/span&gt;&lt;span class="o"&gt;}]}}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's a &lt;strong&gt;protocol adapter&lt;/strong&gt; - speaks MCP to Kiro, speaks HTTP to AWS.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Infrastructure (Terraform)
&lt;/h3&gt;

&lt;p&gt;Each Lambda gets its own Terraform file. Here's the pattern:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="c1"&gt;# IAM Role&lt;/span&gt;
&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_iam_role"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_get_item_role"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb-get-item-role"&lt;/span&gt;
  &lt;span class="c1"&gt;# Trust policy allows Lambda service to assume this role&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;# Scoped Permission&lt;/span&gt;
&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_iam_role_policy"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_get_item_dynamodb"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb-get-item-policy"&lt;/span&gt;
  &lt;span class="nx"&gt;role&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;aws_iam_role&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;lambda_get_item_role&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;

  &lt;span class="nx"&gt;policy&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;jsonencode&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="nx"&gt;Statement&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;
      &lt;span class="nx"&gt;Effect&lt;/span&gt;   &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Allow"&lt;/span&gt;
      &lt;span class="nx"&gt;Action&lt;/span&gt;   &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"dynamodb:GetItem"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;  &lt;span class="c1"&gt;# Only this action!&lt;/span&gt;
      &lt;span class="nx"&gt;Resource&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;}]&lt;/span&gt;
  &lt;span class="p"&gt;})&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;# Lambda Function&lt;/span&gt;
&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_lambda_function"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_get_item"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;function_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb-get-item"&lt;/span&gt;
  &lt;span class="nx"&gt;role&lt;/span&gt;          &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;aws_iam_role&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;lambda_get_item_role&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;arn&lt;/span&gt;
  &lt;span class="nx"&gt;runtime&lt;/span&gt;       &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"python3.13"&lt;/span&gt;
  &lt;span class="nx"&gt;handler&lt;/span&gt;       &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb_ops.get_item_handler"&lt;/span&gt;
  &lt;span class="c1"&gt;# ... more config&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Rinse and repeat&lt;/strong&gt; for each operation. Total: 11 Lambda functions.&lt;/p&gt;

&lt;h2&gt;
  
  
  The 10 DynamoDB Operations
&lt;/h2&gt;

&lt;p&gt;Here's what you can do:&lt;/p&gt;

&lt;h3&gt;
  
  
  Read Operations
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Get Item&lt;/strong&gt; - Fetch a single item by key&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Get user user001 from the Users table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2. Query&lt;/strong&gt; - Find items matching a condition&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Show me all orders for user123 from the Orders table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;3. Scan&lt;/strong&gt; - Read the entire table (with optional filters)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Scan the Products table and show me 10 items"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;4. Batch Get&lt;/strong&gt; - Fetch multiple items at once&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Get users user001, user002, and user003 from Users table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;5. List Tables&lt;/strong&gt; - See all DynamoDB tables&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"What DynamoDB tables do I have?"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;6. Describe Table&lt;/strong&gt; - Get table metadata&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Describe the Users table structure"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;7. Count Items&lt;/strong&gt; - Get approximate table size&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"How many items are in the Users table?"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Write Operations
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;8. Put Item&lt;/strong&gt; - Add or replace an item&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Add a user with userId user011, name Kate Brown to Users table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;9. Update Item&lt;/strong&gt; - Modify specific attributes&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Update the role to Senior Engineer for user001"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;10. Delete Item&lt;/strong&gt; - Remove an item&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Delete user user005 from the Users table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Bonus: The Sample Table
&lt;/h2&gt;

&lt;p&gt;We include an optional &lt;code&gt;sample-table.tf&lt;/code&gt; that creates a "Users" table with 10 realistic user records:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_dynamodb_table"&lt;/span&gt; &lt;span class="s2"&gt;"users_sample"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt;         &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Users"&lt;/span&gt;
  &lt;span class="nx"&gt;billing_mode&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"PAY_PER_REQUEST"&lt;/span&gt;  &lt;span class="c1"&gt;# No fixed costs!&lt;/span&gt;
  &lt;span class="nx"&gt;hash_key&lt;/span&gt;     &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"userId"&lt;/span&gt;

  &lt;span class="c1"&gt;# ... schema definition&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_dynamodb_table_item"&lt;/span&gt; &lt;span class="s2"&gt;"user_1"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;table_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;aws_dynamodb_table&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;users_sample&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;

  &lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;jsonencode&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="nx"&gt;userId&lt;/span&gt;     &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;S&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"user001"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="nx"&gt;name&lt;/span&gt;       &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;S&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Alice Johnson"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="nx"&gt;email&lt;/span&gt;      &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;S&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"alice.johnson@example.com"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="nx"&gt;role&lt;/span&gt;       &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;S&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Software Engineer"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="nx"&gt;department&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;S&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Engineering"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="nx"&gt;active&lt;/span&gt;     &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;BOOL&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="c1"&gt;# ... more fields&lt;/span&gt;
  &lt;span class="p"&gt;})&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Perfect for testing!&lt;/strong&gt; Deploy once, start asking questions immediately.&lt;/p&gt;

&lt;p&gt;Don't need it? Just delete the file or rename it to &lt;code&gt;sample-table.tf.disabled&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Deploy This
&lt;/h2&gt;

&lt;p&gt;Ready to try it? Here's the journey:&lt;/p&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# You need these installed&lt;/span&gt;
aws &lt;span class="nt"&gt;--version&lt;/span&gt;          &lt;span class="c"&gt;# AWS CLI&lt;/span&gt;
terraform &lt;span class="nt"&gt;--version&lt;/span&gt;    &lt;span class="c"&gt;# Terraform&lt;/span&gt;
jq &lt;span class="nt"&gt;--version&lt;/span&gt;          &lt;span class="c"&gt;# JSON processor&lt;/span&gt;
bash &lt;span class="nt"&gt;--version&lt;/span&gt;        &lt;span class="c"&gt;# Bash 4+&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Make sure your AWS credentials are configured:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws sts get-caller-identity
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 1: Clone and Deploy
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Clone the repo (or create from the code)&lt;/span&gt;
&lt;span class="nb"&gt;cd &lt;/span&gt;AWSServerlessMCP

&lt;span class="c"&gt;# Run the magic script&lt;/span&gt;
./apply.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This script:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;✅ Validates your environment&lt;/li&gt;
&lt;li&gt;✅ Deploys all 11 Lambda functions via Terraform&lt;/li&gt;
&lt;li&gt;✅ Creates API Gateway routes&lt;/li&gt;
&lt;li&gt;✅ Generates IAM user for the proxy&lt;/li&gt;
&lt;li&gt;✅ Stores credentials in Secrets Manager&lt;/li&gt;
&lt;li&gt;✅ Generates Claude Desktop config&lt;/li&gt;
&lt;li&gt;✅ Runs validation tests&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Total deployment time:&lt;/strong&gt; ~2-3 minutes&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure Claude Desktop
&lt;/h3&gt;

&lt;p&gt;The script generates &lt;code&gt;02-proxy/claude_desktop_config_sh.json&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"dynamodb"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"bash"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"/path/to/proxy.sh"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"env"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"MCP_ACCESS_KEY_ID"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"AKIA..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"MCP_SECRET_ACCESS_KEY"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"MCP_API_ENDPOINT"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://....execute-api.us-east-1.amazonaws.com"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"MCP_REGION"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"us-east-1"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Copy this&lt;/strong&gt; to your Claude Desktop config:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;macOS&lt;/strong&gt;: &lt;code&gt;~/Library/Application Support/Claude/claude_desktop_config.json&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Linux&lt;/strong&gt;: &lt;code&gt;~/.config/Claude/claude_desktop_config.json&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 3: Restart Claude Desktop
&lt;/h3&gt;

&lt;p&gt;Close and reopen Claude Desktop. You should see DynamoDB tools appear!&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Start Asking Questions!
&lt;/h3&gt;

&lt;p&gt;Try these:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"List all my DynamoDB tables"

"Describe the Users table"

"Show me all users from the Users table"

"Get user user001 from Users table"

"Add a new user with userId user011, name John Doe, 
 email john@example.com to the Users table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Security Deep Dive
&lt;/h2&gt;

&lt;p&gt;Let's talk about how we keep this secure:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. IAM Authentication
&lt;/h3&gt;

&lt;p&gt;Every request goes through this flow:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Request → Proxy signs with AWS SigV4 → API Gateway validates signature → Lambda executes
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;No signature = No access.&lt;/strong&gt; Period.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Scoped Permissions
&lt;/h3&gt;

&lt;p&gt;The proxy IAM user has exactly ONE permission:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"execute-api:Invoke"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:execute-api:us-east-1:ACCOUNT:API_ID/*/*"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It can call the API. &lt;strong&gt;Nothing else.&lt;/strong&gt; Can't create EC2 instances, can't delete S3 buckets, can't read secrets.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Lambda Isolation
&lt;/h3&gt;

&lt;p&gt;Each Lambda has scoped DynamoDB permissions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;get-item Lambda    → Can only read
put-item Lambda    → Can only write
delete-item Lambda → Can only delete
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Even if you somehow bypass API Gateway (you can't), each Lambda is isolated.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Audit Trail
&lt;/h3&gt;

&lt;p&gt;Every action is logged:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_audit_log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;user&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;headers&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{}).&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;x-mcp-user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;unknown&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;AUDIT tool=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; user=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;user&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;CloudWatch Logs capture:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Who made the request (your username)&lt;/li&gt;
&lt;li&gt;What tool was called&lt;/li&gt;
&lt;li&gt;When it happened&lt;/li&gt;
&lt;li&gt;What the result was&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. No Secrets in Code
&lt;/h3&gt;

&lt;p&gt;Credentials live in AWS Secrets Manager:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws secretsmanager get-secret-value &lt;span class="nt"&gt;--secret-id&lt;/span&gt; dynamodb-mcp-proxy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Never in your codebase. Never in environment variables you might accidentally commit.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Analysis
&lt;/h2&gt;

&lt;p&gt;"How much does this cost to run?"&lt;/p&gt;

&lt;p&gt;Let's break it down:&lt;/p&gt;

&lt;h3&gt;
  
  
  AWS Free Tier (First 12 Months):
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Lambda&lt;/strong&gt;: 1M requests/month free + 400,000 GB-seconds compute&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API Gateway&lt;/strong&gt;: 1M API calls/month free&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DynamoDB&lt;/strong&gt;: 25 GB storage + 25 read/write units&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  After Free Tier:
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Lambda&lt;/strong&gt;: $0.20 per 1M requests + $0.0000166667 per GB-second&lt;/p&gt;

&lt;p&gt;Example calculation for 10,000 queries/month:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Requests: 10,000 × $0.20/1M = &lt;strong&gt;$0.002&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Compute (128MB, 200ms avg): 10,000 × 0.2s × 0.125GB × $0.0000166667 = &lt;strong&gt;$0.004&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Total Lambda: ~$0.01/month&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;API Gateway&lt;/strong&gt;: $1.00 per 1M requests&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;10,000 requests = &lt;strong&gt;$0.01/month&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;DynamoDB&lt;/strong&gt;: Pay-per-request pricing&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;$1.25 per 1M write requests&lt;/li&gt;
&lt;li&gt;$0.25 per 1M read requests&lt;/li&gt;
&lt;li&gt;10,000 reads = &lt;strong&gt;$0.003/month&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Secrets Manager&lt;/strong&gt;: $0.40/month per secret&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;$0.40/month&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Total for 10,000 queries/month: &lt;strong&gt;~$0.42&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;For a hobby project? Basically free. For production? Scales linearly with usage.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common Patterns and Best Practices
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Pattern 1: Query with Filters
&lt;/h3&gt;

&lt;p&gt;Instead of scanning, use query when possible:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Efficient - uses partition key
&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Query Orders table where userId equals user123&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="c1"&gt;# Less efficient - full table scan
&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Scan Orders table and filter by userId user123&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Pattern 2: Batch Operations
&lt;/h3&gt;

&lt;p&gt;Fetch multiple items in one call:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# One request for three items
&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Get users user001, user002, user003 using batch get&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="c1"&gt;# Better than three separate requests
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Pattern 3: Conditional Updates
&lt;/h3&gt;

&lt;p&gt;Use update expressions for atomic operations:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Update the counter by incrementing it by 1 for item user001&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This translates to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;UpdateExpression&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SET #counter = #counter + :inc&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Atomic, no race conditions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Extending the System
&lt;/h2&gt;

&lt;p&gt;Want to add a new operation? Here's how:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Add Handler to Python
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# In dynamodb_ops.py
&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;batch_write_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Bulk write multiple items.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;params&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;_parse_json_body&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# ... implementation
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;_response&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Successfully wrote N items&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Add to TOOL_REGISTRY
&lt;/span&gt;&lt;span class="n"&gt;TOOL_REGISTRY&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;name&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;dynamodb_batch_write&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;description&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Write multiple items in one request&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;inputSchema&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{...},&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;route&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/dynamodb/batch-write&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Create Terraform File
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="c1"&gt;# lambda-batch-write.tf&lt;/span&gt;

&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_iam_role"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_batch_write_role"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb-batch-write-role"&lt;/span&gt;
  &lt;span class="c1"&gt;# ... role definition&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_iam_role_policy"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_batch_write_dynamodb"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;policy&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;jsonencode&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="nx"&gt;Statement&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;
      &lt;span class="nx"&gt;Effect&lt;/span&gt;   &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"Allow"&lt;/span&gt;
      &lt;span class="nx"&gt;Action&lt;/span&gt;   &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"dynamodb:BatchWriteItem"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
      &lt;span class="nx"&gt;Resource&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;}]&lt;/span&gt;
  &lt;span class="p"&gt;})&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_lambda_function"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_batch_write"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;function_name&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb-batch-write"&lt;/span&gt;
  &lt;span class="nx"&gt;handler&lt;/span&gt;       &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb_ops.batch_write_handler"&lt;/span&gt;
  &lt;span class="c1"&gt;# ... function config&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. Update API Gateway
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="c1"&gt;# In api.tf&lt;/span&gt;

&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_apigatewayv2_integration"&lt;/span&gt; &lt;span class="s2"&gt;"batch_write_integration"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;api_id&lt;/span&gt;          &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;aws_apigatewayv2_api&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;dynamodb_api&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;
  &lt;span class="nx"&gt;integration_uri&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;aws_lambda_function&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;lambda_batch_write&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;invoke_arn&lt;/span&gt;
  &lt;span class="c1"&gt;# ... integration config&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_apigatewayv2_route"&lt;/span&gt; &lt;span class="s2"&gt;"batch_write_route"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;api_id&lt;/span&gt;    &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;aws_apigatewayv2_api&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;dynamodb_api&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;
  &lt;span class="nx"&gt;route_key&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"POST /dynamodb/batch-write"&lt;/span&gt;
  &lt;span class="nx"&gt;target&lt;/span&gt;    &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"integrations/${aws_apigatewayv2_integration.batch_write_integration.id}"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Deploy
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;./apply.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;That's it!&lt;/strong&gt; The proxy auto-discovers the new tool on next startup.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real-World Use Cases
&lt;/h2&gt;

&lt;p&gt;Where does this shine?&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Data Exploration
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Show me all users who joined in 2023"
"How many active subscriptions do we have?"
"What's the average age of users in the Engineering department?"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Natural language beats writing DynamoDB queries.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Quick CRUD Operations
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Add a test user for QA testing"
"Update the status to active for order order123"
"Delete all test data with prefix test-"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;No need to open AWS Console.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Database Migrations
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Scan the Users table and show me all items missing the email field"
"Update all users in the Premium tier to add a credits field with value 100"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Kiro can help you identify and fix data inconsistencies.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Monitoring and Alerts
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"How many failed login attempts in the last hour?"
"Show me all orders with status pending older than 24 hours"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Quick operational queries without building dashboards.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Developer Productivity
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Create a sample order for testing the checkout flow"
"Copy user user001 to user001-backup"
"Show me the schema of the Products table"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Faster than clicking through the console.&lt;/p&gt;

&lt;h2&gt;
  
  
  Lessons Learned
&lt;/h2&gt;

&lt;p&gt;Building this taught me some valuable lessons:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Start with Security
&lt;/h3&gt;

&lt;p&gt;We didn't bolt on IAM later - it was there from day one. That made all subsequent decisions easier.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Simplicity Scales
&lt;/h3&gt;

&lt;p&gt;One Python file. Simple Terraform. No fancy frameworks. Yet it handles thousands of requests/day without breaking a sweat.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Developer Experience Matters
&lt;/h3&gt;

&lt;p&gt;The fact that you can ask questions in plain English? That's not a gimmick. It genuinely changes how you interact with your data.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Observability is Free (Almost)
&lt;/h3&gt;

&lt;p&gt;CloudWatch Logs, CloudTrail, X-Ray tracing - all built into Lambda. We didn't build a monitoring system; we just used what AWS gives us.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. The Proxy Pattern Works
&lt;/h3&gt;

&lt;p&gt;Keeping the proxy thin and stateless was the right call. All complexity lives in Lambda where we can update it independently.&lt;/p&gt;

&lt;h2&gt;
  
  
  Troubleshooting Tips
&lt;/h2&gt;

&lt;p&gt;Hit a snag? Here's how to debug:&lt;/p&gt;

&lt;h3&gt;
  
  
  Problem: Proxy won't connect
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Check AWS credentials&lt;/span&gt;
aws sts get-caller-identity

&lt;span class="c"&gt;# Test API Gateway directly&lt;/span&gt;
aws lambda invoke &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--function-name&lt;/span&gt; dynamodb-list-tables &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--payload&lt;/span&gt; &lt;span class="s1"&gt;'{}'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  /tmp/out.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Problem: Permission denied
&lt;/h3&gt;

&lt;p&gt;Check IAM user has execute-api permission:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws iam get-user-policy &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--user-name&lt;/span&gt; dynamodb-mcp-proxy &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--policy-name&lt;/span&gt; dynamodb-mcp-proxy-invoke
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Problem: Lambda timeout
&lt;/h3&gt;

&lt;p&gt;Increase timeout in Terraform:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_lambda_function"&lt;/span&gt; &lt;span class="s2"&gt;"lambda_scan"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;timeout&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;30&lt;/span&gt;  &lt;span class="c1"&gt;# Increase from 15 to 30 seconds&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Problem: Can't find table
&lt;/h3&gt;

&lt;p&gt;Verify table exists:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws dynamodb list-tables
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check Lambda has permission to access it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future Enhancements
&lt;/h2&gt;

&lt;p&gt;Where could this go?&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Multi-Region Support
&lt;/h3&gt;

&lt;p&gt;Deploy to multiple regions, let Kiro route to the nearest one:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="nx"&gt;module&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb_mcp_us_east"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;source&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"./modules/dynamodb-mcp"&lt;/span&gt;
  &lt;span class="nx"&gt;region&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"us-east-1"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;module&lt;/span&gt; &lt;span class="s2"&gt;"dynamodb_mcp_eu_west"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;source&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"./modules/dynamodb-mcp"&lt;/span&gt;
  &lt;span class="nx"&gt;region&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"eu-west-1"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Advanced Query Support
&lt;/h3&gt;

&lt;p&gt;Add support for complex queries:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Find all users where age &amp;gt; 25 AND department = Engineering 
 AND active = true, sorted by joinDate&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. Transaction Support
&lt;/h3&gt;

&lt;p&gt;DynamoDB supports transactions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;transaction_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Execute multiple operations atomically.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;dynamodb&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;transact_write_items&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;TransactItems&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
            &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Put&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{...}},&lt;/span&gt;
            &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Update&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{...}},&lt;/span&gt;
            &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Delete&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{...}}&lt;/span&gt;
        &lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Stream Processing
&lt;/h3&gt;

&lt;p&gt;React to DynamoDB changes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Alert me when a new order is created&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Update the analytics table whenever a user signs up&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Use DynamoDB Streams + Lambda triggers.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Cost Optimization
&lt;/h3&gt;

&lt;p&gt;Add DynamoDB reserved capacity for predictable workloads:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight hcl"&gt;&lt;code&gt;&lt;span class="nx"&gt;resource&lt;/span&gt; &lt;span class="s2"&gt;"aws_dynamodb_table"&lt;/span&gt; &lt;span class="s2"&gt;"users"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;billing_mode&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"PROVISIONED"&lt;/span&gt;
  &lt;span class="nx"&gt;read_capacity&lt;/span&gt;  &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
  &lt;span class="nx"&gt;write_capacity&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  6. Multi-Table Operations
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Join Users table with Orders table on userId 
 and show me total order value per user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Execute multiple queries and aggregate in Lambda.&lt;/p&gt;

&lt;h2&gt;
  
  
  Comparison with Alternatives
&lt;/h2&gt;

&lt;p&gt;How does this stack up?&lt;/p&gt;

&lt;h3&gt;
  
  
  vs. Local MCP Server
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Local Server:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ Lower latency&lt;/li&gt;
&lt;li&gt;✅ No AWS costs&lt;/li&gt;
&lt;li&gt;❌ Runs only on your machine&lt;/li&gt;
&lt;li&gt;❌ Need to manage runtime dependencies&lt;/li&gt;
&lt;li&gt;❌ No centralized updates&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Serverless (Ours):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ Works for your whole team&lt;/li&gt;
&lt;li&gt;✅ No runtime to manage&lt;/li&gt;
&lt;li&gt;✅ Built-in scaling&lt;/li&gt;
&lt;li&gt;✅ AWS-level security&lt;/li&gt;
&lt;li&gt;❌ Small latency overhead (~100-200ms)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  vs. Direct DynamoDB Access
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Direct Access (boto3):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ Maximum control&lt;/li&gt;
&lt;li&gt;✅ Lowest latency&lt;/li&gt;
&lt;li&gt;❌ Requires coding for every query&lt;/li&gt;
&lt;li&gt;❌ No natural language interface&lt;/li&gt;
&lt;li&gt;❌ Harder to audit&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;MCP (Ours):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ Natural language queries&lt;/li&gt;
&lt;li&gt;✅ Audit trail built-in&lt;/li&gt;
&lt;li&gt;✅ Non-technical users can query&lt;/li&gt;
&lt;li&gt;❌ Limited to predefined operations&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  vs. AWS Data API
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;AWS Data API:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Only for Aurora Serverless&lt;/li&gt;
&lt;li&gt;HTTP-based queries&lt;/li&gt;
&lt;li&gt;SQL interface&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Ours:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ Works with DynamoDB&lt;/li&gt;
&lt;li&gt;✅ NoSQL operations&lt;/li&gt;
&lt;li&gt;✅ Natural language interface&lt;/li&gt;
&lt;li&gt;✅ MCP integration&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Takeaways
&lt;/h2&gt;

&lt;p&gt;If you remember nothing else, remember this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;MCP is powerful&lt;/strong&gt; - It's not hype. It genuinely changes how we interact with data.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Serverless fits MCP perfectly&lt;/strong&gt; - Centralized, scalable, secure. All the things MCP needs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Security first, always&lt;/strong&gt; - IAM, scoped permissions, audit logs. Build it in from day one.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Plain-text responses win&lt;/strong&gt; - Optimize for conversation, not computation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Keep it simple&lt;/strong&gt; - One Python file, clear Terraform, no magic. Simplicity scales.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The proxy pattern works&lt;/strong&gt; - Thin client, fat backend. Update independently.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Try It Yourself!
&lt;/h2&gt;

&lt;p&gt;Ready to build your own? Here's the complete source:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;GitHub&lt;/strong&gt;: [Link to your repo]&lt;/p&gt;

&lt;p&gt;Deploy in 3 commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone &lt;span class="o"&gt;[&lt;/span&gt;your-repo]
&lt;span class="nb"&gt;cd &lt;/span&gt;AWSServerlessMCP
./apply.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Questions? Hit me up in the comments! I'd love to hear:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What other AWS services would you want MCP tools for?&lt;/li&gt;
&lt;li&gt;What improvements would you make?&lt;/li&gt;
&lt;li&gt;What challenges did you face deploying it?&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;We started with a simple question: "Can I ask Kiro to query my database?"&lt;/p&gt;

&lt;p&gt;We ended with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ A production-ready serverless MCP backend&lt;/li&gt;
&lt;li&gt;✅ 10 DynamoDB operations as natural language tools&lt;/li&gt;
&lt;li&gt;✅ Secure, scalable, and cost-effective&lt;/li&gt;
&lt;li&gt;✅ Deployable in under 5 minutes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is just the beginning. MCP is going to change how we build AI-powered tools. The future isn't about building smarter AI - it's about giving AI better tools.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What will you build with MCP?&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Found this helpful? Give it a ❤️ and follow for more serverless + AI content!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Have questions or improvements? Drop them in the comments - I read every one!&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Further Reading
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://spec.modelcontextprotocol.io/" rel="noopener noreferrer"&gt;Model Context Protocol Specification&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/lambda/latest/dg/best-practices.html" rel="noopener noreferrer"&gt;AWS Lambda Best Practices&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/dynamodb/" rel="noopener noreferrer"&gt;DynamoDB Developer Guide&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html" rel="noopener noreferrer"&gt;AWS IAM Best Practices&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Connect with me:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;GitHub: [&lt;a href="https://github.com/yeshwanthlm" rel="noopener noreferrer"&gt;https://github.com/yeshwanthlm&lt;/a&gt;]&lt;/li&gt;
&lt;li&gt;LinkedIn: [&lt;a href="https://www.linkedin.com/in/yeshwanth-l-m/" rel="noopener noreferrer"&gt;https://www.linkedin.com/in/yeshwanth-l-m/&lt;/a&gt;]&lt;/li&gt;
&lt;li&gt;YouTube: [&lt;a href="https://www.youtube.com/@TechWithYeshwanth" rel="noopener noreferrer"&gt;https://www.youtube.com/@TechWithYeshwanth&lt;/a&gt;]&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;strong&gt;Tags&lt;/strong&gt;: #aws #serverless #lambda #dynamodb #ai #claude #mcp #terraform #python #devops&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>ai</category>
      <category>mcp</category>
    </item>
    <item>
      <title>re:Play 2025</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Sun, 07 Dec 2025 14:10:03 +0000</pubDate>
      <link>https://forem.com/yeshwanthlm/replay-2025-52oo</link>
      <guid>https://forem.com/yeshwanthlm/replay-2025-52oo</guid>
      <description>&lt;p&gt;re:Invent Week AWS News Round-Up Summary&lt;/p&gt;

&lt;p&gt;This week's AWS news was dominated by &lt;strong&gt;re:Invent&lt;/strong&gt; announcements, with a massive flow of new services and features across multiple domains.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Performance and Compute Power
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Graviton5 CPU:&lt;/strong&gt; Unveiled for up to &lt;strong&gt;30% better performance&lt;/strong&gt; and &lt;strong&gt;40% better price performance&lt;/strong&gt; than Graviton4 for general-purpose workloads, launching with new &lt;strong&gt;Amazon EC2 M9g instances&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lambda Managed Instances:&lt;/strong&gt; A new way to run Lambda functions on &lt;strong&gt;EC2 compute&lt;/strong&gt; with serverless simplicity, granting access to specialized hardware and flexible EC2 pricing while AWS manages the infrastructure.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. Serverless and Data Modernization
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Lambda Durable Functions:&lt;/strong&gt; Allows for the orchestration of multi-step applications and AI workflows directly in Lambda, featuring &lt;strong&gt;automatic checkpointing&lt;/strong&gt; and year-long waits.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon S3 Object Size Increase:&lt;/strong&gt; The maximum object size limit has been significantly increased from 5 TB to &lt;strong&gt;50 TB&lt;/strong&gt;, supporting massive datasets like AI training corpora and high-resolution video.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. Cost Optimization
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Database Savings Plans:&lt;/strong&gt; New, flexible, commit-based discounts of up to &lt;strong&gt;35%&lt;/strong&gt; for AWS managed databases across engines and regions, simplifying long-term cost optimization.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon RDS for SQL Server:&lt;/strong&gt; Now supports &lt;strong&gt;Microsoft SQL Server 2022 Developer Edition&lt;/strong&gt; to help cut non-production licensing costs.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  4. AI, Agents, and Foundation Models
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;New Foundation Models in Amazon Bedrock:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Mistral AI&lt;/strong&gt; models (fast, cost-effective options).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Nova 2 Lite and Nova 2 Pro (Preview)&lt;/strong&gt; with advanced step-by-step reasoning and a large &lt;strong&gt;1M-token context window&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Agentic Services (AI-Powered Teammates - Previews):&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Security Agent:&lt;/strong&gt; Frontier agentic approach for &lt;strong&gt;AppSec&lt;/strong&gt;, code analysis, and on-demand penetration testing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS DevOps Agent:&lt;/strong&gt; Autonomous on-call teammate to accelerate &lt;strong&gt;incident response&lt;/strong&gt;, correlate metrics/logs, and recommend resilience improvements.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Kiro’s New Autonomous Agent:&lt;/strong&gt; An AI dev teammate that runs multi-repo tasks and ships coordinated pull requests.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. Security and Observability
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Security Hub &amp;amp; GuardDuty Enhancements:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Security Hub:&lt;/strong&gt; Now Generally Available with near real-time risk analytics and unified exposure views.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GuardDuty Extended Threat Detection:&lt;/strong&gt; Adds unified, AI-powered attack sequence findings for EC2 and ECS.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;CloudWatch Unification:&lt;/strong&gt; Now unifies log data management and analytics (operations, security, compliance) with new features like &lt;strong&gt;OCSF/OTel normalization&lt;/strong&gt; and &lt;strong&gt;AI-powered queries&lt;/strong&gt;.&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;X-Ray Transition to OpenTelemetry:&lt;/strong&gt; Encouraging customers to adopt the open, vendor-neutral tracing standard.&lt;/li&gt;

&lt;/ul&gt;




&lt;h3&gt;
  
  
  Key Takeaways for Your Stream:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The Big Theme:&lt;/strong&gt; A massive acceleration in &lt;strong&gt;AI and Agents&lt;/strong&gt; across every part of the AWS stack (DevOps, Security, and App Dev).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Performance Jump:&lt;/strong&gt; Graviton continues to lead with major performance and cost efficiency gains.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Simplicity and Scale:&lt;/strong&gt; The S3 limit increase and Lambda's move to durable functions are game-changers for large-scale data and complex serverless workflows.&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>🤖 AWS Outage (Oct 2025): Breakdown &amp; Lessons</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Wed, 22 Oct 2025 06:38:51 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-outage-oct-2025-breakdown-lessons-5f06</link>
      <guid>https://forem.com/aws-builders/aws-outage-oct-2025-breakdown-lessons-5f06</guid>
      <description>&lt;h2&gt;
  
  
  🌎 The Big Picture: What Happened?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;When:&lt;/strong&gt; Monday, October 20, 2025 (for about 15 hours).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;What:&lt;/strong&gt; A massive portion of the internet stopped working. Apps like Roblox, Snapchat, Duolingo, and even services like Alexa and Ring doorbells went offline.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Where:&lt;/strong&gt; The failure started in &lt;strong&gt;AWS US-EAST-1&lt;/strong&gt; (Northern Virginia). This is the oldest, largest, and most important AWS data center region in the world.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Why:&lt;/strong&gt; It was &lt;strong&gt;not&lt;/strong&gt; a hack. It was an internal technical failure that caused a massive chain reaction (a cascading failure).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  💥 The Story: A Cascade of Failures
&lt;/h2&gt;

&lt;p&gt;The failure was like a set of dominos falling.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The First Domino (The Monitor):&lt;/strong&gt; A tiny, internal AWS system that monitors the health of its own &lt;strong&gt;Network Load Balancers (NLBs)&lt;/strong&gt; glitched.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Second Domino (The Traffic Cop):&lt;/strong&gt; Because the monitor failed, the NLBs (the "traffic cops" that direct data) also failed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Third Domino (The Phonebook):&lt;/strong&gt; &lt;strong&gt;DynamoDB&lt;/strong&gt; (a critical database used by thousands of apps) relied on those "traffic cops" for its &lt;strong&gt;DNS&lt;/strong&gt; (the internet's "phonebook"). When the cops failed, the phonebook entry for DynamoDB went blank.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Final Collapse:&lt;/strong&gt; Apps across the internet tried to "call" DynamoDB but couldn't find its "phone number." This caused them to fail. The failure then spread to other core services in the region, like &lt;strong&gt;EC2&lt;/strong&gt; (servers) and &lt;strong&gt;IAM&lt;/strong&gt; (logins), bringing down the entire region's "management layer."&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;Simple Analogy: A tiny fuse for the airport's control tower blew. This made the air traffic controllers go blind. Because they were blind, they couldn't tell planes which runway to land on. Soon, no planes could land (DynamoDB), and this caused the entire airport to shut down (the whole region).&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  🧑‍💻 3 Hard Lessons for Engineers
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The &lt;code&gt;us-east-1&lt;/code&gt; Trap:&lt;/strong&gt; We all use US-EAST-1 as our default. This outage proved many &lt;em&gt;global&lt;/em&gt; services (like IAM logins) are still secretly controlled from this one region. A failure there can break your app &lt;em&gt;everywhere&lt;/em&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Cloud is Not Magic:&lt;/strong&gt; The cloud is just someone else's computer. We must design our apps to &lt;em&gt;survive&lt;/em&gt; cloud failures. We share responsibility for resilience.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Your App is Only as Strong as its Weakest Link:&lt;/strong&gt; Thousands of apps failed because their &lt;em&gt;entire&lt;/em&gt; system was in one region, or they "hardcoded" a dependency (like &lt;code&gt;dynamodb.us-east-1.amazonaws.com&lt;/code&gt;) into their app.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  🛠️ Your 5-Step Survival Guide (DevOps Action Plan)
&lt;/h2&gt;

&lt;p&gt;Here are the concrete actions to prevent this from happening to you.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;1. Stop Confusing Multi-AZ and Multi-Region&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Multi-AZ&lt;/strong&gt; (multiple data centers in one city) is the &lt;em&gt;minimum&lt;/em&gt;. It &lt;strong&gt;would not&lt;/strong&gt; have saved you from this outage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action:&lt;/strong&gt; Use a &lt;strong&gt;Multi-Region&lt;/strong&gt; architecture (e.g., US-EAST-1 and US-WEST-2) for critical apps. This can be Active-Passive (warm standby) or Active-Active (running in both places at once).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;2. Use DNS Failover (Your Best Friend)&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;This is non-negotiable for a multi-region setup.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action:&lt;/strong&gt; Use &lt;strong&gt;Amazon Route 53 DNS Failover&lt;/strong&gt;. It automatically detects a failing region and sends all your users to the healthy one, like a smart GPS rerouting traffic around a crash.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;3. Design for Graceful Degradation&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Your app shouldn't be "all or nothing."&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action:&lt;/strong&gt; Ask: "If the 'upload' feature breaks, can I just disable the button and let the user keep browsing?" Decouple your services (e.g., with SQS queues) so a failure in one part doesn't crash the whole system.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;4. Banish Hardcoded Endpoints&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Never, ever write &lt;code&gt;us-east-1&lt;/code&gt; directly in your application's code.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action:&lt;/strong&gt; Audit your code. Use environment variables or a parameter store (like AWS SSM) to manage endpoints. Your app shouldn't care &lt;em&gt;where&lt;/em&gt; it's running.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;5. Practice Failing (Chaos Engineering)&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;The companies that survived weren't lucky; they were prepared.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Action:&lt;/strong&gt; Run a &lt;strong&gt;GameDay&lt;/strong&gt; (a simulated disaster). Intentionally break things in your test environment to find weaknesses. Ask your team, "What happens if I shut down the primary database right now?" and &lt;em&gt;test it&lt;/em&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>techwithyeshwanth</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>AWS Bedrock Powered VPC Flow Log Analyzer 🔍</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Wed, 15 Oct 2025 11:46:27 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-bedrock-vpc-flow-log-analyzer-3gi1</link>
      <guid>https://forem.com/aws-builders/aws-bedrock-vpc-flow-log-analyzer-3gi1</guid>
      <description>&lt;h1&gt;
  
  
  Supercharge Your VPC Flow Log Analysis with Amazon Bedrock
&lt;/h1&gt;

&lt;p&gt;In today's complex and dynamic cloud environments, understanding network traffic is crucial for security, troubleshooting, and performance optimization. AWS VPC Flow Logs provide a wealth of information about the IP traffic going to and from network interfaces in your VPC. However, manually analyzing these logs can be a daunting and time-consuming task.&lt;/p&gt;

&lt;p&gt;What if you could use the power of generative AI to analyze your VPC Flow Logs using natural language? This is where the &lt;strong&gt;Amazon Bedrock-Powered VPC Flowlogs Analyzer&lt;/strong&gt; comes in. This solution, available on GitHub, leverages the capabilities of Amazon Bedrock to provide a powerful and intuitive way to query and understand your network traffic.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Challenge with VPC Flow Logs
&lt;/h2&gt;

&lt;p&gt;VPC Flow Logs are a critical source of information for network monitoring and security analysis. They can help you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Diagnose overly restrictive or permissive security group and NACL rules.&lt;/li&gt;
&lt;li&gt;Monitor traffic that is reaching your instances.&lt;/li&gt;
&lt;li&gt;Understand traffic patterns and identify anomalies.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;However, the raw data from Flow Logs is verbose and can be difficult to parse. To get meaningful insights, you often need to use specialized tools or write complex queries, which can be a barrier for many users.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Solution: A Generative AI-Powered Approach
&lt;/h2&gt;

&lt;p&gt;The Amazon Bedrock-Powered VPC Flowlogs Analyzer provides a new paradigm for interacting with your network data. Instead of writing complex queries, you can simply ask questions in plain English. For example, you could ask:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IP Address Analysis:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"What source IP addresses do you see?"&lt;/li&gt;
&lt;li&gt;"List all destination IP addresses"&lt;/li&gt;
&lt;li&gt;"Which IP has the most traffic?"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Port and Protocol Analysis:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"What destination ports are being accessed?"&lt;/li&gt;
&lt;li&gt;"Show me all TCP connections"&lt;/li&gt;
&lt;li&gt;"Which protocols are being used?"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Security Analysis:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"Which connections were rejected?"&lt;/li&gt;
&lt;li&gt;"Show me suspicious activities"&lt;/li&gt;
&lt;li&gt;"Are there any failed connection attempts?"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Traffic Analysis:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"What's the largest data transfer?"&lt;/li&gt;
&lt;li&gt;"Show me connections to external IPs"&lt;/li&gt;
&lt;li&gt;"Which interface has the most traffic?"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The solution uses Amazon Bedrock, a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies, to understand your natural language queries and generate the appropriate code and queries to retrieve the information from your VPC Flow Logs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Natural Language Queries:&lt;/strong&gt; Ask questions about your VPC Flow Logs in plain English.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Serverless and Scalable:&lt;/strong&gt; The solution is built on a serverless architecture that can scale to handle large volumes of data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Easy to Deploy:&lt;/strong&gt; The entire infrastructure can be deployed using a single python script.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Extensible:&lt;/strong&gt; The solution can be extended to support additional data sources and analysis capabilities.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;To get started with the Amazon Bedrock-Powered VPC Flowlogs Analyzer, you will need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An AWS account with access to Amazon Bedrock.&lt;/li&gt;
&lt;li&gt;Python 3.10+ installed on your local machine.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The solution can be deployed using a simple script that sets up all the necessary AWS resources. Once deployed, you can start querying your VPC Flow Logs using natural language through the provided interface.&lt;/p&gt;

&lt;h2&gt;
  
  
  Example Usage
&lt;/h2&gt;

&lt;p&gt;Here are a few examples of how you can use the solution to analyze your VPC Flow Logs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Identify suspicious traffic:&lt;/strong&gt; "Show me all traffic from IP address 192.0.2.1"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Troubleshoot connectivity issues:&lt;/strong&gt; "Is there any traffic being blocked by a security group?"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monitor application traffic:&lt;/strong&gt; "What are the top 10 most active IP addresses?"&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The Amazon Bedrock-Powered VPC Flowlogs Analyzer is a powerful tool that can help you unlock the full potential of your VPC Flow Logs. By leveraging the power of generative AI, you can gain deeper insights into your network traffic, improve your security posture, and optimize your cloud environment.&lt;/p&gt;

&lt;p&gt;To learn more and get started, check out the &lt;a href="https://github.com/yeshwanthlm/Amazon-Bedrock-Powered-VPC-Flowlogs-Analyzer" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Demo of the project: &lt;a href="https://youtu.be/3pC720Wd-Rk" rel="noopener noreferrer"&gt;Hands-on Demo&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>vpc</category>
      <category>aws</category>
      <category>loganalysis</category>
    </item>
    <item>
      <title>4 Ways to Transfer Files/Code From Your Local Computer to a Remote Cloud Server</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Tue, 07 Oct 2025 09:35:13 +0000</pubDate>
      <link>https://forem.com/aws-builders/file-and-code-transfer-local-machine-cloud-server-513l</link>
      <guid>https://forem.com/aws-builders/file-and-code-transfer-local-machine-cloud-server-513l</guid>
      <description>&lt;p&gt;&lt;strong&gt;File and Code Transfer: Local Machine → Cloud Server&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This guide explains four popular methods to transfer files or code from your local computer to a remote cloud server, such as AWS EC2 running Ubuntu.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. SCP (Secure Copy Protocol)&lt;/strong&gt;&lt;br&gt;
Quickly copy files or folders from your local machine to a remote server over SSH.&lt;/p&gt;

&lt;p&gt;Copy a single file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;scp -i /path/to/private/key.pem /local/file/path user@SERVER_PUBLIC_IP:/PATH/INSIDE/SERVER
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy an entire folder:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;scp -i /path/to/private/key.pem /local/FOLDER/path user@SERVER_PUBLIC_IP:/PATH/INSIDE/SERVER

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;2. S3 Method (with IAM Role on EC2)&lt;/strong&gt;&lt;br&gt;
Upload files to an S3 bucket from your local machine. Then, grant your EC2 instance permission to access S3, and transfer files directly on the instance.&lt;/p&gt;

&lt;p&gt;Sample IAM Policy for EC2 IAM Role:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"Statement"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"s3:*"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:s3:::file-transfer-demo-server-bucket"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:s3:::file-transfer-demo-server-bucket/*"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Upload from local to S3:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3 cp /path/to/file s3://file-transfer-demo-server-bucket/

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Download from S3 to EC2:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3 cp s3://file-transfer-demo-server-bucket/IMG_8366.jpg /home/ubuntu/

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The EC2 instance must have the proper IAM role attached with this policy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. WinSCP or Other SFTP GUI Tools&lt;/strong&gt;&lt;br&gt;
For Windows or those who prefer GUIs, use WinSCP (or FileZilla, Cyberduck) for drag-and-drop file transfer:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect using SFTP, your server’s IP, username (e.g., ubuntu), and your .pem SSH private key.&lt;/li&gt;
&lt;li&gt;Drag files or folders from local to remote panel.&lt;/li&gt;
&lt;li&gt;Make sure SFTP/SSH (port 22) is open.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4. GitHub: Clone Code from Repository&lt;/strong&gt;&lt;br&gt;
For code and structured projects, push your content to a GitHub (or GitLab) repository, then on the cloud server:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git clone https://github.com/yourusername/your-repo.git

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Requires git installed on the cloud server.&lt;/li&gt;
&lt;li&gt;Best for transferring source code, not large binary files.&lt;/li&gt;
&lt;li&gt;For all SSH-based methods (SCP, SFTP, WinSCP), ensure the correct username and private key.&lt;/li&gt;
&lt;li&gt;Always protect your credentials and configure least-privilege access for IAM roles and bucket policies.&lt;/li&gt;
&lt;li&gt;For large files or many files, S3 or SCP may be more efficient than GitHub.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>tutorial</category>
      <category>cloud</category>
      <category>linux</category>
      <category>aws</category>
    </item>
    <item>
      <title>Enhance Your Cloud Development Workflow with Amazon Q CLI and MCP Servers</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Tue, 07 Oct 2025 07:19:02 +0000</pubDate>
      <link>https://forem.com/aws-builders/enhance-your-cloud-development-workflow-with-amazon-q-cli-and-mcp-servers-1bbh</link>
      <guid>https://forem.com/aws-builders/enhance-your-cloud-development-workflow-with-amazon-q-cli-and-mcp-servers-1bbh</guid>
      <description>&lt;p&gt;In the rapidly evolving landscape of cloud development, tools that streamline workflows and enhance productivity are invaluable. Amazon Q CLI, a command-line interface tool, brings intelligent assistance directly to your terminal with features like IDE-style autocomplete and agentic capabilities. When paired with Model Context Protocol (MCP) servers, Amazon Q CLI transforms into a powerful ally, offering a rich "toolbox" of functionalities for diverse development tasks. This guide will walk you through the complete setup of Amazon Q CLI and MCP servers, empowering you to automate and accelerate your AWS projects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding Amazon Q CLI and MCP Servers&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Amazon Q CLI&lt;/strong&gt; acts as a smart assistant within your terminal, understanding your context and providing intelligent suggestions or executing tasks based on your input. It aims to reduce manual effort and cognitive load during development.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Model Context Protocol (MCP)&lt;/strong&gt; Servers are specialized applications that extend the capabilities of Amazon Q CLI. They provide specific functionalities, acting as plugins or modules that Amazon Q CLI can leverage to perform more complex operations, such as generating infrastructure diagrams, writing code, or managing Kubernetes resources. Think of them as a collection of experts that Amazon Q CLI can consult for specialized tasks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Installing Amazon Q CLI&lt;/strong&gt;&lt;br&gt;
Before you begin, ensure you have an AWS Builder ID, as it's a prerequisite for using Amazon Q CLI.&lt;br&gt;
The installation process for Amazon Q CLI is straightforward and varies slightly depending on your operating system:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;For macOS:&lt;/strong&gt;&lt;br&gt;
The simplest way to install Amazon Q CLI on macOS is by using Homebrew:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;brew install amazon-q&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Alternatively, you can download the installer directly and follow the on-screen prompts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;For Windows (via WSL - Windows Subsystem for Linux):&lt;/strong&gt;&lt;br&gt;
If you're a Windows user, WSL offers a seamless Linux environment to run Amazon Q CLI.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Download the Amazon Q CLI zip file.&lt;/li&gt;
&lt;li&gt;Unzip the contents to your desired location.&lt;/li&gt;
&lt;li&gt;Open your WSL terminal and navigate to the unzipped directory.&lt;/li&gt;
&lt;li&gt;Run the installation program provided within the unzipped files.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;For Linux (e.g., Ubuntu):&lt;/strong&gt;&lt;br&gt;
For Linux distributions like Ubuntu, follow these steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;First, ensure your system is up-to-date:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;sudo apt update &amp;amp;&amp;amp; sudo apt upgrade -y&lt;/code&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Install libfuse2, which is often a dependency:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;sudo apt install libfuse2 -y&lt;/code&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Download the Amazon Q CLI .deb package.
Install the .deb file using dpkg:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;sudo dpkg -i amazon-q-cli.deb&lt;/code&gt; (Replace amazon-q-cli.deb) with the actual filename&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After successful installation, you can log in to Amazon Q CLI using your AWS Builder ID:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;q login&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Once logged in, you can start interacting with Amazon Q CLI by simply typing q in your terminal.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setting up MCP Servers Locally&lt;/strong&gt;&lt;br&gt;
MCP servers can be run locally using various tools like npx, uvx, or docker. This guide will demonstrate using uvx for its simplicity and efficiency.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install uvx:&lt;/strong&gt;&lt;br&gt;
If you don't have uvx installed, you can typically install it via npm (Node Package Manager):&lt;/p&gt;

&lt;p&gt;&lt;code&gt;npm install -g uvx&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configure mcp.json:&lt;/strong&gt;&lt;br&gt;
Create or modify the mcp.json file located in your Amazon Q CLI configuration directory: ~/.aws/amazonq/mcp.json. This file defines the MCP servers that Amazon Q CLI will recognize and utilize.&lt;/p&gt;

&lt;p&gt;Here's an example of how your mcp.json might look, including configurations for an AWS CDK MCP server and an AWS Diagram MCP server:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "servers": [
    {
      "name": "awslabs.cdk-mcp-server",
      "command": "uvx",
      "args": ["awslabs.cdk-mcp-server"]
    },
    {
      "name": "awslabs.aws-diagram-mcp-server",
      "command": "uvx",
      "args": ["awslabs.aws-diagram-mcp-server"]
    }
  ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this configuration:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"name": A unique identifier for the MCP server.&lt;/li&gt;
&lt;li&gt;"command": The executable command to run the MCP server (e.g., uvx).&lt;/li&gt;
&lt;li&gt;"args": Any arguments required to run the specific MCP server.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By defining these entries, you're essentially telling Amazon Q CLI where to find and how to launch these powerful extensions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Powerful Use Cases with Amazon Q CLI and MCP Servers&lt;/strong&gt;&lt;br&gt;
With Amazon Q CLI and MCP servers set up, you unlock a realm of possibilities for automating and enhancing your AWS development tasks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
Creating Architectural Diagrams:
The awslabs.aws-diagram-mcp-server is incredibly useful for visually representing your cloud infrastructure. You can describe your desired architecture in natural language, and Amazon Q CLI, leveraging the MCP server, will generate a professional diagram.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example Prompt:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;q chat "Create a fault-tolerant, highly available real-time GPS tracking system using AWS ECS Fargate, Redis (ElastiCache), Aurora Serverless, and API Gateway. Use ALB for socket routing, integrate CloudWatch for logging, and S3 for archival storage. CI/CD should be managed by CodePipeline."
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Amazon Q CLI will then process this prompt and output a detailed architectural diagram, saving you hours of manual drawing. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Generating Terraform Code:&lt;br&gt;
For Infrastructure as Code (IaC) enthusiasts, MCP servers can assist in generating Terraform configurations. Describe the AWS resources you need, and Amazon Q CLI can provide the corresponding Terraform code, significantly accelerating your IaC development.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Managing Kubernetes Resources:&lt;br&gt;
If you're working with Kubernetes on AWS (e.g., EKS), MCP servers can help you generate Kubernetes manifests (YAML files) based on your requirements, simplifying the deployment and management of containerised applications.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Conclusion&lt;br&gt;
Integrating Amazon Q CLI with MCP servers transforms your command-line experience into an intelligent and highly efficient development environment. From generating complex architectural diagrams to automating infrastructure provisioning and managing Kubernetes resources, these tools empower you to work smarter, not just harder. By following the steps outlined in this guide, you can unlock a new level of productivity in your AWS cloud&lt;/p&gt;

</description>
      <category>cloudarchitecture</category>
      <category>aws</category>
      <category>amazonqcli</category>
      <category>mcp</category>
    </item>
    <item>
      <title>Automate Your AWS MSK Kafka Cluster with Terraform: A Complete Guide</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Tue, 07 Oct 2025 07:18:39 +0000</pubDate>
      <link>https://forem.com/aws-builders/automate-your-aws-msk-kafka-cluster-with-terraform-a-complete-guide-18li</link>
      <guid>https://forem.com/aws-builders/automate-your-aws-msk-kafka-cluster-with-terraform-a-complete-guide-18li</guid>
      <description>&lt;p&gt;In a previous post, we walked through setting up an AWS MSK cluster manually using the AWS Console. While that's great for learning, it's not repeatable, scalable, or easy to manage. Today, we're taking it to the next level with &lt;strong&gt;Infrastructure as Code (IaC)&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;We'll use &lt;strong&gt;Terraform&lt;/strong&gt; to define our entire MSK environment—VPC, subnets, security groups, IAM roles, the MSK cluster, and even a client EC2 instance—in a single set of configuration files. With a few simple commands, you can create, update, or destroy the whole setup.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What We'll Build:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A new VPC with public subnets across three Availability Zones.&lt;/li&gt;
&lt;li&gt;All necessary networking (Internet Gateway, Route Tables).&lt;/li&gt;
&lt;li&gt;A secure MSK Kafka cluster.&lt;/li&gt;
&lt;li&gt;An EC2 instance pre-configured with Kafka tools and the correct authentication settings.&lt;/li&gt;
&lt;li&gt;IAM roles and security groups that allow the EC2 instance to securely communicate with the MSK cluster.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's get started!&lt;/p&gt;




&lt;h2&gt;
  
  
  🛠️ Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before you begin, make sure you have the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;An AWS Account:&lt;/strong&gt; You'll need an AWS account with programmatic access. If you haven't already, configure your credentials locally using the AWS CLI:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Terraform Installed:&lt;/strong&gt; You'll need the Terraform CLI installed on your machine.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  How to Install Terraform
&lt;/h3&gt;

&lt;p&gt;Terraform is easy to install. Here are instructions for common operating systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;On macOS (using Homebrew):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;brew tap hashicorp/tap
brew &lt;span class="nb"&gt;install &lt;/span&gt;hashicorp/tap/terraform
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;On Windows (using Chocolatey):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;choco &lt;span class="nb"&gt;install &lt;/span&gt;terraform
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;On Linux (Debian/Ubuntu):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;wget &lt;span class="nt"&gt;-O-&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt;https://apt.releases.hashicorp.com/gpg]&lt;span class="o"&gt;(&lt;/span&gt;https://apt.releases.hashicorp.com/gpg&lt;span class="o"&gt;)&lt;/span&gt; | &lt;span class="nb"&gt;sudo &lt;/span&gt;gpg &lt;span class="nt"&gt;--dearmor&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; /usr/share/keyrings/hashicorp-archive-keyring.gpg
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"deb [signed-by=/usr/share/keyrings/hashicorp-archive-keyring.gpg] [https://apt.releases.hashicorp.com](https://apt.releases.hashicorp.com) &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;lsb_release &lt;span class="nt"&gt;-cs&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; main"&lt;/span&gt; | &lt;span class="nb"&gt;sudo tee&lt;/span&gt; /etc/apt/sources.list.d/hashicorp.list
&lt;span class="nb"&gt;sudo &lt;/span&gt;apt update &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;sudo &lt;/span&gt;apt &lt;span class="nb"&gt;install &lt;/span&gt;terraform
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;After installation, verify it's working by running:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  📜 Understanding the Terraform Code
&lt;/h2&gt;

&lt;p&gt;Save the code from the prompt into a file named &lt;code&gt;main.tf&lt;/code&gt;. Let's break down what each section of our Terraform configuration does.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Networking (VPC, Subnets, IGW)
&lt;/h3&gt;

&lt;p&gt;This section builds the foundational network for our resources. We create a new VPC and then provision three public subnets, one in each available AWS Availability Zone for high availability. The Internet Gateway and Route Tables ensure our EC2 instance can reach the internet.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Security (Security Groups &amp;amp; SSH Key)
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SSH Key:&lt;/strong&gt; Terraform dynamically generates an RSA key pair. The public key is uploaded to AWS (&lt;code&gt;aws_key_pair&lt;/code&gt;), and the private key is saved locally as &lt;code&gt;msk-client-key.pem&lt;/code&gt; so you can SSH into the EC2 instance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Groups:&lt;/strong&gt; We create two security groups: one for the MSK cluster and one for the EC2 client. The rules are configured to allow the EC2 instance and the MSK cluster to communicate freely with each other on any port, while the EC2 instance only accepts incoming SSH traffic from the internet.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. IAM Role for EC2
&lt;/h3&gt;

&lt;p&gt;Instead of hardcoding AWS keys, we use an IAM Role. This block creates an &lt;code&gt;aws_iam_role&lt;/code&gt; that the EC2 instance can "assume." The attached &lt;code&gt;aws_iam_policy&lt;/code&gt; grants the instance specific permissions to connect, describe, read from, and write to the MSK cluster topics. This is the most secure way to grant AWS permissions to services.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. The MSK Cluster
&lt;/h3&gt;

&lt;p&gt;This is the core resource. We define an &lt;code&gt;aws_msk_cluster&lt;/code&gt; with three small broker nodes (&lt;code&gt;kafka.t3.small&lt;/code&gt;). Critically, the &lt;code&gt;client_authentication&lt;/code&gt; block is configured to use &lt;strong&gt;IAM (&lt;code&gt;iam = true&lt;/code&gt;)&lt;/strong&gt;, which allows our EC2 instance to authenticate using its assigned IAM role.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. The EC2 Client Instance
&lt;/h3&gt;

&lt;p&gt;This is where the magic happens! We launch a &lt;code&gt;t2.micro&lt;/code&gt; EC2 instance. The &lt;code&gt;user_data&lt;/code&gt; script is a powerful feature that runs automatically on the first boot. This script:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Installs Java.&lt;/li&gt;
&lt;li&gt;Downloads and extracts the correct version of Kafka.&lt;/li&gt;
&lt;li&gt;Downloads the AWS MSK IAM Auth library, which is required for IAM authentication.&lt;/li&gt;
&lt;li&gt;Creates the &lt;code&gt;client.properties&lt;/code&gt; file with the exact configuration needed for our Kafka tools to authenticate with MSK via IAM.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This means that as soon as the instance is ready, it's already fully configured to act as a Kafka client!&lt;/p&gt;




&lt;h2&gt;
  
  
  🚀 Deploying the Infrastructure
&lt;/h2&gt;

&lt;p&gt;Note: You can find the complete terraform script here: &lt;a href="https://github.com/yeshwanthlm/AWS-MSK-Crash-Course/blob/main/terraform/msk-cluster-with-vpc-ec2-client.tf" rel="noopener noreferrer"&gt;https://github.com/yeshwanthlm/AWS-MSK-Crash-Course/blob/main/terraform/msk-cluster-with-vpc-ec2-client.tf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With the &lt;code&gt;msk-cluster-with-vpc-ec2-client.tf&lt;/code&gt; file saved, running the deployment is as simple as three commands.&lt;/p&gt;

&lt;h4&gt;
  
  
  Initialize Terraform
&lt;/h4&gt;

&lt;p&gt;This command downloads the necessary AWS provider plugin.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Plan the Deployment
&lt;/h4&gt;

&lt;p&gt;This is a dry run. Terraform shows you exactly what resources it will create, change, or destroy.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Apply the Configuration
&lt;/h4&gt;

&lt;p&gt;This command executes the plan and builds everything in your AWS account. Type yes when prompted.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Grab a coffee! ☕ The MSK cluster creation is the longest step and will take around &lt;strong&gt;20-30 minutes&lt;/strong&gt;. Once finished, Terraform will display the outputs, including the public IP of your EC2 instance.&lt;/p&gt;

&lt;h2&gt;
  
  
  ✅ Connecting and Testing Your Cluster
&lt;/h2&gt;

&lt;p&gt;Once terraform apply is complete, let's verify everything works.&lt;/p&gt;

&lt;h4&gt;
  
  
  Get the EC2 Public IP
&lt;/h4&gt;

&lt;p&gt;Find the public IP from the Terraform output. You can also run terraform output ec2_public_ip.&lt;/p&gt;

&lt;h4&gt;
  
  
  SSH into the EC2 Instance
&lt;/h4&gt;

&lt;p&gt;The private key msk-client-key.pem was saved in your project directory.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Make sure to set correct permissions if needed&lt;/span&gt;
&lt;span class="nb"&gt;chmod &lt;/span&gt;400 msk-client-key.pem

ssh &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s2"&gt;"msk-client-key.pem"&lt;/span&gt; ec2-user@&amp;lt;YOUR_EC2_PUBLIC_IP&amp;gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Get Your Bootstrap Brokers String
&lt;/h4&gt;

&lt;p&gt;In your local terminal (not the SSH session), get the connection string from the Terraform outputs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform output bootstrap_brokers
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy this long string. You'll need it for the next steps.&lt;/p&gt;

&lt;h4&gt;
  
  
  Create a Kafka Topic
&lt;/h4&gt;

&lt;p&gt;Inside your EC2 SSH session, navigate to the Kafka bin directory and create a topic.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bin/kafka-topics.sh &lt;span class="nt"&gt;--create&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--bootstrap-server&lt;/span&gt; &amp;lt;bootstrapServerString&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--command-config&lt;/span&gt; /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--replication-factor&lt;/span&gt; 3 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--partitions&lt;/span&gt; 1 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--topic&lt;/span&gt; my-first-topic
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Start a Producer
&lt;/h4&gt;

&lt;p&gt;In the same terminal, start the console producer. This will give you a &amp;gt; prompt.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bin/kafka-console-producer.sh &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--broker-list&lt;/span&gt; &amp;lt;bootstrapServerString&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--producer&lt;/span&gt;.config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--topic&lt;/span&gt; my-first-topic
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Type a message like Hello from Terraform! and press Enter.&lt;/p&gt;

&lt;h4&gt;
  
  
  Start a Consumer (in a new terminal)
&lt;/h4&gt;

&lt;p&gt;Open a &lt;strong&gt;second terminal window&lt;/strong&gt; and SSH into your EC2 instance again. Navigate to the same bin directory and run the consumer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bin/kafka-console-consumer.sh &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--bootstrap-server&lt;/span&gt; &amp;lt;bootstrapServerString&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--consumer&lt;/span&gt;.config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--topic&lt;/span&gt; my-first-topic &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--from-beginning&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should instantly see Hello from Terraform! appear in your consumer window. Success! 🎉&lt;/p&gt;

&lt;h2&gt;
  
  
  🧹 Cleaning Up
&lt;/h2&gt;

&lt;p&gt;Don't forget to tear down your infrastructure to avoid ongoing AWS charges! The beauty of Terraform is that this is a single, simple command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;terraform destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Type yes when prompted, and Terraform will neatly remove all the resources it created.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;You've successfully automated a complete AWS MSK environment using Terraform. With Infrastructure as Code, you now have a repeatable, version-controlled, and reliable way to manage your Kafka clusters on AWS. This is the foundation for building powerful, event-driven applications at scale.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>kafka</category>
      <category>iac</category>
    </item>
    <item>
      <title>A Beginner's Guide to AWS MSK: From Cluster Setup to Your First Message</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Mon, 06 Oct 2025 09:48:49 +0000</pubDate>
      <link>https://forem.com/aws-builders/a-beginners-guide-to-aws-msk-from-cluster-setup-to-your-first-message-3j05</link>
      <guid>https://forem.com/aws-builders/a-beginners-guide-to-aws-msk-from-cluster-setup-to-your-first-message-3j05</guid>
      <description>&lt;h1&gt;
  
  
  🚀 Getting Started with AWS MSK: Your First Kafka Cluster
&lt;/h1&gt;

&lt;p&gt;Ever wondered how massive, data-driven apps handle real-time event streams for things like live analytics, log aggregation, or IoT data? A key technology behind this is &lt;strong&gt;Apache Kafka&lt;/strong&gt;, a powerful open-source distributed event streaming platform.&lt;/p&gt;

&lt;p&gt;However, setting up and managing Kafka on your own can be a complex and time-consuming task. This is where &lt;strong&gt;AWS Managed Streaming for Apache Kafka (MSK)&lt;/strong&gt; comes in.&lt;/p&gt;

&lt;p&gt;In this guide, we'll cover:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What AWS MSK is and why it's useful.&lt;/li&gt;
&lt;li&gt;The different cluster types available.&lt;/li&gt;
&lt;li&gt;A full, hands-on tutorial to create your own MSK cluster and send your first messages from an EC2 instance.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's dive in!&lt;/p&gt;




&lt;h2&gt;
  
  
  ## 🤔 What is AWS MSK?
&lt;/h2&gt;

&lt;p&gt;Think of Apache Kafka as a high-speed, central post office for your application's data. 📮 Applications can send messages (&lt;strong&gt;produce&lt;/strong&gt;) to different mailboxes (&lt;strong&gt;topics&lt;/strong&gt;), and other applications can pick them up (&lt;strong&gt;consume&lt;/strong&gt;) when they're ready.&lt;/p&gt;

&lt;p&gt;AWS MSK is a &lt;strong&gt;fully managed service&lt;/strong&gt; that runs this post office for you. It handles the heavy lifting so you don't have to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Provisioning Servers:&lt;/strong&gt; No need to pick, set up, or configure EC2 instances.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Kafka Software Management:&lt;/strong&gt; AWS handles the installation, patching, and upgrades of Kafka.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;High Availability:&lt;/strong&gt; MSK automatically distributes your cluster across multiple data centers (Availability Zones) to ensure it's resilient to failure.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In short, you get the full power of Apache Kafka without the operational overhead.&lt;/p&gt;




&lt;h2&gt;
  
  
  ## ✨ Why Do You Need AWS MSK?
&lt;/h2&gt;

&lt;p&gt;So, why choose MSK over managing Kafka yourself?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;✅ Simplified Operations:&lt;/strong&gt; Spend your time building applications, not managing infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;🌐 Highly Available &amp;amp; Scalable:&lt;/strong&gt; MSK is built for resilience. You can easily scale your cluster's compute and storage with a few clicks and no downtime.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;🔒 Secure by Default:&lt;/strong&gt; Integrates seamlessly with AWS services like IAM for authentication, VPC for network isolation, and KMS for encrypting your data at rest and in transit.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;💯 Fully Compatible:&lt;/strong&gt; It's 100% compatible with open-source Apache Kafka. You can migrate existing applications, tools, and plugins without changing your code.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  ## Cluster Types: Provisioned vs. Serverless
&lt;/h2&gt;

&lt;p&gt;When creating a cluster, MSK gives you two options:&lt;/p&gt;

&lt;h3&gt;
  
  
  ### Provisioned Clusters
&lt;/h3&gt;

&lt;p&gt;This is the traditional model. Think of it like &lt;strong&gt;leasing a fleet of trucks&lt;/strong&gt;. 🚚 You choose the size and number of trucks (broker types and count), and you have full control over the configuration.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Best for:&lt;/strong&gt; Predictable, high-volume workloads where you want fine-grained control.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;You pay for:&lt;/strong&gt; The resources you provision, 24/7.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  ### Serverless Clusters
&lt;/h3&gt;

&lt;p&gt;This is a newer, more flexible option. Think of it like a &lt;strong&gt;pay-per-package delivery service&lt;/strong&gt;. 📦 You don't manage any trucks; you just send your data, and the service automatically scales to handle the load.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Best for:&lt;/strong&gt; New apps, or workloads with variable or unpredictable traffic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;You pay for:&lt;/strong&gt; The data you stream and retain (throughput and storage).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For this tutorial, we'll use a &lt;strong&gt;Provisioned&lt;/strong&gt; cluster to see all the underlying configurations.&lt;/p&gt;




&lt;h2&gt;
  
  
  ## 🛠️ Hands-On Demo: Creating Your Cluster and Sending Messages
&lt;/h2&gt;

&lt;p&gt;Time to build! We'll create an MSK cluster and an EC2 instance to communicate with it.&lt;/p&gt;

&lt;h3&gt;
  
  
  ### Part 1: Create the MSK Cluster
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt; In the AWS Console, navigate to &lt;strong&gt;MSK&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; Click &lt;strong&gt;Create cluster&lt;/strong&gt; and choose the &lt;strong&gt;Custom create&lt;/strong&gt; method.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Cluster settings:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cluster name:&lt;/strong&gt; &lt;code&gt;my-demo-msk-cluster&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cluster type:&lt;/strong&gt; &lt;strong&gt;Provisioned&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Apache Kafka version:&lt;/strong&gt; Use the recommended default.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Networking:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Select your desired &lt;strong&gt;VPC&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Choose &lt;strong&gt;at least two Availability Zones&lt;/strong&gt; and select a subnet in each. For a simple demo, public subnets are fine, but use private subnets for production.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Security:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Under &lt;strong&gt;Access control methods&lt;/strong&gt;, check the box for &lt;strong&gt;IAM role-based authentication&lt;/strong&gt;. This is the most secure and straightforward way to connect from other AWS services.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Review and Create&lt;/strong&gt;. The cluster will take &lt;strong&gt;20-30 minutes&lt;/strong&gt; to become &lt;code&gt;Active&lt;/code&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  ### Part 2: Set Up the EC2 Client &amp;amp; Security Groups
&lt;/h3&gt;

&lt;p&gt;While the cluster is creating, let's set up our client machine.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Launch an EC2 Instance:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to the EC2 service and click &lt;strong&gt;Launch instance&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Name:&lt;/strong&gt; &lt;code&gt;MSK-Client-EC2&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AMI:&lt;/strong&gt; &lt;strong&gt;Amazon Linux 2&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Instance Type:&lt;/strong&gt; &lt;code&gt;t2.micro&lt;/code&gt; (Free Tier eligible)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Network:&lt;/strong&gt; &lt;strong&gt;Crucially, select the same VPC and one of the subnets you used for your MSK cluster.&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;IAM Role:&lt;/strong&gt; Attach an IAM role to the instance with a policy that allows it to connect to MSK. A simple policy for this demo would be:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2012-10-17"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nl"&gt;"Statement"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:Connect"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:AlterCluster"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:DescribeCluster"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:*Topic*"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:WriteData"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:ReadData"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Effect"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allow"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Action"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:AlterGroup"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"kafka-cluster:DescribeGroup"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Resource"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Configure Security Groups (The Important Part!):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;MSK Cluster Security Group:&lt;/strong&gt; Find the security group attached to your MSK cluster. Add an &lt;strong&gt;inbound rule&lt;/strong&gt; to allow &lt;strong&gt;All traffic&lt;/strong&gt; from the security group of your &lt;code&gt;MSK-Client-EC2&lt;/code&gt; instance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;EC2 Instance Security Group:&lt;/strong&gt; Find the security group for your EC2 instance. Add an &lt;strong&gt;inbound rule&lt;/strong&gt; to allow &lt;strong&gt;All traffic&lt;/strong&gt; from the MSK cluster's security group. Also, make sure you have a rule to allow SSH from your IP.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This two-way rule allows the EC2 instance and the MSK brokers to communicate freely within the VPC.&lt;/p&gt;

&lt;h3&gt;
  
  
  ### Part 3: Connect and Send Messages
&lt;/h3&gt;

&lt;p&gt;Once your MSK cluster is &lt;strong&gt;Active&lt;/strong&gt; and your EC2 instance is running, SSH into the instance.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Install Tools:&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;    &lt;span class="c"&gt;# Update and install Java&lt;/span&gt;
    &lt;span class="nb"&gt;sudo &lt;/span&gt;yum update &lt;span class="nt"&gt;-y&lt;/span&gt;
    &lt;span class="nb"&gt;sudo &lt;/span&gt;yum &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="nb"&gt;install &lt;/span&gt;java-11
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Download and extract Apache Kafka tools&lt;/span&gt;
wget https://archive.apache.org/dist/kafka/3.6.0/kafka_2.13-3.6.0.tgz

&lt;span class="nb"&gt;tar&lt;/span&gt; &lt;span class="nt"&gt;-xzf&lt;/span&gt; kafka_2.13-3.6.0.tgz

&lt;span class="nb"&gt;cd &lt;/span&gt;kafka_2.13-3.6.0/libs

wget https://github.com/aws/aws-msk-iam-auth/releases/download/v1.1.1/aws-msk-iam-auth-1.1.1-all.jar
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Create Client Properties File:&lt;/strong&gt;&lt;br&gt;
We need to tell the Kafka tools to use IAM for authentication.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Create the config file&lt;/span&gt;
&lt;span class="nb"&gt;cat&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="no"&gt;EOF&lt;/span&gt;&lt;span class="sh"&gt; &amp;gt; client.properties
security.protocol=SASL_SSL
sasl.mechanism=AWS_MSK_IAM
sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;
sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
&lt;/span&gt;&lt;span class="no"&gt;EOF
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Get Bootstrap Servers:&lt;/strong&gt;&lt;br&gt;
In the MSK console, click your cluster, then &lt;strong&gt;View client information&lt;/strong&gt;. Copy the &lt;strong&gt;Bootstrap servers&lt;/strong&gt; endpoint for &lt;strong&gt;IAM&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Create a Topic:&lt;/strong&gt;&lt;br&gt;
Let's create a "mailbox" called &lt;code&gt;my-first-topic&lt;/code&gt;. Replace &lt;code&gt;&amp;lt;YOUR_BOOTSTRAP_SERVERS&amp;gt;&lt;/code&gt; with the endpoint you just copied.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bin/kafka-topics.sh &lt;span class="nt"&gt;--create&lt;/span&gt; &lt;span class="nt"&gt;--bootstrap-server&lt;/span&gt; &amp;lt;bootstrapServerString&amp;gt; &lt;span class="nt"&gt;--command-config&lt;/span&gt; /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties &lt;span class="nt"&gt;--replication-factor&lt;/span&gt; 3 &lt;span class="nt"&gt;--partitions&lt;/span&gt; 1 &lt;span class="nt"&gt;--topic&lt;/span&gt; my-first-topic
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Start a Producer:&lt;/strong&gt;&lt;br&gt;
This command gives you a prompt where you can type messages to send.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bin/kafka-console-producer.sh &lt;span class="nt"&gt;--broker-list&lt;/span&gt; &amp;lt;bootstrapServerString&amp;gt; &lt;span class="nt"&gt;--producer&lt;/span&gt;.config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties &lt;span class="nt"&gt;--topic&lt;/span&gt; my-first-topic

&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;Type &lt;code&gt;Hello MSK!&lt;/code&gt; and hit Enter. Type a few more messages.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Start a Consumer (in a new terminal):&lt;/strong&gt;&lt;br&gt;
Open a &lt;strong&gt;second SSH session&lt;/strong&gt; to your EC2 instance, navigate to the same &lt;code&gt;bin&lt;/code&gt; directory, and run:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bin/kafka-console-consumer.sh &lt;span class="nt"&gt;--bootstrap-server&lt;/span&gt; &amp;lt;bootstrapServerString&amp;gt; &lt;span class="nt"&gt;--consumer&lt;/span&gt;.config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties &lt;span class="nt"&gt;--topic&lt;/span&gt; my-first-topic &lt;span class="nt"&gt;--from-beginning&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You should see the messages you typed in the producer terminal appear instantly! 🎉&lt;/p&gt;




&lt;h2&gt;
  
  
  ## Conclusion
&lt;/h2&gt;

&lt;p&gt;Congratulations! You've successfully deployed a highly available Apache Kafka cluster using AWS MSK, configured secure access from an EC2 instance, and sent your first real-time messages.&lt;/p&gt;

&lt;p&gt;By using MSK, you get to leverage the power of Kafka for your event-driven applications without the headache of managing the underlying infrastructure.&lt;/p&gt;

&lt;p&gt;Thanks for reading! Let me know in the comments if you have any questions.&lt;/p&gt;

&lt;h1&gt;
  
  
  aws #kafka #cloud #devops #tutorial
&lt;/h1&gt;

</description>
      <category>aws</category>
      <category>kafka</category>
      <category>msk</category>
      <category>devops</category>
    </item>
    <item>
      <title>Automate AWS RDS &amp; Aurora Recommendations with Lambda and EventBridge</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Mon, 06 Oct 2025 05:49:58 +0000</pubDate>
      <link>https://forem.com/aws-builders/automate-aws-rds-aurora-recommendations-with-lambda-and-eventbridge-ng3</link>
      <guid>https://forem.com/aws-builders/automate-aws-rds-aurora-recommendations-with-lambda-and-eventbridge-ng3</guid>
      <description>&lt;h1&gt;
  
  
  Never Miss an AWS Database Tune-Up Again: Your Automated Alert System
&lt;/h1&gt;

&lt;p&gt;As developers, we're constantly juggling a million things. The last thing we want to worry about is whether our &lt;strong&gt;Amazon RDS and Aurora databases&lt;/strong&gt; are running at their peak. AWS provides a ton of great recommendations to boost performance, security, and reliability, but who has the time to check for them manually? 😥&lt;/p&gt;

&lt;p&gt;What if you could get these crucial recommendations delivered straight to your inbox, automatically? In this post, we'll walk you through a simple yet powerful solution to automate this process using &lt;strong&gt;AWS Lambda, Amazon EventBridge, and Amazon Simple Email Service (SES)&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Let's dive in! 🚀&lt;/p&gt;




&lt;h2&gt;
  
  
  The Big Picture: How It Works
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0i2bq2j1hhqmzf3ych3y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0i2bq2j1hhqmzf3ych3y.png" alt=" " width="779" height="318"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We'll be creating a serverless workflow that does the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Scheduled Trigger:&lt;/strong&gt; An &lt;strong&gt;Amazon EventBridge&lt;/strong&gt; rule will kick things off on a schedule you define (e.g., once a day, once a week).&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Recommendation Fetching:&lt;/strong&gt; The EventBridge rule will trigger an &lt;strong&gt;AWS Lambda function&lt;/strong&gt;. This function's job is to go and fetch all the latest recommendations for your RDS and Aurora instances.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Email Notification:&lt;/strong&gt; Once the Lambda function has the recommendations, it will format them into a neat HTML email and send it to you and your team using &lt;strong&gt;Amazon SES&lt;/strong&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This "set it and forget it" system ensures you're always in the loop about potential optimizations for your databases.&lt;/p&gt;




&lt;h2&gt;
  
  
  Let's Get Building! The Step-by-Step Guide
&lt;/h2&gt;

&lt;p&gt;Here’s a high-level overview of the steps we’ll take to bring this solution to life:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Tag, You're It! Tag Your Database Instances
&lt;/h3&gt;

&lt;p&gt;First things first, you'll need a way to identify the database instances you want to monitor. The easiest way to do this is by applying a &lt;strong&gt;tag&lt;/strong&gt;. For example, you could create a tag with the key &lt;code&gt;send-recommendations&lt;/code&gt; and the value &lt;code&gt;true&lt;/code&gt; for all the databases you want to receive notifications for.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Get Your Email Ready with Amazon SES
&lt;/h3&gt;

&lt;p&gt;To send emails, you'll need to set up &lt;strong&gt;Amazon SES&lt;/strong&gt;. This involves verifying your email address or domain to ensure you're authorized to send emails from it.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Permissions, Permissions, Permissions: The IAM Role
&lt;/h3&gt;

&lt;p&gt;Our Lambda function needs permission to access other AWS services (like RDS and SES). We'll create an &lt;strong&gt;IAM (Identity and Access Management) role&lt;/strong&gt; with a policy that grants the necessary permissions. This is a crucial security step to ensure our function only has access to what it needs.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. The Brains of the Operation: The Lambda Function
&lt;/h3&gt;

&lt;p&gt;This is where the magic happens! We'll write a Lambda function (you can use your favorite language, like Python or Node.js) that will:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fetch recommendations for your RDS and Aurora instances (using the tag we created earlier to filter them).&lt;/li&gt;
&lt;li&gt;Format the recommendations into a user-friendly HTML email.&lt;/li&gt;
&lt;li&gt;Send the email using Amazon SES.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. Set the Clock with EventBridge
&lt;/h3&gt;

&lt;p&gt;Finally, we'll create an &lt;strong&gt;Amazon EventBridge rule&lt;/strong&gt; that runs on a schedule. This rule will be configured to trigger our Lambda function, kicking off the entire process automatically.&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;Get the Code&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;You can find the complete source code, including the Lambda function and IAM policy, on GitHub. Feel free to clone it and get started right away!&lt;/p&gt;

&lt;p&gt;➡️ &lt;strong&gt;GitHub Repo:&lt;/strong&gt; &lt;a href="https://github.com/yeshwanthlm/RDS-Automation/tree/main" rel="noopener noreferrer"&gt;https://github.com/yeshwanthlm/RDS-Automation/tree/main&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Why You Should Do This Right Now
&lt;/h2&gt;

&lt;p&gt;By automating your RDS and Aurora recommendations, you'll:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Save Time and Effort:&lt;/strong&gt; No more manually checking for recommendations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Stay Proactive:&lt;/strong&gt; Address potential issues before they become major problems.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Boost Performance and Security:&lt;/strong&gt; Keep your databases running smoothly and securely.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Never Miss a Beat:&lt;/strong&gt; Ensure critical recommendations are never overlooked.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The best part? This solution is incredibly flexible. You can customize the filtering logic to match your organization's specific needs and priorities.&lt;/p&gt;

&lt;p&gt;So what are you waiting for? Take an hour to set this up today and thank yourself later. Happy coding! 🎉&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>python</category>
      <category>automation</category>
    </item>
    <item>
      <title>How to Convert AWS Clicks into CDK/CloudFormation (The EASY Way)</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Sun, 05 Oct 2025 13:28:51 +0000</pubDate>
      <link>https://forem.com/aws-builders/how-to-convert-aws-clicks-into-cdkcloudformation-the-easy-way-14lg</link>
      <guid>https://forem.com/aws-builders/how-to-convert-aws-clicks-into-cdkcloudformation-the-easy-way-14lg</guid>
      <description>&lt;h2&gt;
  
  
  The Manual Grind is Over
&lt;/h2&gt;

&lt;p&gt;Let's be real: setting up infrastructure in the AWS Management Console is a right of passage. We've all been there—clicking through menus, configuring settings, and trying to remember every single step we took. It works for a one-off task, but what happens when you need to do it again? And again? It's slow, error-prone, and doesn't scale.&lt;/p&gt;

&lt;p&gt;For years, the answer has been Infrastructure as Code (IaC) using tools like AWS CloudFormation or the AWS Cloud Development Kit (CDK). But this comes with a steep learning curve. What if you could get the best of both worlds? What if you could perform your setup once in the console and have AI generate the code for you?&lt;/p&gt;

&lt;p&gt;That's exactly what &lt;strong&gt;Console-to-Code&lt;/strong&gt;, a powerful feature in Amazon Q Developer, does. It watches your actions in the console and magically transforms them into clean, ready-to-use IaC.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Console-to-Code is a Game-Changer 🚀
&lt;/h2&gt;

&lt;p&gt;This isn't just another tool; it's a fundamental shift in how we can approach cloud automation.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Bridge the Skills Gap:&lt;/strong&gt; New to IaC? No problem. Use the visual console you're comfortable with and get high-quality code as your output. It's an incredible way to learn by doing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Massive Time Saver:&lt;/strong&gt; Instead of spending hours writing YAML or TypeScript from scratch, you can generate a solid baseline in minutes. This dramatically speeds up prototyping and deployment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enforce Consistency and Best Practices:&lt;/strong&gt; By converting manual setups into code, you create a repeatable, reliable process. This ensures every environment you spin up is identical, eliminating the "it worked on my machine" problem for infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-Language Support:&lt;/strong&gt; Whether you're a fan of Python, Java, TypeScript with the AWS CDK, or prefer the simplicity of CloudFormation (JSON/YAML), Console-to-Code has you covered.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  From Clicks to Code: Your 3-Step Guide
&lt;/h2&gt;

&lt;p&gt;Ready to try it out? The process is incredibly straightforward. Here’s how to turn your console actions into reusable code.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Hit Record
&lt;/h3&gt;

&lt;p&gt;First, you need to tell Amazon Q to start watching.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Log in to your &lt;strong&gt;AWS Management Console&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; Navigate to a supported service like &lt;strong&gt;VPC&lt;/strong&gt;, &lt;strong&gt;RDS&lt;/strong&gt;, or &lt;strong&gt;EC2&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; On the far right edge of your browser window, you'll see the &lt;strong&gt;Console-to-Code icon&lt;/strong&gt;. Click it.&lt;/li&gt;
&lt;li&gt; Simply click &lt;strong&gt;"Start recording."&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That's it! The tool is now actively recording every infrastructure-related action you take.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Build Your Infrastructure
&lt;/h3&gt;

&lt;p&gt;Now for the fun part. Go through the console and perform the tasks you want to automate.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Launch an EC2 instance.&lt;/li&gt;
&lt;li&gt;Create a new S3 bucket and configure its policies.&lt;/li&gt;
&lt;li&gt;Set up a VPC with subnets and route tables.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can even move between different services during a single recording session. The Console-to-Code panel will keep track of everything you do.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Generate Your Code
&lt;/h3&gt;

&lt;p&gt;Once you've completed your setup, it's time to get your code.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; In the Console-to-Code panel, you'll see a list of all the actions you performed.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Review and select&lt;/strong&gt; the specific actions you want to include in your IaC template.&lt;/li&gt;
&lt;li&gt; At the bottom of the panel, choose your desired output format from the dropdown menu (e.g., &lt;strong&gt;AWS CDK - Python&lt;/strong&gt; or &lt;strong&gt;CloudFormation - YAML&lt;/strong&gt;).&lt;/li&gt;
&lt;li&gt; Click the &lt;strong&gt;"Generate chosen language"&lt;/strong&gt; button.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Instantly, the panel will display the generated code. You'll not only get the IaC template but also the equivalent CLI commands for your reference.&lt;/p&gt;




&lt;h2&gt;
  
  
  Final Thoughts &amp;amp; Best Practices
&lt;/h2&gt;

&lt;p&gt;Console-to-Code is an incredibly powerful feature for accelerating your cloud journey. To get the most out of it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Plan Your Steps:&lt;/strong&gt; Have a clear idea of what you want to build &lt;em&gt;before&lt;/em&gt; you start recording to keep your generated code clean and focused.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Review the Output:&lt;/strong&gt; The generated code is a fantastic starting point, but always review it to understand what's happening and see if you can make any custom tweaks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use it as a Learning Tool:&lt;/strong&gt; If you're new to IaC, generate code for simple tasks first. Study the output to understand how resources are defined and linked. It’s one of the most practical ways to learn.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The era of choosing between the easy-to-use console and powerful automation is over. With Console-to-Code, you can finally have both.&lt;/p&gt;

&lt;p&gt;Happy building!&lt;/p&gt;




&lt;h2&gt;
  
  
  Connect with the Author
&lt;/h2&gt;

&lt;p&gt;Thanks for reading! I'm &lt;strong&gt;Yeshwanth L M&lt;/strong&gt;, an AWS Community Builder passionate about making cloud and DevOps accessible to everyone. If you found this article helpful, let's connect!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;YouTube:&lt;/strong&gt; &lt;a href="https://www.youtube.com/@TechWithYeshwanth/videos" rel="noopener noreferrer"&gt;Subscribe to TechWithYeshwanth&lt;/a&gt; for more tutorials.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Community:&lt;/strong&gt; &lt;a href="https://www.youtube.com/channel/UCwhERUcuzUCwr8x8mQ8zrcw/join" rel="noopener noreferrer"&gt;Join the Channel Membership&lt;/a&gt; for exclusive perks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GitHub:&lt;/strong&gt; &lt;a href="https://github.com/yeshwanthlm" rel="noopener noreferrer"&gt;Follow my projects&lt;/a&gt; and contribute.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Blog:&lt;/strong&gt; &lt;a href="https://dev.to/yeshwanthlm/"&gt;Read more articles on dev.to&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Instagram:&lt;/strong&gt; &lt;a href="https://www.instagram.com/techwithyeshwanth/" rel="noopener noreferrer"&gt;Follow @techwithyeshwanth&lt;/a&gt; for daily tech content.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;LinkedIn:&lt;/strong&gt; &lt;a href="https://www.linkedin.com/company/techwithyeshwanth/" rel="noopener noreferrer"&gt;Connect on LinkedIn&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Book a 1:1 Call:&lt;/strong&gt; &lt;a href="https://topmate.io/techwithyeshwanth" rel="noopener noreferrer"&gt;Schedule a mentoring session with me on TopMate&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>cloud</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Conquering the AWS Certified Solutions Architect Associate Exam: Your Essential Study Guide and Tips from A Monk in Cloud</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Fri, 06 Jun 2025 16:42:05 +0000</pubDate>
      <link>https://forem.com/aws-builders/conquering-the-aws-certified-solutions-architect-associate-exam-your-essential-study-guide-and-3b9d</link>
      <guid>https://forem.com/aws-builders/conquering-the-aws-certified-solutions-architect-associate-exam-your-essential-study-guide-and-3b9d</guid>
      <description>&lt;p&gt;Hey Cloud Enthusiasts! Yeshwanth here, ready to guide you through the journey of preparing for one of the most coveted certifications in the cloud computing landscape: the AWS Certified Solutions Architect Associate (SAA-C03) exam. This certification is a powerful testament to your ability to design and implement well-architected infrastructures within the AWS Cloud.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding the Role of an AWS Solutions Architect:&lt;/strong&gt;&lt;br&gt;
As a Solutions Architect, your core responsibility is to bridge customer requirements with effective, scalable, and secure AWS solutions. This means not only knowing which services to use but also understanding how to combine them to create infrastructure that adheres to key principles: efficiency, security, reliability, fault tolerance, and cost-effectiveness. These are the pillars upon which your SAA-C03 exam will be built.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Your Primary Study Resources:&lt;/strong&gt;&lt;br&gt;
When it comes to official study materials, consider the following your go-to sources:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;AWS Whitepapers: These provide in-depth technical details and best practices.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AWS FAQs: Often overlooked, but they contain crucial Q&amp;amp;A on service functionalities and limitations.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;AWS Documentation: The authoritative source for all AWS services.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Best &lt;a href="https://learn.cantrill.io/p/all-the-things-plus?affcode=212820_1wcobgaz" rel="noopener noreferrer"&gt;video tutorials&lt;/a&gt; by Adrian Cantrill.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Beyond reading, hands-on experience in building systems on AWS is incredibly beneficial. The exam includes many scenario-based questions, and practical knowledge will greatly enhance your understanding. For the most precise and up-to-date information on the exam structure and content, always consult the official SAA-C03 Exam Guide directly on the AWS Certification website. Give it a quick read to set your expectations!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Starting Your SAA-C03 Journey: A Recommended First Step&lt;/strong&gt;&lt;br&gt;
If you're relatively new to AWS, I highly recommend kicking off your studies with the FREE AWS Certified Cloud Practitioner Essentials digital course. This interactive course covers fundamental AWS Cloud concepts, services, security, architecture, pricing, and support plans. It's an excellent foundation before you delve into the more complex topics of the SAA-C03.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Navigating Study Materials: Staying Up-to-Date&lt;/strong&gt;&lt;br&gt;
The internet is awash with resources claiming to be the "best" for the SAA-C03 exam. However, it's critical to be discerning. Some resources might be outdated and won't cover the latest services or features introduced in the SAA-C03 exam version.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;My golden rule: Always check the official AWS Certification website for the most current information.&lt;/strong&gt; The official AWS Certified Solutions Architect Associate SAA-C03 exam page is your single source of truth. Here, you'll find the official exam guide, sample questions, and the link to schedule your exam.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Services for the SAA-C03 Exam (SAA-C03 Version Specific)&lt;/strong&gt;&lt;br&gt;
For the SAA-C03 exam version, be aware of services like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Global Accelerator&lt;/li&gt;
&lt;li&gt;Elastic Fabric Adapter (EFA)&lt;/li&gt;
&lt;li&gt;Elastic Network Adapter (ENA)&lt;/li&gt;
&lt;li&gt;AWS ParallelCluster&lt;/li&gt;
&lt;li&gt;Amazon FSx (for Windows File Server and Lustre)&lt;/li&gt;
&lt;li&gt;AWS DataSync&lt;/li&gt;
&lt;li&gt;AWS Directory Service&lt;/li&gt;
&lt;li&gt;High Performance Computing concepts&lt;/li&gt;
&lt;li&gt;Aurora Serverless&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Core AWS Services to Master for the SAA-C03 Exam&lt;/strong&gt;&lt;br&gt;
While the list above highlights specific updates, a deep understanding of the following core AWS services is non-negotiable for the SAA-C03:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;EC2 (Elastic Compute Cloud): This is foundational. Understand instance types, AMIs, storage options (EBS), and networking.&lt;/li&gt;
&lt;li&gt;Lambda: The heart of serverless computing. Know its integrations with other AWS services for building complete serverless applications.&lt;/li&gt;
&lt;li&gt;Elastic Load Balancer (ELB): Crucial for high availability. Study the different types (Application, Network, Gateway) and their features.&lt;/li&gt;
&lt;li&gt;Auto Scaling: Understand what services can be auto-scaled, the triggers for scaling, and how it manages instance counts.&lt;/li&gt;
&lt;li&gt;Elastic Block Store (EBS): Primary storage for EC2. Familiarize yourself with volume types, security, backup, and restore procedures.&lt;/li&gt;
&lt;li&gt;S3 / Glacier: Explore the various S3 storage classes, their use cases, and capabilities like static website hosting, access policies, and lifecycle management. S3 is a heavily tested service!&lt;/li&gt;
&lt;li&gt;Storage Gateway: Understand its purpose and when to use it versus direct S3 or EBS. Differentiate between DataSync and Storage Gateway.&lt;/li&gt;
&lt;li&gt;EFS (Elastic File System): Often compared with other storage solutions. Know when EFS is the right choice, considering cost and efficiency trade-offs.&lt;/li&gt;
&lt;li&gt;RDS / Aurora: Understand the differences between various RDS databases and what makes Aurora unique. Learn about parameter groups, option groups, and subnet groups.&lt;/li&gt;
&lt;li&gt;DynamoDB: A frequently tested NoSQL database. Compare it with RDS, ElastiCache, and Redshift. It's often paired with Lambda for serverless applications.&lt;/li&gt;
&lt;li&gt;ElastiCache: Focus on Redis and its functions. Identify scenarios where caching can improve performance, such as managing ELB session state or optimizing RDS.&lt;/li&gt;
&lt;li&gt;VPC / NACL / Security Groups: Master the components of a Virtual Private Cloud (subnets, route tables, internet gateways, NAT gateways, VPN gateways). Crucially, understand the distinct roles of Network Access Control Lists (NACLs) and Security Groups.&lt;/li&gt;
&lt;li&gt;Route 53: Study the different record types and routing policies. Be familiar with hosted zones and domains.&lt;/li&gt;
&lt;li&gt;IAM (Identity and Access Management): IAM Users, Groups, Policies, and Roles are fundamental. Understand how IAM integrates with other services for secure applications and be aware of best practices.&lt;/li&gt;
&lt;li&gt;CloudWatch: Learn about monitoring in AWS, metrics, CloudWatch Logs, CloudWatch Alarms, and custom metrics using the CloudWatch Agent.&lt;/li&gt;
&lt;li&gt;CloudTrail: Understand how CloudTrail works and the types of logs it stores, differentiating them from CloudWatch Logs.&lt;/li&gt;
&lt;li&gt;Kinesis: Have a high-level understanding of Kinesis Data Streams, including sharding, and how the different Kinesis services operate.&lt;/li&gt;
&lt;li&gt;CloudFront: Learn how CloudFront speeds up content delivery, its content sources, and supported SSL certificates.&lt;/li&gt;
&lt;li&gt;SQS (Simple Queue Service): Understand how SQS decouples systems, message management (standard, FIFO, dead-letter queues), and the differences between SQS, SNS, SES, and Amazon MQ.&lt;/li&gt;
&lt;li&gt;SNS (Simple Notification Service): Study its function, integrations, and supported notification recipients.&lt;/li&gt;
&lt;li&gt;SWF (Simple Workflow Service) / CloudFormation / OpsWorks: Understand the functions, capabilities, and typical use cases for each of these orchestration and automation services.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Crucial Comparison Scenarios&lt;/strong&gt;&lt;br&gt;
Based on my exam experience, pay special attention to the nuances and appropriate use cases when comparing these services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS DataSync vs. Storage Gateway&lt;/li&gt;
&lt;li&gt;FSx (considerations for cold and hot storage)&lt;/li&gt;
&lt;li&gt;Cross-Region Read Replicas vs. Multi-AZ RDS (focus on high-availability aspects)&lt;/li&gt;
&lt;li&gt;Amazon Object Key vs. Object Metadata&lt;/li&gt;
&lt;li&gt;Direct Connect vs. Site-to-Site VPN&lt;/li&gt;
&lt;li&gt;AWS Config vs. AWS CloudTrail&lt;/li&gt;
&lt;li&gt;Security Group vs. NACL&lt;/li&gt;
&lt;li&gt;NAT Gateway vs. NAT Instance&lt;/li&gt;
&lt;li&gt;Geolocation routing policy vs. Geoproximity routing policy on Route 53&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Your Path to Success: Practice and Hands-On Experience&lt;/strong&gt;&lt;br&gt;
Beyond the official documentation, consider leveraging reputable study aids like practice exams. Aim for consistent high scores to ensure you're truly prepared for the exam's format and difficulty.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Most importantly, get hands-on!&lt;/strong&gt; Sign up for an AWS Free Tier account and perform lab exercises. Experiencing these services directly – spinning up EC2 instances, creating S3 buckets, configuring VPCs – will deeply embed the concepts and help you remember what each service is capable of. It's an invaluable part of the learning process.&lt;/p&gt;

&lt;p&gt;I wish you all the best on your AWS Certified Solutions Architect Associate exam journey! If you have any questions or want to share your own tips, drop them in the comments below.&lt;/p&gt;

&lt;p&gt;Happy Architecting!&lt;/p&gt;

&lt;p&gt;&lt;a href="//yeshwanthlm.in"&gt;Yeshwanth&lt;/a&gt; AKA &lt;a href="https://www.youtube.com/@amonkincloud/videos" rel="noopener noreferrer"&gt;A Monk in Cloud&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>solutions</category>
      <category>exam</category>
    </item>
    <item>
      <title>Simplifying Multi-Region EC2 Management with AWS EC2 Instance Manager</title>
      <dc:creator>Yeshwanth L M</dc:creator>
      <pubDate>Thu, 22 May 2025 13:59:11 +0000</pubDate>
      <link>https://forem.com/aws-builders/simplifying-multi-region-ec2-management-with-aws-ec2-instance-manager-3h22</link>
      <guid>https://forem.com/aws-builders/simplifying-multi-region-ec2-management-with-aws-ec2-instance-manager-3h22</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Managing EC2 instances across multiple AWS regions can be a challenging task. As your cloud infrastructure grows, switching between regions in the AWS Console becomes time-consuming and inefficient. &lt;br&gt;
Today, I'm excited to introduce the AWS EC2 Instance Manager - a lightweight, browser-based tool I've developed to solve this exact problem.&lt;/p&gt;

&lt;p&gt;🔗 Try the app here:&lt;br&gt;
&lt;a href="https://gray-plant-037bead10.6.azurestaticapps.net/" rel="noopener noreferrer"&gt;https://gray-plant-037bead10.6.azurestaticapps.net/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📂 GitHub Repo:&lt;br&gt;
&lt;a href="https://github.com/yeshwanthlm/AWS-EC2-Instance-Manager" rel="noopener noreferrer"&gt;https://github.com/yeshwanthlm/AWS-EC2-Instance-Manager&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Challenge of Multi-Region Management
&lt;/h2&gt;

&lt;p&gt;If you're managing AWS infrastructure, you've likely encountered these pain points:&lt;/p&gt;

&lt;p&gt;• Constantly switching between AWS regions to check instance status&lt;br&gt;
• Difficulty getting a consolidated view of all running instances&lt;br&gt;
• Time wasted navigating through the AWS Console for simple operations&lt;br&gt;
• Need for a quick way to stop or terminate instances across regions&lt;/p&gt;

&lt;p&gt;These challenges inspired me to create a simple yet powerful solution that runs entirely in your browser.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introducing AWS EC2 Instance Manager
&lt;/h2&gt;

&lt;p&gt;The AWS EC2 Instance Manager is a static web application that provides a unified interface for managing EC2 instances across all AWS regions. What makes this tool special is its simplicity - there's no &lt;br&gt;
backend server, no complex setup, and no additional infrastructure required.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Cross-Region Visibility
&lt;/h4&gt;

&lt;p&gt;The application automatically discovers and queries all AWS regions, providing a single dashboard view of your EC2 instances. This eliminates the need to manually switch between regions in the AWS &lt;br&gt;
Console.&lt;/p&gt;

&lt;h4&gt;
  
  
  Focus on Active Resources
&lt;/h4&gt;

&lt;p&gt;The tool shows only running instances by default, helping you focus on active resources that might be incurring costs. This filtering makes it easier to identify instances that could potentially be &lt;br&gt;
stopped to optimize your AWS spending.&lt;/p&gt;

&lt;h4&gt;
  
  
  Streamlined Instance Management
&lt;/h4&gt;

&lt;p&gt;With just a single click, you can stop or terminate instances directly from the interface. This streamlined approach saves valuable time compared to navigating through the AWS Console for each action.&lt;/p&gt;

&lt;h4&gt;
  
  
  Security-First Design
&lt;/h4&gt;

&lt;p&gt;Security is a top priority. The application handles AWS credentials with care - they're stored only in memory and never persisted or sent to any server other than AWS directly. This client-side only &lt;br&gt;
architecture minimizes security risks.&lt;/p&gt;

&lt;h4&gt;
  
  
  Responsive Interface
&lt;/h4&gt;

&lt;p&gt;Whether you're at your desk or on the go, the responsive design ensures the tool works seamlessly across desktop and mobile browsers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Technical Implementation
&lt;/h2&gt;

&lt;p&gt;The AWS EC2 Instance Manager is built with simplicity in mind:&lt;/p&gt;

&lt;p&gt;• &lt;strong&gt;Pure Web Technologies&lt;/strong&gt;: The application uses vanilla HTML, CSS, and JavaScript without any frameworks, keeping it lightweight and fast.&lt;br&gt;
• &lt;strong&gt;AWS SDK for JavaScript&lt;/strong&gt;: Leverages the AWS SDK v2 to interact with AWS services directly from the browser.&lt;br&gt;
• &lt;strong&gt;Client-Side Architecture&lt;/strong&gt;: All processing happens in your browser, with no server-side components required.&lt;/p&gt;

&lt;p&gt;This approach makes the tool incredibly portable - you can run it from any computer with a modern web browser.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;Using the EC2 Instance Manager is straightforward:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Clone or download the repository from GitHub&lt;/li&gt;
&lt;li&gt;Open the index.html file in your web browser&lt;/li&gt;
&lt;li&gt;Enter your AWS credentials (Account ID, Access Key ID, and Secret Access Key)&lt;/li&gt;
&lt;li&gt;Click "Connect to AWS" to fetch your running instances&lt;/li&gt;
&lt;li&gt;Use the provided buttons to stop or terminate instances as needed&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Required Permissions
&lt;/h3&gt;

&lt;p&gt;To use the tool effectively, your AWS credentials need these specific permissions:&lt;br&gt;
• ec2:DescribeRegions - To discover all available AWS regions&lt;br&gt;
• ec2:DescribeInstances - To list EC2 instances in each region&lt;br&gt;
• ec2:StopInstances - To stop running instances&lt;br&gt;
• ec2:TerminateInstances - To terminate instances when needed&lt;/p&gt;

&lt;h2&gt;
  
  
  Security Considerations
&lt;/h2&gt;

&lt;p&gt;While the tool is designed with security in mind, it's important to note that storing AWS credentials in a browser application is not recommended for production environments. For production use, consider &lt;br&gt;
these more secure alternatives:&lt;/p&gt;

&lt;p&gt;• Implement authentication using Amazon Cognito&lt;br&gt;
• Create a backend using API Gateway and Lambda to handle AWS operations&lt;br&gt;
• Use proper IAM roles with least privilege principles&lt;/p&gt;

&lt;h2&gt;
  
  
  Future Roadmap
&lt;/h2&gt;

&lt;p&gt;This is just the beginning for the EC2 Instance Manager. Future enhancements may include:&lt;/p&gt;

&lt;p&gt;• Adding the ability to start stopped instances&lt;br&gt;
• Including more detailed instance information and advanced filtering options&lt;br&gt;
• Expanding support to other AWS resources like RDS databases and Lambda functions&lt;br&gt;
• Implementing secure credential storage with Amazon Cognito&lt;/p&gt;

&lt;h2&gt;
  
  
  Troubleshooting Tips
&lt;/h2&gt;

&lt;p&gt;If you encounter issues while using the tool:&lt;/p&gt;

&lt;p&gt;• Verify your AWS credentials are correct and have the necessary permissions&lt;br&gt;
• Ensure you have running EC2 instances in your account&lt;br&gt;
• Check your browser's console for any error messages&lt;br&gt;
• Make sure your browser isn't blocking scripts from loading&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The AWS EC2 Instance Manager demonstrates how a simple tool can significantly improve the efficiency of cloud resource management. By providing a consolidated view of EC2 instances across all regions and &lt;br&gt;
enabling quick actions, it saves valuable time for AWS users.&lt;/p&gt;

&lt;p&gt;I built this tool to address a common pain point in my own AWS management workflow, and I hope it proves useful for others facing similar challenges. The project is open-source and available for anyone to&lt;br&gt;
use, modify, and improve.&lt;/p&gt;

&lt;p&gt;Whether you're managing a handful of instances or a large fleet across multiple regions, this tool can help streamline your EC2 management tasks and provide better visibility into your AWS resources.&lt;/p&gt;

&lt;p&gt;━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━&lt;/p&gt;

&lt;p&gt;This project was developed by Yeshwanth L M at &lt;a href="https://yeshwanthlm.in" rel="noopener noreferrer"&gt;A Monk in Cloud&lt;/a&gt;. The AWS EC2 Instance Manager is licensed under the MIT License, making it freely available for personal and &lt;br&gt;
commercial use.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
