<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Pratik Ponde</title>
    <description>The latest articles on Forem by Pratik Ponde (@pratik_26).</description>
    <link>https://forem.com/pratik_26</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/pratik_26"/>
    <language>en</language>
    <item>
      <title>Amazon Bedrock Beginner Guide: A Complete Deep Dive into AWS Generative AI</title>
      <dc:creator>Pratik Ponde</dc:creator>
      <pubDate>Mon, 30 Mar 2026 07:06:55 +0000</pubDate>
      <link>https://forem.com/pratik_26/amazon-bedrock-beginner-guide-a-complete-deep-dive-into-aws-generative-ai-c5e</link>
      <guid>https://forem.com/pratik_26/amazon-bedrock-beginner-guide-a-complete-deep-dive-into-aws-generative-ai-c5e</guid>
      <description>&lt;p&gt;👋 Hey there! This is Pratik, a Senior DevOps Consultant with a strong background in automating and optimizing cloud infrastructure, particularly on AWS. Over the years, I have designed and implemented scalable solutions for enterprises, focusing on infrastructure as code, CI/CD pipelines, cloud security, and resilience. My expertise lies in translating complex cloud requirements into efficient, reliable, and cost-effective architectures.&lt;/p&gt;

&lt;p&gt;In this guide, we’ll take a beginner-friendly deep dive into Amazon Bedrock, explaining how it works, its features, costs, and use cases. &lt;/p&gt;




&lt;p&gt;🔍 &lt;em&gt;How a Simple Problem Introduced Me to &lt;a href="https://aws.amazon.com/bedrock/?trk=5eabf6f5-7510-4f30-9f4b-03d1339cf4e0&amp;amp;sc_channel=ps&amp;amp;ef_id=Cj0KCQjw7IjOBhDyARIsAFzrWQwK_JXvf_NoRY5Uqy19wMO-n_qou6K35l6fShVd6zVswayhdIjcX9saAkVSEALw_wcB:G:s&amp;amp;s_kwcid=AL!4422!3!795924513359!e!!g!!amazon%20bedrock!23528572310!193737038578&amp;amp;gad_campaignid=23528572310&amp;amp;gbraid=0AAAAADjHtp9AfyDPYT4ajewNAxtY98tWY&amp;amp;gclid=Cj0KCQjw7IjOBhDyARIsAFzrWQwK_JXvf_NoRY5Uqy19wMO-n_qou6K35l6fShVd6zVswayhdIjcX9saAkVSEALw_wcB" rel="noopener noreferrer"&gt;&lt;strong&gt;Amazon Bedrock&lt;/strong&gt;&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A while ago, I was trying to find a solution for simple problem: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;&lt;strong&gt;How to make systems understand files and data intelligently, not just match keywords.&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;When I started exploring AI solutions, things quickly became complicated. &lt;br&gt;
Hosting models, managing infrastructure, and scaling felt like too much work for what I wanted to create. &lt;/p&gt;

&lt;p&gt;That’s when I discovered &lt;strong&gt;Amazon Bedrock&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;What stood out wasn’t just the ability to generate text, it was the flexibility. &lt;br&gt;
It could handle chat, text, images, and more, all through simple APIs. I didn’t have to worry about infrastructure, thanks to &lt;a href="https://aws.amazon.com/" rel="noopener noreferrer"&gt;Amazon Web Services&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;That shift from managing systems to building solutions is what this article is all about.&lt;/p&gt;


&lt;h3&gt;
  
  
  &lt;strong&gt;🧠 What is Amazon Bedrock?&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Amazon Bedrock is a fully managed AWS service that allows developers to build and scale generative AI applications using foundation models (FMs).&lt;/p&gt;
&lt;h4&gt;
  
  
  What are Foundation Models?
&lt;/h4&gt;

&lt;p&gt;Foundation Models are large AI models trained on extensive datasets that can:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Understand text&lt;/li&gt;
&lt;li&gt; Analyze images&lt;/li&gt;
&lt;li&gt; Generate content&lt;/li&gt;
&lt;li&gt; Answer questions&lt;/li&gt;
&lt;/ol&gt;
&lt;h4&gt;
  
  
  Why Use Bedrock?
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;No need to manage infrastructure&lt;/li&gt;
&lt;li&gt;Built for security and scalability&lt;/li&gt;
&lt;li&gt;Cost-effective with a pay-as-you-go model&lt;/li&gt;
&lt;li&gt;Seamlessly integrates with the AWS ecosystem&lt;/li&gt;
&lt;/ol&gt;
&lt;h4&gt;
  
  
  Popular Models Available
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;Anthropic Claude &lt;/li&gt;
&lt;li&gt;Meta Llama&lt;/li&gt;
&lt;li&gt;Amazon Titan&lt;/li&gt;
&lt;/ol&gt;


&lt;h3&gt;
  
  
  &lt;strong&gt;🧠 What Can You Do with Amazon Bedrock?&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The range of AI capabilities that Amazon Bedrock provides through a single API layer is one of its main advantages.&lt;/p&gt;

&lt;p&gt;Let's divide it up into the main categories of capabilities:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;💬 1. Chat (Conversational AI)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Chat is the most common way to interact with Amazon Bedrock.&lt;br&gt;
You simply type a question or instruction, and the model responds just like a conversation.&lt;br&gt;
It’s perfect for building chatbots, assistants, or automating responses.&lt;/p&gt;

&lt;p&gt;Bedrock supports building intelligent chat systems using models like:&lt;/p&gt;

&lt;p&gt;• Claude 3.5 Haiku &lt;br&gt;
• Claude 3 Sonnet &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- What you can build:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;• Customer support bots &lt;br&gt;
• Internal enterprise assistants &lt;br&gt;
• DevOps copilots &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Example Prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"messages"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"role"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"user"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"content"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Explain AWS Bedrock in simple terms"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;👉 Best practice:&lt;br&gt;
• Use structured conversation format (messages) instead of plain prompt &lt;br&gt;
• Maintain chat history for context &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm0pbjs87do0hzk44q55x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm0pbjs87do0hzk44q55x.png" alt=" " width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjcll16wnsgv0gsjm084d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjcll16wnsgv0gsjm084d.png" alt=" " width="800" height="234"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;✍️ 2. Text Generation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Text generation is used when you want the AI to create content.&lt;br&gt;
This can include summaries, articles, code, or structured outputs like JSON.&lt;br&gt;
It’s ideal for automation tasks, documentation, and content creation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Supported Models:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;• Amazon Titan Text &lt;br&gt;
• Claude 4 Sonnet &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- What you can build:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;• Blog/article generation &lt;br&gt;
• Code generation &lt;br&gt;
• Email drafting &lt;br&gt;
• Summarization &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Example Use Case:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"prompt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Write a professional email for leave request"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"max_tokens"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;150&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"temperature"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb2dblvn8ghqw7im2fbi2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb2dblvn8ghqw7im2fbi2.png" alt=" " width="800" height="190"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu3mnbvbldrs7mxhrlff3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu3mnbvbldrs7mxhrlff3.png" alt=" " width="800" height="194"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🖼️ 3. Image Generation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With image input, you can send pictures to the model and ask questions about them.&lt;br&gt;
The AI can analyze visuals, detect objects, or extract information from images.&lt;br&gt;
This is useful for use cases like document verification, image matching, and visual inspection.&lt;/p&gt;

&lt;p&gt;Bedrock also supports text-to-image generation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Supported Model:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;• Amazon Titan Image Generator &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- What you can build:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;• AI-generated designs &lt;br&gt;
• Marketing creatives &lt;br&gt;
• Thumbnails and banners &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;- Example Prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"textToImageParams"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"text"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"A futuristic city with flying cars at sunset"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzqvvfvvrosjwkzgvgnfn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzqvvfvvrosjwkzgvgnfn.png" alt=" " width="800" height="323"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;⚙️ Understanding Bedrock API Parameters&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;When you work with Amazon Bedrock, you basically send a request to the AI saying:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;“Here’s what I want and here’s how I want you to respond.”&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A typical request looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"anthropic_version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"bedrock-2023-05-31"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"max_tokens"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"temperature"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"top_p"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"messages"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s break this down 👇&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1&lt;/strong&gt;.  &lt;strong&gt;anthropic_version&lt;/strong&gt;  -  [ Which rulebook to follow ]&lt;/p&gt;

&lt;p&gt;• Defines the API schema version&lt;br&gt;
• It’s required for Anthropic models (like Claude)&lt;br&gt;
• Always keep this fixed unless AWS updates it&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2&lt;/strong&gt;.  &lt;strong&gt;max_tokens&lt;/strong&gt;  -  [ How long should the answer be? ]&lt;/p&gt;

&lt;p&gt;This controls the length of the response.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Higher value → longer answer&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Lower value → short and precise&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Quick tips:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Use 200–500 → for JSON / short answers&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use 800+ → when you want detailed explanations&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;3.&lt;/strong&gt;  &lt;strong&gt;temperature&lt;/strong&gt;  -  [ How creative should the AI be? ]&lt;/p&gt;

&lt;p&gt;This controls how predictable or creative the response is.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;0 → very strict, same answer every time&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;0.5 → balanced&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;1 → more creative and random&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4.&lt;/strong&gt; &lt;strong&gt;top_p&lt;/strong&gt;  -  [ How many options should the AI consider? ]&lt;/p&gt;

&lt;p&gt;This one sounds complex, but it’s actually simple.&lt;/p&gt;

&lt;p&gt;• Controls token probability sampling&lt;br&gt;
• Limits how “wide” the model thinks&lt;br&gt;
Value   Behavior&lt;/p&gt;

&lt;p&gt;1  Consider all possibilities&lt;br&gt;
&amp;lt;1  More focused output&lt;/p&gt;

&lt;p&gt;Best practice:&lt;br&gt;
• Use top_p = 1 with temperature = 0&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5.&lt;/strong&gt; &lt;strong&gt;messages&lt;/strong&gt;  -  [ What do you actually want? ]&lt;/p&gt;

&lt;p&gt;This is the most important part.&lt;/p&gt;

&lt;p&gt;This is where you:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Ask your question&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Give instructions&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Provide input (text, image, etc.)&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="nl"&gt;"messages"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"role"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"user"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"content"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;📌 Types of input you can send&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Inside messages, you can include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;text → normal instructions&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;image → for image understanding&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;document → structured files (limited use)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;💰 Amazon Bedrock Pricing&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;When you first look at Bedrock pricing, it can feel confusing but the idea is actually very simple:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;You only pay for what you use (pay-as-you-go)&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;No upfront cost, no subscription required.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.&lt;/strong&gt; &lt;strong&gt;The Most Important Concept: Tokens&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Bedrock pricing is mainly based on tokens.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What is a token?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A token is a piece of text (word or part of a word)&lt;br&gt;
Example: “Hello world” ≈ 2–3 tokens&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You are charged for:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Input tokens (what you send)&lt;br&gt;
Output tokens (what AI generates)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2.&lt;/strong&gt; &lt;strong&gt;Pricing Depends on the Model&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Different models = different pricing&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Anthropic Claude&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;~$6 per 1M input tokens&lt;br&gt;
~$30 per 1M output tokens&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Smaller / cheaper models&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Can be 10x cheaper&lt;br&gt;
Example: some models start around $0.07 per 1M tokens&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3.&lt;/strong&gt; &lt;strong&gt;Image Pricing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For image-based tasks:&lt;/p&gt;

&lt;p&gt;You are charged per image processed&lt;br&gt;
Example: ~$0.03 to $0.60 per image depending on operation&lt;/p&gt;

&lt;p&gt;For detailed pricing information, click here 👉 &lt;a href="https://aws.amazon.com/bedrock/pricing/?refid=5eabf6f5-7510-4f30-9f4b-03d1339cf4e0" rel="noopener noreferrer"&gt;AWS Bedrock Pricing&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;✨Final Thoughts&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Amazon Bedrock makes it extremely simple to get started with generative AI without worrying about the infrastructure or complexity.&lt;br&gt;
With multiple models available, flexible pricing options, and support for text, chat, and image-based applications, it provides the ability to create highly scalable applications.&lt;br&gt;
The key here is to understand how to effectively tune parameters, select the most appropriate model, and manage costs effectively for your application.&lt;/p&gt;

&lt;p&gt;Start simple, experiment, and scale as needed and that’s where Bedrock shines.&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;🔜 What’s Next&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In our next article, we’ll work on a hands-on example of a real-world use case involving Amazon Bedrock and AWS Lambda.&lt;br&gt;
The example will be a solution to verify if a reference image exists within a target file, such as a document or PDF.&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;💬 Let’s Keep the Conversation Going&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Have thoughts, questions, or any experience with Generative AI to share? I would love to hear from you! Feel free to leave a comment or connect with me on &lt;em&gt;&lt;a href="//www.linkedin.com/in/pratikponde"&gt;LinkedIn&lt;/a&gt;&lt;/em&gt;. Let's learn and grow together as a community of builders.&lt;br&gt;
Keep exploring, keep automating and see you in the next one!&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>How to Get Real-Time Notifications When AWS Glue Schema Registry Changes</title>
      <dc:creator>Pratik Ponde</dc:creator>
      <pubDate>Tue, 17 Feb 2026 11:45:58 +0000</pubDate>
      <link>https://forem.com/pratik_26/how-to-get-real-time-notifications-when-aws-glue-schema-registry-changes-4nbd</link>
      <guid>https://forem.com/pratik_26/how-to-get-real-time-notifications-when-aws-glue-schema-registry-changes-4nbd</guid>
      <description>&lt;h4&gt;
  
  
  &lt;em&gt;A real-world DevOps journey with AWS Glue, EventBridge, and Lambda&lt;/em&gt;
&lt;/h4&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;👋 Hey there! This is Pratik, a Senior DevOps Consultant with a strong background in automating and optimizing cloud infrastructure, particularly on AWS. Over the years, I have designed and implemented scalable solutions for enterprises, focusing on infrastructure as code, CI/CD pipelines, cloud security, and resilience. My expertise lies in translating complex cloud requirements into efficient, reliable, and cost-effective architectures.&lt;/p&gt;

&lt;p&gt;In this article, I walk through a real-world approach to detecting AWS Glue Schema Registry updates in real time and why making schema changes observable matters in production systems.&lt;/p&gt;




&lt;h2&gt;
  
  
  🚨&lt;strong&gt;The Problem That Started It All&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;While working as a DevOps engineer, I supported a Kafka-based event-driven system where &lt;strong&gt;schemas mattered a lot&lt;/strong&gt;. Producers and consumers depended heavily on schema versions. Any change, whether intentional or accidental, could quietly disrupt downstream services.&lt;/p&gt;

&lt;p&gt;We were already using &lt;strong&gt;AWS Glue Schema Registry&lt;/strong&gt; to manage schemas for Amazon MSK. It provided us with versioning and compatibility checks.&lt;/p&gt;

&lt;p&gt;But one question kept coming up during reviews:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt;“How do we know when someone updates a schema ?”&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There was no alert, no trigger, no automation just a silent update sitting in Glue.&lt;/p&gt;

&lt;p&gt;That’s when I decided to build an event-driven notification system for schema changes.&lt;/p&gt;




&lt;h2&gt;
  
  
  💡&lt;strong&gt;The Idea: Make Schema Changes Event Driven&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Instead of polling Glue or depending on manual communication, the idea was straightforward:&lt;br&gt;
&lt;strong&gt;Whenever a schema changes, trigger an action automatically.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;And that action Calls a &lt;strong&gt;POST API&lt;/strong&gt; which could:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Notify teams&lt;/li&gt;
&lt;li&gt;Trigger validations&lt;/li&gt;
&lt;li&gt;Update dashboards&lt;/li&gt;
&lt;li&gt;Or kick off CI/CD pipelines&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  ⚙️&lt;strong&gt;Comprehensive Architecture Flow&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu46eg2v5ljcfvlb1hn22.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu46eg2v5ljcfvlb1hn22.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here’s how the flow looks in real life:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A schema is created or updated in Glue Schema Registry.&lt;/li&gt;
&lt;li&gt;CloudTrail records the API activity.&lt;/li&gt;
&lt;li&gt;EventBridge listens for specific Glue schema events.&lt;/li&gt;
&lt;li&gt;Lambda is triggered automatically.&lt;/li&gt;
&lt;li&gt;Lambda calls a POST API with schema details.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;No polling. No cron jobs. Fully event-driven.&lt;/em&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  &lt;strong&gt;💻 Source Code and GitHub Repository&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The complete implementation for this architecture is available on GitHub.&lt;/p&gt;

&lt;p&gt;🔗 GitHub Repository:&lt;br&gt;&lt;br&gt;


&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/pratiksponde" rel="noopener noreferrer"&gt;
        pratiksponde
      &lt;/a&gt; / &lt;a href="https://github.com/pratiksponde/AWS-Glue-Schema-EventBridge-Lambda" rel="noopener noreferrer"&gt;
        AWS-Glue-Schema-EventBridge-Lambda
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      How to Get Real-Time Notifications When AWS Glue Schema Registry Changes
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;How to Get Real-Time Notifications When AWS Glue Schema Registry Changes&lt;/h1&gt;
&lt;/div&gt;
&lt;blockquote&gt;
&lt;p&gt;Real-time notifications for AWS Glue Schema Registry changes using EventBridge and Lambda.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;This repository provides a fully event-driven solution to detect and respond to schema changes in AWS Glue Schema Registry in real time. It uses AWS CloudTrail, Amazon EventBridge, and AWS Lambda to automatically capture schema-related API events and trigger downstream notifications or integrations.&lt;/p&gt;
&lt;p&gt;📖 &lt;strong&gt;Full article explaining the architecture and implementation:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://dev.to/pratik_26/how-to-get-real-time-notifications-when-aws-glue-schema-registry-changes-4nbd" rel="nofollow"&gt;https://dev.to/pratik_26/how-to-get-real-time-notifications-when-aws-glue-schema-registry-changes-4nbd&lt;/a&gt;&lt;/p&gt;

&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;🚀 Architecture Overview&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;This solution follows a serverless, event-driven architecture:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;AWS Glue Schema Registry&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Stores and manages schemas for streaming and data applications.&lt;/li&gt;
&lt;li&gt;Events occur when schemas are created, updated, or new versions are registered.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;AWS CloudTrail&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Records all Glue Schema Registry API calls such as
&lt;ul&gt;
&lt;li&gt;CreateSchema&lt;/li&gt;
&lt;li&gt;RegisterSchemaVersion&lt;/li&gt;
&lt;li&gt;UpdateSchema&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;These events are automatically available to EventBridge.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Amazon EventBridge&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Captures relevant Glue events from CloudTrail.&lt;/li&gt;
&lt;li&gt;Filters only schema-related changes.&lt;/li&gt;
&lt;li&gt;Triggers the Lambda function.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;AWS Lambda&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Receives…&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/pratiksponde/AWS-Glue-Schema-EventBridge-Lambda" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;




&lt;p&gt;Feel free to clone the repository and try it in your own AWS environment.&lt;/p&gt;

&lt;p&gt;Now, let’s dive deep into the hands-on implementation step by step.&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;strong&gt;Step 1: Capturing Schema Changes with CloudTrail&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Steps to Create a Default Trail (Console):&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Open CloudTrail:&lt;/strong&gt; Go to the AWS CloudTrail console.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Create Trail:&lt;/strong&gt; Click Create trail.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8984el5a70wz26dq725.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8984el5a70wz26dq725.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3. &lt;strong&gt;Configure Name:&lt;/strong&gt; Enter a trail name (e.g., DefaultTrail).&lt;br&gt;
4. &lt;strong&gt;Storage Location:&lt;/strong&gt; Select Create new S3 bucket to let CloudTrail handle permissions automatically.&lt;br&gt;
5. &lt;strong&gt;Log File Validation:&lt;/strong&gt; Leave enabled (default) to ensure log integrity.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fka67q7vqhyaz4mxrpqfn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fka67q7vqhyaz4mxrpqfn.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6. &lt;strong&gt;KMS Encryption:&lt;/strong&gt; Leave enabled (default) for security.&lt;br&gt;
7. &lt;strong&gt;Finish:&lt;/strong&gt; Click Next, then Create trail. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx601to5jd0l5odjm7jwc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx601to5jd0l5odjm7jwc.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;AWS Glue APIs like &lt;code&gt;CreateSchema&lt;/code&gt;, &lt;code&gt;RegisterSchemaVersion&lt;/code&gt;, &lt;code&gt;UpdateSchema&lt;/code&gt; are automatically logged in CloudTrail.&lt;/p&gt;

&lt;p&gt;This was a big win no custom instrumentation required.&lt;/p&gt;

&lt;p&gt;Every schema change already produced a reliable audit event.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;Step 2: Filtering the Right Events Using EventBridge&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Next, I created an EventBridge rule that listens only to Glue schema related CloudTrail events.&lt;/p&gt;

&lt;p&gt;Instead of triggering Lambda for every Glue operation, the rule filters on:&lt;/p&gt;

&lt;p&gt;• Event source: glue.amazonaws.com&lt;br&gt;
• Event names related to schema updates&lt;/p&gt;

&lt;p&gt;This kept the system clean, efficient, and cost-effective.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps to Create a EventBridge Rule (Console):&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Open EventBridge:&lt;/strong&gt; Navigate to the Amazon EventBridge console.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Create Rule:&lt;/strong&gt; In the navigation pane, choose Rules, and then choose Create rule.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv86mnwc2ogzqkc4rssot.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv86mnwc2ogzqkc4rssot.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3. &lt;strong&gt;Define Rule:&lt;/strong&gt; Enter a Name and optional Description for the rule and select default Event Bus.&lt;br&gt;
4. &lt;strong&gt;Configure the Event Source:&lt;/strong&gt; Use the visual builder or JSON editor to define your event pattern.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn9kwrv9t2l2k7yflrv5s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn9kwrv9t2l2k7yflrv5s.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use below event pattern for AWS Glue schema registry.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  {
    "source": ["aws.glue"],
    "detail-type": ["AWS API Call via CloudTrail"],
    "detail": {
      "eventSource": ["glue.amazonaws.com"],
      "eventName": ["RegisterSchemaVersion", "UpdateSchema", "CreateSchema"]
    }
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;5. &lt;strong&gt;Select Targets:&lt;/strong&gt; From the Select a target list, choose the AWS Lambda service to invoke when the event is matched &amp;amp; configure the required details for the selected target.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fydlclj0psfnz0hstr98m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fydlclj0psfnz0hstr98m.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6. &lt;strong&gt;Finish:&lt;/strong&gt; Choose Create rule to activate your rule. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn1xx8cgn38j2o7cy0fqu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn1xx8cgn38j2o7cy0fqu.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 3: Lambda as the Brain of the System&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Lambda became the decision-maker and Its responsibilities were very clear:&lt;/p&gt;

&lt;p&gt;• Parse the CloudTrail event&lt;br&gt;
• Identify what changed&lt;br&gt;
• Extract schema details (registry, name, version ARN)&lt;br&gt;
• Prepare a meaningful payload&lt;br&gt;
• Call the external POST API&lt;/p&gt;

&lt;p&gt;This approach kept Lambda lightweight and focused.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps to Create a Lambda Function (Console):&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Open Lambda Function:&lt;/strong&gt; Navigate to AWS Lambda function console.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Create Function:&lt;/strong&gt; Select Author from scratch, give function name and you can select any runtime (I have selected python 3.11), select create a new role with basic Lambda permissions and click on create.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Attach Permission:&lt;/strong&gt; Select created lambda and go to configuration and then select permission and click on the role attached to function.
Add &lt;code&gt;AWSGlueSchemaRegistryFullAccess&lt;/code&gt; permission policy to this role.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Note: Modify the policy as needed to align with your security requirements and best practices.&lt;br&gt;
4. &lt;strong&gt;Set Environment Variable:&lt;/strong&gt; Go to configuration and then select permission and click on environment variables. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key:&lt;/strong&gt; API_URL&lt;br&gt;
&lt;strong&gt;Value:&lt;/strong&gt; &lt;a href="http://ip.of.instance:port/contextpath/of/your/api" rel="noopener noreferrer"&gt;http://ip.of.instance:port/contextpath/of/your/api&lt;/a&gt;&lt;br&gt;
(This is just a sample syntax. Replace this with your actual API Url.)&lt;/p&gt;

&lt;p&gt;5. &lt;strong&gt;Upload Code:&lt;/strong&gt; Please upload below given code to call an external API.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import os
import urllib3

http = urllib3.PoolManager()

API_URL = os.environ.get("API_URL")

def extract_schema_info(detail):
    request_params = detail.get("requestParameters", {})
    response_elements = detail.get("responseElements", {})

    schema_arn = response_elements.get("schemaArn")
    schema_version_arn = response_elements.get("schemaVersionArn")

    registry_name = request_params.get("registryId", {}).get("registryName")
    schema_name = request_params.get("schemaName")

    return {
        "registry_name": registry_name,
        "schema_name": schema_name,
        "schema_arn": schema_arn,
        "schema_version_arn": schema_version_arn
    }

def lambda_handler(event, context):
    try:
        detail = event.get("detail", {})
        event_name = detail.get("eventName")


        if event_name not in [
            "CreateSchema",
            "RegisterSchemaVersion",
            "UpdateSchema"
        ]:
            print(f"Ignored event: {event_name}")
            return {"status": "ignored"}

        schema_info = extract_schema_info(detail)

        payload = {
            "eventType": event_name,
            "schemaUpdated": True,
            "schemaDetails": schema_info,
            "timestamp": detail.get("eventTime"),
            "awsAccount": detail.get("recipientAccountId"),
            "region": detail.get("awsRegion")
        }

        encoded_body = json.dumps(payload).encode("utf-8")

        response = http.request(
            "POST",
            API_URL,
            body=encoded_body,
            headers={
                "Content-Type": "application/json"
            },
            timeout=urllib3.Timeout(connect=5.0, read=10.0)
        )

        print("API response status:", response.status)
        print("API response body:", response.data.decode())

        return {
            "status": "success",
            "httpStatus": response.status
        }

    except Exception as e:
        print("Lambda execution failed:", str(e))
        raise
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;strong&gt;Step 4: Calling a API&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The final step was integration and the Lambda sends a POST request to an API with details like:&lt;br&gt;
  •     Schema name&lt;br&gt;
  • Registry name&lt;br&gt;
  • Schema version ARN&lt;br&gt;
  • Event type (create/update)&lt;br&gt;
  • Timestamp and region&lt;br&gt;
From here, the API can:&lt;br&gt;
  • Notify consumers&lt;br&gt;
  • Trigger validations&lt;br&gt;
  • Store audit records&lt;br&gt;
  • Or block deployments if needed.&lt;/p&gt;




&lt;h3&gt;
  
  
  🚀&lt;strong&gt;Why This Approach Worked So Well&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;What I liked most about this solution:&lt;br&gt;
• ✅ Fully event-driven&lt;br&gt;
• ✅ No polling or scheduled jobs&lt;br&gt;
• ✅ Zero impact on producers or consumers&lt;br&gt;
• ✅ Scales automatically&lt;br&gt;
• ✅ Clear audit trail for schema changes&lt;br&gt;
It also aligned perfectly with serverless best practices.&lt;/p&gt;




&lt;h3&gt;
  
  
  📚&lt;strong&gt;Real Lessons Learned&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;While implementing this, a few things stood out:&lt;br&gt;
• CloudTrail events are rich but noisy → filtering is critical&lt;br&gt;
• Lambda should not assume all fields exist in every event&lt;br&gt;
• Timeouts and retries matter when calling external APIs&lt;br&gt;
• Logging schema ARNs saved a lot of debugging time later&lt;br&gt;
These small details made the difference between a demo and a production ready solution.&lt;/p&gt;




&lt;h3&gt;
  
  
  ✨&lt;strong&gt;Final Thoughts&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Schema changes are not just metadata updates; they are potential breaking changes. By turning schema updates into events, this setup changed Glue Schema Registry from a passive store into an active part of the system architecture. If you’re running Kafka on AWS and care about schema governance, it’s a good idea to adopt this pattern.&lt;/p&gt;




&lt;h3&gt;
  
  
  💬 &lt;strong&gt;Let’s Keep the Conversation Going&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Have thoughts, questions, or any experience with Event driven architectures to share? I would love to hear from you! Feel free to leave a comment or connect with me on &lt;em&gt;&lt;a href="//www.linkedin.com/in/pratikponde"&gt;LinkedIn&lt;/a&gt;&lt;/em&gt;. Let's learn and grow together as a community of builders.&lt;br&gt;
Keep exploring, keep automating and see you in the next one!&lt;/p&gt;




</description>
      <category>aws</category>
      <category>devops</category>
      <category>ai</category>
      <category>beginners</category>
    </item>
    <item>
      <title>AWS MSK Using Terraform: Multi-Environment Deployment Guide</title>
      <dc:creator>Pratik Ponde</dc:creator>
      <pubDate>Fri, 30 Jan 2026 10:11:07 +0000</pubDate>
      <link>https://forem.com/pratik_26/deploying-amazon-msk-serverless-across-multiple-environments-with-terraform-2i8c</link>
      <guid>https://forem.com/pratik_26/deploying-amazon-msk-serverless-across-multiple-environments-with-terraform-2i8c</guid>
      <description>&lt;p&gt;👋 Hey there! This is Pratik, a Senior DevOps Consultant with a strong background in automating and optimizing cloud infrastructure, particularly on AWS. Over the years, I have designed and implemented scalable solutions for enterprises, focusing on infrastructure as code, CI/CD pipelines, cloud security, and resilience. My expertise lies in translating complex cloud requirements into efficient, reliable, and cost-effective architectures.&lt;br&gt;
Through this article, I aim to share practical insights into building and managing AWS MSK Serverless using Terraform, helping fellow engineers and teams design scalable, secure, and resilient streaming architectures on AWS.&lt;/p&gt;


&lt;h2&gt;
  
  
  ⚡&lt;strong&gt;Amazon MSK: Overview and Types&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;AWS MSK is a fully managed service that makes it easy to run Apache Kafka on AWS without managing the infrastructure yourself. Kafka is a distributed streaming platform used for building real-time data pipelines and streaming apps.&lt;/p&gt;

&lt;p&gt;AWS MSK offers two deployment types:&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;1. Amazon MSK Provisioned&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;You manage the cluster capacity by choosing instance types and the number of brokers. It offers more control over performance, scaling, and configuration, making it suitable for predictable workloads and production environments that need fine-tuning.&lt;/p&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;2. Amazon MSK Serverless&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;AWS automatically manages capacity, scaling, and broker infrastructure. You don’t need to choose instance types or manage brokers. It is ideal for variable or unpredictable workloads and for teams that want minimal operational overhead.&lt;/p&gt;


&lt;h2&gt;
  
  
  ⚙️&lt;strong&gt;Core Components and Their Functionality&lt;/strong&gt;
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Broker Nodes:&lt;/strong&gt; When you create an Amazon MSK cluster, you define the number of broker nodes per Availability Zone (minimum one per AZ). In MSK Provisioned, you can choose between Standard and Express broker types. In MSK Serverless, broker management is handled automatically, and you only configure cluster-level capacity.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;ZooKeeper Nodes:&lt;/strong&gt; Amazon MSK automatically provisions Apache ZooKeeper nodes to support reliable cluster coordination.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;KRaft Controllers:&lt;/strong&gt; KRaft is Kafka’s modern metadata management mode that replaces ZooKeeper. Metadata is managed internally by Kafka controllers, with no additional setup or cost required.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Producers, Consumers, and Topics:&lt;/strong&gt; You can use standard Kafka operations to create topics and publish or consume data.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Cluster Operations:&lt;/strong&gt; You manage clusters using the AWS Console, AWS CLI, or SDKs to perform actions such as creating, updating, viewing, or deleting clusters.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  🎯&lt;strong&gt;Learning Objectives and Hands-On Walkthrough&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;This article demonstrates how to design and implement reusable Terraform modules for core infrastructure components such as the VPC and Amazon MSK, while supporting multiple environments (for example, Dev and UAT) through dedicated environment configurations using terraform.tfvars and variable definition files. It also covers configuring a remote Terraform backend using Amazon S3 to securely store and manage the Terraform state.&lt;br&gt;
The solution provisions a complete Amazon MSK infrastructure, including the VPC, subnets, security groups, IAM roles, the MSK cluster, and a client EC2 instance. With a small set of Terraform commands, you can reliably create, update, and decommission the entire environment in a consistent and repeatable manner across environments.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Resources Covered in This Guide:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A VPC with public and private subnets across three Availability Zones.&lt;/li&gt;
&lt;li&gt;Full networking setup, including Internet Gateway and route tables.&lt;/li&gt;
&lt;li&gt;A secure Amazon MSK Serverless cluster.&lt;/li&gt;
&lt;li&gt;An EC2 instance configured with Kafka tools and authentication.&lt;/li&gt;
&lt;li&gt;IAM roles and security groups to enable secure communication between the EC2 instance and the MSK cluster.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀&lt;strong&gt;Let’s begin!&lt;/strong&gt;&lt;/p&gt;


&lt;h3&gt;
  
  
  📋&lt;strong&gt;1. Prerequisites&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Account:&lt;/strong&gt; Ensure you have an AWS account with programmatic access and that your AWS credentials are configured locally using the AWS CLI.
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;S3 Bucket:&lt;/strong&gt; S3 Bucket for remote backend. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Terraform Setup:&lt;/strong&gt; Make sure Terraform is installed on your local environment.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;
  
  
  &lt;strong&gt;Installing Terraform&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Terraform is straightforward to set up. The following sections provide installation instructions for the most common operating systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;On macOS (using Homebrew):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;brew tap hashicorp/tap
brew install hashicorp/tap/terraform
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;On Windows (using Chocolatey):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;choco install terraform
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;On Linux (Debian/Ubuntu):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;wget -O- [https://apt.releases.hashicorp.com/gpg](https://apt.releases.hashicorp.com/gpg) | sudo gpg --dearmor -o /usr/share/keyrings/hashicorp-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/hashicorp-archive-keyring.gpg] [https://apt.releases.hashicorp.com](https://apt.releases.hashicorp.com) $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/hashicorp.list
sudo apt update &amp;amp;&amp;amp; sudo apt install terraform
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;After installation, verify it's working by running:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  💻&lt;strong&gt;2. Deep Dive into the Terraform Code&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The following directory structure shows the Terraform modules for provisioning the VPC and Amazon MSK Serverless.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwqqdbvxfxt9fxts0dolp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwqqdbvxfxt9fxts0dolp.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Core Networking Resources (VPC, Subnets, Gateways &amp;amp; Route Tables)&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This section establishes the core network infrastructure for our environment. We begin by creating a new VPC, followed by provisioning three public and three private subnets one in each AWS Availability Zone to ensure high availability. The Internet Gateway, NAT Gateway, and route tables are configured to provide secure internet connectivity for the EC2 instance.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Security Controls (Security Groups and SSH Access)&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SSH Key:&lt;/strong&gt; Terraform automatically generates an RSA key pair. The      public key is uploaded to AWS using &lt;code&gt;aws_key_pair&lt;/code&gt;, while the private key  is stored locally as &lt;code&gt;msk-client-key.pem&lt;/code&gt;, allowing secure SSH access to the EC2 instance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Groups:&lt;/strong&gt; Two security groups are created one for the MSK cluster and one for the EC2 client. The rules are configured to allow unrestricted communication between the EC2 instance and the MSK cluster, while limiting inbound internet access to the EC2 instance to SSH  traffic only.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;



&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;IAM Permissions for EC2&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Instead of embedding AWS access keys, we use an IAM role to grant permissions securely. This configuration creates an &lt;code&gt;aws_iam_role&lt;/code&gt; that the EC2 instance can assume. An attached &lt;code&gt;aws_iam_policy&lt;/code&gt; provides the required permissions to connect to the MSK cluster, describe resources, and read from or write to Kafka topics. This approach follows AWS security best practices and is the recommended way to manage service access.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Amazon MSK Serverless Cluster&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This resource creates an Amazon MSK Serverless cluster with a configurable name. It deploys the cluster within the specified subnets and associates it with the MSK security group for secure network access. IAM-based SASL authentication is enabled to allow secure client access using IAM roles. Resource tags are also applied for easier management and identification.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Kafka Client EC2 Instance&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This instance acts as the Kafka client for our setup. We provision a t2.micro EC2 instance and use a &lt;code&gt;user_data&lt;/code&gt; script that runs automatically during the first boot to configure the environment.&lt;br&gt;
The script performs the following tasks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Installs Java&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Downloads and extracts the required Kafka version&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Installs the AWS MSK IAM Authentication library for secure access&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Creates the &lt;code&gt;client.properties&lt;/code&gt; file with the necessary configuration for IAM-based authentication.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This ensures the EC2 instance is fully prepared to connect to and interact with the MSK cluster.&lt;/p&gt;
&lt;h3&gt;
  
  
  ☁️&lt;strong&gt;3. Provisioning the Infrastructure&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Note: The full Terraform scripts are available here⬇️: &lt;br&gt;


&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/pratiksponde" rel="noopener noreferrer"&gt;
        pratiksponde
      &lt;/a&gt; / &lt;a href="https://github.com/pratiksponde/AWS-MSK-Terraform-code" rel="noopener noreferrer"&gt;
        AWS-MSK-Terraform-code
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      This repo contains terraform module code to provision AWS VPC and MSK Serverless cluster 
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;🚀 AWS MSK Using Terraform: Multi-Environment Deployment Guide&lt;/h1&gt;
&lt;/div&gt;

&lt;p&gt;This repository contains Terraform infrastructure code to deploy &lt;strong&gt;Amazon MSK Serverless&lt;/strong&gt; across multiple environments such as &lt;strong&gt;Dev&lt;/strong&gt; and &lt;strong&gt;UAT&lt;/strong&gt; using a modular and scalable architecture.&lt;/p&gt;

&lt;p&gt;It provisions networking, security, IAM roles, MSK Serverless cluster, and a Kafka client EC2 instance to enable secure Kafka communication.&lt;/p&gt;

&lt;p&gt;📖 &lt;strong&gt;Full article:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://dev.to/pratik_26/deploying-amazon-msk-serverless-across-multiple-environments-with-terraform-2i8c" rel="nofollow"&gt;https://dev.to/pratik_26/deploying-amazon-msk-serverless-across-multiple-environments-with-terraform-2i8c&lt;/a&gt;&lt;/p&gt;

&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;🏗️ Architecture Overview&lt;/h1&gt;
&lt;/div&gt;
&lt;p&gt;The infrastructure created by this project includes:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;VPC with public and private subnets&lt;/li&gt;
&lt;li&gt;Internet Gateway and routing&lt;/li&gt;
&lt;li&gt;Amazon MSK Serverless cluster&lt;/li&gt;
&lt;li&gt;EC2 instance as Kafka client&lt;/li&gt;
&lt;li&gt;IAM roles and policies&lt;/li&gt;
&lt;li&gt;Security groups&lt;/li&gt;
&lt;li&gt;Remote Terraform state stored in S3&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;📂 Repository Structure&lt;/h1&gt;
&lt;/div&gt;
&lt;div class="snippet-clipboard-content notranslate position-relative overflow-auto"&gt;
&lt;pre class="notranslate"&gt;&lt;code&gt;AWS-MSK-Terraform-code/
├── modules/
│   ├── vpc/
│   │   ├── main.tf
│   │   ├── variables.tf
│   │   └── outputs.tf
│   │
│   └── msk/
│       ├── main.tf
│       ├── variables.tf
│       └── outputs.tf
│
├── environments/
│   ├── dev/
│   │   ├── backend.tf
│   │   ├── main.tf
│   │   ├──&lt;/code&gt;&lt;/pre&gt;…&lt;/div&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/pratiksponde/AWS-MSK-Terraform-code" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;




&lt;h4&gt;
  
  
  Step 1: Configure the Remote Backend (Dev Environment)
&lt;/h4&gt;

&lt;p&gt;The first step is to configure the remote backend for the Dev environment. This is done by updating the backend configuration under the path:&lt;br&gt;
&lt;strong&gt;Environment → Dev → backend.tf&lt;/strong&gt;&lt;br&gt;
The following configuration uses Amazon S3 to store the Terraform state file securely and enables state locking to prevent concurrent modifications:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
  backend "s3" {
    bucket        = "your-bucket-name"
    key           = "msk/dev/terraform.tfstate"
    region        = "region-of-bucket"
    use_lockfile  = true
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This setup ensures that the Terraform state is centrally managed, secure, and consistent when working across environments or teams.&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 2: Run Terraform from the Correct Environment Directory
&lt;/h4&gt;

&lt;p&gt;Before executing any Terraform commands, ensure that you are in the correct environment directory. For the Dev environment, navigate to:&lt;br&gt;
&lt;strong&gt;Environment → Dev&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Initialize Terraform&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Run the following command to initialize the working directory. This step downloads the required AWS provider plugins and configures the backend.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform init
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Plan the Deployment&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This command performs a dry run and shows a detailed preview of the resources Terraform will create, modify, or delete without making any changes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Apply the Configuration&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This command executes the planned changes and provisions the resources in your AWS account. Confirm the operation by typing yes when prompted.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform apply
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  ✅&lt;strong&gt;4. Testing and Verifying the Cluster Setup&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;After terraform apply completes successfully, follow the steps below to validate that the setup is working as expected.&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 1: Get the EC2 Public IP
&lt;/h4&gt;

&lt;p&gt;Retrieve the public IP address of the EC2 instance from the Terraform outputs. &lt;/p&gt;

&lt;h4&gt;
  
  
  Step 2: SSH into the EC2 Instance
&lt;/h4&gt;

&lt;p&gt;The private key file &lt;code&gt;msk-client-key.pem&lt;/code&gt; is saved in your project directory. Ensure it has the correct permissions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;chmod 400 msk-client-key.pem
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then connect to the instance using SSH:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ssh -i "msk-client-key.pem" ec2-user@&amp;lt;YOUR_EC2_PUBLIC_IP&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 3: Get the Bootstrap Brokers String
&lt;/h4&gt;

&lt;p&gt;In your &lt;strong&gt;local terminal&lt;/strong&gt; (not inside the SSH session), retrieve the MSK bootstrap broker’s string:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform output bootstrap_brokers
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy this value. You will use it in the Kafka commands.&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 4: Create a Kafka Topic
&lt;/h4&gt;

&lt;p&gt;Inside the EC2 SSH session, navigate to the Kafka bin directory and create a topic:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bin/kafka-topics.sh --create \
  --bootstrap-server &amp;lt;bootstrapServerString&amp;gt; \
  --command-config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties \
  --replication-factor 3 \
  --partitions 1 \
  --topic my-first-topic

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 5: Start a Producer
&lt;/h4&gt;

&lt;p&gt;In the same terminal, start the Kafka console producer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bin/kafka-console-producer.sh \
  --broker-list &amp;lt;bootstrapServerString&amp;gt; \
  --producer.config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties \
  --topic my-first-topic

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You will see a &amp;gt; prompt. Type a message such as:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Hello from Terraform!&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;and press Enter.&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 6: Start a Consumer (in a New Terminal)
&lt;/h4&gt;

&lt;p&gt;Open another terminal window and SSH into the EC2 instance again. Then run the consumer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;bin/kafka-console-consumer.sh \
  --bootstrap-server &amp;lt;bootstrapServerString&amp;gt; \
  --consumer.config /home/ec2-user/kafka_2.13-3.6.0/bin/client.properties \
  --topic my-first-topic \
  --from-beginning

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see the message &lt;em&gt;&lt;strong&gt;"Hello from Terraform!"&lt;/strong&gt;&lt;/em&gt; appear immediately.&lt;br&gt;
This confirms that your MSK cluster, authentication, and connectivity are working correctly.&lt;/p&gt;


&lt;h3&gt;
  
  
  🗑️&lt;strong&gt;5. Cleaning Up Resources&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To avoid unnecessary AWS charges, remember to delete the infrastructure once you are done. One of the advantages of using Terraform is that cleanup can be performed with a single command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform destroy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When prompted, type yes to confirm. Terraform will then safely and systematically remove all resources that were created.&lt;/p&gt;




&lt;h2&gt;
  
  
  💸&lt;strong&gt;Cost Optimization Tips&lt;/strong&gt;
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Use MSK Serverless for variable or unpredictable workloads.&lt;/li&gt;
&lt;li&gt;Delete unused topics regularly.&lt;/li&gt;
&lt;li&gt;Compress messages to minimize data transfer and storage.&lt;/li&gt;
&lt;li&gt;Avoid large message payloads where possible.
For detailed pricing information on Amazon MSK, please refer to the link. ⬇️ &lt;a href="https://aws.amazon.com/msk/pricing/" rel="noopener noreferrer"&gt;https://aws.amazon.com/msk/pricing/&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  💡&lt;strong&gt;Final Thoughts&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;In this article, we &lt;strong&gt;designed&lt;/strong&gt; and &lt;strong&gt;implemented&lt;/strong&gt; a complete &lt;strong&gt;Amazon MSK Serverless environment&lt;/strong&gt; using a &lt;strong&gt;modular&lt;/strong&gt; and &lt;strong&gt;reusable Terraform approach&lt;/strong&gt;. By separating infrastructure into &lt;strong&gt;well-structured modules&lt;/strong&gt; for &lt;strong&gt;VPC&lt;/strong&gt; and &lt;strong&gt;MSK&lt;/strong&gt;, and managing multiple &lt;strong&gt;environments&lt;/strong&gt; such as &lt;strong&gt;Dev&lt;/strong&gt; and &lt;strong&gt;UAT&lt;/strong&gt; with &lt;strong&gt;environment-specific configurations&lt;/strong&gt; and &lt;strong&gt;remote state stored in Amazon S3&lt;/strong&gt;, we achieved a solution that is &lt;strong&gt;scalable&lt;/strong&gt;, &lt;strong&gt;maintainable&lt;/strong&gt;, and aligned with &lt;strong&gt;infrastructure-as-code best practices&lt;/strong&gt;.  &lt;/p&gt;

&lt;p&gt;This approach not only &lt;strong&gt;simplifies provisioning and management&lt;/strong&gt; but also improves &lt;strong&gt;consistency across environments&lt;/strong&gt; and reduces &lt;strong&gt;operational risk&lt;/strong&gt;. With &lt;strong&gt;automated deployment&lt;/strong&gt;, &lt;strong&gt;secure authentication using IAM&lt;/strong&gt;, and an &lt;strong&gt;EC2-based client&lt;/strong&gt; for &lt;strong&gt;validation&lt;/strong&gt;, you now have a &lt;strong&gt;solid foundation&lt;/strong&gt; for building and operating &lt;strong&gt;real-time streaming solutions on AWS&lt;/strong&gt;.  &lt;/p&gt;

&lt;p&gt;You can further extend this setup by &lt;strong&gt;integrating monitoring&lt;/strong&gt;, &lt;strong&gt;enhancing security controls&lt;/strong&gt;, and adding &lt;strong&gt;CI/CD pipelines&lt;/strong&gt; to &lt;strong&gt;automate infrastructure changes&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  ✅ &lt;strong&gt;Wrapping Up&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Thanks for taking the time to explore AWS MSK Serverless with Terraform! I hope this article has helped you understand how to build a scalable, secure, and maintainable streaming data infrastructure. Whether you are a seasoned engineer or just starting with AWS, applying these concepts can make a real difference in managing real-time workloads efficiently.&lt;/p&gt;




&lt;h2&gt;
  
  
  💬 &lt;strong&gt;Let’s Keep the Conversation Going&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Have thoughts, questions, or experience with MSK to share? I would love to hear from you! Feel free to leave a comment or connect with me on &lt;em&gt;&lt;a href="//www.linkedin.com/in/pratikponde"&gt;LinkedIn&lt;/a&gt;&lt;/em&gt;. Let's learn and grow together as a community of builders.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;Keep exploring, keep automating and see you in the next one!&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>terraform</category>
      <category>ai</category>
    </item>
    <item>
      <title>AWS Backup Explained Simply : Use Cases, Setup &amp; Recovery</title>
      <dc:creator>Pratik Ponde</dc:creator>
      <pubDate>Mon, 04 Aug 2025 14:42:06 +0000</pubDate>
      <link>https://forem.com/pratik_26/aws-backup-explained-simply-use-cases-setup-recovery-43ao</link>
      <guid>https://forem.com/pratik_26/aws-backup-explained-simply-use-cases-setup-recovery-43ao</guid>
      <description>&lt;p&gt;👋 Hey there! This is Pratik, a Senior DevOps Consultant with a strong background in automating and optimizing cloud infrastructure, particularly on AWS. Over the years, I have designed and implemented scalable solutions for enterprises, focusing on infrastructure as code, CI/CD pipelines, cloud security, and resilience. My expertise lies in translating complex cloud requirements into efficient, reliable, and cost-effective architectures.&lt;/p&gt;

&lt;p&gt;Through this article, I aim to share practical insights into AWS Backup helping fellow engineers and teams strengthen their cloud resilience.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is AWS Backup?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;AWS Backup is a fully managed backup service that allows you to automate and centrally manage backups across AWS services like EC2, EBS, RDS, S3, DynamoDB, Aurora, and more.&lt;/li&gt;
&lt;li&gt;In this guide, we will walk through real use cases and step-by-step instructions to implement and restore backups effectively.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8294ogsmd263aosylon6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8294ogsmd263aosylon6.png" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  AWS Backup Features Overview
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Supports Key AWS Services&lt;/strong&gt;: EC2, EBS, S3, EFS, RDS, DynamoDB, Aurora, Storage Gateway, and more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Backup Vault&lt;/strong&gt;: Central encrypted vault for storing backup copies.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Incremental Backups&lt;/strong&gt;: Initial full backup followed by incremental changes to reduce storage costs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cross-Region Backups&lt;/strong&gt;: Protect against regional outages.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automated Backup Plans&lt;/strong&gt;: Schedule backups on a daily, weekly, or monthly basis.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost Estimation&lt;/strong&gt;: AWS Pricing Calculator helps forecast backup costs.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Use Case 1: Backing Up EC2 Instances
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisites&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An EC2 Linux instance with a web server (e.g., httpd) installed and a sample index.html file.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Steps&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Create a Backup Plan&lt;/strong&gt;: Define frequency and retention rules.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7j34cvcqcjo9ywiooo2d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7j34cvcqcjo9ywiooo2d.png" alt=" " width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Create or Use a Backup Vault&lt;/strong&gt;: Encrypts data at rest and in transit.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F40v3bpwlflz86bnsyu4x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F40v3bpwlflz86bnsyu4x.png" alt=" " width="800" height="389"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Assign Resources&lt;/strong&gt;: Choose EC2 instances via tags or specific IDs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faadxlwvfm4zleq76rukt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faadxlwvfm4zleq76rukt.png" alt=" " width="751" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Create IAM Role&lt;/strong&gt;: Grant necessary permissions (EC2 and AWS Backup full access).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7whav2pyvylcb4xqrr5c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7whav2pyvylcb4xqrr5c.png" alt=" " width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Test Restore&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Delete the EC2 instance.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbi8m3mtcn2tf9u6gt9o1.png" alt=" " width="800" height="307"&gt;
&lt;/li&gt;
&lt;li&gt;Restore using AWS Backup.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft2clwnotnqiwwhcybxa6.png" alt=" " width="800" height="228"&gt;
&lt;/li&gt;
&lt;li&gt;Confirm the web server (index.html) is running after restore.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6vyrjymg5i283bf124h.png" alt=" " width="800" height="209"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h2&gt;
  
  
  Use Case 2: Backing Up S3 Buckets
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Pre-requisites&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;S3 bucket with versioning enabled.&lt;/li&gt;
&lt;li&gt;IAM role with S3 and AWS Backup permissions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Steps&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create or configure a versioned S3 bucket.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnw3za4lx4jylke2tn8ux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnw3za4lx4jylke2tn8ux.png" alt=" " width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set up an on-demand or scheduled backup plan.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7g070eyui6uer6rsx0i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7g070eyui6uer6rsx0i.png" alt=" " width="800" height="245"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Assign S3 resources in the backup plan.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Test Restore:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Delete objects in the bucket.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zq55q8kciiudk7tcgag.png" alt=" " width="773" height="414"&gt;
&lt;/li&gt;
&lt;li&gt;Use the restore job to recover data.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6n2ddi0zb4epfj1eju20.png" alt=" " width="800" height="377"&gt;
&lt;/li&gt;
&lt;li&gt;Monitor the restore job status and verify object restoration.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy7jqyd8bx9l56pjkn3dl.png" alt=" " width="800" height="240"&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: AWS Backup supports all S3 storage classes except Glacier Deep Archive and some Glacier Flexible Retrieval scenarios.&lt;/p&gt;




&lt;h2&gt;
  
  
  Use Case 3: Backing Up Databases (RDS/Aurora)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Steps&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create a new RDS database.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Assign it to a backup plan using tags or direct selection.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create or attach an IAM role with RDS and AWS Backup permissions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Perform restore testing by simulating failure or deletion.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Backup Strategy Behind AWS Backup
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Full + Incremental&lt;/strong&gt;: The first backup is full, subsequent backups store only changes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Efficient Storage&lt;/strong&gt;: Saves space and time, though full restores may take longer.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Cost Optimization Tips
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Use Cost Allocation Tags to track backup related expenses.&lt;/li&gt;
&lt;li&gt;Monitor usage and trends using AWS Cost Explorer.&lt;/li&gt;
&lt;li&gt;Choose Intelligent Tiering for S3 to balance cost and retrieval needs.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Refer to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://aws.amazon.com/backup/pricing/" rel="noopener noreferrer"&gt;AWS Backup Pricing&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h2&gt;
  
  
  Common FAQs
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q1. Can I back up multiple S3 buckets at once?&lt;/strong&gt;&lt;br&gt;
Yes, using an automated backup plan. Manual backups are still one bucket at a time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q2. How is S3 integrated with AWS Backup?&lt;/strong&gt;&lt;br&gt;
Backup policies can be applied to S3 with versioning enabled. You can restore to a point in time and define long-term retention policies.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q3. Is S3 replication a backup method?&lt;/strong&gt;&lt;br&gt;
No. Replication is for real-time data copying, not version-controlled historical backups.&lt;/p&gt;




&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;AWS Backup offers a streamlined, secure way to protect your cloud workloads. With automated plans, incremental backups, and centralized management, it simplifies data protection while ensuring compliance and cost control.&lt;/p&gt;




&lt;h2&gt;
  
  
  ✅ Wrapping Up
&lt;/h2&gt;

&lt;p&gt;Thanks for taking the time to explore AWS Backup with me! I hope this article has helped you better understand how to build a reliable and efficient backup strategy for your cloud workloads. Whether you are a seasoned engineer or just getting started with AWS, applying these concepts can make a real difference in your infrastructure's resilience.&lt;/p&gt;




&lt;h2&gt;
  
  
  🔍 Looking Ahead
&lt;/h2&gt;

&lt;p&gt;This is just one piece of the larger DevOps and cloud automation puzzle. In upcoming posts, I will be diving into topics like infrastructure as code, CI/CD pipelines, and advanced AWS architecture patterns. Be sure to follow for more hands-on guides and real-world insights.&lt;/p&gt;




&lt;h2&gt;
  
  
  💬 Let’s Keep the Conversation Going
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Have thoughts, questions, or your own backup story to share? I would love to hear from you! Feel free to leave a comment or connect with me on &lt;a href="//www.linkedin.com/in/pratikponde"&gt;LinkedIn&lt;/a&gt;. Let's learn and grow together as a community of builders.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Keep exploring, keep automating and see you in the next one!&lt;/em&gt;&lt;/p&gt;




</description>
      <category>aws</category>
      <category>backup</category>
      <category>beginners</category>
      <category>development</category>
    </item>
  </channel>
</rss>
