<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Lilupa Karu</title>
    <description>The latest articles on Forem by Lilupa Karu (@lilupa).</description>
    <link>https://forem.com/lilupa</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/lilupa"/>
    <language>en</language>
    <item>
      <title>AWS AI-DLC: Rethinking the SDLC in the Age of AI</title>
      <dc:creator>Lilupa Karu</dc:creator>
      <pubDate>Wed, 04 Feb 2026 09:54:19 +0000</pubDate>
      <link>https://forem.com/lilupa/aws-ai-dlc-rethinking-the-sdlc-in-the-age-of-ai-4hln</link>
      <guid>https://forem.com/lilupa/aws-ai-dlc-rethinking-the-sdlc-in-the-age-of-ai-4hln</guid>
      <description>&lt;p&gt;Lately, my day-to-day world in software engineering has shifted more toward architecture and leadership than hands-on delivery. But over the Christmas break, I had the chance to get closer to the ground again—offering some friendly guidance to help build a SaaS product from scratch. No legacy systems, no existing processes—just a blank canvas and a big idea.&lt;/p&gt;

&lt;p&gt;A former colleague of mine, Edwin, who was deep in the build, asked a familiar question: “Where should I start? Do we do this the same way we always have—Agile, sprints, ceremonies?” I paused, thought it through, and said, “Honestly, no mate. You should try AI-DLC.” He laughed and replied, “I know it has something to do with AI… but what exactly is it?”&lt;/p&gt;

&lt;p&gt;That question led to several long conversations, mapping AI-DLC onto his project and rethinking what modern software development could look like when AI isn’t just a supporting tool, but a first-class participant in how software is designed, built, and operated. This article is my attempt to share that thinking—for anyone ready to step outside the old playbook and explore what’s next.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Is AWS AI-DLC?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;AWS AI-DLC (AI-Driven Development Lifecycle)&lt;/strong&gt; is an AI-native approach to software development that fundamentally rethinks how systems are designed, built, and operated. Instead of treating AI as a supporting tool used at isolated stages, AI-DLC embeds intelligence across the entire lifecycle—from ideation and requirements to implementation, testing, deployment, and operations. AI becomes an active participant in delivery, continuously assisting with decisions, execution, and optimisation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsc9ag0g3v8lcbyacnica.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsc9ag0g3v8lcbyacnica.png" alt="SDLC vs AI-DLC" width="800" height="536"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Crucially, AI-DLC preserves a strong human-in-the-loop model. Engineers and architects define intent, apply domain context, validate outcomes, and retain ownership of production systems, while AI accelerates execution, reduces manual effort, and enforces consistency at scale. The result is a development lifecycle where humans focus on strategy and judgment, and AI handles orchestration—enabling teams to deliver faster, with higher quality and confidence.&lt;/p&gt;

&lt;p&gt;AI-DLC deliberately keeps humans in the driver’s seat, with explicit approval gates for critical decisions, while AI accelerates the work that happens in between.&lt;br&gt;
That’s why &lt;strong&gt;AI-DLC isn’t vibe coding. It delivers speed without sacrificing control&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  How do we use AI-DLC?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fidfkplu3v5vyzoldx1oq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fidfkplu3v5vyzoldx1oq.png" alt="How do we use AI-DLC" width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just as you interact with ChatGPT by asking a question, you can initiate an AI-DLC workflow by simply stating your intent. AI-DLC follows a simple yet powerful pattern—one that consistently repeats across every phase of the development lifecycle.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;I want to start an AI-DLC project for an event management application.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;From there, the AI takes over the orchestration. It automatically identifies whether the project is greenfield (new) or brownfield (existing), guides you through the appropriate lifecycle phases, asks clarifying questions to refine requirements, seeks explicit approval before advancing, and continuously tracks progress throughout the journey.&lt;/p&gt;




&lt;h2&gt;
  
  
  AI-DLC Core Framework
&lt;/h2&gt;

&lt;p&gt;AWS has formally introduced the AI-DLC methodology and its core philosophy &lt;a href="https://prod.d13rzhkk8cj2z0.amplifyapp.com/" rel="noopener noreferrer"&gt;here&lt;/a&gt;. Let’s explore the key components that define it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpemvlads18e8y2rwgrb6.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpemvlads18e8y2rwgrb6.webp" alt="AI-DLC Core Framework" width="720" height="621"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Inception Phase
&lt;/h3&gt;

&lt;p&gt;This is the planning and definition stage where the project kicks off.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Roles&lt;/strong&gt;: Product Owner, Developers, and AI.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Key Ritual: Mob Elaboration&lt;/strong&gt; (Collaborative defining of requirements).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Process&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Starts with a high-level Intent (Goal).&lt;/li&gt;
&lt;li&gt;Breaks down the intent into smaller working pieces (labeled Unit 1, Unit 2... Unit n).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Artefacts (Outputs)&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;PRFAQs (Press Release / Frequently Asked Questions)&lt;/li&gt;
&lt;li&gt;User Stories&lt;/li&gt;
&lt;li&gt;NFRs (Non-Functional Requirements).&lt;/li&gt;
&lt;li&gt;Risks.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Construction Phase
&lt;/h3&gt;

&lt;p&gt;This is the building and testing stage where the "Units" are turned into functional software components.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Roles:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For Construction: Developers and AI.&lt;/li&gt;
&lt;li&gt;For Testing: Product Owner, Developers, and AI.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;&lt;strong&gt;Key Rituals: Mob Construction and Mob Testing&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Process:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Involves iterative design and coding cycles.&lt;/li&gt;
&lt;li&gt;Transforms "Units" into "Bolts" (Bolt 1, Bolt 2... Bolt n).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Artefacts (Outputs):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Domain Design, Logical Design, and Codes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment Units&lt;/strong&gt; that are: Secured, Instrumented, De-Risked, Tested, and Packaged.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Operation Phase
&lt;/h3&gt;

&lt;p&gt;This is the deployment stage where the software goes live.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Roles:&lt;/strong&gt; Product Owner, Developers, and AI.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Process:&lt;/strong&gt; The finalized "Bolts" are moved into the production environment.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Artefacts (Outputs):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deployment Units active in the Production Environment.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Five Critical Artefacts of AI-DLC
&lt;/h2&gt;

&lt;p&gt;Imagine you’re building a modern event management platform—one that allows organisers to create events, sell tickets, manage capacity in real time, and control entry seamlessly on event day. This is how AI-DLC’s five core artefacts come together in practice.&lt;/p&gt;

&lt;h3&gt;
  
  
  Intent: The North Star
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;What it is&lt;/strong&gt;&lt;br&gt;
Intent captures the why—the business goal or outcome that anchors everything that follows. It is the single source of truth that AI uses to guide decomposition, prioritisation, and decision-making.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example Intent&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“Build a scalable event management application that enables organisers to create and manage events, sell tickets securely, control venue capacity, and handle fast, fraud-free entry on event day.”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This Intent becomes the guiding compass for AI throughout the lifecycle, ensuring every feature, design choice, and deployment decision traces back to this core objective.&lt;/p&gt;

&lt;h3&gt;
  
  
  Unit: The Self-Contained Value Block
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;What it is&lt;/strong&gt;&lt;br&gt;
A Unit is a cohesive, independently valuable slice of functionality. AI decomposes the Intent into Units that deliver measurable outcomes and can be built, tested, and deployed in isolation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example Units&lt;/strong&gt;&lt;br&gt;
From the event management Intent, AI may derive Units such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Event Creation &amp;amp; Configuration&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Ticket Sales &amp;amp; Payments&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Capacity &amp;amp; Availability Management&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Event Day Entry &amp;amp; Check-In&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Reporting &amp;amp; Analytics&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each Unit is loosely coupled, allowing teams to move fast without waiting on other components—perfect for parallel development and incremental releases.&lt;/p&gt;

&lt;h3&gt;
  
  
  Bolt: The High-Velocity Execution Cycle
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;What it is&lt;/strong&gt;&lt;br&gt;
Bolts represent the smallest execution loop in AI-DLC. They focus on rapid, tangible progress—measured in hours or days rather than traditional multi-week sprints.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example Bolts&lt;/strong&gt; (for the “Event Day Entry &amp;amp; Check-In” Unit)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Day 1: Implement QR code–based ticket validation&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Day 2: Add offline check-in support for poor connectivity&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Day 3: Introduce real-time entry count and capacity alerts&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AI plans these Bolts to maximise delivery speed while keeping scope sharply focused, with humans validating quality and user experience.&lt;/p&gt;

&lt;h3&gt;
  
  
  Domain Design: The Business Logic Blueprint
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;What it is&lt;/strong&gt;&lt;br&gt;
Domain Design captures the heart of your business logic using domain-driven design principles—independent of infrastructure or deployment concerns.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example Domain Model&lt;/strong&gt; (Event Management)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Entities: Event, Ticket, Attendee, Venue&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Value Objects: TicketType, CapacityLimit, EntryStatus&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Aggregates: EventSession (enforcing capacity rules)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Domain Events: TicketPurchased, CapacityReached, AttendeeCheckedIn&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Repositories: EventRepository, TicketRepository&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AI then extends this into Logical Design, recommending architectural patterns (for example, event-driven processing for entry scans) and documenting decisions in Architecture Decision Records (ADRs) for human review and approval.&lt;/p&gt;

&lt;h3&gt;
  
  
  Deployment Units: Ready-to-Ship Deliverables
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;What it is&lt;/strong&gt;&lt;br&gt;
Deployment Units are fully operational, production-ready packages—combining application code, configuration, infrastructure definitions, and automated tests.&lt;/p&gt;

&lt;p&gt;For the event platform, this could include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Ticketing and payment services&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Entry scanning APIs&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Real-time capacity monitoring components&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CI/CD pipelines and infrastructure templates&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each Deployment Unit is validated for scalability, reliability, and security—ensuring the system performs flawlessly even during peak event-day traffic.&lt;/p&gt;




&lt;h2&gt;
  
  
  Closing Thoughts
&lt;/h2&gt;

&lt;p&gt;Edwin and I caught up again on a warm midsummer evening, beers in hand, somewhere in town. By then, his project was already past the halfway mark. What stood out wasn’t just the progress, but how smoothly it was moving. No long pauses, no painful rework—just steady momentum. Choosing AI-DLC over the usual Waterfall or even Agile approach paid off. Features were landing fast, working software was in users’ hands, and milestone after milestone quietly ticked itself off.&lt;/p&gt;

&lt;p&gt;If I were starting a new software product today, I wouldn’t think twice—I’d use AI-DLC. Not because it’s trendy, but because it matches where the industry is clearly heading. Humans still set the vision, apply judgment, and take ownership of outcomes. AI simply accelerates the journey from idea to reality. To me, AI-DLC feels like the natural evolution of SDLC—one that leans into the future instead of trying to hold it back.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ai</category>
      <category>softwareengineering</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Enterprise Bedrock Agent for Retail Customer Success</title>
      <dc:creator>Lilupa Karu</dc:creator>
      <pubDate>Fri, 18 Jul 2025 06:01:58 +0000</pubDate>
      <link>https://forem.com/lilupa/enterprise-bedrock-agent-for-retail-customer-success-p56</link>
      <guid>https://forem.com/lilupa/enterprise-bedrock-agent-for-retail-customer-success-p56</guid>
      <description>&lt;p&gt;Amazon Bedrock has quickly become one of the most talked-about platforms in the software world—a buzzword echoing across boardrooms, tech meetups, and cloud architecture discussions alike. Given its growing popularity, there’s little need to cover the basics here. Instead, let me take you into a real-world conversation that brought this technology into a sharp, business-focused context.&lt;/p&gt;

&lt;p&gt;Just a few months ago, I caught up with a friend of mine—a successful retail business owner—who casually brought up his curiosity about integrating generative AI into his operations.&lt;/p&gt;

&lt;p&gt;“What an incredible time to embrace it,” I responded, sensing the potential spark.&lt;/p&gt;

&lt;p&gt;He leaned in and shared his vision:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“I want a smart solution, something like a conversational agent that can understand a customer’s buying needs, streamline the sales process, and—most importantly—if the exact product isn’t available, it should intelligently recommend alternatives based on intent, price, and market trends.”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That’s when it hit me: this wasn’t just a chatbot—this called for an agentic AI system. A solution powered by GenAI that could reason, adapt, and take action with business context in mind.&lt;/p&gt;

&lt;p&gt;We ended up spending the afternoon diving deep into possibilities over two strong black coffees. By the time the mugs were empty, we had mapped out a high-level solution tailored to his business needs—bridging customer intent, product intelligence, and market awareness in one streamlined flow.&lt;/p&gt;

&lt;p&gt;Later that evening, I realized the conversation was worth documenting—not just from a solution standpoint, but through the lens of business value and architectural thinking. This article is the result—a walk through that thought process, focusing on how agentic GenAI systems with AWS Bedrock can unlock real, transformative outcomes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Business Problem
&lt;/h2&gt;

&lt;p&gt;A premium bags and shoes retailer, operating across both physical stores and online channels, faces a recurring challenge: the inability to meet customer expectations when a specific product is unavailable. Whether it's an item that's out of stock or a product that doesn't quite match the customer's style or preferences, the result is the same—missed sales opportunities and (and I guess) disappointed shoppers.&lt;/p&gt;

&lt;p&gt;This issue arises frequently at the store level and in the digital storefront. Customers often walk away or abandon their carts simply because no suitable alternatives are presented in real time. The lack of intelligent, personalized recommendations at the moment of decision-making is leading to lost revenue and a decline in customer satisfaction.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Agentic AI Solve This Problem
&lt;/h2&gt;

&lt;p&gt;Enter Generative AI and Agentic AI—transformative technologies that can revolutionize how retailers respond to product availability challenges.&lt;/p&gt;

&lt;p&gt;By leveraging Amazon Bedrock, businesses can deploy GenAI-powered agents capable of deeply understanding customer intent, context, and preferences in real time. When a specific product is unavailable, the agent doesn’t simply show generic alternatives—it intelligently curates suggestions that align with the customer’s style, color preferences, price range, and even previous purchase history.&lt;/p&gt;

&lt;p&gt;These agents go beyond basic chatbot interactions. They think and act like digital sales associates, capable of reasoning across a product catalog, identifying patterns, and making personalized recommendations that feel human and thoughtful. They can explain why an alternative product is a good match, handle follow-up questions naturally, and even suggest complementary items—just like an expert salesperson would.&lt;/p&gt;

&lt;p&gt;For example, if a customer is searching for a specific designer handbag that’s out of stock, the agent can dynamically recommend visually similar models, limited-edition pieces, or items trending with similar customer profiles—all within the same conversation.&lt;/p&gt;

&lt;p&gt;This approach not only salvages potentially lost sales but also elevates the customer experience, making shoppers feel understood and supported. The result: increased conversion rates, stronger brand loyalty, and a modern retail experience that meets the expectations of today’s digitally-savvy consumers.&lt;/p&gt;

&lt;h2&gt;
  
  
  High-level Flow
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4cm5hlv0a8y8j0xqwz5i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4cm5hlv0a8y8j0xqwz5i.png" alt=" " width="800" height="216"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To bring the GenAI-powered agent to life, during the thought process of the architecture I decided to utilise a combination of AWS Bedrock, Amazon OpenSearch, AWS API Gateway, Lambda and a few other  key AWS services to enable a scalable, secure, and intelligent experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnk0f4gfy093fvmxm9le2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnk0f4gfy093fvmxm9le2.png" alt=" " width="800" height="468"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture Components
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Customer Interaction Layer
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4k4oqo960xpag979ejgi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4k4oqo960xpag979ejgi.png" alt=" " width="800" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Customer Chat Interface
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: This is the front-end application (web, mobile, smart kiosk) where customers send their queries. I have not focused in detail on these front-end applications in this context, as our primary focus must center on the Agentic AI solution.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Provides the user experience for interaction with the AI agent.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon API Gateway
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Acts as the single entry point for all customer queries. It handles request routing, authorization, rate limiting, and acts as a proxy to the back-end Lambda functions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Security: Integrates with Amazon Cognito for user authentication and authorization.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Query Handler Lambda Function
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A serverless compute service that executes code in response to events.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Receives the raw customer query from API Gateway. It can perform initial validation or pre-processing of the query before passing it to the next stage.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scalability: Automatically scales with the number of incoming requests.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Core AI &amp;amp; Data Processing Layer
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzs0ghtomc115dsc9aag.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frzs0ghtomc115dsc9aag.png" alt=" " width="800" height="611"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Query Ingestion SQS Queue
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A fully managed message queuing service that enables us to decouple and scale microservices, distributed systems, and serverless applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Provides asynchronous processing of customer queries. This prevents the API Gateway from timing out for longer processing queries and decouples the ingestion from the processing, improving resilience.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Product Availability Check Lambda Function
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Role: Consumes messages from the SQS queue. It queries the &lt;code&gt;DynamoDB: Product Catalog/Inventory&lt;/code&gt; to check if the requested product is available. If unavailable, it extracts relevant keywords or product identifiers to initiate the search for alternatives.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Product Catalog/Inventory DynamoDB
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A fast and flexible NoSQL database service for single-digit millisecond performance at any scale.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Stores the real-time product catalog and inventory information. This allows for quick lookup of product availability.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  OpenSearch Query Handler Lambda
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Role: If a product is unavailable, this function constructs a query for Amazon OpenSearch Service. It uses the extracted keywords and potentially context from the initial query to perform both semantic and vector searches.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon OpenSearch Service
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A managed service that makes it easy to deploy, operate, and scale OpenSearch clusters. It supports both keyword-based search (semantic) and vector search (for similarity).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Stores vectorized product descriptions, attributes, and potentially customer reviews. It's crucial for identifying relevant product alternatives based on semantic similarity to the unavailable product or the customer's desired attributes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scalability: Easily scales to handle large product catalogs and high query volumes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Security: Deployed within a VPC for network isolation, and access controlled via IAM policies.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon Bedrock Knowledge Base
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A fully managed capability in Amazon Bedrock that enables you to connect foundation models (FMs) to your company data sources for Retrieval Augmented Generation (RAG).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: This acts as the external knowledge source for the Bedrock Agent. The prepared contextual product data is ingested here. When the Bedrock Agent receives a request, it can query this Knowledge Base to retrieve relevant product information to augment its response. This is crucial for grounding the GenAI model's responses in factual, up-to-date product data, reducing hallucinations.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon Bedrock Agent
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A new capability within Amazon Bedrock that allows you to create generative AI applications that can automate multi-step tasks by seamlessly connecting with company systems, APIs, and data sources.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Role: This is the core orchestrator of the GenAI solution.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;It receives the user query (potentially enhanced by initial processing).&lt;/li&gt;
&lt;li&gt;It understands the intent ("product unavailability, find alternatives").&lt;/li&gt;
&lt;li&gt;It orchestrates calls to the Bedrock Knowledge Base (via RAG) to get relevant product alternatives.&lt;/li&gt;
&lt;li&gt;It uses a Foundation Model (FM) (e.g., Anthropic Claude, Amazon Titan) to craft a natural, personalised, and persuasive response incorporating the retrieved product recommendations.&lt;/li&gt;
&lt;li&gt;Memory Retention: Bedrock Agents can maintain conversational context across interactions, leading to more human-like and continuous dialogues.&lt;/li&gt;
&lt;li&gt;Code Interpretation: Can be used for complex analytical queries or dynamic content generation if needed.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h4&gt;
  
  
  Response Formatter Lambda Function
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Role: Receives the generated response from the Bedrock Agent. It can perform post-processing such as formatting the response for the chat interface, adding specific links, or integrating with other downstream systems. It then sends the final response back to the customer via API Gateway.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Data Storage &amp;amp; Pre-processing
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7otp190dq3pvrgfg9uyg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7otp190dq3pvrgfg9uyg.png" alt=" " width="800" height="180"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Product Data Source Bucket
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: Object storage built to retrieve any amount of data from anywhere.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Serves as a data lake for storing raw product data, images, descriptions, and other unstructured or semi-structured information that will be used to generate embeddings for OpenSearch. It can also store chat logs for analytics.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Security, Monitoring &amp;amp; Management
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fec08mal7znlulnozgfew.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fec08mal7znlulnozgfew.png" alt=" " width="800" height="299"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Amazon Cognito
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A service that provides authentication, authorization, and user management for web and mobile apps.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Manages user identities and provides secure authentication for the chat interface, integrating with API Gateway for authorization.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  AWS IAM
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: Enables us to securely control access to AWS services and resources.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Defines granular permissions for all AWS services and resources. Each Lambda function, Bedrock Agent, and other services will operate under IAM roles with the principle of least privilege, ensuring they only have access to the resources they need.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  AWS KMS
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: Makes it easy for us to create and manage cryptographic keys and control their use across a wide range of AWS services and in the applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Used for encryption at rest for sensitive data stored in DynamoDB, S3, and OpenSearch Service, ensuring data confidentiality.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon VPC
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: Enables us to launch AWS resources into a virtual network.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Provides network isolation for services like OpenSearch Service and potentially Lambda functions connecting to internal resources, enhancing security.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon CloudWatch
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A monitoring and observability service.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Role: Collects logs, metrics, and events from all AWS services in the architecture (Lambda, API Gateway, Bedrock, OpenSearch, DynamoDB). This is critical for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Traceability: End-to-end logging of requests and responses helps trace the flow of information and debug issues.&lt;/li&gt;
&lt;li&gt;Monitoring: Real-time metrics for performance, errors, and usage.&lt;/li&gt;
&lt;li&gt;Alerting: Setting up alarms for anomalies (e.g. increased error rates, high latency) via Amazon SNS.&lt;/li&gt;
&lt;li&gt;CloudWatch Logs Insights: For interactive analysis of log data.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon Bedrock Guardrails
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A feature within Amazon Bedrock that allows us to implement safety policies and filters for generative AI applications.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Ensures that the GenAI model's responses are safe, on-topic, and do not contain harmful, offensive, or off-brand content. It's crucial for maintaining brand reputation and compliance.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  DevOps &amp;amp; MLOps
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc4sqq3x2mbsbzbvzapbi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc4sqq3x2mbsbzbvzapbi.png" alt=" " width="800" height="195"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  GitHub and GitHub Actions
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: Services for Continuous Integration and Continuous Delivery (CI/CD).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Automate the build, test, and deployment of the Lambda functions, Bedrock Agent configurations, and other infrastructure changes. This ensures rapid, consistent, and reliable deployments.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon SNS
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: A fully managed messaging service for application-to-application (A2A) and application-to-person (A2P) communication.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Sends alerts based on CloudWatch alarms, notifying relevant teams of operational issues or security incidents.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Amazon CloudWatch Dashboards/Logs Insights
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Description: Visualisation and query tools within CloudWatch.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Role: Provides centralised dashboards for monitoring the health and performance of the entire solution, and allows for deep dives into logs for troubleshooting and analysis.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Enterprise Grade Considerations
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Scalability: All chosen AWS services (Lambda, API Gateway, SQS, DynamoDB, OpenSearch, Bedrock) are inherently scalable and managed by AWS, eliminating the need for manual server provisioning or scaling.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Security&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Authentication &amp;amp; Authorization: Amazon Cognito and IAM roles with least privilege.&lt;/li&gt;
&lt;li&gt;Data Encryption: AWS KMS for encryption at rest and SSL/TLS for encryption in transit.&lt;/li&gt;
&lt;li&gt;Network Isolation: VPC for internal services like OpenSearch.&lt;/li&gt;
&lt;li&gt;Content Moderation: Amazon Bedrock Guardrails to prevent harmful or off-brand content.&lt;/li&gt;
&lt;li&gt;PII Handling: Implementation of PII redaction or masking if sensitive customer data is processed.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Traceability: Comprehensive logging with CloudWatch Logs and metrics, allowing for detailed auditing and debugging. Chat history persistence in DynamoDB also aids in traceability and understanding user interactions.&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Reliability &amp;amp; High Availability: Services are designed for high availability by default (e.g. multi-AZ deployments for DynamoDB, OpenSearch, and Bedrock). SQS provides decoupling for resilience.&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Cost Optimisation: Pay-as-you-go model for most services, and serverless compute (Lambda) means you only pay when code runs.&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Observability: Integrated monitoring through CloudWatch and its various features.&lt;/p&gt;&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;There are countless enhancements that can be made to this solution, unlocking endless business possibilities. On a recent call, my friend mentioned that his development team had just delivered the first version of what they called an "intelligent agent" for their e-commerce website. I was thrilled to hear that and immediately said, "Mate, don’t just call it an intelligent agent—give it a name! I truly believe these AI applications will soon become part of human society, and just like us, they deserve names!"&lt;/p&gt;

</description>
      <category>ai</category>
      <category>bedrock</category>
      <category>aws</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Effortless API Scaling: Unlock the Power of AWS AppSync</title>
      <dc:creator>Lilupa Karu</dc:creator>
      <pubDate>Thu, 26 Dec 2024 10:25:21 +0000</pubDate>
      <link>https://forem.com/lilupa/effortless-api-scaling-unlock-the-power-of-aws-appsync-3mhl</link>
      <guid>https://forem.com/lilupa/effortless-api-scaling-unlock-the-power-of-aws-appsync-3mhl</guid>
      <description>&lt;h2&gt;
  
  
  AWS AppSync Journey
&lt;/h2&gt;

&lt;p&gt;During the challenging times of the COVID-19 pandemic in 2020, I embarked on architecting my first AWS AppSync enterprise solution with GraphQL APIs. The project involved integrating several data sources, including a legacy system. After completing the architecture design and undergoing a thorough peer review, our engineering teams began building the solution, with AWS AppSync as the core component. The project was a success, delivering a robust solution that brought smiles to everyone involved. At the time, I had the intention of writing an article about the experience, but, much like the pandemic itself, that plan faded into the background.&lt;/p&gt;

&lt;p&gt;Fast forward to a few months ago, I had a coffee catch up with Brett, an ex-colleague who had been part of that AppSync project. As we reminisced about the incredible time we spent on the project, I was inspired to revisit AWS AppSync and finally write this article. Reflecting on the challenges we overcame and the happy outcomes we achieved, I owe it to Brett's recollection of those moments to share this journey.&lt;/p&gt;

&lt;p&gt;Since AWS AppSync is a serverless implementation of GraphQL, a brief overview of GraphQL is necessary before proceeding.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is GraphQL
&lt;/h2&gt;

&lt;p&gt;The fundamental concept of a GraphQL API is that all API functionality is accessed through a unified query language (GQL) via a single endpoint. Instead of making multiple requests to different endpoints to retrieve various data needed for building a software application, a single request can be issued to a GraphQL API, returning all the necessary data at once. This approach reduces the complexity of modern applications and enhances user experience by enabling faster load times.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwuyj961cdgho4lhw6066.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwuyj961cdgho4lhw6066.png" alt="Image description" width="771" height="561"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The system providing data to a GraphQL API is rarely just a single application. In my experience, I’ve worked with nearly a dozen applications where GraphQL played a key role, but only one of those had a single data source. Typically, the backend consists of multiple microservices or data sources, with the GraphQL API layer responsible for aggregating data from various applications and delivering a unified response to the API requester. This GraphQL layer must efficiently parse the incoming requests, identify the appropriate data sources for each part, and seamlessly combine the results to generate the final response.&lt;/p&gt;

&lt;h3&gt;
  
  
  Components in GraphQL
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;GraphQL Types&lt;/strong&gt;&lt;br&gt;
GraphQL Types define the structure of entities by specifying their fields and attributes. Below is a list of the various types available.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Scalar Type&lt;/li&gt;
&lt;li&gt;Object Type
[In the &lt;code&gt;Object&lt;/code&gt; type, there are three special root operation types: &lt;code&gt;Query&lt;/code&gt;, &lt;code&gt;Mutation&lt;/code&gt;, and &lt;code&gt;Subscription&lt;/code&gt;]&lt;/li&gt;
&lt;li&gt;Input Types&lt;/li&gt;
&lt;li&gt;Enumeration Type&lt;/li&gt;
&lt;li&gt;Union and Interface Type&lt;/li&gt;
&lt;li&gt;Lists and Non-Null&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Schema&lt;/strong&gt;&lt;br&gt;
The schema is treated as a contract between the server and the client. It consists of a collection of types and fields, along with a defined set of operations available for interacting with the data.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type Project {
    id: ID!
    name: String!
    client: String!
    tasks: [Task!]
  }

type Task {
    title: String!
    description: String
    duration: Int
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Query&lt;/strong&gt;&lt;br&gt;
Queries serve as entry points on a GraphQL server, providing read access to data sources.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  query {
    project (id: 1) {
      name
      client
    }
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Mutations&lt;/strong&gt;&lt;br&gt;
GraphQL Mutations function as entry points on a GraphQL server, allowing write access to data sources. Whenever data needs to be modified—whether creating, updating, or deleting—it is done through GraphQL mutations.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type Mutation {
  addProject(name: String, client: String): Project
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Resolvers&lt;/strong&gt;&lt;br&gt;
Resolvers are built-in functions in GraphQL that connect types or fields defined in the schema to the corresponding data in the data sources.&lt;/p&gt;

&lt;h2&gt;
  
  
  AppSync Deep Dive
&lt;/h2&gt;

&lt;p&gt;With a basic understanding of GraphQL, attention can now be turned to AWS’s serverless offering of GraphQL. AWS AppSync, a fully managed serverless service, facilitates real-time data queries, synchronisation, and communication. Through AppSync, AWS provides a GraphQL-as-a-Service solution, enabling the seamless creation of scalable and resilient GraphQL APIs in the cloud. Request parsing and resolution, as well as integration with other AWS services such as AWS Lambda, NoSQL and SQL data stores, and HTTP APIs, are all handled to retrieve backend data for the API.&lt;/p&gt;

&lt;h2&gt;
  
  
  Overview and Architecture
&lt;/h2&gt;

&lt;p&gt;Developers are granted access to their data via this managed GraphQL service, which offers several advantages over traditional gateways. GraphQL promotes declarative coding and integrates effectively with modern tools and frameworks such as React, React Native, iOS, and Android. Subscriptions can be utilised to implement live updates, push notifications, and other features.&lt;/p&gt;

&lt;p&gt;When GraphQL subscription operations are invoked, AWS AppSync automatically establishes and maintains a secure WebSocket connection. Data can then be distributed in real-time to subscribers from a data source, while connection and scaling requirements are managed.&lt;/p&gt;

&lt;p&gt;With AWS AppSync, serverless GraphQL and Pub/Sub APIs allow secure querying, modification, or publication of data using a single endpoint.&lt;/p&gt;

&lt;h3&gt;
  
  
  GraphQL APIs
&lt;/h3&gt;

&lt;p&gt;AWS AppSync-built GraphQL APIs empower frontend developers to seamlessly query multiple databases, microservices, and APIs through a single GraphQL endpoint. This simplifies data interaction and accelerates application development.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ezuk9mt681culpa5m5a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ezuk9mt681culpa5m5a.png" alt="Image description" width="800" height="318"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Pub/ Sub APIs
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0uzwc3a8g7hzrbwaaql.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi0uzwc3a8g7hzrbwaaql.png" alt="Image description" width="800" height="292"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  AppSync Use Cases
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Gateway for microservices&lt;/strong&gt;&lt;br&gt;
AppSync serves as a powerful gateway for microservices, allowing you to access and aggregate data from various sources with ease.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real Time Collaboration&lt;/strong&gt;&lt;br&gt;
AppSync enables scalable, real-time collaboration by broadcasting data from the backend to all connected clients (one-to-many) or facilitating communication between clients (many-to-many). This is ideal for chat applications, shared documents, or live dashboards.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Polyglot data access&lt;/strong&gt;&lt;br&gt;
With AppSync, you can interact with diverse data sources—including SQL databases in Amazon Aurora Serverless, NoSQL tables in Amazon DynamoDB, search indexes in Amazon OpenSearch Service, REST endpoints via Amazon API Gateway, and serverless backends using AWS Lambda—using a single GraphQL call.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Offline Delta Sync&lt;/strong&gt;&lt;br&gt;
AppSync integrates with Amplify DataStore, an on-device persistent storage engine, to enable offline-first capabilities. Data is automatically synchronised between mobile/web apps and the cloud using GraphQL, ensuring a seamless experience even in limited connectivity scenarios.&lt;/p&gt;

&lt;h2&gt;
  
  
  Let’s Build Something Together
&lt;/h2&gt;

&lt;p&gt;Why just read when you can build? Let’s dive into a practical example to explore how AWS AppSync works in action.&lt;/p&gt;

&lt;p&gt;For this demo, we’ll build a simple coffee ordering app—a favorite theme of mine because, let’s face it, life without coffee isn’t life at all.&lt;/p&gt;

&lt;h3&gt;
  
  
  Building the Coffee Order App
&lt;/h3&gt;

&lt;p&gt;Here’s what the app does:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;GraphQL Schema for Orders
Customers can place coffee orders by providing their name, mobile number, and preferred coffee type. Orders are submitted through the &lt;code&gt;createOrder&lt;/code&gt; mutation and stored in a DynamoDB table via a Lambda function.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7fh2vutbwxrcs66gmffw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7fh2vutbwxrcs66gmffw.png" alt="Image description" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ghwro8qqch6js727aa9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ghwro8qqch6js727aa9.png" alt="Image description" width="800" height="300"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Querying Orders&lt;br&gt;
Coffee shop owners can retrieve new orders from the database using the &lt;code&gt;listOrders&lt;/code&gt; GraphQL query. This query also utilises the same Lambda function for data retrieval.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Infrastructure and Frontend&lt;br&gt;
The backend is built entirely with AWS CDK, ensuring it adheres to Infrastructure as Code (IaC) best practices. On the frontend, I’ve created a simple React application that fetches new coffee orders via AppSync’s GraphQL API.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7rhiwlwkjb07ibycgcw4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7rhiwlwkjb07ibycgcw4.png" alt="Image description" width="800" height="258"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The complete application is available on GitHub for you to explore and adapt.&lt;/p&gt;

&lt;p&gt;AWS CDK-based AppSync APIs: &lt;a href="https://github.com/karu-lk/vitinya-cdk" rel="noopener noreferrer"&gt;Vitinya CDK&lt;/a&gt;&lt;br&gt;
React Front-end web app: &lt;a href="https://github.com/karu-lk/vitinya-web" rel="noopener noreferrer"&gt;Vitinya Web&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As for the app’s name, Vitinya, it’s inspired by the Vitinya Pass in Bulgaria’s Balkan Mountains—a place I’ve yet to visit but remains high on my bucket list. By naming the app after this scenic destination, I’ve given myself a constant reminder to plan my Eastern European holiday soon!&lt;/p&gt;

&lt;h2&gt;
  
  
  Until Next Time
&lt;/h2&gt;

&lt;p&gt;That’s all for now. Stay tuned for more articles and hands-on examples in the beautiful AWS ecosystem. Until then, happy AWSing!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>appsync</category>
      <category>graphql</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Event-Driven Magic: Exploring AWS EventBridge</title>
      <dc:creator>Lilupa Karu</dc:creator>
      <pubDate>Sat, 20 Jul 2024 11:36:46 +0000</pubDate>
      <link>https://forem.com/aws-builders/event-driven-magic-exploring-aws-eventbridge-1hoa</link>
      <guid>https://forem.com/aws-builders/event-driven-magic-exploring-aws-eventbridge-1hoa</guid>
      <description>&lt;p&gt;I kept thinking about what Werner Vogels said in his speech at AWS re:Invent in December 2022,&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;the world is event-driven&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Even though I’ve involved on a few event-driven architectures before, his words made me reconsider how we use event-driven architectures.&lt;/p&gt;

&lt;p&gt;The result? I’ve developed some impressive reference architectures utilising modern cloud-based tools and services. Let’s begin by reviewing the fundamentals.&lt;/p&gt;

&lt;p&gt;Event-driven architecture is all about events. Simply put, an event is a change in state. Picture it like this: swiping a credit card at a payment machine is an event. Our world is full of these actions and events, and they play a big role in how everything functions. Think of events as signals that indicate when something changes in a system, like saving information to a database, uploading a file into storage, or a sensor being triggered on an IoT device.&lt;/p&gt;

&lt;p&gt;A key point to remember about events is their immutability; once they occur, they cannot be altered. This concept mirrors everyday life — once a moment passes, it is unchangeable. For instance, I wished for the All Blacks to win RWC 2023. Regardless of how much I wanted it, there was no way to go back and change the outcome once the event had taken place and was recorded in history.&lt;/p&gt;

&lt;h2&gt;
  
  
  Event Driven Architecture Components
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6r58upvng59bjqpbzhuo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6r58upvng59bjqpbzhuo.png" alt="Event Driven Architecture Components"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Event Producer:&lt;/strong&gt; These entities generate events based on specific actions or triggers within the system and are solely aware of the event router, not the event consumers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Event Broker:&lt;/strong&gt; The event broker is responsible for receiving and filtering events, ensuring that each event is directed to the correct consumer. It serves as a mediator, enabling the smooth transmission of events throughout the architecture.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Event Consumer:&lt;/strong&gt; Event consumers either subscribe to particular events or monitor event streams, reacting to events based on their individual needs or initiating further actions within the system.&lt;/p&gt;

&lt;h2&gt;
  
  
  Event-Driven Architecture Patterns
&lt;/h2&gt;

&lt;p&gt;There’s much discussion surrounding event-driven architecture patterns across various categories. Below are some of the most popular patterns:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Communication Patterns:&lt;/strong&gt; publish/subscribe, point-to-point, event streaming, request/reply&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Consumption Patterns:&lt;/strong&gt; guaranteed delivery, hierarchical topics, event filtering, backpressure, and push&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Event Generation Patterns:&lt;/strong&gt; ECST, event sourcing, CQRS, CDC&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Governance Patterns:&lt;/strong&gt; event catalog, event APIs, intermediated, disintermediated, access control, and authorization&lt;/p&gt;

&lt;h2&gt;
  
  
  Let’s talk about EventBridge
&lt;/h2&gt;

&lt;p&gt;Let's keep our attention on the main topic. We should delve into what EventBridge is and explore the reasons behind its growing popularity.&lt;/p&gt;

&lt;p&gt;EventBridge, introduced in July 2019, is a serverless service that leverages events to seamlessly connect components of software applications. This facilitates the creation of modern, scalable, and loosely coupled event-driven applications.&lt;/p&gt;

&lt;p&gt;Its robust positioning within this architectural domain stems from its seamless integration capabilities with numerous other AWS services, including Step Functions, SQS, SNS, and Lambda functions, all of which inherently complement event-driven architectures.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0pnm1q0w3w8c83i98jrt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0pnm1q0w3w8c83i98jrt.png" alt="How EventBridge works with its components"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the middle of the illustration, we have AWS EventBridge, a serverless event bus. EventBridge is designed with a pub-sub architecture pattern in mind, allowing external sources to ingest events as ‘publishers.’ These sources can range from bespoke applications and various SaaS applications to a wide array of AWS services.&lt;/p&gt;

&lt;h2&gt;
  
  
  Events
&lt;/h2&gt;

&lt;p&gt;An event is a real-time change in a system, data, or environment. This change can occur in your application, an AWS service, or a SaaS partner service. While events can have similar structures, they must all include the top-level fields: detail, source, and detail-type. Custom events must, at a minimum, contain these fields.&lt;/p&gt;

&lt;h2&gt;
  
  
  Custom Event Sample
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

&lt;p&gt;{&lt;br&gt;
    "version": "0",&lt;br&gt;
    "id": "665f15b9-bbbb-ffff-0000-ee92777062fb",&lt;br&gt;
    "detail-type": "product.update",&lt;br&gt;
    "source": "your source app",&lt;br&gt;
    "account": "123456789012",&lt;br&gt;
    "time": "2024-07-16T05:24:42Z",&lt;br&gt;
    "region": "us-east-1",&lt;br&gt;
    "resources": [],&lt;br&gt;
    "detail": {&lt;br&gt;
        "action": "product update"&lt;br&gt;
    }&lt;br&gt;
}&lt;/p&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Rules&lt;br&gt;
&lt;/h2&gt;

&lt;p&gt;Once the events are ingested into EventBridge, rules redirect the events to appropriate targets based on their content. EventBridge has a soft limit of 300 rules, which can be increased via a support ticket. These rules can be connected to multiple targets, such as AWS services like Lambda functions, Step Functions, or SQS queues.&lt;/p&gt;

&lt;h2&gt;
  
  
  Event Schema
&lt;/h2&gt;

&lt;p&gt;EventBridge is capable of handling event schemas, which represent the structure of events. A schema typically includes details such as the title and type of each data element. For example, on an online survey website, a schema might include fields for the survey name, description, reviewer’s name, user ID, and reviewer’s comments and ratings. These attributes are defined with specific types, such as the name being a string and the user ID being a GUID, etc.&lt;/p&gt;

&lt;h2&gt;
  
  
  Schema Registry
&lt;/h2&gt;

&lt;p&gt;A schema registry serves as a centralised repository for managing a collection of schemas. It allows you to search, locate, and track various schemas utilised and generated by your applications, ensuring consistency and compatibility across your data infrastructure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Schema Discovery
&lt;/h2&gt;

&lt;p&gt;Schema discovery automates the identification and registration of schemas, streamlining their integration into your registry. When enabled for an EventBridge event bus, schema discovery automatically records the schema of each event sent to the bus in the registry. If an event’s schema changes, schema discovery promptly creates a new version in the registry. After a schema is registered, you can generate a code binding for it, facilitating seamless interaction with the schema.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pipes
&lt;/h2&gt;

&lt;p&gt;EventBridge buses facilitate the routing and distribution of events to services using the publish/subscribe pattern, while EventBridge Pipes manage direct integrations between services through the point-to-point pattern. Both patterns are essential for constructing event-driven architectures. EventBridge Pipes allow for direct integrations without needing specialised knowledge. These pipes consist of a Source and a Target, and an enrichment step can be employed to enhance the payload data before it is sent to the target.&lt;/p&gt;

&lt;h2&gt;
  
  
  EventBridge Use Cases
&lt;/h2&gt;

&lt;p&gt;There are numerous use cases for event-driven architecture where we can effectively apply AWS EventBridge. Let’s explore some of these use cases.&lt;/p&gt;

&lt;h3&gt;
  
  
  SaaS Product Integration
&lt;/h3&gt;

&lt;p&gt;Imagine your organisation has a CRM system that is a SaaS partner with AWS EventBridge. The architecture below can be used to integrate this CRM with other business activities. The CRM system can send events to the EventBridge whenever there are changes in the system, data, or environment. Based on the rules defined in the EventBridge, messages are distributed to the appropriate tasks. For example, when a new customer registers in the CRM, the customer banking integration Step Function subscribes to the event and performs the relevant tasks.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fltvlgwyonhmk7l9445gj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fltvlgwyonhmk7l9445gj.png" alt="SaaS Product Integration using EventBridge"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Bespoke System Integration
&lt;/h3&gt;

&lt;p&gt;When developing a bespoke software application that requires integration with external entities in an event-driven manner, the following architecture would be ideal.&lt;/p&gt;

&lt;p&gt;Consider a scenario where a customer requests custom-made furniture. The request is received by the company’s sales system, which then communicates the specifications to the factory. Throughout the various phases, the customer needs to be kept informed, and once the furniture is ready, the designated courier company must be notified for pickup and delivery. This integration can be handled more reliably and efficiently using an event-driven architecture with AWS EventBridge, rather than a polling architecture.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo26w8ri6mck1ys6oqmz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffo26w8ri6mck1ys6oqmz.png" alt="Bespoke System Integration with EventBridge"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After diving into a multitude of theories, it’s great to get some hands-on experience. Part two of this article, featuring a near real-world EventBridge integration, will be released soon. Stay tuned!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>eventbridge</category>
      <category>eventdriven</category>
      <category>architecture</category>
    </item>
  </channel>
</rss>
