<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Sedat SALMAN</title>
    <description>The latest articles on Forem by Sedat SALMAN (@sdtslmn).</description>
    <link>https://forem.com/sdtslmn</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/sdtslmn"/>
    <language>en</language>
    <item>
      <title>Azure in 60 Seconds: What is a Landing Zone?</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Sun, 03 May 2026 15:13:25 +0000</pubDate>
      <link>https://forem.com/sdtslmn/azure-in-60-seconds-what-is-a-landing-zone-2l6f</link>
      <guid>https://forem.com/sdtslmn/azure-in-60-seconds-what-is-a-landing-zone-2l6f</guid>
      <description>&lt;h3&gt;
  
  
  1. Foundation of Your Cloud
&lt;/h3&gt;

&lt;p&gt;A Landing Zone is the starting architecture for your Azure environment.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;It defines how everything will be deployed, secured, and managed.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  2. Not Just a Subscription
&lt;/h3&gt;

&lt;p&gt;It’s more than creating a subscription.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;It includes identity, networking, governance, and security baseline.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  3. Identity and Access Design
&lt;/h3&gt;

&lt;p&gt;Controls who can do what across the environment.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use Entra ID, RBAC, and managed identities.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  4. Network Architecture
&lt;/h3&gt;

&lt;p&gt;Defines how resources communicate securely.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Hub-spoke model, private endpoints, DNS, and segmentation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  5. Governance and Policies
&lt;/h3&gt;

&lt;p&gt;Ensures consistency and compliance.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Naming standards, tagging, policies, and role separation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  6. Security Baseline
&lt;/h3&gt;

&lt;p&gt;Built-in controls from day one.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Logging, monitoring, threat protection, and secure configurations.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  7. Management and Monitoring
&lt;/h3&gt;

&lt;p&gt;Central visibility and control.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Log Analytics, alerts, and operational dashboards.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  8. Designed for Scale
&lt;/h3&gt;

&lt;p&gt;Landing Zone is built for future growth.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Supports multiple subscriptions, environments, and regions.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  9. Workload Ready
&lt;/h3&gt;

&lt;p&gt;Applications are deployed on top of the Landing Zone.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;It prepares the environment before any workload goes live.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  10. Based on Best Practices
&lt;/h3&gt;

&lt;p&gt;Usually aligned with Microsoft Cloud Adoption Framework (CAF).&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Standardized, repeatable, and enterprise-ready.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Simple View:&lt;/strong&gt;&lt;br&gt;
Landing Zone = Your Azure blueprint for secure, scalable, and governed deployments from day one.&lt;/p&gt;

</description>
      <category>azure</category>
    </item>
    <item>
      <title>Azure in 60 Seconds: What Clients Always Get Wrong</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Sun, 03 May 2026 14:46:42 +0000</pubDate>
      <link>https://forem.com/sdtslmn/azure-in-60-seconds-what-clients-always-get-wrong-5fbp</link>
      <guid>https://forem.com/sdtslmn/azure-in-60-seconds-what-clients-always-get-wrong-5fbp</guid>
      <description>&lt;h3&gt;
  
  
  1. Cloud Will Automatically Save Money
&lt;/h3&gt;

&lt;p&gt;Costs increase quickly without design and control.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Optimize architecture, not just pricing.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  2. Treating Azure Like On-Prem
&lt;/h3&gt;

&lt;p&gt;Lifting and shifting VMs without redesign = poor performance + high cost.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use PaaS and cloud-native patterns.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  3. Ignoring Identity Design
&lt;/h3&gt;

&lt;p&gt;Users, apps, and services need structured access control.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Design RBAC, groups, and managed identities from day one.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  4. No Network Planning
&lt;/h3&gt;

&lt;p&gt;IP overlap, DNS issues, and broken connectivity happen later.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Define hub-spoke, IP ranges, and DNS early.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  5. Skipping Governance
&lt;/h3&gt;

&lt;p&gt;No naming, no tagging, no policy = no control.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Implement landing zones, policies, and standards.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  6. Thinking Security is “Handled by Azure”
&lt;/h3&gt;

&lt;p&gt;Cloud is shared responsibility.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You are still responsible for data, access, and configuration.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  7. No Cost Monitoring
&lt;/h3&gt;

&lt;p&gt;Bills become a surprise at the end of the month.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use Azure Cost Management with budgets and alerts.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  8. Overengineering Early
&lt;/h3&gt;

&lt;p&gt;Too many services, too complex design.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Start simple, scale when needed.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  9. No Backup / DR Strategy
&lt;/h3&gt;

&lt;p&gt;Assuming cloud = safe is risky.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Define backup, retention, and recovery processes.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  10. Ignoring Operations
&lt;/h3&gt;

&lt;p&gt;Deployment is easy, operations are not.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Plan monitoring, patching, and incident response.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Reality:&lt;/strong&gt;&lt;br&gt;
Most Azure problems are not technical — they come from wrong assumptions and missing fundamentals.&lt;/p&gt;

</description>
      <category>azure</category>
    </item>
    <item>
      <title>Azure in 60 Seconds: Mistakes Killing Your Budget</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Sun, 03 May 2026 13:50:07 +0000</pubDate>
      <link>https://forem.com/sdtslmn/azure-in-60-seconds-mistakes-killing-your-budget-2fac</link>
      <guid>https://forem.com/sdtslmn/azure-in-60-seconds-mistakes-killing-your-budget-2fac</guid>
      <description>&lt;h3&gt;
  
  
  1. Leaving Resources Running
&lt;/h3&gt;

&lt;p&gt;Unused VMs, disks, and test environments running 24/7 = silent cost leak.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Stop, deallocate, or schedule shutdowns.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  2. Overprovisioning
&lt;/h3&gt;

&lt;p&gt;Using large VM sizes “just in case” wastes money.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Start small, monitor usage, then scale.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  3. No Cost Visibility
&lt;/h2&gt;

&lt;p&gt;If you don’t track it, you can’t control it.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use Azure Cost Management for budgets and alerts.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  4. Ignoring Storage Costs
&lt;/h3&gt;

&lt;p&gt;Snapshots, backups, and logs grow silently over time.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Apply lifecycle policies and clean up regularly.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  5. No Auto-Scaling
&lt;/h3&gt;

&lt;p&gt;Fixed resources can’t adapt to real demand.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Enable auto-scale to match usage dynamically.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  6. Paying Pay-As-You-Go for Everything
&lt;/h3&gt;

&lt;p&gt;Long-running workloads on PAYG = expensive.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use Reserved Instances or Savings Plans.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  7. Unused Public IPs &amp;amp; Load Balancers
&lt;/h3&gt;

&lt;p&gt;Even idle networking components can cost money.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Remove or consolidate unused resources.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  8. Poor Tagging Strategy
&lt;/h3&gt;

&lt;p&gt;No tags = no accountability or visibility.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Use tags for cost allocation and tracking.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  9. Wrong Service Choice
&lt;/h3&gt;

&lt;p&gt;Using IaaS where PaaS is enough increases cost and effort.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Choose the right service model for the workload.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  10. Cross-Region Data Transfer
&lt;/h3&gt;

&lt;p&gt;Moving data between regions adds hidden charges.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Keep workloads and data in the same region when possible.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Quick Win:&lt;/strong&gt;&lt;br&gt;
Tag resources + enable budgets + auto-shutdown = immediate cost control.&lt;/p&gt;

</description>
      <category>azure</category>
    </item>
    <item>
      <title>AWS Data &amp; AI Stories #04: Multimodal RAG on AWS</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Wed, 22 Apr 2026 19:07:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-data-ai-stories-04-multimodal-rag-on-aws-2ppp</link>
      <guid>https://forem.com/aws-builders/aws-data-ai-stories-04-multimodal-rag-on-aws-2ppp</guid>
      <description>&lt;p&gt;In the first article, I talked about multimodal AI at a high level.&lt;/p&gt;

&lt;p&gt;In the second article, I focused on Amazon Bedrock Data Automation as the processing layer.&lt;/p&gt;

&lt;p&gt;In the third article, I explained multimodal knowledge bases as the retrieval layer.&lt;/p&gt;

&lt;p&gt;Now it is time to connect these pieces together.&lt;/p&gt;

&lt;p&gt;This is where multimodal RAG becomes important. Amazon Bedrock Knowledge Bases now supports multimodal content including images, audio, and video, and AWS positions it as a managed way to build end-to-end RAG workflows over enterprise data.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is multimodal RAG?
&lt;/h2&gt;

&lt;p&gt;RAG means Retrieval Augmented Generation.&lt;/p&gt;

&lt;p&gt;The idea is simple:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;retrieve relevant content from your own data&lt;/li&gt;
&lt;li&gt;send that context to the model&lt;/li&gt;
&lt;li&gt;generate a grounded answer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A multimodal RAG system follows the same logic, but the retrieved context is not limited to text. It can also include images, audio, video, or processed outputs derived from those inputs. AWS documentation for multimodal knowledge bases explicitly supports multimedia ingestion and querying, including image queries and time-based retrieval metadata for audio and video.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why is multimodal RAG different from normal RAG?
&lt;/h2&gt;

&lt;p&gt;Traditional RAG is usually text-focused.&lt;/p&gt;

&lt;p&gt;That works well for manuals, policies, reports, and similar documents.&lt;/p&gt;

&lt;p&gt;But in many real environments, important knowledge is spread across:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;diagrams&lt;/li&gt;
&lt;li&gt;screenshots&lt;/li&gt;
&lt;li&gt;scanned pages&lt;/li&gt;
&lt;li&gt;recorded calls&lt;/li&gt;
&lt;li&gt;videos&lt;/li&gt;
&lt;li&gt;field images&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So the challenge is no longer only “Which paragraph should I retrieve?”&lt;/p&gt;

&lt;p&gt;The new challenge becomes:&lt;br&gt;
Which content is relevant, regardless of format?&lt;/p&gt;

&lt;p&gt;That is the real value of multimodal RAG. AWS’s newer multimodal retrieval guidance is built around this exact shift from text-only retrieval to retrieval across media types.&lt;/p&gt;

&lt;h2&gt;
  
  
  How I see the architecture
&lt;/h2&gt;

&lt;p&gt;A simple multimodal RAG architecture on AWS looks like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data is collected in a source such as Amazon S3&lt;/li&gt;
&lt;li&gt;Raw files are processed if needed&lt;/li&gt;
&lt;li&gt;A knowledge base indexes the usable content&lt;/li&gt;
&lt;li&gt;A query retrieves relevant multimodal context&lt;/li&gt;
&lt;li&gt;A foundation model generates the answer&lt;/li&gt;
&lt;li&gt;The application returns the answer, often with source grounding&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS describes Knowledge Bases as a fully managed RAG capability that handles ingestion, retrieval, and prompt augmentation, which is why it fits this workflow so well. AWS also shows multimodal examples where Bedrock Data Automation is used before Knowledge Bases to improve downstream retrieval.&lt;/p&gt;

&lt;h2&gt;
  
  
  Two main multimodal RAG patterns
&lt;/h2&gt;

&lt;p&gt;This is the most important design point for this article.&lt;/p&gt;

&lt;p&gt;Not every multimodal RAG system should be built the same way.&lt;/p&gt;

&lt;p&gt;AWS currently describes two main approaches for multimodal processing in Knowledge Bases:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Retrieval-first approach
&lt;/h3&gt;

&lt;p&gt;This is the better option when the main goal is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;visual similarity&lt;/li&gt;
&lt;li&gt;image search&lt;/li&gt;
&lt;li&gt;cross-modal retrieval&lt;/li&gt;
&lt;li&gt;media-aware search&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this pattern, Amazon Nova Multimodal Embeddings is the main enabler. AWS describes this approach as the right fit for visual similarity searches and multimodal semantic retrieval.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Processing-first approach
&lt;/h3&gt;

&lt;p&gt;This is the better option when the main goal is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;extracting structured meaning from raw media&lt;/li&gt;
&lt;li&gt;turning audio, video, or documents into usable searchable content&lt;/li&gt;
&lt;li&gt;supporting downstream question answering with processed output&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this pattern, Amazon Bedrock Data Automation becomes the first major step before retrieval. AWS documentation describes BDA as the text-based processing path for multimedia content in multimodal knowledge bases, and AWS has also published solution examples combining BDA with Knowledge Bases for multimodal RAG applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to decide between the two
&lt;/h3&gt;

&lt;p&gt;For me, the design question is simple.&lt;/p&gt;

&lt;p&gt;If I want to ask:&lt;br&gt;
“Find content that looks or feels similar.”&lt;br&gt;
then I would think retrieval-first.&lt;/p&gt;

&lt;p&gt;If I want to ask:&lt;br&gt;
“Extract useful content from media and use that in RAG.”&lt;br&gt;
then I would think processing-first.&lt;/p&gt;

&lt;p&gt;AWS’s own “choose your multimodal processing approach” guidance makes this distinction very clearly, and I think that is the right way to avoid overdesigning the solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  A practical workflow example
&lt;/h2&gt;

&lt;p&gt;Imagine a support or operations use case.&lt;/p&gt;

&lt;p&gt;Your data may include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;PDF maintenance procedures&lt;/li&gt;
&lt;li&gt;field images&lt;/li&gt;
&lt;li&gt;audio notes from engineers&lt;/li&gt;
&lt;li&gt;short troubleshooting videos&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A user asks:&lt;br&gt;
“What is the likely issue and what should I check first?”&lt;/p&gt;

&lt;p&gt;A text-only RAG system may retrieve a manual section.&lt;/p&gt;

&lt;p&gt;A multimodal RAG system can do more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;retrieve a relevant text section&lt;/li&gt;
&lt;li&gt;identify matching visual evidence&lt;/li&gt;
&lt;li&gt;point to the correct moment in a video&lt;/li&gt;
&lt;li&gt;use processed audio or image context to improve the answer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS documentation for querying multimodal knowledge bases shows response metadata such as source modality, MIME type, and start and end timestamps for audio and video segments, which makes this type of experience much more practical.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Bedrock Knowledge Bases matters here
&lt;/h2&gt;

&lt;p&gt;You can always build your own RAG system.&lt;/p&gt;

&lt;p&gt;But one reason Bedrock Knowledge Bases matters is that it reduces the amount of custom plumbing.&lt;/p&gt;

&lt;p&gt;AWS positions it as a managed RAG capability that simplifies setup, handles parts of preprocessing and retrieval, and helps ground model responses in proprietary data. For many teams, this is a better starting point than building a fully custom retrieval pipeline from scratch.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where BDA still matters in multimodal RAG
&lt;/h2&gt;

&lt;p&gt;Even though this article is about RAG, BDA still plays an important role.&lt;/p&gt;

&lt;p&gt;Multimodal RAG does not always mean retrieving directly from raw multimedia.&lt;/p&gt;

&lt;p&gt;In many cases, the better pattern is:&lt;/p&gt;

&lt;h2&gt;
  
  
  process the content first
&lt;/h2&gt;

&lt;p&gt;extract structured insights&lt;br&gt;
store or index those outputs&lt;br&gt;
use them in RAG&lt;/p&gt;

&lt;p&gt;AWS has shown this pattern in solution examples where Amazon Bedrock Data Automation processes multimodal content, the extracted information is stored in a knowledge base, and then a RAG interface is used for question answering.&lt;/p&gt;

&lt;h2&gt;
  
  
  One point people often miss
&lt;/h2&gt;

&lt;p&gt;A common mistake is to assume multimodal RAG is only about attaching files to a chatbot.&lt;/p&gt;

&lt;p&gt;That is too simple.&lt;/p&gt;

&lt;p&gt;A real multimodal RAG system usually includes:&lt;/p&gt;

&lt;h2&gt;
  
  
  ingestion
&lt;/h2&gt;

&lt;p&gt;processing&lt;br&gt;
indexing&lt;br&gt;
retrieval&lt;br&gt;
prompt augmentation&lt;br&gt;
response generation&lt;br&gt;
source grounding&lt;/p&gt;

&lt;p&gt;That is why I see multimodal RAG as an architecture pattern, not just a model feature. AWS Prescriptive Guidance describes Knowledge Bases as covering the RAG workflow from ingestion to retrieval and prompt augmentation, which supports this architecture view.&lt;/p&gt;

&lt;h2&gt;
  
  
  Constraints to remember
&lt;/h2&gt;

&lt;p&gt;There are also a few practical points to remember.&lt;/p&gt;

&lt;p&gt;First, AWS states that multimodal support in Bedrock Knowledge Bases is available with unstructured data sources. Structured data sources do not support multimodal content processing. Second, the available query types and features depend on the processing approach you choose.&lt;/p&gt;

&lt;p&gt;So it is important to design the knowledge layer with the right data source model from the start.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where this is useful
&lt;/h2&gt;

&lt;p&gt;I think multimodal RAG is especially useful in cases like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;technical support&lt;/li&gt;
&lt;li&gt;operations knowledge assistants&lt;/li&gt;
&lt;li&gt;document and image search&lt;/li&gt;
&lt;li&gt;inspection workflows&lt;/li&gt;
&lt;li&gt;compliance evidence review&lt;/li&gt;
&lt;li&gt;media-rich enterprise search&lt;/li&gt;
&lt;li&gt;predictive maintenance assistants&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS has published examples including multimodal root-cause diagnosis and agentic multimodal assistants, which shows that this pattern is already moving into real business use cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;For me, multimodal RAG is where the previous three topics come together.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multimodal AI gives the overall direction&lt;/li&gt;
&lt;li&gt;Bedrock Data Automation helps process raw content&lt;/li&gt;
&lt;li&gt;Multimodal Knowledge Bases provide the retrieval layer&lt;/li&gt;
&lt;li&gt;Multimodal RAG turns all of that into useful answers&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS now provides a much clearer path for building these solutions than before, especially with managed multimodal retrieval in Knowledge Bases and guidance on choosing between BDA and Nova Multimodal Embeddings depending on the use case.&lt;/p&gt;

&lt;p&gt;For me, the key lesson is simple:&lt;/p&gt;

&lt;p&gt;Do not start with the model.&lt;/p&gt;

&lt;p&gt;Start with the question:&lt;br&gt;
What kind of content do I need to retrieve, and why?&lt;/p&gt;

&lt;p&gt;If that answer is clear, the multimodal RAG design becomes much easier.&lt;/p&gt;

&lt;p&gt;In the next article, I would move to the next logical topic:&lt;/p&gt;

&lt;p&gt;Amazon Nova Multimodal Embeddings.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>datascience</category>
      <category>aws</category>
      <category>awsbigdata</category>
    </item>
    <item>
      <title>AWS Data &amp; AI Stories #03: Multimodal Knowledge Bases</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Mon, 20 Apr 2026 19:01:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-data-ai-stories-03-multimodal-knowledge-bases-34af</link>
      <guid>https://forem.com/aws-builders/aws-data-ai-stories-03-multimodal-knowledge-bases-34af</guid>
      <description>&lt;p&gt;In the first article, I talked about multimodal AI at a high level.&lt;/p&gt;

&lt;p&gt;In the second one, I focused on Amazon Bedrock Data Automation as the processing layer.&lt;/p&gt;

&lt;p&gt;Now the next question is simple:&lt;/p&gt;

&lt;p&gt;After we process the content, how do we make it searchable and useful for AI applications?&lt;/p&gt;

&lt;p&gt;This is where multimodal knowledge bases come in.&lt;/p&gt;

&lt;p&gt;Amazon Bedrock Knowledge Bases now supports multimodal content, including images, audio, and video, in addition to traditional unstructured text sources. It also supports multimodal querying, including image-based search and retrieval across media types.&lt;/p&gt;

&lt;p&gt;For me, this is the layer that turns processed content into usable context.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is a multimodal knowledge base?
&lt;/h2&gt;

&lt;p&gt;A knowledge base is a managed retrieval layer for your own content.&lt;/p&gt;

&lt;p&gt;Instead of asking a model to rely only on general training knowledge, a knowledge base helps the system retrieve information from your own files and data sources before generating a response. That is the main idea behind Retrieval Augmented Generation, or RAG. Amazon Bedrock Knowledge Bases is designed for exactly this purpose: it retrieves relevant information from your data sources and uses it to improve response relevance and accuracy.&lt;/p&gt;

&lt;p&gt;A multimodal knowledge base extends that idea beyond text.&lt;/p&gt;

&lt;p&gt;So instead of only working with documents, the system can also work with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;images&lt;/li&gt;
&lt;li&gt;audio&lt;/li&gt;
&lt;li&gt;video&lt;/li&gt;
&lt;li&gt;mixed-content files&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This matters because enterprise knowledge is rarely text only.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why does this matter?
&lt;/h2&gt;

&lt;p&gt;Because many real-world systems do not store knowledge in perfect written documents.&lt;/p&gt;

&lt;p&gt;A lot of value exists in:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;diagrams&lt;/li&gt;
&lt;li&gt;scanned files&lt;/li&gt;
&lt;li&gt;screenshots&lt;/li&gt;
&lt;li&gt;inspection photos&lt;/li&gt;
&lt;li&gt;recorded calls&lt;/li&gt;
&lt;li&gt;training videos&lt;/li&gt;
&lt;li&gt;equipment images&lt;/li&gt;
&lt;li&gt;operational media&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If our knowledge layer only understands text, a large part of business context stays outside the system.&lt;/p&gt;

&lt;p&gt;With multimodal retrieval in Bedrock Knowledge Bases, AWS now supports ingesting, indexing, and retrieving information from text, images, video, and audio in a more unified workflow. AWS also notes that applications can search using an image query to find visually similar content or relevant scenes in multimedia sources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where it fits in the architecture
&lt;/h2&gt;

&lt;p&gt;I see the flow like this:&lt;/p&gt;

&lt;p&gt;Raw content → processing layer → knowledge base → retrieval → answer or action&lt;/p&gt;

&lt;p&gt;So:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Part 1 was the general multimodal AI view&lt;/li&gt;
&lt;li&gt;Part 2 was the processing layer with Bedrock Data Automation&lt;/li&gt;
&lt;li&gt;Part 3 is the retrieval layer&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That means the knowledge base is not the first step.&lt;/p&gt;

&lt;p&gt;It comes after the content is already available in a usable form, whether directly from unstructured sources or after preprocessing.&lt;/p&gt;

&lt;p&gt;AWS documentation also makes this separation clearer now by distinguishing multimodal processing approaches depending on the goal: Nova Multimodal Embeddings for visual similarity and cross-modal retrieval, or Bedrock Data Automation for text-oriented processing of multimedia content.&lt;/p&gt;

&lt;h2&gt;
  
  
  Two ways to think about multimodal retrieval
&lt;/h2&gt;

&lt;p&gt;This is the most important design point.&lt;/p&gt;

&lt;p&gt;Not every multimodal use case is the same.&lt;/p&gt;

&lt;p&gt;AWS currently describes two main multimodal processing approaches for knowledge bases:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Nova Multimodal Embeddings approach
&lt;/h3&gt;

&lt;p&gt;This is better when the focus is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;visual similarity&lt;/li&gt;
&lt;li&gt;image search&lt;/li&gt;
&lt;li&gt;cross-modal retrieval&lt;/li&gt;
&lt;li&gt;searching media with text or image input&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS documentation says this approach is suited for visual similarity searches and multimodal semantic retrieval.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Bedrock Data Automation approach&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is better when the focus is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;extracting structured meaning from multimedia&lt;/li&gt;
&lt;li&gt;turning media into searchable text-oriented outputs&lt;/li&gt;
&lt;li&gt;using processed content in downstream RAG&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS documentation describes this option as the text-based processing path for multimedia content.&lt;/p&gt;

&lt;p&gt;For me, the decision is simple:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If I want to find similar content across modalities, I think retrieval-first.&lt;/li&gt;
&lt;li&gt;If I want to extract useful content from media and then search it, I think processing-first.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What can you query?
&lt;/h2&gt;

&lt;p&gt;This is one of the nice parts of the newer multimodal support.&lt;/p&gt;

&lt;p&gt;After ingesting multimodal content, Bedrock Knowledge Bases supports different query patterns depending on the selected approach. AWS documentation for testing and querying multimodal knowledge bases shows support for metadata such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;source modality&lt;/li&gt;
&lt;li&gt;MIME type&lt;/li&gt;
&lt;li&gt;chunk start time for audio/video&lt;/li&gt;
&lt;li&gt;chunk end time for audio/video&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It also mentions playback controls with automatic segment positioning for multimedia results in the console.&lt;/p&gt;

&lt;p&gt;That means this is not just “retrieve a paragraph.”&lt;/p&gt;

&lt;p&gt;It can also become:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;retrieve a scene from a video&lt;/li&gt;
&lt;li&gt;return the relevant moment in an audio file&lt;/li&gt;
&lt;li&gt;find a matching image&lt;/li&gt;
&lt;li&gt;connect retrieved media segments to an answer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That is a big step forward compared with traditional text-only RAG.&lt;/p&gt;

&lt;h2&gt;
  
  
  How I would explain it simply
&lt;/h2&gt;

&lt;p&gt;A traditional knowledge base answers:&lt;/p&gt;

&lt;p&gt;“Which text chunk is relevant?”&lt;/p&gt;

&lt;p&gt;A multimodal knowledge base can answer:&lt;/p&gt;

&lt;p&gt;“Which content is relevant, regardless of whether it is text, image, audio, or video?”&lt;/p&gt;

&lt;p&gt;That is the real difference.&lt;/p&gt;

&lt;h2&gt;
  
  
  Data source point to remember
&lt;/h2&gt;

&lt;p&gt;There is one important limitation to keep in mind.&lt;/p&gt;

&lt;p&gt;AWS documentation states that multimodal support in Bedrock Knowledge Bases is available when creating a knowledge base with unstructured data sources. Structured data sources do not support multimodal content processing.&lt;/p&gt;

&lt;p&gt;That is important for design.&lt;/p&gt;

&lt;p&gt;If your use case depends heavily on images, audio, or video, you should think in terms of unstructured content pipelines, not only structured tables.&lt;/p&gt;

&lt;h2&gt;
  
  
  A practical example
&lt;/h2&gt;

&lt;p&gt;Imagine a support or operations platform.&lt;/p&gt;

&lt;p&gt;Your users may store:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;PDF manuals&lt;/li&gt;
&lt;li&gt;field photos&lt;/li&gt;
&lt;li&gt;recorded troubleshooting calls&lt;/li&gt;
&lt;li&gt;short maintenance videos&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A user asks:&lt;br&gt;
“Show me the relevant maintenance guidance for this equipment issue.”&lt;/p&gt;

&lt;p&gt;A traditional text-only system may retrieve only written manuals.&lt;/p&gt;

&lt;p&gt;A multimodal knowledge base can potentially retrieve:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a relevant text section&lt;/li&gt;
&lt;li&gt;a matching image&lt;/li&gt;
&lt;li&gt;a useful audio segment&lt;/li&gt;
&lt;li&gt;a video moment with the right scene&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And then that context can be passed to the model for answer generation.&lt;/p&gt;

&lt;p&gt;That is why this is more than just a storage feature.&lt;/p&gt;

&lt;p&gt;It is a better retrieval model for real-world knowledge.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Why I like this layer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I like multimodal knowledge bases because they make AI architecture more realistic.&lt;/p&gt;

&lt;p&gt;In many enterprise environments, the problem is not lack of data.&lt;/p&gt;

&lt;p&gt;The problem is that the useful data is trapped inside different formats and scattered across different files.&lt;/p&gt;

&lt;p&gt;A multimodal knowledge base helps solve that by creating a retrieval layer that can work across those formats. AWS positions Knowledge Bases as an out-of-the-box RAG capability that reduces the effort of building pipelines and helps applications answer queries using proprietary content, with source-grounded responses and citations.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Common mistake&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A common mistake is to assume that all multimodal use cases need the same architecture.&lt;/p&gt;

&lt;p&gt;They do not.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;image similarity search is not the same as document extraction&lt;/li&gt;
&lt;li&gt;video segment retrieval is not the same as audio transcription&lt;/li&gt;
&lt;li&gt;cross-modal search is not the same as text-based RAG over processed media&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS’s own multimodal guidance now separates these choices clearly, and I think that is the right way to approach the design.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I would decide early
&lt;/h2&gt;

&lt;p&gt;Before building the knowledge base, I would answer these questions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Do I need visual similarity or text-oriented retrieval?&lt;/li&gt;
&lt;li&gt;Am I retrieving directly from raw multimodal content, or from processed output?&lt;/li&gt;
&lt;li&gt;Do I need image queries?&lt;/li&gt;
&lt;li&gt;Do I need timestamped retrieval from audio or video?&lt;/li&gt;
&lt;li&gt;Do I want the knowledge base mainly for search, RAG, or both?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These questions make the architecture much clearer.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;For me, multimodal knowledge bases are the point where multimodal AI becomes operational.&lt;/p&gt;

&lt;p&gt;They connect processed or stored media-rich content to retrieval, and they make it possible to build AI systems that are grounded in more than just text. With Amazon Bedrock Knowledge Bases, AWS now supports multimodal ingestion and retrieval across images, audio, video, and text, along with query-time metadata that can point to the right file type and even the right media segment.&lt;/p&gt;

&lt;p&gt;That makes this layer very important.&lt;/p&gt;

&lt;p&gt;Because once retrieval improves, the answers improve.&lt;/p&gt;

&lt;p&gt;And once the answers improve, the AI system becomes much more useful.&lt;/p&gt;

&lt;p&gt;In the next article, I would move to the next logical topic:&lt;/p&gt;

&lt;p&gt;How to use multimodal retrieval in a real RAG workflow on AWS.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>datascience</category>
      <category>awsbigdata</category>
    </item>
    <item>
      <title>AWS Data &amp; AI Stories #02: Amazon Bedrock Data Automation</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Sat, 18 Apr 2026 05:01:56 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-data-ai-stories-02-amazon-bedrock-data-automation-1gg7</link>
      <guid>https://forem.com/aws-builders/aws-data-ai-stories-02-amazon-bedrock-data-automation-1gg7</guid>
      <description>&lt;p&gt;In the first article, I talked about multimodal AI at a high level.&lt;/p&gt;

&lt;p&gt;Now it is time to go one step deeper.&lt;/p&gt;

&lt;p&gt;When we say multimodal AI, one of the first real challenges is not the model itself. The first challenge is the data. In most environments, the input is messy, unstructured, and spread across different formats such as documents, images, audio, and video. Amazon Bedrock Data Automation, or BDA, is designed for exactly this problem: extracting useful insights from unstructured multimodal content and turning it into structured output that applications can use.&lt;/p&gt;

&lt;p&gt;For me, BDA is not the “chat” layer. It is the processing layer.&lt;/p&gt;

&lt;p&gt;That is what makes it important.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Amazon Bedrock Data Automation?
&lt;/h2&gt;

&lt;p&gt;Amazon Bedrock Data Automation is a managed AWS capability that automates insight generation from unstructured content such as documents, images, audio, and video. Instead of building separate extraction pipelines for each format, you can use BDA to generate structured outputs from multimodal input in a more consistent way.&lt;/p&gt;

&lt;p&gt;This is useful because many AI projects fail at the beginning.&lt;/p&gt;

&lt;p&gt;Not because the model is weak, but because the source data is not ready.&lt;/p&gt;

&lt;p&gt;Think about a few common examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;scanned PDFs&lt;/li&gt;
&lt;li&gt;invoices&lt;/li&gt;
&lt;li&gt;screenshots&lt;/li&gt;
&lt;li&gt;call recordings&lt;/li&gt;
&lt;li&gt;inspection videos&lt;/li&gt;
&lt;li&gt;photos from the field&lt;/li&gt;
&lt;li&gt;reports with mixed text and visuals&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These are all valuable, but none of them are naturally clean inputs for downstream AI.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why does BDA matter?
&lt;/h2&gt;

&lt;p&gt;Because AI systems need structure.&lt;/p&gt;

&lt;p&gt;Before you build search, RAG, analytics, or assistants, you usually need to answer a simpler question:&lt;/p&gt;

&lt;p&gt;How do I turn raw content into usable information?&lt;/p&gt;

&lt;p&gt;That is where BDA fits.&lt;/p&gt;

&lt;p&gt;AWS describes BDA as a service that can produce both standard output and custom output depending on the use case. Standard output gives predefined insights for a data type, while custom output lets you define tailored extraction logic. This makes BDA useful not only for generic processing, but also for business-specific workflows.&lt;/p&gt;

&lt;p&gt;So in practical terms, BDA can help when you want to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;extract content from complex documents&lt;/li&gt;
&lt;li&gt;summarize audio or video&lt;/li&gt;
&lt;li&gt;generate structured metadata&lt;/li&gt;
&lt;li&gt;prepare content for retrieval&lt;/li&gt;
&lt;li&gt;feed another AI workflow&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How I would position BDA in an AWS architecture
&lt;/h2&gt;

&lt;p&gt;I would place BDA near the beginning of the workflow.&lt;/p&gt;

&lt;p&gt;A simple view looks like this:&lt;/p&gt;

&lt;p&gt;Input data → BDA processing → structured output → storage/indexing → retrieval/generation&lt;/p&gt;

&lt;p&gt;This is also how AWS examples position it. In AWS guidance and solution examples, BDA is commonly used after content lands in S3, and before services such as Knowledge Bases, vector stores, or agentic applications use the extracted results.&lt;/p&gt;

&lt;p&gt;So if Part 1 was about multimodal AI, Part 2 is about making multimodal content usable.&lt;/p&gt;

&lt;h2&gt;
  
  
  Main concepts to understand
&lt;/h2&gt;

&lt;p&gt;There are two core ideas in BDA that matter most: projects and blueprints.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Projects
&lt;/h3&gt;

&lt;p&gt;A project is the main configuration container in BDA. AWS documentation describes it as the grouping that holds standard and optional custom output settings for processing. When you call the async API with a project ARN, BDA uses that project’s configuration to process the file and produce the defined outputs.&lt;/p&gt;

&lt;p&gt;In simple terms, a project is where you define how BDA should behave for your use case.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Blueprints
&lt;/h3&gt;

&lt;p&gt;Blueprints are what make custom extraction possible. AWS documentation explains that blueprints define the extraction logic and output format for custom outputs, allowing you to tailor BDA to your own business fields and data structures.&lt;/p&gt;

&lt;p&gt;This is one of the most valuable parts of BDA.&lt;/p&gt;

&lt;p&gt;Because in real projects, we usually do not want only generic output. We want specific fields, specific structure, and specific business meaning.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;invoice number&lt;/li&gt;
&lt;li&gt;customer name&lt;/li&gt;
&lt;li&gt;incident category&lt;/li&gt;
&lt;li&gt;equipment ID&lt;/li&gt;
&lt;li&gt;inspection result&lt;/li&gt;
&lt;li&gt;priority level&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That is where blueprints become important.&lt;/p&gt;

&lt;h2&gt;
  
  
  Standard output vs custom output
&lt;/h2&gt;

&lt;p&gt;This is one of the most important design choices.&lt;/p&gt;

&lt;h3&gt;
  
  
  Standard output
&lt;/h3&gt;

&lt;p&gt;Standard output is faster to start with. AWS says it provides predefined insights based on the data type being processed, such as document semantics, audio transcripts, or video summaries and chapter summaries.&lt;/p&gt;

&lt;p&gt;This is a good option when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;you want speed&lt;/li&gt;
&lt;li&gt;you are validating a use case&lt;/li&gt;
&lt;li&gt;you do not need very specific extraction fields yet&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Custom output
&lt;/h3&gt;

&lt;p&gt;Custom output is for more targeted use cases. With blueprints, you define the extraction logic and expected structure so the output matches your business need more closely. AWS has also added features such as blueprint instruction optimization to improve custom extraction accuracy using example assets and ground-truth labels.&lt;/p&gt;

&lt;p&gt;This is a better option when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;you need specific fields&lt;/li&gt;
&lt;li&gt;you need consistency&lt;/li&gt;
&lt;li&gt;you are building a production workflow&lt;/li&gt;
&lt;li&gt;your documents or media are domain-specific&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For me, the normal journey is:&lt;br&gt;
start with standard output, then move to custom output when the use case becomes clearer.&lt;/p&gt;

&lt;h2&gt;
  
  
  What types of content can BDA process?
&lt;/h2&gt;

&lt;p&gt;BDA is built for multimodal content. AWS documentation and product pages describe support for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;documents&lt;/li&gt;
&lt;li&gt;images&lt;/li&gt;
&lt;li&gt;audio&lt;/li&gt;
&lt;li&gt;video&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That matters because many organizations have all four.&lt;/p&gt;

&lt;p&gt;And the value is not only in “understanding” each file type independently. The real value is creating a single processing layer for all of them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where BDA fits best
&lt;/h2&gt;

&lt;p&gt;I think BDA is strongest in these scenarios:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Intelligent document processing
&lt;/h3&gt;

&lt;p&gt;AWS has positioned BDA strongly for document-heavy workflows, and AWS blog content around intelligent document processing shows it being used to accelerate extraction and automation for business documents.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Preparing content for RAG&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;AWS examples show BDA being used before Knowledge Bases and vector indexing so that multimodal content can be turned into cleaner, more useful retrieval input.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Audio and video understanding
&lt;/h3&gt;

&lt;p&gt;BDA can generate outputs such as transcripts and summaries from audio and video, and AWS recently expanded it with custom vocabulary support through Data Automation Library to improve speech recognition accuracy for domain-specific terms.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Compliance and content review
&lt;/h3&gt;

&lt;p&gt;AWS has also shown BDA in workflows such as extracting attachment content for later PII detection and redaction with Guardrails, which makes it relevant beyond simple summarization.&lt;/p&gt;

&lt;h2&gt;
  
  
  A practical workflow example
&lt;/h2&gt;

&lt;p&gt;Let’s take a simple example.&lt;/p&gt;

&lt;p&gt;Imagine you are building a support or operations workflow.&lt;/p&gt;

&lt;p&gt;The input may include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a maintenance PDF&lt;/li&gt;
&lt;li&gt;a photo from the field&lt;/li&gt;
&lt;li&gt;a voice note from an engineer&lt;/li&gt;
&lt;li&gt;a short inspection video&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without a processing layer, each file stays isolated.&lt;/p&gt;

&lt;p&gt;With BDA, the system can extract usable outputs from these files, and the rest of the architecture can work with those results more easily. Those extracted outputs can then be stored, indexed, sent to a knowledge base, or used by an agentic workflow. This is consistent with AWS’s documented BDA flow and solution examples that combine S3, BDA, Knowledge Bases, OpenSearch, and AgentCore.&lt;/p&gt;

&lt;p&gt;That is why I see BDA as the bridge between raw content and usable AI workflows.&lt;/p&gt;

&lt;h2&gt;
  
  
  Newer capabilities worth watching
&lt;/h2&gt;

&lt;p&gt;Two additions make BDA more interesting for real projects.&lt;/p&gt;

&lt;p&gt;First, AWS added blueprint instruction optimization, which helps improve custom field extraction accuracy using example documents and labels. This is useful because custom extraction often needs tuning before it becomes reliable.&lt;/p&gt;

&lt;p&gt;Second, AWS added custom vocabulary through Data Automation Library for audio and video processing. This helps when your environment uses domain-specific terms, product names, or technical language that general transcription may miss.&lt;/p&gt;

&lt;p&gt;These are good signs that BDA is moving from “interesting feature” toward “serious processing layer.”&lt;/p&gt;

&lt;h2&gt;
  
  
  Things to keep in mind
&lt;/h2&gt;

&lt;p&gt;BDA is powerful, but I would still keep a few points in mind.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Start simple
&lt;/h3&gt;

&lt;p&gt;AWS recommends starting with standard output if you are new to the service. That makes sense because it helps validate the value quickly before you invest in custom extraction logic.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Design for the business output
&lt;/h3&gt;

&lt;p&gt;Do not begin with “what model do I want?”&lt;br&gt;
Begin with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;what fields do I need?&lt;/li&gt;
&lt;li&gt;what decision will use this output?&lt;/li&gt;
&lt;li&gt;what system will consume it?&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. Watch input requirements
&lt;/h3&gt;

&lt;p&gt;AWS documents prerequisites and file requirements for BDA, including file-specific constraints that differ by content type and processing mode.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Treat prompts and blueprints carefully
&lt;/h3&gt;

&lt;p&gt;AWS explicitly notes that blueprint prompt input should come from trusted sources, which is an important reminder for secure enterprise design.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;For me, Amazon Bedrock Data Automation is one of the most practical pieces in the current AWS multimodal stack.&lt;/p&gt;

&lt;p&gt;It does not replace retrieval, RAG, or agents.&lt;/p&gt;

&lt;p&gt;It enables them.&lt;/p&gt;

&lt;p&gt;If multimodal AI is the bigger vision, BDA is one of the first services that helps turn that vision into a usable workflow. It helps convert raw documents, images, audio, and video into outputs that the rest of your architecture can actually work with.&lt;/p&gt;

&lt;p&gt;That is why I would not treat BDA as a side feature.&lt;/p&gt;

&lt;p&gt;I would treat it as a foundational building block.&lt;/p&gt;

&lt;p&gt;In the next article, I will move one step further and focus on multimodal knowledge bases and how retrieval fits after the processing layer.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>datascience</category>
      <category>awsbigdata</category>
    </item>
    <item>
      <title>AWS Data &amp; AI Stories #01: Multimodal AI</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Thu, 16 Apr 2026 18:49:58 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-data-ai-stories-01-multimodal-ai-2k4k</link>
      <guid>https://forem.com/aws-builders/aws-data-ai-stories-01-multimodal-ai-2k4k</guid>
      <description>&lt;p&gt;In traditional AI systems, text was usually the main input.&lt;/p&gt;

&lt;p&gt;But to solve real life problem, text is not enough by alone.&lt;/p&gt;

&lt;p&gt;Today, many workloads include documents, images, audio, and video at the same time. A user may upload a PDF report, attach a photo, send a voice note, or provide a short video clip. If our solution only understands text, we miss a big part of the context.&lt;/p&gt;

&lt;p&gt;This is where multimodal AI becomes important.&lt;/p&gt;

&lt;p&gt;On AWS, multimodal AI is now becoming more practical. Amazon Bedrock Knowledge Bases supports multimodal content such as images, audio, and video, and AWS now provides different processing approaches depending on whether the goal is retrieval or structured extraction.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Multimodal AI?
&lt;/h2&gt;

&lt;p&gt;Multimodal AI means an AI system can work with more than one type of data.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;text&lt;/li&gt;
&lt;li&gt;images&lt;/li&gt;
&lt;li&gt;scanned documents&lt;/li&gt;
&lt;li&gt;audio&lt;/li&gt;
&lt;li&gt;video&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Instead of focusing on only one format, the system can process and combine multiple data types to produce better results.&lt;/p&gt;

&lt;p&gt;This is useful because enterprise data is rarely pure text. A lot of business value sits inside screenshots, scanned forms, call recordings, diagrams, inspection videos, and media-rich documents. AWS’s current multimodal stack is built around exactly that problem.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why does it matter?
&lt;/h2&gt;

&lt;p&gt;Because real systems are multimodal by nature.&lt;/p&gt;

&lt;p&gt;Think about a few examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A support team receives images and voice notes from the field&lt;/li&gt;
&lt;li&gt;A finance team works with reports, charts, and scanned documents&lt;/li&gt;
&lt;li&gt;A healthcare team uses forms, reports, and medical images&lt;/li&gt;
&lt;li&gt;An industrial operation stores inspection photos, maintenance PDFs, and recorded operator observations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In all of these cases, text-only AI is limited.&lt;/p&gt;

&lt;p&gt;A multimodal approach helps us move from isolated files to connected understanding.&lt;/p&gt;

&lt;h2&gt;
  
  
  Multimodal AI on AWS
&lt;/h2&gt;

&lt;p&gt;When I look at AWS from a practical point of view, I see multimodal AI as a workflow, not just a model.&lt;/p&gt;

&lt;p&gt;A simple logical flow looks like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Collect multimodal data&lt;/li&gt;
&lt;li&gt;Process and extract useful information&lt;/li&gt;
&lt;li&gt;Store or index the result&lt;/li&gt;
&lt;li&gt;Retrieve relevant context&lt;/li&gt;
&lt;li&gt;Generate answers, summaries, or actions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AWS already has building blocks for this. Amazon Bedrock Data Automation is designed to extract structured insights from documents, images, audio, and video. Amazon Bedrock Knowledge Bases supports multimodal retrieval. AWS also supports Nova Multimodal Embeddings for visual and cross-modal retrieval scenarios.&lt;/p&gt;

&lt;h2&gt;
  
  
  Main AWS services to know
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Amazon Bedrock Data Automation
&lt;/h3&gt;

&lt;p&gt;This is useful when the main challenge is understanding raw input.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;extracting information from documents&lt;/li&gt;
&lt;li&gt;analyzing images&lt;/li&gt;
&lt;li&gt;processing audio&lt;/li&gt;
&lt;li&gt;turning video into structured output&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So if the input is messy and unstructured, this is a strong starting point. AWS positions Bedrock Data Automation specifically for automating insight generation from unstructured multimodal content.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Amazon Bedrock Knowledge Bases
&lt;/h3&gt;

&lt;p&gt;This is useful when the goal is retrieval.&lt;/p&gt;

&lt;p&gt;If you want your AI application to search your content and answer questions based on it, Knowledge Bases becomes important. AWS documentation now states that Bedrock Knowledge Bases supports images, audio, and video in addition to traditional text-based content.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Amazon Nova Multimodal Embeddings
&lt;/h3&gt;

&lt;p&gt;This is useful when the goal is similarity and cross-modal search.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;finding images similar to another image&lt;/li&gt;
&lt;li&gt;searching media with text&lt;/li&gt;
&lt;li&gt;creating a shared semantic space across different content types&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Amazon Nova Multimodal Embeddings supports text, documents, images, video, and audio in a single embedding space, which makes cross-modal retrieval possible.&lt;/p&gt;

&lt;h2&gt;
  
  
  A simple architecture view
&lt;/h2&gt;

&lt;p&gt;At a high level, a multimodal AI architecture on AWS can look like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data comes from users, applications, devices, or storage&lt;/li&gt;
&lt;li&gt;Files are stored in Amazon S3&lt;/li&gt;
&lt;li&gt;Bedrock Data Automation extracts useful content and structure&lt;/li&gt;
&lt;li&gt;Bedrock Knowledge Bases indexes or connects relevant knowledge&lt;/li&gt;
&lt;li&gt;Nova Multimodal Embeddings can support semantic and visual retrieval&lt;/li&gt;
&lt;li&gt;A Bedrock-based application or assistant generates the final output&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This approach is also reflected in AWS guidance and recent AWS machine learning posts around multimodal retrieval and multimodal assistants.&lt;/p&gt;

&lt;h2&gt;
  
  
  Retrieval or extraction?
&lt;/h2&gt;

&lt;p&gt;This is one of the first design questions I would ask.&lt;/p&gt;

&lt;p&gt;Do I want to extract information from the content?&lt;/p&gt;

&lt;p&gt;Or do I want to retrieve relevant content across multiple modalities?&lt;/p&gt;

&lt;p&gt;These are not exactly the same problem.&lt;/p&gt;

&lt;p&gt;If the main need is converting raw media into structured output, Bedrock Data Automation is usually the right starting point.&lt;/p&gt;

&lt;p&gt;If the main need is visual similarity or cross-modal search, Nova Multimodal Embeddings is often the better fit.&lt;/p&gt;

&lt;p&gt;AWS explicitly separates these two approaches in its multimodal guidance, which is useful because many teams try to solve all multimodal problems in the same way.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where can this be used?
&lt;/h2&gt;

&lt;p&gt;There are many practical scenarios.&lt;/p&gt;

&lt;p&gt;A few examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;intelligent document processing&lt;/li&gt;
&lt;li&gt;visual search&lt;/li&gt;
&lt;li&gt;media search&lt;/li&gt;
&lt;li&gt;support case analysis&lt;/li&gt;
&lt;li&gt;industrial inspection workflows&lt;/li&gt;
&lt;li&gt;knowledge assistants&lt;/li&gt;
&lt;li&gt;report summarization from mixed content&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For me, the important point is this: multimodal AI is not only about chat. It is also about turning different content types into usable business knowledge.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common mistake
&lt;/h2&gt;

&lt;p&gt;A common mistake is to think multimodal AI means only “upload file and ask question.”&lt;/p&gt;

&lt;p&gt;That is too simple.&lt;/p&gt;

&lt;p&gt;A real solution usually needs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ingestion&lt;/li&gt;
&lt;li&gt;extraction&lt;/li&gt;
&lt;li&gt;indexing&lt;/li&gt;
&lt;li&gt;retrieval&lt;/li&gt;
&lt;li&gt;generation&lt;/li&gt;
&lt;li&gt;governance&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The model is only one part of the story.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;Multimodal AI is becoming a practical architecture topic on AWS.&lt;/p&gt;

&lt;p&gt;Instead of treating text, images, audio, and video as separate worlds, we can now build workflows that connect them. AWS already provides managed building blocks for processing, retrieval, and generation across these content types, which makes multimodal design much more realistic than before.&lt;/p&gt;

&lt;p&gt;For me, the first step is not choosing the model.&lt;/p&gt;

&lt;p&gt;The first step is asking:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What type of data do I have?&lt;/li&gt;
&lt;li&gt;What insight do I need?&lt;/li&gt;
&lt;li&gt;Do I need extraction, retrieval, or both?&lt;/li&gt;
&lt;li&gt;Do I want an assistant, a search engine, or a workflow?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If these answers are clear, the architecture becomes much easier.&lt;/p&gt;

&lt;p&gt;In the next article, I will focus on Amazon Bedrock Data Automation and how it fits into a real multimodal workflow on AWS.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ai</category>
      <category>datascience</category>
      <category>awsbigdata</category>
    </item>
    <item>
      <title>Achieving IT/OT Convergence with Azure Cloud</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Thu, 12 Sep 2024 15:45:38 +0000</pubDate>
      <link>https://forem.com/sdtslmn/achieving-itot-convergence-with-azure-cloud-b3j</link>
      <guid>https://forem.com/sdtslmn/achieving-itot-convergence-with-azure-cloud-b3j</guid>
      <description>&lt;p&gt;The distinction between Information Technology (IT) and Operational Technology (OT) is becoming increasingly blurred in today's rapidly evolving industrial landscape. IT/OT convergence is now crucial for organizations aiming to enhance operational efficiency, improve decision-making, and bolster security. Integrating IT and OT systems allows organizations to unlock new levels of innovation and productivity. Azure Cloud leads this transformation, providing a comprehensive suite of tools and services designed to seamlessly integrate and optimize IT and OT systems, driving the future of industrial operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Synergy of IT and OT: Benefits of Convergence
&lt;/h3&gt;

&lt;p&gt;The importance of IT lies in its ability to streamline business processes, enhance data management, and improve decision-making, while OT is crucial for ensuring the efficiency, safety, and reliability of industrial operations. Integrating IT and OT systems is essential for leveraging the strengths of both domains, leading to improved operational efficiency, enhanced security, and comprehensive data-driven insights.&lt;/p&gt;

&lt;p&gt;The integration of Information Technology (IT) and Operational Technology (OT) systems yields substantial benefits, fundamentally transforming industrial operations. Utilizing frameworks like IEC 62443 and the Purdue Model ensures a secure, segmented network architecture that enhances both operational efficiency and cybersecurity.&lt;/p&gt;

&lt;p&gt;Here’s a detailed look at the technical benefits of IT/OT convergence:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Enhanced Decision-Making:&lt;/strong&gt; Integrating IT and OT systems facilitates comprehensive data aggregation and analysis from both business operations and industrial processes. This fusion enables advanced data analytics and business intelligence applications, enhancing situational awareness and supporting data-driven decision-making processes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Increased Operational Efficiency:&lt;/strong&gt; The convergence of IT and OT automates and streamlines workflows, significantly reducing manual intervention. This automation enhances system reliability and operational efficiency, optimizing resource allocation and improving response times. Advanced solutions, such as ERP and Manufacturing Execution Systems (MES), integrated with OT systems, drive operational excellence.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Robust Security:&lt;/strong&gt; Adhering to the IEC 62443 standard provides a robust framework for securing industrial automation and control systems (IACS). This standard ensures comprehensive security controls across IT and OT environments, mitigating cyber threats and vulnerabilities. The Purdue Model further reinforces security by segmenting networks into distinct layers, preventing lateral movement of threats and maintaining system integrity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Predictive Maintenance:&lt;/strong&gt; Leveraging machine learning algorithms and advanced analytics, IT/OT convergence facilitates predictive maintenance strategies. These strategies utilize historical and real-time data to predict equipment failures, enabling proactive maintenance scheduling. This approach reduces unscheduled downtimes, minimizes maintenance costs, and extends the lifecycle of critical assets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Real-Time Analytics:&lt;/strong&gt; Real-time data processing and analytics are crucial benefits of IT/OT integration. Continuous monitoring and instantaneous data analysis enable rapid detection and response to anomalies, ensuring uninterrupted operations. High-frequency data collection from OT devices, combined with IT’s analytical capabilities, supports real-time optimization and process control.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Addressing OT Design Limitations:&lt;/strong&gt; Traditional OT systems often suffer from legacy infrastructure constraints and limited integration capabilities. IT/OT convergence overcomes these limitations by modernizing OT environments with advanced IT solutions. This modernization enhances system scalability, flexibility, and overall performance, ensuring that OT systems can support contemporary industrial demands.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7. Structured Network Segmentation:&lt;/strong&gt; The Purdue Model’s hierarchical approach to network architecture enhances security by creating distinct functional layers. This segmentation reduces the risk of widespread cyber threats, ensuring that security breaches in one layer do not compromise the entire network. By maintaining strict network boundaries, organizations can effectively protect critical infrastructure and maintain operational continuity.&lt;/p&gt;

&lt;p&gt;By converging IT and OT systems, organizations can unlock the full potential of their data, streamline operational processes, and fortify their cybersecurity posture. Azure Cloud’s comprehensive suite of tools and services facilitates this integration, empowering businesses to achieve enhanced efficiency, innovation, and resilience in their industrial operations&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenges in IT/OT Convergence
&lt;/h3&gt;

&lt;p&gt;Achieving the convergence of IT and Operational Technology (OT) presents several significant challenges. These hurdles stem from the fundamental differences in the priorities, systems, and operational protocols of IT and OT environments. Successfully integrating these domains requires overcoming technical, cultural, and regulatory barriers while ensuring that the integrity and reliability of both IT and OT systems are maintained.&lt;/p&gt;

&lt;p&gt;**1. Cultural Differences: **IT and OT teams have traditionally operated in separate silos, each with distinct priorities and workflows. IT focuses on data management, cybersecurity, and compliance, while OT prioritizes system reliability, uptime, and safety. Bridging these cultural gaps necessitates significant organizational change and collaboration to create a unified approach.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Legacy Systems:&lt;/strong&gt; Many OT environments rely on outdated infrastructure that may not be compatible with modern IT systems. These legacy systems often lack the necessary interfaces for integration, making seamless data exchange and interoperability challenging. Upgrading or replacing these systems can be both costly and complex.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Security Risks:&lt;/strong&gt; Integrating OT systems with IT networks increases the attack surface, exposing critical industrial systems to potential cyber threats. OT systems, traditionally designed for isolated operation, may not have robust security measures in place. Ensuring comprehensive cybersecurity across both IT and OT domains is a significant challenge.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Complexity of Integration:&lt;/strong&gt; Achieving IT/OT convergence involves integrating diverse systems and protocols, each with its own set of requirements and constraints. This complexity can lead to integration challenges, such as data incompatibility, communication issues, and synchronization problems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Regulatory Compliance:&lt;/strong&gt; OT environments often operate under stringent regulatory requirements to ensure safety and reliability. Integrating these systems with IT must not compromise compliance with industry standards and regulations. Navigating this regulatory landscape while implementing convergence can be challenging.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Operational Disruptions:&lt;/strong&gt; The process of integrating IT and OT systems can cause operational disruptions, especially if not carefully managed. Downtime during the integration process can lead to significant production losses and impact business continuity.&lt;/p&gt;

&lt;h3&gt;
  
  
  How Azure Supports IT/OT Convergence
&lt;/h3&gt;

&lt;p&gt;The convergence of IT and Operational Technology (OT) is essential for modern industrial operations, driving efficiency, security, and innovation. Azure Cloud offers a comprehensive suite of tools and services designed to facilitate this integration, aligning with industry standards such as IEC 62443 and leveraging advanced technologies to bridge the gap between IT and OT systems. Here’s how Azure supports IT/OT convergence:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Azure IoT Hub and Digital Twins:&lt;/strong&gt; Azure IoT Hub serves as a central platform to connect, monitor, and manage IoT devices, ensuring seamless data flow between IT and OT environments. This integration enables real-time data processing and advanced analytics, essential for informed decision-making. Azure Digital Twins takes this further by creating digital representations of physical environments, allowing for detailed modeling, simulation, and analysis. These tools facilitate the integration of OT data into IT systems, enhancing operational insights and optimizing industrial processes​.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Azure Security Center and Defender for IoT:&lt;/strong&gt; Security is paramount in IT/OT convergence, particularly due to the unique vulnerabilities in OT systems. Azure Security Center provides comprehensive security management and advanced threat protection across hybrid environments, ensuring continuous assessment and actionable insights. Azure Defender for IoT extends these capabilities specifically to OT environments, offering asset discovery, vulnerability management, and continuous threat monitoring. It integrates seamlessly with existing OT infrastructure, ensuring no disruption to operational processes while adhering to IEC 62443 standards, which provide guidelines for securing industrial automation and control systems (IACS).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Azure Monitor and Sentinel:&lt;/strong&gt; Azure Monitor provides full-stack monitoring, collecting and analyzing data from both IT and OT systems to deliver actionable insights. This is crucial for maintaining operational efficiency and quickly addressing anomalies. Azure Sentinel, a cloud-native SIEM (Security Information and Event Management) solution, enhances security with intelligent threat detection and automated response capabilities. By integrating with various data sources, including OT systems, Azure Sentinel offers a unified security posture, ensuring compliance with standards like IEC 62443 and NIST SP 800-82, which are critical for industrial cybersecurity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Azure Kubernetes Service (AKS) and Azure Stack Edge:&lt;/strong&gt; Azure Kubernetes Service (AKS) supports the deployment and management of containerized applications across IT and OT environments, ensuring scalability and reliability essential for industrial applications. Azure Stack Edge brings cloud computing capabilities to the edge, enabling real-time processing and analytics close to the data source. This particularly benefits latency-sensitive OT applications, facilitating quick decision-making and operational efficiency. These services support the Purdue Model’s hierarchical network architecture, enhancing security and operational integrity across the integrated IT/OT landscape.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Compliance with Industry Standards:&lt;/strong&gt; Azure adheres to key industry standards such as IEC 62443, which provides a comprehensive framework for securing industrial automation and control systems. This ensures that both IT and OT environments are protected against cyber threats. Azure also aligns with ISA-95, which focuses on the integration of enterprise and control systems, supporting the structured network segmentation outlined in the Purdue Model. This compliance facilitates secure and efficient IT/OT convergence, maintaining operational continuity and integrity.&lt;/p&gt;

&lt;p&gt;By leveraging Azure’s extensive capabilities, organizations can overcome the challenges of IT/OT convergence, achieving enhanced operational efficiency, robust security, and innovative data-driven insights. Azure Cloud empowers businesses to seamlessly integrate their IT and OT systems, driving the future of industrial operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;IT/OT convergence is vital for modern industries, driving operational efficiency, enhanced decision-making, and robust security. Integrating IT systems, which manage data and business processes, with OT systems, which control physical devices and industrial processes, unlocks significant benefits. Azure Cloud plays a crucial role in this integration by offering tools like Azure IoT Hub and Digital Twins for seamless data integration and real-time analytics, and Azure Security Center and Defender for IoT for comprehensive security. These tools adhere to industry standards such as IEC 62443, ensuring protection for both IT and OT environments.&lt;/p&gt;

&lt;p&gt;Moreover, Azure Monitor and Sentinel provide extensive monitoring and intelligent threat detection, maintaining operational integrity, while Azure Kubernetes Service (AKS) and Azure Stack Edge support scalable application deployment and edge computing, essential for latency-sensitive OT applications. By following industry standards and structured network segmentation outlined in the Purdue Model, Azure ensures a secure and efficient integration of IT and OT systems. This empowers organizations to overcome convergence challenges, enhancing operational efficiency, security,&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Boosting Incident Response Capabilities with Azure: A Practical Guide</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Sat, 07 Sep 2024 07:12:22 +0000</pubDate>
      <link>https://forem.com/sdtslmn/boosting-incident-response-capabilities-with-azure-a-practical-guide-4gp6</link>
      <guid>https://forem.com/sdtslmn/boosting-incident-response-capabilities-with-azure-a-practical-guide-4gp6</guid>
      <description>&lt;p&gt;In today’s digital world, cybersecurity threats are a constant concern. Whether it’s ransomware, data breaches, or other cyberattacks, having an effective incident response plan is critical for every organization. Microsoft Azure offers a suite of tools that not only improves your ability to detect, respond to, and recover from security incidents but also helps ensure compliance with global regulations like ISO 27001, GDPR, NIS2, and IEC 62443.&lt;/p&gt;

&lt;p&gt;This guide will explore how Azure services can significantly boost your incident response capabilities while meeting regulatory requirements. We’ll also dive into a detailed incident response workflow that shows how Azure services can be leveraged at each step of the process.&lt;/p&gt;

&lt;h3&gt;
  
  
  Azure Sentinel: Real-Time Detection and Automated Response
&lt;/h3&gt;

&lt;p&gt;Azure Sentinel is a game-changer in threat detection and response. It’s a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) platform that enables organizations to detect threats in real time. By analyzing security data across your entire IT environment, whether it’s in Azure, on-premises, or with third-party systems, Sentinel helps spot potential threats before they escalate into larger incidents.&lt;/p&gt;

&lt;p&gt;One of the key strengths of Sentinel is its automation capabilities. Using playbooks, it automates routine response tasks like isolating compromised systems, sending alerts, or logging incidents. Automating these processes saves valuable time and reduces human error, which is critical for meeting compliance standards like NIS2, which requires prompt incident detection and response in critical infrastructure sectors.&lt;/p&gt;

&lt;h3&gt;
  
  
  Azure Security Center: Keeping Your Environment Safe and Sound
&lt;/h3&gt;

&lt;p&gt;Azure Security Center (ASC) acts as your security control tower. It continuously monitors your Azure and hybrid environments for vulnerabilities, misconfigurations, and potential security threats. By leveraging Microsoft’s global threat intelligence, ASC helps you stay ahead of new and evolving threats.&lt;/p&gt;

&lt;p&gt;With ASC, you’ll not only detect threats but also receive actionable recommendations to fix issues before they can be exploited. This proactive approach is crucial for meeting security standards like ISO 27001 and IEC 62443, particularly for industries like energy and manufacturing, where operational technology (OT) environments need to be secured.&lt;/p&gt;

&lt;h3&gt;
  
  
  Azure Defender: Expanding Protection to Every Corner of Your Environment
&lt;/h3&gt;

&lt;p&gt;Azure Defender extends the protective capabilities of Azure Security Center, offering real-time protection for workloads such as virtual machines, containers, IoT devices, and databases. It helps detect vulnerabilities and suspicious activity, ensuring that your environment stays secure from potential threats.&lt;/p&gt;

&lt;p&gt;When integrated with Azure Sentinel, Azure Defender allows for a unified view of security incidents, making it easier to prioritize and act on threats. Its comprehensive protection is critical for compliance with standards like GDPR (which mandates the safeguarding of personal data) and IEC 62443, a standard focused on securing industrial control systems.&lt;/p&gt;

&lt;h3&gt;
  
  
  Centralized Monitoring with Azure Monitor and Log Analytics
&lt;/h3&gt;

&lt;p&gt;Centralized visibility is key during a security incident. Azure Monitor aggregates logs, metrics, and events from across your environment, providing real-time insights that help detect issues early. It offers a single pane of glass to monitor the health and performance of your infrastructure, enabling quicker incident detection and response.&lt;/p&gt;

&lt;p&gt;Azure Log Analytics, a component of Azure Monitor, enhances your ability to search through logs and identify patterns or threats. This is particularly useful for detailed investigations, helping to uncover the root causes of incidents. This centralized monitoring approach is vital for complying with regulations like NIS2, which requires continuous monitoring and timely reporting of incidents in critical infrastructure sectors.&lt;/p&gt;

&lt;h3&gt;
  
  
  Automating Incident Response with Azure Logic Apps
&lt;/h3&gt;

&lt;p&gt;During a security incident, limiting who has access to sensitive systems is crucial. Azure’s Role-Based Access Control (RBAC) ensures that only authorized personnel can access critical resources, aligning with ISO 27001 and IEC 62443 principles of least privilege.&lt;/p&gt;

&lt;p&gt;Azure’s Just-in-Time (JIT) access further improves security by allowing temporary access to key systems only when needed, reducing the attack surface during incident investigations. This minimizes potential exposure while helping organizations comply with regulations like NIS2, which emphasizes strong access controls to protect critical infrastructure.&lt;/p&gt;

&lt;h3&gt;
  
  
  Role-Based Access Control (RBAC) and Just-in-Time (JIT) Access: Controlling Access During Incidents
&lt;/h3&gt;

&lt;p&gt;During a security incident, limiting who has access to sensitive systems is crucial. Azure’s Role-Based Access Control (RBAC) ensures that only authorized personnel can access critical resources, aligning with ISO 27001 and IEC 62443 principles of least privilege.&lt;/p&gt;

&lt;p&gt;Azure’s Just-in-Time (JIT) access further improves security by allowing temporary access to key systems only when needed, reducing the attack surface during incident investigations. This minimizes potential exposure while helping organizations comply with regulations like NIS2, which emphasizes strong access controls to protect critical infrastructure.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enforcing Security with Azure Policy and Compliance Management
&lt;/h3&gt;

&lt;p&gt;Misconfigurations and lack of policy enforcement can increase the risk of security incidents. Azure Policy helps by automating the enforcement of security policies across your resources, ensuring consistency and compliance. With pre-built compliance frameworks for standards like GDPR, NIS2, and IEC 62443, Azure Policy simplifies the task of staying aligned with regulatory requirements.&lt;/p&gt;

&lt;p&gt;Using Azure Blueprints, you can deploy pre-configured environments that are already compliant with regulatory frameworks like ISO 27001 or HIPAA, allowing you to quickly set up secure environments that meet audit requirements.&lt;/p&gt;

&lt;h3&gt;
  
  
  Protecting Identities with Azure Active Directory (AAD)
&lt;/h3&gt;

&lt;p&gt;In an incident response scenario, securing user identities is essential. Azure Active Directory (AAD) offers advanced identity protection features such as conditional access and Multi-Factor Authentication (MFA) to ensure that only the right people have access to sensitive resources. This helps organizations meet the strict identity management requirements of standards like ISO 27001, GDPR, and NIS2.&lt;/p&gt;

&lt;p&gt;Additionally, AAD’s Identity Protection feature alerts administrators to suspicious sign-in behaviors, while Privileged Identity Management (PIM) allows you to grant temporary elevated permissions during incident investigations, keeping access tightly controlled.&lt;/p&gt;

&lt;h3&gt;
  
  
  Azure Site Recovery: Ensuring Business Continuity
&lt;/h3&gt;

&lt;p&gt;When a serious incident or disaster strikes, having a robust recovery plan in place is critical to minimizing downtime. Azure Site Recovery provides a disaster recovery solution that replicates workloads to another region, ensuring that operations can quickly resume even in the event of a major breach or system failure.&lt;/p&gt;

&lt;p&gt;This ability to recover quickly ensures that you meet the requirements of standards like ISO 22301, GDPR, and NIS2, all of which mandate a strong disaster recovery plan to maintain business continuity during a crisis.&lt;/p&gt;

&lt;h3&gt;
  
  
  Incident Response Workflow Using Azure Services
&lt;/h3&gt;

&lt;p&gt;To put everything into action, here’s a simplified incident response workflow based on the NIST SP 800-61 framework, showing how Azure services fit into each phase:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Preparation: Set up policies, procedures, and playbooks using Azure Policy, Blueprints, and Security Center.&lt;/li&gt;
&lt;li&gt;Detection &amp;amp; Analysis: Detect potential incidents using Azure Sentinel, Azure Monitor, Azure Defender, and Log Analytics.&lt;/li&gt;
&lt;li&gt;Containment, Eradication &amp;amp; Recovery: Respond with automated workflows using Azure Logic Apps, backup data with Azure Backup, and recover with Azure Site Recovery.&lt;/li&gt;
&lt;li&gt;Post-Incident Activity: Review the incident, update policies, and strengthen defenses with insights from Azure Sentinel and Security Center.&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;tr&gt;
&lt;th&gt;Incident Response Mechanism&lt;/th&gt;
&lt;th&gt;Azure Service&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Preparation&lt;/td&gt;
&lt;td&gt;Azure Policy, Blueprints, Security Center&lt;/td&gt;
&lt;td&gt;Enforce policies and monitor for vulnerabilities before an incident happens.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Detection &amp;amp; Analysis&lt;/td&gt;
&lt;td&gt;Azure Sentinel, Monitor, Defender, Log Analytics&lt;/td&gt;
&lt;td&gt;Centralized monitoring and advanced threat detection.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Containment, Eradication, and Recovery&lt;/td&gt;
&lt;td&gt;Azure Logic Apps, Backup, Site Recovery&lt;/td&gt;
&lt;td&gt;Automate responses, restore data, and ensure business continuity during an incident.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Post-Incident Activity&lt;/td&gt;
&lt;td&gt;Azure Sentinel, Security Center, Policy&lt;/td&gt;
&lt;td&gt;Conduct post-incident reviews, generate reports, and update security policies.&lt;/td&gt;
&lt;/tr&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h3&gt;
  
  
  Wrapping It Up
&lt;/h3&gt;

&lt;p&gt;Whether it’s through automation with Azure Logic Apps, real-time monitoring with Azure Sentinel, or enforcing security policies with Azure Policy, Microsoft Azure gives you the tools you need to stay ahead of cyber threats and ensure you’re ready to respond to incidents when they happen.&lt;/p&gt;

&lt;p&gt;Azure offers a comprehensive set of tools designed to enhance incident response capabilities, streamline processes, and ensure compliance with global standards like GDPR, NIS2, ISO 27001, and IEC 62443. By integrating these services into your incident response plan, your organization can detect and respond to threats more effectively, recover faster, and continually strengthen its security posture.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloud</category>
      <category>compliance</category>
      <category>security</category>
    </item>
    <item>
      <title>Navigating Hybrid Cloud: Integrating VMware with AWS Services</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Fri, 01 Dec 2023 10:52:40 +0000</pubDate>
      <link>https://forem.com/aws-builders/navigating-hybrid-cloud-integrating-vmware-with-aws-services-da4</link>
      <guid>https://forem.com/aws-builders/navigating-hybrid-cloud-integrating-vmware-with-aws-services-da4</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In the ever-evolving landscape of IT, the hybrid cloud has emerged as a pivotal architecture, blending the on-premises reliability of traditional data centers with the scalability and innovation of cloud computing. At the heart of this transformation are two titans of the tech world: VMware, a leader in virtualization solutions, and AWS, the world's most comprehensive and broadly adopted cloud platform. Their integration represents a paradigm shift, offering unprecedented flexibility and efficiency in managing IT resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  VMware Cloud on AWS: A Technical Overview
&lt;/h2&gt;

&lt;p&gt;VMware Cloud on AWS is not merely a bridge between two platforms; it is a sophisticated fusion that extends VMware's Software-Defined Data Center (SDDC) capabilities directly into AWS's cloud infrastructure. This integration allows businesses to migrate and operate their VMware workloads on AWS seamlessly. The synergy offers a reduction in operational overhead, heightened workload agility, and a more streamlined Total Cost of Ownership (TCO) - all while leveraging AWS's robust, scalable infrastructure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Networking and Connectivity Strategies
&lt;/h2&gt;

&lt;p&gt;A critical component of this integration is robust network connectivity. Establishing high-performance, secure links between SDDC VMs and AWS services is crucial. VMware Transit Connect and AWS Transit Gateway play pivotal roles here. Transit Connect uses a VMware-managed AWS Transit Gateway for high-throughput connectivity in multi-VPC environments, ensuring efficient interconnection of SDDCs and attachment of VPCs. Additionally, IPsec VPNs with BGP-based routing offer a reliable method to connect to VPCs through an existing AWS Transit Gateway, utilizing multiple IPsec tunnels for effective traffic load-balancing.&lt;/p&gt;

&lt;h2&gt;
  
  
  Storage Integration and Optimization
&lt;/h2&gt;

&lt;p&gt;The integration with AWS Cloud Storage - Amazon S3, Amazon EFS, and Amazon FSx - is another cornerstone of this alliance. These services provide optimal solutions for VMs requiring file or object storage, significantly reducing TCO by optimizing SDDC sizing. This approach not only streamlines storage architecture but also simplifies the complexities traditionally associated with managing file services on VM disks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advanced Networking and Data Management
&lt;/h2&gt;

&lt;p&gt;Incorporating AWS's networking and content delivery services like Elastic Load Balancing, Amazon CloudFront, and Amazon Route 53 with VMware Cloud on AWS workloads results in robust traffic management and enhanced security. Furthermore, the integration of AWS's database and analytics services, such as Amazon RDS and Amazon Redshift, facilitates efficient data management and insightful analytics for data-heavy VMware workloads.&lt;/p&gt;

&lt;h2&gt;
  
  
  Addressing Migration and Integration Challenges
&lt;/h2&gt;

&lt;p&gt;Migration to VMware Cloud on AWS necessitates careful cost management and optimization. Pre-migration workload optimization is essential to prevent future cost escalations. Additionally, early network configuration, using tools like VMware HCX for WAN optimization and network stretching, is vital to address networking challenges. Deciding between forklift migration and rebuilding in the cloud also requires a careful assessment of cost, complexity, and operational impact.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best Practices for Successful Integration
&lt;/h2&gt;

&lt;p&gt;Adhering to best practices is crucial for a successful integration. This includes ensuring infrastructure flexibility through SDDC configuration and maintenance, implementing a robust data protection strategy using VMware Site Recovery, and configuring stretched clusters for enhanced resiliency and data replication across AWS Availability Zones.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Integrating VMware with AWS services is more than a technical endeavor; it's a strategic move towards a more agile, efficient, and scalable IT infrastructure. As these technologies continue to evolve, businesses adopting this integration stand to gain significantly in terms of operational flexibility, cost efficiency, and technological advancement.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>vmware</category>
    </item>
    <item>
      <title>Busting Cloud Myths: True Nature of Cloud Design</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Thu, 02 Nov 2023 05:59:35 +0000</pubDate>
      <link>https://forem.com/aws-builders/busting-cloud-myths-true-nature-of-cloud-design-55ca</link>
      <guid>https://forem.com/aws-builders/busting-cloud-myths-true-nature-of-cloud-design-55ca</guid>
      <description>&lt;p&gt;In the rapidly evolving digital landscape, cloud technologies have emerged as a beacon of innovation, promising unparalleled agility, scalability, and cost-efficiency. As businesses across sectors rush to harness the power of the cloud, a swirl of myths and misconceptions has risen, often blurring the line between fact and fiction. This article aims to clear the mist, debunking some of the most prevalent myths surrounding cloud design and shedding light on its true nature.&lt;/p&gt;

&lt;p&gt;While the cloud has become almost synonymous with modern tech solutions, many are still navigating its vast expanse with a mix of awe and misinformation. As with any transformative technology, understanding its core principles and design intricacies is crucial to harnessing its full potential. Let's embark on a journey to dispel these myths and grasp the essence of cloud design.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Evolution of Cloud Design
&lt;/h3&gt;

&lt;p&gt;Tracing back to the late 1990s and early 2000s, the concept of cloud computing was born from the need to access computing resources over the internet. Back then, it was a revolutionary idea to rely on remote servers rather than local machines. Fast forward to today, and the cloud has evolved into a multifaceted ecosystem, offering a myriad of services tailored to diverse needs.&lt;/p&gt;

&lt;p&gt;Central to this evolution is the introduction of various service models. Infrastructure as a Service (IaaS) offers raw computing resources, allowing businesses to rent virtualized hardware over the internet. Platform as a Service (PaaS) takes it a step further, providing an environment where developers can build, deploy, and manage applications without worrying about underlying infrastructure. Finally, Software as a Service (SaaS) delivers software applications over the web, eliminating the need for installations or manual updates.&lt;/p&gt;

&lt;p&gt;These service models, though distinct, share a common design principle: to abstract complexities and offer users a more streamlined, efficient experience. The journey from renting mere computing power to accessing sophisticated platforms and software encapsulates the essence of cloud design's evolution—a testament to technological advancement and human ingenuity.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #1: The Universal Blueprint
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; A common misconception is that there exists a universal blueprint for cloud solutions—a one-size-fits-all approach that caters to every business, whether it's a fledgling startup or a multinational conglomerate. The idea often revolves around the notion that if a solution works for one, it should work for all.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq626gzu9rv121lpwbu72.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq626gzu9rv121lpwbu72.png" alt="one-size-fits-all" width="800" height="520"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; The cloud is more like an artist's palette than a pre-drawn sketch. Each business paints its unique cloud journey using tools and colors that align with its vision and requirements. Cloud services offer a plethora of options, from configurations to scalability features. Treating cloud solutions as a uniform entity can lead to misaligned resources and unmet business goals. It's crucial to remember that the cloud is adaptable, not absolute.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #2: The Instantaneous Wallet Relief
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; Cloud computing is frequently seen as the golden ticket to immediate cost savings. The narrative suggests that the moment a business migrates to the cloud, financial burdens tied to IT expenses are magically alleviated.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; Transitioning to the cloud is more akin to an investment. The potential for long-term cost reductions is undeniable, but immediate savings are not a guarantee. Proper cloud integration requires planning, optimization, and sometimes upfront costs. Over time, as businesses fine-tune their cloud strategies, the cost benefits become more evident. However, patience and strategic planning are prerequisites to truly unlocking the cloud's financial advantages.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #3: The Impeccable Uptime Mirage
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; Another widely held belief is that the cloud, with its advanced technologies and infrastructure, promises impeccable uptime, effectively eliminating downtimes or service interruptions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; While cloud providers invest heavily in infrastructure resilience and high availability, no system — cloud-based or otherwise — can guarantee 100% uptime. Outages, though rare and often brief, can and do occur. It's essential for businesses to understand this and have contingency plans in place. What the cloud offers isn't perfection but a significantly improved and reliable service uptime compared to traditional setups.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #4: The Magic Migration Wand
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; Migration to the cloud is often perceived as a simple, instantaneous switch. Many believe that it's as easy as pressing a button or waving a magic wand, and suddenly, all data, applications, and processes are flawlessly operating in the cloud.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; Cloud migration is a strategic journey, not a sprint. It involves meticulous planning, assessment of existing systems, and often a phased approach to move different components. Ensuring data integrity, application functionality, and security during this process requires expertise and time. While cloud providers offer tools to aid in migration, understanding the nuances of these tools and the specific needs of the business is pivotal to a successful transition.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #5: Absolute Autonomy Equals Absolute Security
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; With cloud solutions offering greater autonomy over data and processes, there's a myth that this self-governance translates directly to increased security. The belief is that because businesses have more control, they're inherently more secure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; While cloud providers equip businesses with robust security tools and protocols, the responsibility of safeguarding data often lies in shared accountability. Autonomy does provide businesses with the tools to secure their data, but it also requires them to be proactive in their approach to security. Regular audits, updates, and employee training on security best practices are essential components of a holistic cloud security strategy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #6: The Static Cloud Environment
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; Once set up, many believe that a cloud environment remains static, requiring minimal updates or changes, much like a 'set it and forget it' appliance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; The cloud is dynamic and ever-evolving. As businesses grow, their cloud needs can change, requiring adjustments in configurations, scalability options, and services. Additionally, cloud providers continually roll out new features, updates, and optimizations. Engaging with the cloud is an ongoing relationship, demanding regular attention to ensure optimal performance and to leverage the latest offerings.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #7: Independence from IT Teams
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; With the rise of user-friendly cloud interfaces and simplified platforms, there’s a circulating belief that once a company transitions to the cloud, there's no longer a need for in-house IT teams. The cloud will handle everything, right?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; While cloud platforms do streamline many processes, the role of the IT team shifts rather than diminishes. The expertise of IT professionals becomes paramount in managing cloud configurations, ensuring security protocols are upheld, and integrating new technologies. Moreover, as businesses become more reliant on cloud services, the role of IT evolves to focus on strategic implementation and optimization. Far from making IT teams obsolete, the cloud reinforces their importance in a different capacity.&lt;/p&gt;

&lt;h2&gt;
  
  
  Myth #8: Cloud Is Just Virtualized Servers
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The Myth:&lt;/strong&gt; Many perceive the cloud as merely a collection of virtualized servers, seeing it as just another iteration of traditional hosting but with a fancier name.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Reality:&lt;/strong&gt; The cloud goes beyond simple virtualization. It encompasses a range of services, from machine learning and artificial intelligence to IoT integration and beyond. While virtualization is a component of cloud computing, the cloud's essence is its ability to provide on-demand, scalable resources, broad service offerings, and flexible pricing models. Reducing the cloud to just virtualized servers misses the vast potential and myriad of solutions it brings to the modern digital landscape.&lt;/p&gt;

</description>
      <category>awscloud</category>
      <category>aws</category>
      <category>cloud</category>
      <category>design</category>
    </item>
    <item>
      <title>A Beginner's Guide to Building Web Applications with AWS Amplify</title>
      <dc:creator>Sedat SALMAN</dc:creator>
      <pubDate>Sat, 01 Jul 2023 07:16:34 +0000</pubDate>
      <link>https://forem.com/aws-builders/a-beginners-guide-to-building-web-applications-with-aws-amplify-5h3i</link>
      <guid>https://forem.com/aws-builders/a-beginners-guide-to-building-web-applications-with-aws-amplify-5h3i</guid>
      <description>&lt;p&gt;AWS Amplify is a set of tools and services that allows developers to build scalable, secure, and flexible web applications. It provides an easy-to-use interface, along with a variety of AWS services like authentication, APIs, storage, and hosting, enabling you to create robust applications quickly. This guide will walk you through the basics of building a web application using AWS Amplify.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is AWS Amplify?
&lt;/h3&gt;

&lt;p&gt;AWS Amplify is a development platform from Amazon Web Services (AWS) that allows developers to build and deploy scalable and secure web applications. It provides a framework to use popular AWS services like AWS Cognito for user authentication, AWS AppSync for APIs, and AWS S3 for storage.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why Use AWS Amplify?
&lt;/h3&gt;

&lt;p&gt;The primary reason for using AWS Amplify is its seamless integration with AWS services. Additionally, Amplify provides a unified workflow for mobile and front-end web developers, reducing the complexity of managing multiple services. Here are a few benefits of using AWS Amplify:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Easy to use: AWS Amplify offers a simple, declarative interface for developers to utilize AWS services without needing to be an expert in cloud infrastructure.&lt;/li&gt;
&lt;li&gt;Scalable: With AWS Amplify, your application can scale easily to accommodate a growing user base.&lt;/li&gt;
&lt;li&gt;Secure: AWS Amplify provides built-in security features to protect your application, including user authentication and authorization.&lt;/li&gt;
&lt;li&gt;Fast: Amplify allows for quick prototyping and deployment of applications.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Getting Started with AWS Amplify
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Setting Up&lt;/strong&gt;&lt;br&gt;
First, you need to set up your AWS account. If you do not have one, you can create it at &lt;a href="https://aws.amazon.com/" rel="noopener noreferrer"&gt;https://aws.amazon.com/&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After setting up the account, install the AWS Amplify CLI (Command Line Interface) on your local system. You can do this by running the following command in your terminal:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

npm install -g @aws-amplify/cli


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Then, configure the AWS Amplify CLI with your AWS account using:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify configure


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This command will guide you through the process of setting up your AWS account with the Amplify CLI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Creating a New Project&lt;/strong&gt;&lt;br&gt;
After setting up the Amplify CLI, you can create a new Amplify project using:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify init


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This command will prompt you to answer several configuration questions about your new project such as the name, environment, and your preferred text editor. It will also ask for the AWS profile to use, which you set up in the previous step.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Adding Features&lt;/strong&gt;&lt;br&gt;
With AWS Amplify, you can add features like authentication, APIs, and storage to your app using simple commands.&lt;/p&gt;

&lt;p&gt;To add authentication, you can use:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify add auth


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;To add an API, you can use:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify add api


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;And to add storage, you can use:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify add storage


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Each of these commands will guide you through the process of setting up these features.&lt;/p&gt;

&lt;p&gt;Step 4: Deploying Your Application&lt;br&gt;
After adding all the desired features, you can deploy your application to the cloud using:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify push


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This command will create all the resources in the cloud that were configured during the previous steps.&lt;/p&gt;

&lt;p&gt;Step 5: Updating Your Application&lt;br&gt;
In case you want to update your application in the future, you can do so by running the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

amplify update


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You will be prompted to select the service you want to update. After choosing the service, follow the prompts to update your service.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advanced AWS Amplify Concepts
&lt;/h3&gt;

&lt;p&gt;While the previous sections covered the basics of getting started with AWS Amplify, there are several advanced concepts that can prove useful as you progress in your web development journey.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Customizing Authentication&lt;/strong&gt;&lt;br&gt;
While AWS Amplify makes it simple to add authentication to your app with a single command, you also have the flexibility to customize the authentication flow. This can be particularly useful when you need to add additional security measures, or if you want to provide a unique user experience. You can customize the authentication UI, add multi-factor authentication (MFA), and even integrate with third-party authentication providers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Managing Data with GraphQL&lt;/strong&gt;&lt;br&gt;
AWS Amplify makes it easy to manage your app's data using GraphQL, a query language for APIs. With the Amplify CLI, you can automatically generate a GraphQL API, complete with schema and resolvers, by running amplify add api and selecting GraphQL. Amplify's GraphQL Transform library provides directives that you can use in your schema to quickly set up common patterns like search and pagination.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hosting with AWS Amplify Console&lt;/strong&gt;&lt;br&gt;
Once your web application is ready, you can use the AWS Amplify Console to host it. The Amplify Console provides a git-based workflow for hosting fullstack serverless web applications with continuous deployment. Simply connect your application's repository, configure build settings, and Amplify Console will deploy updates to your app on every code commit.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Real-Time Updates with Subscriptions&lt;/strong&gt;&lt;br&gt;
AWS Amplify supports GraphQL subscriptions, which allows you to easily set up real-time updates in your app. This can be useful for features like live chat, real-time notifications, and more. To set up subscriptions, you simply define them in your GraphQL schema, and then use the Amplify libraries in your app to subscribe to the events.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multi-Environment Workflow&lt;/strong&gt;&lt;br&gt;
When working on a larger application, it's often useful to have multiple environments (like development, staging, and production). AWS Amplify supports this with a simple multi-environment workflow. You can create and switch between different environments using the Amplify CLI, and Amplify will manage the backend resources separately for each environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;AWS Amplify is a powerful tool for building web applications, providing an array of services and features that simplify the development process. With its seamless integration with AWS services, easy-to-use CLI, and a host of advanced features, AWS Amplify can speed up your web development process while ensuring your applications are secure, scalable, and feature-rich. Whether you're a beginner just starting out or an experienced developer, AWS Amplify is a tool worth exploring for your next web development project.&lt;/p&gt;

&lt;p&gt;Bonus: Amplify Studio&lt;/p&gt;

&lt;p&gt;Amplify Studio provides a visual designer to create UI components that you can connect to your backend data, further simplifying the web development process&lt;/p&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/amplify/studio/" rel="noopener noreferrer"&gt;https://aws.amazon.com/amplify/studio/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4sf89idecmk8axyqou06.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4sf89idecmk8axyqou06.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwt28epydnykse6ukqvj9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwt28epydnykse6ukqvj9.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>amplify</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
