<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Sergio D. Rodríguez Inclán</title>
    <description>The latest articles on Forem by Sergio D. Rodríguez Inclán (@w4ls3n).</description>
    <link>https://forem.com/w4ls3n</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/w4ls3n"/>
    <language>en</language>
    <item>
      <title>DevSecOps Fundamentals Project</title>
      <dc:creator>Sergio D. Rodríguez Inclán</dc:creator>
      <pubDate>Fri, 20 Feb 2026 14:42:43 +0000</pubDate>
      <link>https://forem.com/w4ls3n/devsecops-fundamentals-project-19jd</link>
      <guid>https://forem.com/w4ls3n/devsecops-fundamentals-project-19jd</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Every engineer has that one project — the one that starts as a simple idea and evolves into a full-blown learning laboratory. For me, that project is this &lt;strong&gt;Publisher Web App&lt;/strong&gt;: a system designed to publish announcements across multiple social media channels simultaneously. The primary use case? The &lt;strong&gt;AWS Certification Announcer&lt;/strong&gt;, a community tool where members submit their AWS certification achievements and the platform automatically publishes them to Facebook, Instagram, WhatsApp, LinkedIn, Email, etc.&lt;/p&gt;

&lt;p&gt;What began as a practical need for the AWS User Group quickly became an exercise in applying every DevSecOps principle I wanted to put into practice, but couldn't until now. This article walks through the fundamentals of the project — the architecture, the security posture, the deployment strategy, and the lessons learned along the way.&lt;/p&gt;

&lt;p&gt;The entire project is open source and available at &lt;a href="https://github.com/Walsen/devsecops-poc" rel="noopener noreferrer"&gt;github.com/Walsen/devsecops-poc&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Vision: More Than Just Posting
&lt;/h2&gt;

&lt;p&gt;The goal was never to build "yet another social media scheduler." The vision was to create a platform that could serve as a reference implementation for modern cloud-native development — one that demonstrates how to build software that is secure by design, portable across deployment models, and maintainable over time.&lt;/p&gt;

&lt;p&gt;The platform needed to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Publish to several channels&lt;/strong&gt;: Facebook, Instagram, WhatsApp, LinkedIn, Email, and more&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Support social authentication&lt;/strong&gt;: Google, GitHub, LinkedIn, or email/password via Amazon Cognito&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Schedule messages&lt;/strong&gt;: Queue announcements for future delivery&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enforce role-based access&lt;/strong&gt;: Admin and Community Manager roles with strict boundaries&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Be secure from day one&lt;/strong&gt;: Zero Trust networking, encrypted everything, WAF protection, and a hardened supply chain&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Run in two deployment modes&lt;/strong&gt;: Containers (ECS Fargate) or Serverless (Lambda + DynamoDB), switchable at deploy time&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Hexagonal Architecture: The Foundation
&lt;/h2&gt;

&lt;p&gt;If there is one architectural decision that made everything else possible, it is the adoption of &lt;strong&gt;hexagonal architecture&lt;/strong&gt; (also known as Ports &amp;amp; Adapters).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ncal7xicjz5bvnpw59v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8ncal7xicjz5bvnpw59v.png" alt="Hexagonal Architecture" width="800" height="583"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This pattern separates the system into three concentric zones:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Zone&lt;/th&gt;
&lt;th&gt;Responsibility&lt;/th&gt;
&lt;th&gt;Dependencies&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Driving Adapters&lt;/td&gt;
&lt;td&gt;Receive external input (HTTP, Kinesis events, cron)&lt;/td&gt;
&lt;td&gt;Depend on Inbound Ports&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Application Core&lt;/td&gt;
&lt;td&gt;Business logic, use cases, domain model&lt;/td&gt;
&lt;td&gt;Zero external dependencies&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Driven Adapters&lt;/td&gt;
&lt;td&gt;Implement outbound integrations (DB, queues, APIs)&lt;/td&gt;
&lt;td&gt;Implement Outbound Ports&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The application core — domain entities, use cases, and port interfaces — has absolutely zero knowledge of the outside world. It doesn't know if it's running on ECS or Lambda. It doesn't know if the database is PostgreSQL or DynamoDB. It only knows about abstractions.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Domain Model
&lt;/h3&gt;

&lt;p&gt;The domain is built with pure Python dataclasses:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Entity / Value Object&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;code&gt;Message&lt;/code&gt; (Aggregate Root)&lt;/td&gt;
&lt;td&gt;Tracks content, channels, schedule, status, and per-channel deliveries&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;code&gt;Certification&lt;/code&gt; (Entity)&lt;/td&gt;
&lt;td&gt;AWS certification achievement with member info and type&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;code&gt;ChannelType&lt;/code&gt; (Value Object)&lt;/td&gt;
&lt;td&gt;Enum: &lt;code&gt;facebook&lt;/code&gt;, &lt;code&gt;instagram&lt;/code&gt;, &lt;code&gt;linkedin&lt;/code&gt;, &lt;code&gt;whatsapp&lt;/code&gt;, &lt;code&gt;email&lt;/code&gt;, &lt;code&gt;sms&lt;/code&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;code&gt;MessageStatus&lt;/code&gt; (Value Object)&lt;/td&gt;
&lt;td&gt;Lifecycle: &lt;code&gt;DRAFT&lt;/code&gt; → &lt;code&gt;SCHEDULED&lt;/code&gt; → &lt;code&gt;PROCESSING&lt;/code&gt; → &lt;code&gt;DELIVERED&lt;/code&gt; / &lt;code&gt;FAILED&lt;/code&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;code&gt;DeliveryResult&lt;/code&gt; (Value Object)&lt;/td&gt;
&lt;td&gt;Per-channel outcome with external ID or error&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Ports and Adapters in Practice
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;MessageRepository&lt;/code&gt; port is the same interface regardless of deployment mode. Only the adapter changes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Port (shared across both modes)
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;MessageRepository&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ABC&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nd"&gt;@abstractmethod&lt;/span&gt;
    &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_by_id&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;UUID&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;Message&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="bp"&gt;...&lt;/span&gt;

    &lt;span class="nd"&gt;@abstractmethod&lt;/span&gt;
    &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;save&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="bp"&gt;...&lt;/span&gt;

&lt;span class="c1"&gt;# Container adapter
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;PostgresMessageRepository&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;MessageRepository&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;session&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;AsyncSession&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_session&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;session&lt;/span&gt;

&lt;span class="c1"&gt;# Serverless adapter
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;DynamoMessageRepository&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;MessageRepository&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;table_name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_table&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;resource&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;dynamodb&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nc"&gt;Table&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;table_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the key enabler for the dual-mode deployment strategy. Business logic is completely decoupled from infrastructure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Three Services, One Stream
&lt;/h2&gt;

&lt;p&gt;The platform is composed of three microservices connected by Amazon Kinesis Data Streams:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBBUElbIkFQSSBTZXJ2aWNlIl0KICAgICAgICBGQVNUQVBJWyJGYXN0QVBJPGJyLz5SRVNUICsgQXV0aCJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBRdWV1ZVsiRXZlbnQgU3RyZWFtIl0KICAgICAgICBLSU5FU0lTWyJLaW5lc2lzPGJyLz5EYXRhIFN0cmVhbXMiXQogICAgZW5kCgogICAgc3ViZ3JhcGggV29ya2VyWyJXb3JrZXIgU2VydmljZSJdCiAgICAgICAgQ09OU1VNRVJbIktpbmVzaXMgQ29uc3VtZXI8YnIvPkNoYW5uZWwgRGVsaXZlcnkiXQogICAgZW5kCgogICAgc3ViZ3JhcGggU2NoZWR1bGVyWyJTY2hlZHVsZXIgU2VydmljZSJdCiAgICAgICAgQ1JPTlsiQVBTY2hlZHVsZXI8YnIvPkR1ZSBNZXNzYWdlIFNjYW5uZXIiXQogICAgZW5kCgogICAgc3ViZ3JhcGggREJbIkRhdGEgTGF5ZXIiXQogICAgICAgIFJEU1soIlBvc3RncmVTUUw8YnIvPm9yIER5bmFtb0RCIildCiAgICBlbmQKCiAgICBGQVNUQVBJIC0tPnwiUHVibGlzaCBFdmVudCJ8IEtJTkVTSVMKICAgIEtJTkVTSVMgLS0%2BfCJDb25zdW1lInwgQ09OU1VNRVIKICAgIENST04gLS0%2BfCJQb2xsIGR1ZSBtZXNzYWdlcyJ8IFJEUwogICAgQ1JPTiAtLT58IlB1Ymxpc2ggRXZlbnQifCBLSU5FU0lTCiAgICBGQVNUQVBJIC0tPiBSRFMKCiAgICBzdHlsZSBRdWV1ZSBmaWxsOiNmZmYzZTAKICAgIHN0eWxlIERCIGZpbGw6I2ZjZTRlYw%3D%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBBUElbIkFQSSBTZXJ2aWNlIl0KICAgICAgICBGQVNUQVBJWyJGYXN0QVBJPGJyLz5SRVNUICsgQXV0aCJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBRdWV1ZVsiRXZlbnQgU3RyZWFtIl0KICAgICAgICBLSU5FU0lTWyJLaW5lc2lzPGJyLz5EYXRhIFN0cmVhbXMiXQogICAgZW5kCgogICAgc3ViZ3JhcGggV29ya2VyWyJXb3JrZXIgU2VydmljZSJdCiAgICAgICAgQ09OU1VNRVJbIktpbmVzaXMgQ29uc3VtZXI8YnIvPkNoYW5uZWwgRGVsaXZlcnkiXQogICAgZW5kCgogICAgc3ViZ3JhcGggU2NoZWR1bGVyWyJTY2hlZHVsZXIgU2VydmljZSJdCiAgICAgICAgQ1JPTlsiQVBTY2hlZHVsZXI8YnIvPkR1ZSBNZXNzYWdlIFNjYW5uZXIiXQogICAgZW5kCgogICAgc3ViZ3JhcGggREJbIkRhdGEgTGF5ZXIiXQogICAgICAgIFJEU1soIlBvc3RncmVTUUw8YnIvPm9yIER5bmFtb0RCIildCiAgICBlbmQKCiAgICBGQVNUQVBJIC0tPnwiUHVibGlzaCBFdmVudCJ8IEtJTkVTSVMKICAgIEtJTkVTSVMgLS0%2BfCJDb25zdW1lInwgQ09OU1VNRVIKICAgIENST04gLS0%2BfCJQb2xsIGR1ZSBtZXNzYWdlcyJ8IFJEUwogICAgQ1JPTiAtLT58IlB1Ymxpc2ggRXZlbnQifCBLSU5FU0lTCiAgICBGQVNUQVBJIC0tPiBSRFMKCiAgICBzdHlsZSBRdWV1ZSBmaWxsOiNmZmYzZTAKICAgIHN0eWxlIERCIGZpbGw6I2ZjZTRlYw%3D%3D" alt="Mermaid Diagram" width="1040" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;API Service&lt;/strong&gt;: Handles HTTP requests, authentication, and message scheduling. Implements the full hexagonal stack with FastAPI routes, middleware, and Cognito JWT validation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Worker Service&lt;/strong&gt;: Consumes messages from Kinesis and delivers them to the appropriate channels. Supports two publishing strategies — direct API calls or AI-powered content adaptation via Amazon Bedrock.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scheduler Service&lt;/strong&gt;: Polls the database for scheduled messages and publishes them to Kinesis when their delivery time arrives.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each service has its own &lt;code&gt;main.py&lt;/code&gt; that serves as the composition root — the only place where concrete implementations are imported and wired together. No other module imports from &lt;code&gt;infrastructure/&lt;/code&gt; directly.&lt;/p&gt;

&lt;h2&gt;
  
  
  Dual-Mode Deployment: Containers or Serverless
&lt;/h2&gt;

&lt;p&gt;This is where hexagonal architecture pays off in a very tangible way. The platform supports two fully independent deployment modes, switchable at deploy time via a single CI/CD parameter:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;Containers (&lt;code&gt;infra/&lt;/code&gt;)&lt;/th&gt;
&lt;th&gt;Serverless (&lt;code&gt;infra-fs/&lt;/code&gt;)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Compute&lt;/td&gt;
&lt;td&gt;ECS Fargate&lt;/td&gt;
&lt;td&gt;Lambda&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Database&lt;/td&gt;
&lt;td&gt;PostgreSQL (RDS)&lt;/td&gt;
&lt;td&gt;DynamoDB (Single-Table)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;API Gateway&lt;/td&gt;
&lt;td&gt;ALB + CloudFront&lt;/td&gt;
&lt;td&gt;API Gateway + CloudFront&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Scheduler&lt;/td&gt;
&lt;td&gt;ECS Service (APScheduler)&lt;/td&gt;
&lt;td&gt;EventBridge + Lambda&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cost (low traffic)&lt;/td&gt;
&lt;td&gt;~$180-200/mo&lt;/td&gt;
&lt;td&gt;~$5-15/mo&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Both modes share the same domain and application layers. Only the infrastructure adapters change. The deploy workflow selects the mode via &lt;code&gt;infra_type&lt;/code&gt; input (&lt;code&gt;containers&lt;/code&gt; or &lt;code&gt;serverless&lt;/code&gt;), routing to the corresponding CDK project. Stack names are fully independent, so both can coexist in the same AWS account during migration.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why This Matters
&lt;/h3&gt;

&lt;p&gt;Different stages of a project have different needs:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Stage&lt;/th&gt;
&lt;th&gt;Recommended Mode&lt;/th&gt;
&lt;th&gt;Reason&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Development / Prototyping&lt;/td&gt;
&lt;td&gt;Serverless&lt;/td&gt;
&lt;td&gt;Near-zero cost, instant deploys&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Staging / QA&lt;/td&gt;
&lt;td&gt;Either&lt;/td&gt;
&lt;td&gt;Match production or save costs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Production (steady traffic)&lt;/td&gt;
&lt;td&gt;Containers&lt;/td&gt;
&lt;td&gt;Predictable latency, no cold starts&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Production (variable traffic)&lt;/td&gt;
&lt;td&gt;Serverless&lt;/td&gt;
&lt;td&gt;Pay-per-use, auto-scaling&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;You can start cheap with serverless and migrate to containers when traffic justifies the fixed cost — or vice versa — without rewriting a single line of business logic.&lt;/p&gt;

&lt;h3&gt;
  
  
  Running Both Simultaneously
&lt;/h3&gt;

&lt;p&gt;Since stack names are independent, you can run both modes in parallel during a migration window. This allows smoke testing against the new mode before cutting over, gradual traffic shifting using weighted DNS routing (Route 53), and instant rollback by switching DNS back.&lt;/p&gt;

&lt;h2&gt;
  
  
  Zero Trust Security: Trust Nothing, Verify Everything
&lt;/h2&gt;

&lt;p&gt;Security is not a feature you bolt on at the end. It is a design principle that permeates every layer of the system. The platform implements a Zero Trust architecture based on three pillars:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBaVFsiWmVybyBUcnVzdCBQcmluY2lwbGVzIl0KICAgICAgICBWRVJJRllbIlZlcmlmeSBFeHBsaWNpdGx5PGJyLz5BdXRob3JpemUgYmFzZWQgb24gY29udGV4dDxici8%2BTXVsdGktZmFjdG9yIGF1dGhlbnRpY2F0aW9uIl0KICAgICAgICBMRUFTVFsiTGVhc3QgUHJpdmlsZWdlPGJyLz5KdXN0LWluLXRpbWUgYWNjZXNzPGJyLz5Sb2xlLWJhc2VkIHBlcm1pc3Npb25zIl0KICAgICAgICBCUkVBQ0hbIkFzc3VtZSBCcmVhY2g8YnIvPk1pbmltaXplIGJsYXN0IHJhZGl1czxici8%2BU2VnbWVudCBhY2Nlc3M8YnIvPkVuZC10by1lbmQgZW5jcnlwdGlvbiJdCiAgICBlbmQ%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBaVFsiWmVybyBUcnVzdCBQcmluY2lwbGVzIl0KICAgICAgICBWRVJJRllbIlZlcmlmeSBFeHBsaWNpdGx5PGJyLz5BdXRob3JpemUgYmFzZWQgb24gY29udGV4dDxici8%2BTXVsdGktZmFjdG9yIGF1dGhlbnRpY2F0aW9uIl0KICAgICAgICBMRUFTVFsiTGVhc3QgUHJpdmlsZWdlPGJyLz5KdXN0LWluLXRpbWUgYWNjZXNzPGJyLz5Sb2xlLWJhc2VkIHBlcm1pc3Npb25zIl0KICAgICAgICBCUkVBQ0hbIkFzc3VtZSBCcmVhY2g8YnIvPk1pbmltaXplIGJsYXN0IHJhZGl1czxici8%2BU2VnbWVudCBhY2Nlc3M8YnIvPkVuZC10by1lbmQgZW5jcnlwdGlvbiJdCiAgICBlbmQ%3D" alt="Mermaid Diagram" width="345" height="516"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Network Security
&lt;/h3&gt;

&lt;p&gt;All traffic flows through multiple security layers:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBVU0VSWygi8J%2BRpCBVc2VyIildIC0tPnwiVExTIDEuMyJ8IENGWyJDbG91ZEZyb250PGJyLz4rIFNoaWVsZCJdCiAgICBDRiAtLT4gV0FGWyJXQUY8YnIvPk9XQVNQIFJ1bGVzIl0KICAgIFdBRiAtLT58IlRMUyAxLjMifCBBTEJbIkFMQiJdCiAgICBBTEIgLS0%2BfCJUTFMgMS4zInwgRUNTWyJFQ1MgRmFyZ2F0ZSJdCiAgICBFQ1MgLS0%2BfCJUTFMifCBSRFNbKCJSRFM8YnIvPlBvc3RncmVTUUwiKV0KCiAgICBzdHlsZSBDRiBmaWxsOiNlOGY1ZTkKICAgIHN0eWxlIFdBRiBmaWxsOiNlOGY1ZTk%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBVU0VSWygi8J%2BRpCBVc2VyIildIC0tPnwiVExTIDEuMyJ8IENGWyJDbG91ZEZyb250PGJyLz4rIFNoaWVsZCJdCiAgICBDRiAtLT4gV0FGWyJXQUY8YnIvPk9XQVNQIFJ1bGVzIl0KICAgIFdBRiAtLT58IlRMUyAxLjMifCBBTEJbIkFMQiJdCiAgICBBTEIgLS0%2BfCJUTFMgMS4zInwgRUNTWyJFQ1MgRmFyZ2F0ZSJdCiAgICBFQ1MgLS0%2BfCJUTFMifCBSRFNbKCJSRFM8YnIvPlBvc3RncmVTUUwiKV0KCiAgICBzdHlsZSBDRiBmaWxsOiNlOGY1ZTkKICAgIHN0eWxlIFdBRiBmaWxsOiNlOGY1ZTk%3D" alt="Mermaid Diagram" width="1185" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The VPC is segmented into public, private, and isolated subnets. Security groups enforce micro-segmentation — the ALB can only talk to the API, the API can only talk to the Worker and the database, and the database accepts connections only from authorized services.&lt;/p&gt;

&lt;h3&gt;
  
  
  Authentication Flow
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBQcm92aWRlcnNbIklkZW50aXR5IFByb3ZpZGVycyJdCiAgICAgICAgQ09HTklUT1siQ29nbml0bzxici8%2BRW1haWwvUGFzc3dvcmQiXQogICAgICAgIEdPT0dMRVsiR29vZ2xlPGJyLz5PQXV0aCAyLjAiXQogICAgICAgIEdJVEhVQlsiR2l0SHViPGJyLz5PSURDIl0KICAgICAgICBMSU5LRURJTlsiTGlua2VkSW48YnIvPk9JREMiXQogICAgZW5kCgogICAgc3ViZ3JhcGggUG9vbFsiQ29nbml0byBVc2VyIFBvb2wiXQogICAgICAgIEZFREVSQVRFWyJGZWRlcmF0aW9uIExheWVyIl0KICAgICAgICBVU0VSU1siVXNlciBEaXJlY3RvcnkiXQogICAgICAgIEdST1VQU1siR3JvdXBzPGJyLz5hZG1pbiwgY29tbXVuaXR5LW1hbmFnZXIiXQogICAgZW5kCgogICAgc3ViZ3JhcGggVG9rZW5zWyJUb2tlbiBJc3N1YW5jZSJdCiAgICAgICAgSldUWyJKV1QgVG9rZW5zPGJyLz7igKIgQWNjZXNzIFRva2VuICgxaCk8YnIvPuKAoiBJRCBUb2tlbiAoMWgpPGJyLz7igKIgUmVmcmVzaCBUb2tlbiAoMzBkKSJdCiAgICBlbmQKCiAgICBDT0dOSVRPIC0tPiBGRURFUkFURQogICAgR09PR0xFIC0tPiBGRURFUkFURQogICAgR0lUSFVCIC0tPiBGRURFUkFURQogICAgTElOS0VESU4gLS0%2BIEZFREVSQVRFCiAgICBGRURFUkFURSAtLT4gVVNFUlMKICAgIFVTRVJTIC0tPiBHUk9VUFMKICAgIEdST1VQUyAtLT4gSldU" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBQcm92aWRlcnNbIklkZW50aXR5IFByb3ZpZGVycyJdCiAgICAgICAgQ09HTklUT1siQ29nbml0bzxici8%2BRW1haWwvUGFzc3dvcmQiXQogICAgICAgIEdPT0dMRVsiR29vZ2xlPGJyLz5PQXV0aCAyLjAiXQogICAgICAgIEdJVEhVQlsiR2l0SHViPGJyLz5PSURDIl0KICAgICAgICBMSU5LRURJTlsiTGlua2VkSW48YnIvPk9JREMiXQogICAgZW5kCgogICAgc3ViZ3JhcGggUG9vbFsiQ29nbml0byBVc2VyIFBvb2wiXQogICAgICAgIEZFREVSQVRFWyJGZWRlcmF0aW9uIExheWVyIl0KICAgICAgICBVU0VSU1siVXNlciBEaXJlY3RvcnkiXQogICAgICAgIEdST1VQU1siR3JvdXBzPGJyLz5hZG1pbiwgY29tbXVuaXR5LW1hbmFnZXIiXQogICAgZW5kCgogICAgc3ViZ3JhcGggVG9rZW5zWyJUb2tlbiBJc3N1YW5jZSJdCiAgICAgICAgSldUWyJKV1QgVG9rZW5zPGJyLz7igKIgQWNjZXNzIFRva2VuICgxaCk8YnIvPuKAoiBJRCBUb2tlbiAoMWgpPGJyLz7igKIgUmVmcmVzaCBUb2tlbiAoMzBkKSJdCiAgICBlbmQKCiAgICBDT0dOSVRPIC0tPiBGRURFUkFURQogICAgR09PR0xFIC0tPiBGRURFUkFURQogICAgR0lUSFVCIC0tPiBGRURFUkFURQogICAgTElOS0VESU4gLS0%2BIEZFREVSQVRFCiAgICBGRURFUkFURSAtLT4gVVNFUlMKICAgIFVTRVJTIC0tPiBHUk9VUFMKICAgIEdST1VQUyAtLT4gSldU" alt="Mermaid Diagram" width="773" height="756"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Every request is validated against Cognito JWTs with strict algorithm restriction (RS256 only), audience and issuer validation, and JWKS caching with TTL refresh. OAuth credentials for social providers are stored in AWS Secrets Manager.&lt;/p&gt;

&lt;h3&gt;
  
  
  API Security Middleware
&lt;/h3&gt;

&lt;p&gt;The API service implements a comprehensive security middleware stack:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Concern&lt;/th&gt;
&lt;th&gt;Implementation&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Authentication&lt;/td&gt;
&lt;td&gt;Cognito JWT validation with algorithm restriction&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;CSRF Protection&lt;/td&gt;
&lt;td&gt;Double Submit Cookie with HMAC-signed tokens&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rate Limiting&lt;/td&gt;
&lt;td&gt;Per-user sliding window (60 req/min)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Request Validation&lt;/td&gt;
&lt;td&gt;1MB size limit, input sanitization, HTML escaping&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Security Headers&lt;/td&gt;
&lt;td&gt;CSP, HSTS, X-Frame-Options, Permissions-Policy&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Content Filtering&lt;/td&gt;
&lt;td&gt;Prompt injection detection + PII scanning&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Idempotency&lt;/td&gt;
&lt;td&gt;Hash-based dedup on message_id + channels&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Data Protection
&lt;/h3&gt;

&lt;p&gt;Encryption is applied at every layer:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;In Transit&lt;/strong&gt;: TLS 1.3 everywhere — CloudFront to ALB, ALB to ECS, ECS to RDS, ECS to Kinesis&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;At Rest&lt;/strong&gt;: KMS encryption for S3, RDS, and Kinesis&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;In Use&lt;/strong&gt;: No PII in logs, field-level encryption for sensitive data&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Secure Supply Chain
&lt;/h2&gt;

&lt;p&gt;The build pipeline is hardened against supply chain attacks with multiple security gates:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBTb3VyY2VbIlNvdXJjZSJdCiAgICAgICAgR0lUWyJHaXQ8YnIvPlNpZ25lZCBDb21taXRzIl0KICAgICAgICBQUlsiUFIgUmV2aWV3PGJyLz5SZXF1aXJlZCJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBTY2FuWyJTZWN1cml0eSBTY2FubmluZyJdCiAgICAgICAgREVQWyJEZXBlbmRlbmN5PGJyLz5BdWRpdCJdCiAgICAgICAgU0FTVFsiU3RhdGljPGJyLz5BbmFseXNpcyJdCiAgICAgICAgU0VDUkVUWyJTZWNyZXQ8YnIvPlNjYW5uaW5nIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIEJ1aWxkWyJDb250YWluZXIgQnVpbGQiXQogICAgICAgIEJBU0VbIkRpc3Ryb2xlc3M8YnIvPkJhc2UgSW1hZ2UiXQogICAgICAgIE5PTlJPT1RbIk5vbi1yb290PGJyLz5Vc2VyIl0KICAgICAgICBSRUFET05MWVsiUmVhZC1vbmx5PGJyLz5GaWxlc3lzdGVtIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIFNpZ25bIlNpZ25pbmciXQogICAgICAgIFNJR05bIkltYWdlPGJyLz5TaWduaW5nIl0KICAgICAgICBTQk9NWyJTQk9NPGJyLz5HZW5lcmF0aW9uIl0KICAgIGVuZAoKICAgIFNvdXJjZSAtLT4gU2NhbiAtLT4gQnVpbGQgLS0%2BIFNpZ24%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBTb3VyY2VbIlNvdXJjZSJdCiAgICAgICAgR0lUWyJHaXQ8YnIvPlNpZ25lZCBDb21taXRzIl0KICAgICAgICBQUlsiUFIgUmV2aWV3PGJyLz5SZXF1aXJlZCJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBTY2FuWyJTZWN1cml0eSBTY2FubmluZyJdCiAgICAgICAgREVQWyJEZXBlbmRlbmN5PGJyLz5BdWRpdCJdCiAgICAgICAgU0FTVFsiU3RhdGljPGJyLz5BbmFseXNpcyJdCiAgICAgICAgU0VDUkVUWyJTZWNyZXQ8YnIvPlNjYW5uaW5nIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIEJ1aWxkWyJDb250YWluZXIgQnVpbGQiXQogICAgICAgIEJBU0VbIkRpc3Ryb2xlc3M8YnIvPkJhc2UgSW1hZ2UiXQogICAgICAgIE5PTlJPT1RbIk5vbi1yb290PGJyLz5Vc2VyIl0KICAgICAgICBSRUFET05MWVsiUmVhZC1vbmx5PGJyLz5GaWxlc3lzdGVtIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIFNpZ25bIlNpZ25pbmciXQogICAgICAgIFNJR05bIkltYWdlPGJyLz5TaWduaW5nIl0KICAgICAgICBTQk9NWyJTQk9NPGJyLz5HZW5lcmF0aW9uIl0KICAgIGVuZAoKICAgIFNvdXJjZSAtLT4gU2NhbiAtLT4gQnVpbGQgLS0%2BIFNpZ24%3D" alt="Mermaid Diagram" width="1904" height="152"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Defense in Depth: Multiple Scanners
&lt;/h3&gt;

&lt;p&gt;No single scanner catches everything. The project uses six complementary tools:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Scanner&lt;/th&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;Purpose&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Semgrep&lt;/td&gt;
&lt;td&gt;SAST&lt;/td&gt;
&lt;td&gt;OWASP Top 10 patterns, custom rules&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Bandit&lt;/td&gt;
&lt;td&gt;SAST&lt;/td&gt;
&lt;td&gt;Python-specific security issues&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;pip-audit&lt;/td&gt;
&lt;td&gt;SCA&lt;/td&gt;
&lt;td&gt;Python CVE database (PyPI Advisory)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Trivy&lt;/td&gt;
&lt;td&gt;SCA&lt;/td&gt;
&lt;td&gt;SBOM vulnerability scanning&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Gitleaks&lt;/td&gt;
&lt;td&gt;Secrets&lt;/td&gt;
&lt;td&gt;Hardcoded secrets detection in git history&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Checkov&lt;/td&gt;
&lt;td&gt;IaC&lt;/td&gt;
&lt;td&gt;AWS security misconfigurations in CDK/CloudFormation&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Each tool covers blind spots the others miss. Semgrep catches SQL injection and XSS patterns. Bandit finds Python-specific issues like unsafe YAML loading. pip-audit and Trivy handle known CVEs. Gitleaks prevents credential leaks. Checkov ensures the infrastructure itself is secure — no public S3 buckets, no missing encryption, no overly permissive IAM.&lt;/p&gt;

&lt;h3&gt;
  
  
  Container Security
&lt;/h3&gt;

&lt;p&gt;Production containers are built with security as a first-class concern:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Distroless base images&lt;/strong&gt;: No shell, minimal attack surface&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Non-root user&lt;/strong&gt;: Containers run as &lt;code&gt;nonroot&lt;/code&gt; by default&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Read-only filesystem&lt;/strong&gt;: No writes to the root filesystem&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pinned dependencies&lt;/strong&gt;: Lock files with hashes for reproducibility&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Resource limits&lt;/strong&gt;: CPU and memory constraints to prevent abuse&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  CI/CD with GitHub Actions OIDC
&lt;/h3&gt;

&lt;p&gt;The project uses GitHub Actions OIDC to assume AWS IAM roles without storing long-lived credentials. A CDK bootstrap stack (&lt;code&gt;GitHubOIDCStack&lt;/code&gt;) creates the OIDC provider and two IAM roles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Deploy Role&lt;/strong&gt; (&lt;code&gt;github-actions-deploy&lt;/code&gt;): For CDK deployments and ECR pushes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Scan Role&lt;/strong&gt; (&lt;code&gt;github-actions-security-scan&lt;/code&gt;): For Prowler audits with read-only access&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;No AWS access keys are stored in GitHub Secrets. Every deployment uses short-lived credentials (1 hour) obtained via OIDC federation.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Golden Thread: Distributed Security Tracing
&lt;/h2&gt;

&lt;p&gt;One of the most interesting aspects of the project is the &lt;strong&gt;Golden Thread&lt;/strong&gt; — a distributed tracing exercise that demonstrates end-to-end request correlation across all infrastructure layers using a single correlation ID (&lt;code&gt;X-Request-ID&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;The correlation ID flows through:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Edge Layer&lt;/strong&gt;: WAF logs (CloudFront + ALB)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Application Layer&lt;/strong&gt;: API structured logs (structlog JSON)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Async Layer&lt;/strong&gt;: Kinesis event → Worker logs&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Database Layer&lt;/strong&gt;: PostgreSQL audit logs (pgaudit) with SQL comments
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/* correlation_id=golden-thread-test-1739... */ INSERT INTO messages (id, content_text, ...) VALUES (...)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All queryable from CloudWatch Logs Insights with a single filter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="n"&gt;fields&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="nb"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;service&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;correlation_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;status_code&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;filter&lt;/span&gt; &lt;span class="n"&gt;correlation_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nv"&gt;"YOUR_TRACE_ID"&lt;/span&gt;
&lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="n"&gt;sort&lt;/span&gt; &lt;span class="o"&gt;@&lt;/span&gt;&lt;span class="nb"&gt;timestamp&lt;/span&gt; &lt;span class="k"&gt;asc&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Attack Simulation Exercises
&lt;/h3&gt;

&lt;p&gt;The project includes hands-on exercises that simulate real attacks and trace them across layers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;SQL Injection&lt;/strong&gt;: Send a &lt;code&gt;' OR 1=1--&lt;/code&gt; payload, verify WAF blocks it, confirm the request never reaches the application&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;XSS Attempts&lt;/strong&gt;: Send &lt;code&gt;&amp;lt;script&amp;gt;alert(1)&amp;lt;/script&amp;gt;&lt;/code&gt;, trace the WAF block in CloudWatch&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Brute Force Detection&lt;/strong&gt;: Send 60 rapid requests, observe rate limiting kick in, check CloudWatch alarms&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Full Pipeline Trace&lt;/strong&gt;: Schedule a message with a known correlation ID, trace it from CloudFront WAF → ALB WAF → API → Kinesis → Worker → PostgreSQL&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The documentation also includes exercises using professional penetration testing tools — nmap, nikto, sqlmap, ffuf, nuclei, and hydra — each with corresponding CloudWatch queries to trace the attack signatures across the infrastructure.&lt;/p&gt;

&lt;h2&gt;
  
  
  AI-Powered Content Adaptation
&lt;/h2&gt;

&lt;p&gt;The Worker service supports two publishing strategies, selectable via configuration:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBQdWJsaXNoZXJbIlNvY2lhbE1lZGlhUHVibGlzaGVyIFBvcnQiXQogICAgICAgIERJUkVDVFsiRGlyZWN0UHVibGlzaGVyPGJyLz5TYW1lIGNvbnRlbnQsIGFsbCBjaGFubmVscyJdCiAgICAgICAgQUdFTlRbIkFnZW50UHVibGlzaGVyPGJyLz5BSS1hZGFwdGVkIHBlciBwbGF0Zm9ybSJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBUb29sc1siQHRvb2wgRnVuY3Rpb25zIl0KICAgICAgICBGQlsicG9zdF90b19mYWNlYm9vaygpIl0KICAgICAgICBJR1sicG9zdF90b19pbnN0YWdyYW0oKSJdCiAgICAgICAgTElbInBvc3RfdG9fbGlua2VkaW4oKSJdCiAgICAgICAgV0FbInNlbmRfd2hhdHNhcHAoKSJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBHYXRld2F5c1siQ2hhbm5lbCBHYXRld2F5cyJdCiAgICAgICAgRkJfR1dbIkZhY2Vib29rIEdhdGV3YXkiXQogICAgICAgIElHX0dXWyJJbnN0YWdyYW0gR2F0ZXdheSJdCiAgICAgICAgTElfR1dbIkxpbmtlZEluIEdhdGV3YXkiXQogICAgICAgIFdBX0dXWyJXaGF0c0FwcCBHYXRld2F5Il0KICAgIGVuZAoKICAgIERJUkVDVCAtLT4gR2F0ZXdheXMKICAgIEFHRU5UIC0tPiBUb29scyAtLT4gR2F0ZXdheXM%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBQdWJsaXNoZXJbIlNvY2lhbE1lZGlhUHVibGlzaGVyIFBvcnQiXQogICAgICAgIERJUkVDVFsiRGlyZWN0UHVibGlzaGVyPGJyLz5TYW1lIGNvbnRlbnQsIGFsbCBjaGFubmVscyJdCiAgICAgICAgQUdFTlRbIkFnZW50UHVibGlzaGVyPGJyLz5BSS1hZGFwdGVkIHBlciBwbGF0Zm9ybSJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBUb29sc1siQHRvb2wgRnVuY3Rpb25zIl0KICAgICAgICBGQlsicG9zdF90b19mYWNlYm9vaygpIl0KICAgICAgICBJR1sicG9zdF90b19pbnN0YWdyYW0oKSJdCiAgICAgICAgTElbInBvc3RfdG9fbGlua2VkaW4oKSJdCiAgICAgICAgV0FbInNlbmRfd2hhdHNhcHAoKSJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBHYXRld2F5c1siQ2hhbm5lbCBHYXRld2F5cyJdCiAgICAgICAgRkJfR1dbIkZhY2Vib29rIEdhdGV3YXkiXQogICAgICAgIElHX0dXWyJJbnN0YWdyYW0gR2F0ZXdheSJdCiAgICAgICAgTElfR1dbIkxpbmtlZEluIEdhdGV3YXkiXQogICAgICAgIFdBX0dXWyJXaGF0c0FwcCBHYXRld2F5Il0KICAgIGVuZAoKICAgIERJUkVDVCAtLT4gR2F0ZXdheXMKICAgIEFHRU5UIC0tPiBUb29scyAtLT4gR2F0ZXdheXM%3D" alt="Mermaid Diagram" width="616" height="1116"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;AgentPublisher&lt;/strong&gt; uses the &lt;a href="https://strandsagents.com" rel="noopener noreferrer"&gt;Strands Agents SDK&lt;/a&gt; with Amazon Bedrock (Claude) to intelligently adapt content for each platform:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Platform&lt;/th&gt;
&lt;th&gt;Adaptation&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Facebook&lt;/td&gt;
&lt;td&gt;Longer posts, emojis, hashtags at end&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Instagram&lt;/td&gt;
&lt;td&gt;Visual focus, heavy emojis, hashtags&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;LinkedIn&lt;/td&gt;
&lt;td&gt;Professional tone, formal language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;WhatsApp&lt;/td&gt;
&lt;td&gt;Short, celebratory, personal&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The agent reasons about which tools to call, executes them, observes results, and continues until the task is complete. Content filtering middleware protects against prompt injection and PII leakage in AI-generated output.&lt;/p&gt;

&lt;h2&gt;
  
  
  AI-Powered Security Testing Agent
&lt;/h2&gt;

&lt;p&gt;Perhaps the most novel addition to the project is the &lt;strong&gt;Security Testing Agent&lt;/strong&gt; — an AI-powered interactive penetration testing assistant built with the Strands Agents SDK and Amazon Bedrock. Instead of manually running security tests and interpreting results, you have a conversation with an agent that can run tests, diagnose failures, query AWS resources, and even auto-fix test code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBVc2VyCiAgICAgICAgQ0xJWyJUZXJtaW5hbCAvIEludGVyYWN0aXZlIFByb21wdCJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBBZ2VudFsiU3RyYW5kcyBBZ2VudCAoQ2xhdWRlIEhhaWt1IDQuNSkiXQogICAgICAgIExPT1BbIkFnZW50IExvb3A8YnIvPuKAoiBSZWFzb25pbmc8YnIvPuKAoiBUb29sIFNlbGVjdGlvbjxici8%2B4oCiIEV4ZWN1dGlvbiJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBUb29sc1siQHRvb2wgRnVuY3Rpb25zIl0KICAgICAgICBURVNUWyJUZXN0IFJ1bm5lcnM8YnIvPnJ1bl9weXRlc3RfdGVzdDxici8%2BcnVuX2FsbF90ZXN0c19wYXJhbGxlbDxici8%2BcnVuX2Zhc3RfdGVzdHMiXQogICAgICAgIEFXU1siQVdTIFJlYWQtT25seTxici8%2BQ2xvdWRGb3JtYXRpb24gb3V0cHV0czxici8%2BQ2xvdWRXYXRjaCBMb2dzPGJyLz5XQUYgV2ViIEFDTHMiXQogICAgICAgIENPREVbIkNvZGUgVG9vbHM8YnIvPnJlYWRfdGVzdF9maWxlPGJyLz5maXhfdGVzdF9jb2RlIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIEluZnJhWyJBV1MgUmVzb3VyY2VzIChSZWFkLU9ubHkpIl0KICAgICAgICBDRlsiQ2xvdWRGb3JtYXRpb24iXQogICAgICAgIENXWyJDbG91ZFdhdGNoIExvZ3MiXQogICAgICAgIFdBRlsiV0FGIHYyIl0KICAgIGVuZAoKICAgIENMSSAtLT4gTE9PUAogICAgTE9PUCAtLT4gVEVTVAogICAgTE9PUCAtLT4gQVdTCiAgICBMT09QIC0tPiBDT0RFCiAgICBBV1MgLS0%2BIENGCiAgICBBV1MgLS0%2BIENXCiAgICBBV1MgLS0%2BIFdBRgoKICAgIHN0eWxlIEFnZW50IGZpbGw6I2UzZjJmZAogICAgc3R5bGUgVG9vbHMgZmlsbDojZmZmM2UwCiAgICBzdHlsZSBJbmZyYSBmaWxsOiNlOGY1ZTk%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBVc2VyCiAgICAgICAgQ0xJWyJUZXJtaW5hbCAvIEludGVyYWN0aXZlIFByb21wdCJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBBZ2VudFsiU3RyYW5kcyBBZ2VudCAoQ2xhdWRlIEhhaWt1IDQuNSkiXQogICAgICAgIExPT1BbIkFnZW50IExvb3A8YnIvPuKAoiBSZWFzb25pbmc8YnIvPuKAoiBUb29sIFNlbGVjdGlvbjxici8%2B4oCiIEV4ZWN1dGlvbiJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBUb29sc1siQHRvb2wgRnVuY3Rpb25zIl0KICAgICAgICBURVNUWyJUZXN0IFJ1bm5lcnM8YnIvPnJ1bl9weXRlc3RfdGVzdDxici8%2BcnVuX2FsbF90ZXN0c19wYXJhbGxlbDxici8%2BcnVuX2Zhc3RfdGVzdHMiXQogICAgICAgIEFXU1siQVdTIFJlYWQtT25seTxici8%2BQ2xvdWRGb3JtYXRpb24gb3V0cHV0czxici8%2BQ2xvdWRXYXRjaCBMb2dzPGJyLz5XQUYgV2ViIEFDTHMiXQogICAgICAgIENPREVbIkNvZGUgVG9vbHM8YnIvPnJlYWRfdGVzdF9maWxlPGJyLz5maXhfdGVzdF9jb2RlIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIEluZnJhWyJBV1MgUmVzb3VyY2VzIChSZWFkLU9ubHkpIl0KICAgICAgICBDRlsiQ2xvdWRGb3JtYXRpb24iXQogICAgICAgIENXWyJDbG91ZFdhdGNoIExvZ3MiXQogICAgICAgIFdBRlsiV0FGIHYyIl0KICAgIGVuZAoKICAgIENMSSAtLT4gTE9PUAogICAgTE9PUCAtLT4gVEVTVAogICAgTE9PUCAtLT4gQVdTCiAgICBMT09QIC0tPiBDT0RFCiAgICBBV1MgLS0%2BIENGCiAgICBBV1MgLS0%2BIENXCiAgICBBV1MgLS0%2BIFdBRgoKICAgIHN0eWxlIEFnZW50IGZpbGw6I2UzZjJmZAogICAgc3R5bGUgVG9vbHMgZmlsbDojZmZmM2UwCiAgICBzdHlsZSBJbmZyYSBmaWxsOiNlOGY1ZTk%3D" alt="Mermaid Diagram" width="1212" height="540"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The interaction looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;🔒 Penetration Testing Agent Ready!

You: Run all fast tests
Agent: I'll run the fast test suite in parallel, skipping slow scans...
       ✅ All tests passed in 8 seconds!

You: The TLS test is failing, can you debug it?
Agent: Let me run the TLS test and check the logs...
       [runs test, queries CloudFormation for endpoints, checks CloudWatch]
       The certificate for api.ugcbba.click expired. Here's what I found...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Strict Guardrails
&lt;/h3&gt;

&lt;p&gt;The agent enforces strict security boundaries:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Can only run tests from a hardcoded allowlist — no arbitrary command execution&lt;/li&gt;
&lt;li&gt;AWS access is read-only — cannot create, modify, or delete any resources&lt;/li&gt;
&lt;li&gt;File access is sandboxed to the &lt;code&gt;testing/&lt;/code&gt; directory — path traversal is blocked&lt;/li&gt;
&lt;li&gt;Refuses tasks unrelated to penetration testing&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Test Suite
&lt;/h3&gt;

&lt;p&gt;The agent wraps a comprehensive pytest-based security test suite running inside a Kali Linux container:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Category&lt;/th&gt;
&lt;th&gt;Tests&lt;/th&gt;
&lt;th&gt;Duration&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Fast (&amp;lt; 10s each)&lt;/td&gt;
&lt;td&gt;Health, security headers, TLS, CORS, cookies, error disclosure, HTTP methods, origin access, SQLi, XSS&lt;/td&gt;
&lt;td&gt;~10s total&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Medium (15-30s)&lt;/td&gt;
&lt;td&gt;Rate limiting, CSRF end-to-end, IDOR protection&lt;/td&gt;
&lt;td&gt;~1 min&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Slow (30s-10min)&lt;/td&gt;
&lt;td&gt;nmap port scan, Nikto web scan, sqlmap automated injection&lt;/td&gt;
&lt;td&gt;~15 min&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The agent is smart about which tests to run — it prioritizes recently failed tests, skips slow scans unless explicitly requested, and caches successful results for 5 minutes to avoid redundant runs. Parallel execution via pytest-xdist makes the full fast suite complete in about 8 seconds.&lt;/p&gt;

&lt;h2&gt;
  
  
  Automated Penetration Testing Framework
&lt;/h2&gt;

&lt;p&gt;Beyond the AI agent, the project includes a standalone automated penetration testing framework using Dockerized Kali Linux. This is the foundation that the agent builds on, but it can also be used independently in CI/CD pipelines.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Build the Kali container&lt;/span&gt;
docker build &lt;span class="nt"&gt;-f&lt;/span&gt; testing/Dockerfile.kali &lt;span class="nt"&gt;-t&lt;/span&gt; pentest:latest testing/

&lt;span class="c"&gt;# Quick smoke test&lt;/span&gt;
docker run &lt;span class="nt"&gt;--rm&lt;/span&gt; &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;TARGET_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://your-alb.amazonaws.com pentest smoke

&lt;span class="c"&gt;# Full test suite&lt;/span&gt;
docker run &lt;span class="nt"&gt;--rm&lt;/span&gt; &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;TARGET_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://your-alb.amazonaws.com pentest all

&lt;span class="c"&gt;# Generate report&lt;/span&gt;
docker run &lt;span class="nt"&gt;--rm&lt;/span&gt; &lt;span class="nt"&gt;-v&lt;/span&gt; &lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;pwd&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;:/tests &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;TARGET_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$TARGET_URL&lt;/span&gt; pentest report
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The framework integrates with GitHub Actions for scheduled weekly scans and on-demand testing, with results uploaded as workflow artifacts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Threat Detection and Incident Response
&lt;/h2&gt;

&lt;p&gt;The platform implements automated threat detection and response:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBEZXRlY3Rpb25bIlRocmVhdCBEZXRlY3Rpb24iXQogICAgICAgIEdEWyJHdWFyZER1dHk8YnIvPk1hbGljaW91cyBJUHMsIHVudXN1YWwgQVBJIGNhbGxzIl0KICAgICAgICBTSFsiU2VjdXJpdHkgSHViPGJyLz5BZ2dyZWdhdGVkIGZpbmRpbmdzLCBjb21wbGlhbmNlIl0KICAgICAgICBDVFsiQ2xvdWRUcmFpbDxici8%2BQVBJIGF1ZGl0IGxvZyJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBSZXNwb25zZVsiQXV0b21hdGVkIFJlc3BvbnNlIl0KICAgICAgICBFQlsiRXZlbnRCcmlkZ2UgUnVsZSJdCiAgICAgICAgTEFNQkRBWyJSZXNwb25zZSBMYW1iZGEiXQogICAgICAgIFdBRlsiV0FGIElQIFNldCBVcGRhdGUiXQogICAgZW5kCgogICAgc3ViZ3JhcGggQWxlcnRbIkFsZXJ0aW5nIl0KICAgICAgICBTTEFDS1siU2xhY2siXQogICAgZW5kCgogICAgR0QgLS0%2BfCJGaW5kaW5nInwgRUIKICAgIEVCIC0tPnwiU2V2ZXJpdHkgPj0gNCJ8IExBTUJEQQogICAgTEFNQkRBIC0tPnwiQmxvY2sgSVAifCBXQUYKICAgIExBTUJEQSAtLT58Ik5vdGlmeSJ8IEFsZXJ0CiAgICBTSCAtLi0%2BIEdECiAgICBDVCAtLi0%2BIFNI" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRCCiAgICBzdWJncmFwaCBEZXRlY3Rpb25bIlRocmVhdCBEZXRlY3Rpb24iXQogICAgICAgIEdEWyJHdWFyZER1dHk8YnIvPk1hbGljaW91cyBJUHMsIHVudXN1YWwgQVBJIGNhbGxzIl0KICAgICAgICBTSFsiU2VjdXJpdHkgSHViPGJyLz5BZ2dyZWdhdGVkIGZpbmRpbmdzLCBjb21wbGlhbmNlIl0KICAgICAgICBDVFsiQ2xvdWRUcmFpbDxici8%2BQVBJIGF1ZGl0IGxvZyJdCiAgICBlbmQKCiAgICBzdWJncmFwaCBSZXNwb25zZVsiQXV0b21hdGVkIFJlc3BvbnNlIl0KICAgICAgICBFQlsiRXZlbnRCcmlkZ2UgUnVsZSJdCiAgICAgICAgTEFNQkRBWyJSZXNwb25zZSBMYW1iZGEiXQogICAgICAgIFdBRlsiV0FGIElQIFNldCBVcGRhdGUiXQogICAgZW5kCgogICAgc3ViZ3JhcGggQWxlcnRbIkFsZXJ0aW5nIl0KICAgICAgICBTTEFDS1siU2xhY2siXQogICAgZW5kCgogICAgR0QgLS0%2BfCJGaW5kaW5nInwgRUIKICAgIEVCIC0tPnwiU2V2ZXJpdHkgPj0gNCJ8IExBTUJEQQogICAgTEFNQkRBIC0tPnwiQmxvY2sgSVAifCBXQUYKICAgIExBTUJEQSAtLT58Ik5vdGlmeSJ8IEFsZXJ0CiAgICBTSCAtLi0%2BIEdECiAgICBDVCAtLi0%2BIFNI" alt="Mermaid Diagram" width="528" height="952"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;GuardDuty monitors for threats. When a finding with severity &amp;gt;= 4 is detected, an EventBridge rule triggers a Lambda function that automatically blocks the offending IP in the WAF IP set and sends a detailed alert to Slack. Security Hub aggregates findings from GuardDuty, CloudTrail, and AWS Config rules for a unified compliance view.&lt;/p&gt;

&lt;h3&gt;
  
  
  Security Observability
&lt;/h3&gt;

&lt;p&gt;CloudWatch dashboards and alarms provide real-time visibility:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Metric&lt;/th&gt;
&lt;th&gt;Threshold&lt;/th&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Auth failures (5min)&lt;/td&gt;
&lt;td&gt;&amp;gt;= 20&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;CSRF failures (5min)&lt;/td&gt;
&lt;td&gt;&amp;gt;= 20&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Access denied (5min)&lt;/td&gt;
&lt;td&gt;&amp;gt;= 30&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rate limit hits (5min)&lt;/td&gt;
&lt;td&gt;&amp;gt;= 100&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Error rate (5min)&lt;/td&gt;
&lt;td&gt;&amp;gt;= 10&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Critical errors (1min)&lt;/td&gt;
&lt;td&gt;&amp;gt;= 1&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;API latency p95&lt;/td&gt;
&lt;td&gt;&amp;gt; 1000ms&lt;/td&gt;
&lt;td&gt;Alarm + SNS&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  The Developer Experience
&lt;/h2&gt;

&lt;p&gt;A good DevSecOps project is not just about security and architecture — it is also about making the developer's life easier. The project uses &lt;a href="https://www.jetpack.io/devbox/" rel="noopener noreferrer"&gt;Devbox&lt;/a&gt; for isolated development environments and &lt;a href="https://github.com/casey/just" rel="noopener noreferrer"&gt;Just&lt;/a&gt; as a task runner.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;justfile&lt;/code&gt; provides a comprehensive set of commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Development&lt;/span&gt;
just dev          &lt;span class="c"&gt;# Start local services and tail logs&lt;/span&gt;
just up           &lt;span class="c"&gt;# Start PostgreSQL + LocalStack&lt;/span&gt;
just &lt;span class="nb"&gt;test&lt;/span&gt;         &lt;span class="c"&gt;# Run all tests&lt;/span&gt;
just lint-local   &lt;span class="c"&gt;# Lint all services locally&lt;/span&gt;

&lt;span class="c"&gt;# Security&lt;/span&gt;
just security-scan    &lt;span class="c"&gt;# SAST + SCA + Secrets scan&lt;/span&gt;
just trivy-scan       &lt;span class="c"&gt;# Container vulnerability scan&lt;/span&gt;
just sbom-scan        &lt;span class="c"&gt;# Generate and scan SBOMs&lt;/span&gt;
just iac-scan         &lt;span class="c"&gt;# Infrastructure scan with Checkov&lt;/span&gt;
just pentest-full     &lt;span class="c"&gt;# Run the full penetration test suite&lt;/span&gt;

&lt;span class="c"&gt;# AWS Resource Management (Cost Saving)&lt;/span&gt;
just aws-up       &lt;span class="c"&gt;# Scale up ECS services + start RDS&lt;/span&gt;
just aws-down     &lt;span class="c"&gt;# Scale down to zero (save money)&lt;/span&gt;
just aws-status   &lt;span class="c"&gt;# Check resource status&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;aws-up&lt;/code&gt; and &lt;code&gt;aws-down&lt;/code&gt; commands are particularly useful — they allow scaling the entire infrastructure to zero when not in use, saving significant costs during development.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;

&lt;p&gt;The codebase is organized for clarity and separation of concerns:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── api/                    # API service (FastAPI + hexagonal)
│   ├── src/
│   │   ├── domain/         # Business entities, value objects
│   │   ├── application/    # Use cases, ports, DTOs
│   │   ├── infrastructure/ # Adapters (DB, Kinesis, Secrets)
│   │   └── presentation/   # HTTP routes, middleware
│   └── tests/
│
├── worker/                 # Worker service (Kinesis consumer)
│   ├── src/
│   │   ├── domain/         # Ports for channels, publishers
│   │   ├── application/    # Delivery service
│   │   ├── infrastructure/ # Publisher adapters (Direct, AI Agent)
│   │   └── channels/       # Channel gateways (FB, IG, LI, Email, SMS)
│   └── tests/
│
├── scheduler/              # Scheduler service (cron + ECS)
├── api-lambda/             # API Lambda handler (serverless)
├── worker-lambda/          # Worker Lambda handler (serverless)
├── scheduler-lambda/       # Scheduler Lambda handler (serverless)
├── web/                    # Frontend (React + Vite + TypeScript)
│
├── testing/                # Penetration testing framework
│   ├── pentest_agent.py    # AI security testing agent (Strands SDK)
│   ├── test_pentest.py     # Security test cases (pytest)
│   ├── Dockerfile.kali     # Kali Linux container with tools
│   └── justfile            # Pentest task runner
│
├── infra/                  # CDK infrastructure (containers)
│   └── stacks/             # Network, Security, Auth, Data, Compute, Edge, Monitoring
│
├── infra-fs/               # CDK infrastructure (serverless)
│   └── stacks/             # Data, Auth, API, Worker, Scheduler, Security, Frontend
│
├── docs/                   # Comprehensive documentation
├── .github/workflows/      # CI/CD (deploy + security scan)
├── devbox.json             # Development environment
├── docker-compose.yml      # Local services
└── justfile                # Task runner
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Lessons Learned
&lt;/h2&gt;

&lt;p&gt;Building this project reinforced several convictions:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hexagonal architecture is not academic overhead.&lt;/strong&gt; It is the single decision that enabled dual-mode deployment, clean testing, and a codebase that remains navigable as it grows. The upfront investment in defining ports and adapters pays for itself many times over.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Security scanning must be layered.&lt;/strong&gt; No single tool catches everything. The combination of SAST, SCA, secrets scanning, IaC scanning, and DAST provides genuine defense in depth. The key is making these scans fast and automatic — if they slow down the developer, they will be bypassed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Correlation IDs are non-negotiable.&lt;/strong&gt; The ability to trace a single request from CloudFront through the API, into Kinesis, through the Worker, and into the database is invaluable for debugging, incident response, and security forensics. Implement them from day one.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cost management is a feature.&lt;/strong&gt; The &lt;code&gt;aws-up&lt;/code&gt; / &lt;code&gt;aws-down&lt;/code&gt; pattern for scaling resources to zero when not in use is simple but effective. For a project like this, it is the difference between a $200/month bill and a $5/month bill during development.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI agents need guardrails.&lt;/strong&gt; The Strands Agents integration is powerful, but without content filtering, prompt injection detection, and output validation, it is a liability. The agent can only execute pre-approved operations through well-defined tool interfaces — never raw API calls. The same principle applies to the Security Testing Agent: it operates within a strict allowlist of tests and read-only AWS access, proving that AI-powered automation and security boundaries can coexist.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This project is a living laboratory. It is not finished, and it probably never will be — because the landscape of threats, tools, and best practices is always evolving. But it serves its purpose: a concrete, open-source reference for how to build modern cloud-native applications with security woven into every layer.&lt;/p&gt;

&lt;p&gt;If you are starting a new project and wondering where to begin with DevSecOps, my advice is simple: start with the architecture. Get hexagonal architecture right, define your ports and adapters, and the rest — security, testing, deployment flexibility — becomes dramatically easier.&lt;/p&gt;

&lt;p&gt;The code is open source. Fork it, break it, improve it. That is what it is there for.&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Repository&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc" rel="noopener noreferrer"&gt;https://github.com/Walsen/devsecops-poc&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hexagonal Architecture Reference&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/hexagonal-architecture.md" rel="noopener noreferrer"&gt;docs/hexagonal-architecture.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Documentation&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/security.md" rel="noopener noreferrer"&gt;docs/security.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Golden Thread Tracing&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/golden-thread-tracing.md" rel="noopener noreferrer"&gt;docs/golden-thread-tracing.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dual-Mode Deployment Guide&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/dual-mode-deployment.md" rel="noopener noreferrer"&gt;docs/dual-mode-deployment.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Penetration Testing Guide&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/penetration-testing.md" rel="noopener noreferrer"&gt;docs/penetration-testing.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Testing Agent&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/security-testing-agent.md" rel="noopener noreferrer"&gt;docs/security-testing-agent.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automated Penetration Testing&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/automated-penetration-testing.md" rel="noopener noreferrer"&gt;docs/automated-penetration-testing.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Testing Guide&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/testing.md" rel="noopener noreferrer"&gt;docs/testing.md&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI Agents Documentation&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/devsecops-poc/blob/main/docs/ai-agents.md" rel="noopener noreferrer"&gt;docs/ai-agents.md&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>automation</category>
      <category>aws</category>
      <category>devops</category>
      <category>security</category>
    </item>
    <item>
      <title>An Incredible Operations Platform - Rundeck</title>
      <dc:creator>Sergio D. Rodríguez Inclán</dc:creator>
      <pubDate>Sun, 01 Feb 2026 19:56:51 +0000</pubDate>
      <link>https://forem.com/w4ls3n/an-incredible-operations-platform-rundeck-1kan</link>
      <guid>https://forem.com/w4ls3n/an-incredible-operations-platform-rundeck-1kan</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;There's a moment every operations engineer knows well: it's 2 AM, something's broken, and you're frantically SSH-ing into servers trying to remember the exact sequence of commands to fix it. You've done this before, but was it &lt;code&gt;systemctl restart&lt;/code&gt; first or the config update? And which servers exactly?&lt;/p&gt;

&lt;p&gt;This is the problem Rundeck solves. It's an open-source runbook automation platform that lets you define, schedule, and execute operational procedures across your entire infrastructure—with proper access control, audit trails, and the peace of mind that comes from knowing the procedure will run exactly the same way every time.&lt;/p&gt;

&lt;p&gt;I've been using Rundeck for years, and recently I decided to contribute back to the ecosystem by creating three tools that solve specific pain points I encountered: a production-ready Docker image, a plugin for copying files between nodes, and a GitHub Action for seamless CI/CD integration. Let me walk you through all of them.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Makes Rundeck Special
&lt;/h2&gt;

&lt;p&gt;Rundeck occupies a unique space in the DevOps toolchain. It's not a configuration management tool like Ansible or Puppet—though it integrates beautifully with them. It's not a CI/CD platform like Jenkins—though it can trigger and be triggered by pipelines. Rundeck is specifically designed for &lt;strong&gt;operational workflows&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Core Capabilities
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Job Definitions&lt;/strong&gt;: Create multi-step workflows with conditionals, error handling, and node targeting&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Node Management&lt;/strong&gt;: Maintain a centralized inventory of your infrastructure with custom attributes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Access Control&lt;/strong&gt;: Fine-grained RBAC so developers can restart their services without full SSH access&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Key Storage&lt;/strong&gt;: Secure credential management for SSH keys, passwords, and API tokens&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Execution History&lt;/strong&gt;: Complete audit trail of who ran what, when, and with what results&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scheduling&lt;/strong&gt;: Cron-like scheduling for recurring maintenance tasks&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;REST API&lt;/strong&gt;: Full API for integration with CI/CD pipelines, monitoring systems, and AI agents&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Where It Shines
&lt;/h3&gt;

&lt;p&gt;In my experience, Rundeck excels at:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Incident Response&lt;/strong&gt;: Pre-built runbooks that on-call engineers can execute confidently&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Self-Service Operations&lt;/strong&gt;: Let developers restart services or clear caches without ops tickets&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Coordinated Procedures&lt;/strong&gt;: Multi-node operations that need to happen in a specific order&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Compliance Tasks&lt;/strong&gt;: Scheduled security scans, backup verifications, audit reports&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment Orchestration&lt;/strong&gt;: Coordinate deployments across environments with approval gates&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  REST API: Automation Beyond the UI
&lt;/h2&gt;

&lt;p&gt;One of Rundeck's most powerful features is its comprehensive REST API. Every action you can perform in the web interface is available programmatically, making Rundeck a perfect backend for automated operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  API Capabilities
&lt;/h3&gt;

&lt;p&gt;The API supports:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Job execution&lt;/strong&gt;: Trigger jobs with custom parameters&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Execution monitoring&lt;/strong&gt;: Check status, stream logs, abort running jobs&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Job management&lt;/strong&gt;: Create, update, delete, import/export job definitions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Node inventory&lt;/strong&gt;: Query and manage node sources&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Key storage&lt;/strong&gt;: Programmatic credential management&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;System info&lt;/strong&gt;: Health checks, metrics, cluster status&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Authentication
&lt;/h3&gt;

&lt;p&gt;Rundeck supports multiple authentication methods for API access:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Using API Token (recommended)&lt;/span&gt;
curl &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"X-Rundeck-Auth-Token: YOUR_TOKEN"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  https://rundeck.example.com/api/41/projects

&lt;span class="c"&gt;# Using session cookie&lt;/span&gt;
curl &lt;span class="nt"&gt;-c&lt;/span&gt; cookies.txt &lt;span class="nt"&gt;-b&lt;/span&gt; cookies.txt &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s2"&gt;"j_username=admin&amp;amp;j_password=admin"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  https://rundeck.example.com/j_security_check
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Triggering Jobs Programmatically
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Run a job by ID&lt;/span&gt;
curl &lt;span class="nt"&gt;-X&lt;/span&gt; POST &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"X-Rundeck-Auth-Token: YOUR_TOKEN"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{"options": {"environment": "production", "version": "1.2.3"}}'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  https://rundeck.example.com/api/41/job/JOB_ID/run

&lt;span class="c"&gt;# Response includes execution ID for monitoring&lt;/span&gt;
&lt;span class="o"&gt;{&lt;/span&gt;
  &lt;span class="s2"&gt;"id"&lt;/span&gt;: 12345,
  &lt;span class="s2"&gt;"href"&lt;/span&gt;: &lt;span class="s2"&gt;"https://rundeck.example.com/api/41/execution/12345"&lt;/span&gt;,
  &lt;span class="s2"&gt;"status"&lt;/span&gt;: &lt;span class="s2"&gt;"running"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Integration Scenarios
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;CI/CD Pipelines&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Trigger deployment jobs from Jenkins, GitHub Actions, or GitLab CI. I created a dedicated GitHub Action to make this integration even easier: &lt;a href="https://github.com/Walsen/rundeck-github-action" rel="noopener noreferrer"&gt;rundeck-github-action&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Using the Rundeck GitHub Action&lt;/span&gt;
&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Deploy via Rundeck&lt;/span&gt;
  &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Walsen/rundeck-github-action@v1&lt;/span&gt;
  &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;rundeck_url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://rundeck.example.com&lt;/span&gt;
    &lt;span class="na"&gt;rundeck_token&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.RUNDECK_TOKEN }}&lt;/span&gt;
    &lt;span class="na"&gt;action&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;run_job&lt;/span&gt;
    &lt;span class="na"&gt;job_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ vars.DEPLOY_JOB_ID }}&lt;/span&gt;
    &lt;span class="na"&gt;job_options&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;{"version":&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"${{&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;github.sha&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;}}",&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"environment":&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;"production"}'&lt;/span&gt;
    &lt;span class="na"&gt;wait_for_completion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
    &lt;span class="na"&gt;timeout&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;600&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The action supports multiple operations:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;run_job&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Execute a Rundeck job with options&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;get_job_info&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Get job details&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;list_jobs&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;List jobs in a project&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;get_execution&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Get execution details&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;list_executions&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;List executions for a job or project&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;abort_execution&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Abort a running execution&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;You can also wait for job completion and get the execution status:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Run deployment and wait&lt;/span&gt;
  &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;deploy&lt;/span&gt;
  &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Walsen/rundeck-github-action@v1&lt;/span&gt;
  &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;rundeck_url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://rundeck.example.com&lt;/span&gt;
    &lt;span class="na"&gt;rundeck_token&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.RUNDECK_TOKEN }}&lt;/span&gt;
    &lt;span class="na"&gt;action&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;run_job&lt;/span&gt;
    &lt;span class="na"&gt;job_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ vars.DEPLOY_JOB_ID }}&lt;/span&gt;
    &lt;span class="na"&gt;wait_for_completion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;

&lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Check result&lt;/span&gt;
  &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
    &lt;span class="s"&gt;echo "Status: ${{ steps.deploy.outputs.execution_status }}"&lt;/span&gt;
    &lt;span class="s"&gt;echo "URL: ${{ steps.deploy.outputs.execution_url }}"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Monitoring Integration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Have your monitoring system trigger remediation jobs automatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# PagerDuty webhook handler example
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;handle_alert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;alert&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;alert&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;type&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;high_memory&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;RUNDECK_URL&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/api/41/job/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;CLEAR_CACHE_JOB&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/run&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;X-Rundeck-Auth-Token&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;RUNDECK_TOKEN&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;options&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;server&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;alert&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;hostname&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]}}&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;AI Agents and LLM Integration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is where things get interesting. Rundeck's API makes it an ideal execution backend for AI-powered operations. Using the &lt;a href="https://github.com/strands-agents/sdk-python" rel="noopener noreferrer"&gt;Strands Agents SDK&lt;/a&gt; from AWS, you can build intelligent agents that leverage Rundeck as their operational backbone.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICB1c2VyWyJVc2VyL0FsZXJ0Il0gLS0%2BIGFnZW50WyJTdHJhbmRzIEFnZW50Il0KICAgIGFnZW50IC0tPiBhbmFseXplWyJBbmFseXplIFNpdHVhdGlvbiJdCiAgICBhbmFseXplIC0tPiBkZWNpZGVbIlNlbGVjdCBSdW5ib29rIl0KICAgIGRlY2lkZSAtLT4gdG9vbFsiUnVuZGVjayBUb29sIl0KICAgIHRvb2wgLS0%2BIGFwaVsiUnVuZGVjayBBUEkiXQogICAgYXBpIC0tPiBqb2JbIkV4ZWN1dGUgSm9iIl0KICAgIGpvYiAtLT4gcmVzdWx0WyJSZXR1cm4gUmVzdWx0cyJdCiAgICByZXN1bHQgLS0%2BIGFnZW50" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICB1c2VyWyJVc2VyL0FsZXJ0Il0gLS0%2BIGFnZW50WyJTdHJhbmRzIEFnZW50Il0KICAgIGFnZW50IC0tPiBhbmFseXplWyJBbmFseXplIFNpdHVhdGlvbiJdCiAgICBhbmFseXplIC0tPiBkZWNpZGVbIlNlbGVjdCBSdW5ib29rIl0KICAgIGRlY2lkZSAtLT4gdG9vbFsiUnVuZGVjayBUb29sIl0KICAgIHRvb2wgLS0%2BIGFwaVsiUnVuZGVjayBBUEkiXQogICAgYXBpIC0tPiBqb2JbIkV4ZWN1dGUgSm9iIl0KICAgIGpvYiAtLT4gcmVzdWx0WyJSZXR1cm4gUmVzdWx0cyJdCiAgICByZXN1bHQgLS0%2BIGFnZW50" alt="Mermaid Diagram" width="1637" height="105"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, install the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;strands-agents strands-agents-tools requests
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a custom Rundeck tool for your agent:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;strands&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;

&lt;span class="n"&gt;RUNDECK_URL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://rundeck.example.com&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;RUNDECK_TOKEN&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;your-api-token&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="nd"&gt;@tool&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;list_rundeck_jobs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;project&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;List available runbooks in a Rundeck project.

    Args:
        project: The Rundeck project name

    Returns:
        List of available jobs with their IDs and descriptions
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;RUNDECK_URL&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/api/41/project/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;project&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/jobs&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;X-Rundeck-Auth-Token&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;RUNDECK_TOKEN&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="nd"&gt;@tool&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;run_rundeck_job&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;job_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;options&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Execute a Rundeck job/runbook.

    Args:
        job_id: The UUID of the job to execute
        options: Optional parameters for the job

    Returns:
        Execution details including status and execution ID
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;RUNDECK_URL&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/api/41/job/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;job_id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/run&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;X-Rundeck-Auth-Token&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;RUNDECK_TOKEN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Content-Type&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;options&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;options&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="p"&gt;{}}&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="nd"&gt;@tool&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_execution_status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;execution_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;dict&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Check the status of a Rundeck job execution.

    Args:
        execution_id: The execution ID to check

    Returns:
        Execution status and details
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;RUNDECK_URL&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;/api/41/execution/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;execution_id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;X-Rundeck-Auth-Token&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;RUNDECK_TOKEN&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="c1"&gt;# Create the operations agent
&lt;/span&gt;&lt;span class="n"&gt;ops_agent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;system_prompt&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;You are an operations assistant with access to Rundeck runbooks.
    When asked to perform operational tasks:
    1. List available jobs to find the appropriate runbook
    2. Execute the job with the correct parameters
    3. Monitor the execution and report the results
    Always confirm before executing destructive operations.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;list_rundeck_jobs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;run_rundeck_job&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;get_execution_status&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Example interaction
&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;ops_agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;The app servers are running low on disk space. &lt;/span&gt;&lt;span class="se"&gt;\
&lt;/span&gt;&lt;span class="s"&gt;    Can you clear the log files on the production cluster?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This pattern keeps humans in control—the AI can only execute pre-approved runbooks with proper access controls and audit trails. It's autonomous operations with guardrails. The agent can reason about which runbook to use, execute it, and report back the results, all while respecting Rundeck's RBAC policies.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Low-Code Integration with n8n&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you prefer a visual approach to automation, &lt;a href="https://n8n.io" rel="noopener noreferrer"&gt;n8n&lt;/a&gt; offers native Rundeck integration. You can build workflows that connect GitHub events to Rundeck job executions without writing code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBnaXRodWJbIkdpdEh1YiBXZWJob29rIl0gLS0%2BIG44blsibjhuIFdvcmtmbG93Il0KICAgIG44biAtLT4gcnVuZGVja1siUnVuZGVjayBOb2RlIl0KICAgIHJ1bmRlY2sgLS0%2BIGpvYlsiRXhlY3V0ZSBKb2IiXQogICAgam9iIC0tPiBub3RpZnlbIlNsYWNrL0VtYWlsIl0%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBnaXRodWJbIkdpdEh1YiBXZWJob29rIl0gLS0%2BIG44blsibjhuIFdvcmtmbG93Il0KICAgIG44biAtLT4gcnVuZGVja1siUnVuZGVjayBOb2RlIl0KICAgIHJ1bmRlY2sgLS0%2BIGpvYlsiRXhlY3V0ZSBKb2IiXQogICAgam9iIC0tPiBub3RpZnlbIlNsYWNrL0VtYWlsIl0%3D" alt="Mermaid Diagram" width="1012" height="70"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://n8n.io/integrations/github/and/rundeck/" rel="noopener noreferrer"&gt;n8n Rundeck node&lt;/a&gt; supports:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Execute&lt;/strong&gt;: Trigger any Rundeck job with parameters&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Get Metadata&lt;/strong&gt;: Retrieve job definitions and status&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Combined with n8n's 400+ integrations, you can create powerful automation chains—for example, triggering a deployment job when a GitHub release is published, then notifying your team on Slack when it completes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Licensing: Open Source vs Enterprise
&lt;/h2&gt;

&lt;p&gt;Rundeck follows an open-core model. The community edition is fully open source under the Apache 2.0 license, while PagerDuty (who acquired Rundeck) offers commercial versions with additional features.&lt;/p&gt;

&lt;h3&gt;
  
  
  Rundeck Community (Open Source)
&lt;/h3&gt;

&lt;p&gt;Free forever, includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Workflow execution and job definitions&lt;/li&gt;
&lt;li&gt;Node management and key storage&lt;/li&gt;
&lt;li&gt;Access control (ACL-based)&lt;/li&gt;
&lt;li&gt;Scheduling and job activity logs&lt;/li&gt;
&lt;li&gt;Community plugins&lt;/li&gt;
&lt;li&gt;REST API&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is what my Docker image uses—perfect for small to medium teams and learning environments.&lt;/p&gt;

&lt;h3&gt;
  
  
  PagerDuty Runbook Automation (Commercial)
&lt;/h3&gt;

&lt;p&gt;The enterprise offerings add:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;High Availability&lt;/td&gt;
&lt;td&gt;Clustered deployments with auto-takeover&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;SSO Integration&lt;/td&gt;
&lt;td&gt;SAML, LDAP, OAuth support&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Enterprise Plugins&lt;/td&gt;
&lt;td&gt;ServiceNow, PagerDuty, Datadog, VMware integrations&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Advanced Scheduling&lt;/td&gt;
&lt;td&gt;Blackout calendars, schedule forecasting&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Failed Job Resume&lt;/td&gt;
&lt;td&gt;Resume from the failed step instead of restarting&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Enterprise Support&lt;/td&gt;
&lt;td&gt;SLA-backed support and account management&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;PagerDuty offers two commercial options:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Runbook Automation Self-Hosted&lt;/strong&gt;: You manage the infrastructure, they provide the enterprise features&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Runbook Automation (Cloud)&lt;/strong&gt;: Fully managed SaaS with 99.9% SLA&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Which Should You Choose?
&lt;/h3&gt;

&lt;p&gt;For most use cases, start with the open source version. It's production-ready and covers the core functionality. Consider upgrading when you need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;High availability for mission-critical operations&lt;/li&gt;
&lt;li&gt;SSO integration with your identity provider&lt;/li&gt;
&lt;li&gt;Enterprise integrations (ServiceNow tickets, PagerDuty incidents)&lt;/li&gt;
&lt;li&gt;Professional support with SLAs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The open source version isn't a "lite" version—it's a complete operations platform that many organizations run successfully in production.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Gap: Node-to-Node File Transfers
&lt;/h2&gt;

&lt;p&gt;While building a configuration distribution workflow, I hit a limitation. Rundeck's built-in file copier moves files &lt;strong&gt;from the Rundeck server to target nodes&lt;/strong&gt;. But I needed to copy files &lt;strong&gt;from one node to multiple other nodes&lt;/strong&gt;—specifically, distributing generated configs from a central server to application nodes.&lt;/p&gt;

&lt;p&gt;The workaround was clunky: download to Rundeck, then upload to each destination. For large files or many destinations, this becomes a bottleneck.&lt;/p&gt;

&lt;p&gt;So I built a plugin.&lt;/p&gt;

&lt;h2&gt;
  
  
  Rundeck Node-to-Node Plugin
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://github.com/Walsen/rundeck-node-to-node" rel="noopener noreferrer"&gt;rundeck-node-to-node&lt;/a&gt; plugin adds a workflow step that copies files and directories between nodes using SSH/SFTP, fully integrated with Rundeck's node definitions and key storage.&lt;/p&gt;

&lt;h3&gt;
  
  
  Features
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Two Transfer Modes&lt;/strong&gt;: Route through Rundeck (reliable) or direct node-to-node (fast)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Parallel Transfers&lt;/strong&gt;: Copy to multiple destinations simultaneously&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Directory Support&lt;/strong&gt;: Recursive copy with preserved permissions and timestamps&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Key Storage Integration&lt;/strong&gt;: Uses Rundeck's secure credential management&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Error Handling&lt;/strong&gt;: Option to continue on partial failures&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Transfer Modes Explained
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Via-Rundeck (Default)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Files download to Rundeck once, then upload to all destinations. Works in any network topology.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzb3VyY2VbIlNvdXJjZSBOb2RlIl0gLS0%2BIHJ1bmRlY2tbIlJ1bmRlY2sgU2VydmVyIl0KICAgIHJ1bmRlY2sgLS0%2BIGRlc3QxWyJEZXN0aW5hdGlvbiAxIl0KICAgIHJ1bmRlY2sgLS0%2BIGRlc3QyWyJEZXN0aW5hdGlvbiAyIl0KICAgIHJ1bmRlY2sgLS0%2BIGRlc3QzWyJEZXN0aW5hdGlvbiAzIl0%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzb3VyY2VbIlNvdXJjZSBOb2RlIl0gLS0%2BIHJ1bmRlY2tbIlJ1bmRlY2sgU2VydmVyIl0KICAgIHJ1bmRlY2sgLS0%2BIGRlc3QxWyJEZXN0aW5hdGlvbiAxIl0KICAgIHJ1bmRlY2sgLS0%2BIGRlc3QyWyJEZXN0aW5hdGlvbiAyIl0KICAgIHJ1bmRlY2sgLS0%2BIGRlc3QzWyJEZXN0aW5hdGlvbiAzIl0%3D" alt="Mermaid Diagram" width="598" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Direct Mode&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Source pushes directly to destinations via SCP. Faster, but requires source-to-destination SSH access.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzb3VyY2VbIlNvdXJjZSBOb2RlIl0gLS0%2BIGRlc3QxWyJEZXN0aW5hdGlvbiAxIl0KICAgIHNvdXJjZSAtLT4gZGVzdDJbIkRlc3RpbmF0aW9uIDIiXQogICAgc291cmNlIC0tPiBkZXN0M1siRGVzdGluYXRpb24gMyJd" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzb3VyY2VbIlNvdXJjZSBOb2RlIl0gLS0%2BIGRlc3QxWyJEZXN0aW5hdGlvbiAxIl0KICAgIHNvdXJjZSAtLT4gZGVzdDJbIkRlc3RpbmF0aW9uIDIiXQogICAgc291cmNlIC0tPiBkZXN0M1siRGVzdGluYXRpb24gMyJd" alt="Mermaid Diagram" width="373" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Node Configuration
&lt;/h3&gt;

&lt;p&gt;The plugin uses Rundeck's standard node attributes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;config-server&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;hostname&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;10.0.1.10&lt;/span&gt;
  &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;deploy&lt;/span&gt;
  &lt;span class="na"&gt;ssh-key-storage-path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;keys/project/deploy-key&lt;/span&gt;
  &lt;span class="na"&gt;tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;config&lt;/span&gt;

&lt;span class="na"&gt;app-server-01&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;hostname&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;10.0.1.20&lt;/span&gt;
  &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;deploy&lt;/span&gt;
  &lt;span class="na"&gt;ssh-key-storage-path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;keys/project/deploy-key&lt;/span&gt;
  &lt;span class="na"&gt;tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;app,production&lt;/span&gt;

&lt;span class="na"&gt;app-server-02&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;hostname&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;10.0.1.21&lt;/span&gt;
  &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;deploy&lt;/span&gt;
  &lt;span class="na"&gt;ssh-key-storage-path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;keys/project/deploy-key&lt;/span&gt;
  &lt;span class="na"&gt;tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;app,production&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Plugin Options
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Option&lt;/th&gt;
&lt;th&gt;Required&lt;/th&gt;
&lt;th&gt;Default&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Source Node&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;Node name where files originate&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Source Path&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;File or directory path on source&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Destination Nodes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;Comma-separated destination node names&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Destination Path&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;Target path on destinations&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Recursive Copy&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;true&lt;/td&gt;
&lt;td&gt;Copy directories recursively&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Preserve Attributes&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;true&lt;/td&gt;
&lt;td&gt;Keep timestamps and permissions&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Transfer Mode&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;via-rundeck&lt;/td&gt;
&lt;td&gt;
&lt;code&gt;via-rundeck&lt;/code&gt; or &lt;code&gt;direct&lt;/code&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Parallel Transfers&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;true&lt;/td&gt;
&lt;td&gt;Transfer to multiple nodes in parallel&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Continue on Error&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;false&lt;/td&gt;
&lt;td&gt;Don't fail if some destinations fail&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Installation
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Download the JAR from &lt;a href="https://github.com/Walsen/rundeck-node-to-node/releases" rel="noopener noreferrer"&gt;Releases&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Copy to Rundeck's &lt;code&gt;libext&lt;/code&gt; directory&lt;/li&gt;
&lt;li&gt;Restart Rundeck (or wait for auto-reload)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The plugin appears as a new workflow step type: "Node to Node File Copy".&lt;/p&gt;

&lt;h2&gt;
  
  
  Production-Ready Docker Image
&lt;/h2&gt;

&lt;p&gt;Installing Rundeck traditionally means dealing with Java, databases, reverse proxies, and process management. To streamline this, I created a Docker image that bundles everything for production use.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://github.com/Walsen/rundeck-image" rel="noopener noreferrer"&gt;rundeck-image&lt;/a&gt; provides:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Rundeck 5.18.0&lt;/strong&gt; (configurable version)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Nginx&lt;/strong&gt; reverse proxy for proper HTTP handling&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Supervisor&lt;/strong&gt; for process management&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;PostgreSQL&lt;/strong&gt; support for production databases&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Node-to-Node plugin&lt;/strong&gt; pre-installed&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Architecture
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBjb250YWluZXJbIkRvY2tlciBDb250YWluZXIiXQogICAgICAgIHN1YmdyYXBoIHN1cGVydmlzb3JbIlN1cGVydmlzb3IiXQogICAgICAgICAgICBuZ2lueFsiTmdpbng8YnIvPjo4MDgwIl0KICAgICAgICAgICAgcnVuZGVja1siUnVuZGVjazxici8%2BOjQ0NDAiXQogICAgICAgIGVuZAogICAgZW5kCiAgICAKICAgIGNsaWVudFsiQ2xpZW50Il0gLS0%2BIG5naW54CiAgICBuZ2lueCAtLT4gcnVuZGVjaw%3D%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBzdWJncmFwaCBjb250YWluZXJbIkRvY2tlciBDb250YWluZXIiXQogICAgICAgIHN1YmdyYXBoIHN1cGVydmlzb3JbIlN1cGVydmlzb3IiXQogICAgICAgICAgICBuZ2lueFsiTmdpbng8YnIvPjo4MDgwIl0KICAgICAgICAgICAgcnVuZGVja1siUnVuZGVjazxici8%2BOjQ0NDAiXQogICAgICAgIGVuZAogICAgZW5kCiAgICAKICAgIGNsaWVudFsiQ2xpZW50Il0gLS0%2BIG5naW54CiAgICBuZ2lueCAtLT4gcnVuZGVjaw%3D%3D" alt="Mermaid Diagram" width="541" height="204"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Quick Start
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-p&lt;/span&gt; 8080:8080 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;RUNDECK_GRAILS_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:8080 &lt;span class="se"&gt;\&lt;/span&gt;
  ghcr.io/walsen/rundeck-image:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Access Rundeck at &lt;code&gt;http://localhost:8080&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Production Setup with Docker Compose
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.8'&lt;/span&gt;

&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;rundeck&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ghcr.io/walsen/rundeck-image:latest&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;8080:8080"&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;RUNDECK_GRAILS_URL&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;https://rundeck.example.com&lt;/span&gt;
      &lt;span class="na"&gt;RUNDECK_DATABASE_URL&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;jdbc:postgresql://db:5432/rundeck&lt;/span&gt;
      &lt;span class="na"&gt;RUNDECK_DATABASE_USERNAME&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;rundeck&lt;/span&gt;
      &lt;span class="na"&gt;RUNDECK_DATABASE_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${DB_PASSWORD}&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;./config/realm.properties:/etc/rundeck/realm.properties&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;rundeck-data:/var/rundeck&lt;/span&gt;
    &lt;span class="na"&gt;depends_on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;db&lt;/span&gt;

  &lt;span class="na"&gt;db&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;postgres:15-alpine&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;POSTGRES_DB&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;rundeck&lt;/span&gt;
      &lt;span class="na"&gt;POSTGRES_USER&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;rundeck&lt;/span&gt;
      &lt;span class="na"&gt;POSTGRES_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${DB_PASSWORD}&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;postgres-data:/var/lib/postgresql/data&lt;/span&gt;

&lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;rundeck-data&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;postgres-data&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Environment Variables
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Variable&lt;/th&gt;
&lt;th&gt;Default&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;RUNDECK_GRAILS_URL&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;http://localhost:8080&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;External URL (must match your setup)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;RUNDECK_DATABASE_URL&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;PostgreSQL JDBC connection string&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;RUNDECK_DATABASE_USERNAME&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;Database user&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;RUNDECK_DATABASE_PASSWORD&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;-&lt;/td&gt;
&lt;td&gt;Database password&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  User Authentication
&lt;/h3&gt;

&lt;p&gt;Mount a &lt;code&gt;realm.properties&lt;/code&gt; file for basic authentication:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight properties"&gt;&lt;code&gt;&lt;span class="c"&gt;# username:password,role1,role2
&lt;/span&gt;&lt;span class="py"&gt;admin&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s"&gt;admin,user,admin&lt;/span&gt;
&lt;span class="py"&gt;operator&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s"&gt;operator123,user&lt;/span&gt;
&lt;span class="py"&gt;readonly&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="s"&gt;viewer456,user&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For production, integrate with LDAP or SSO—Rundeck supports both.&lt;/p&gt;

&lt;h3&gt;
  
  
  Custom Port Mapping
&lt;/h3&gt;

&lt;p&gt;The image handles any external port. Just match &lt;code&gt;RUNDECK_GRAILS_URL&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Running on port 4440&lt;/span&gt;
docker run &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-p&lt;/span&gt; 4440:8080 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;RUNDECK_GRAILS_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:4440 &lt;span class="se"&gt;\&lt;/span&gt;
  ghcr.io/walsen/rundeck-image:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Practical Example: Config Distribution Workflow
&lt;/h2&gt;

&lt;p&gt;Let me show how these tools work together in a real scenario.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Setup:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;1 config server generates environment-specific configuration files&lt;/li&gt;
&lt;li&gt;5 application servers need these configs&lt;/li&gt;
&lt;li&gt;Updates should happen without manual intervention&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The Workflow:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRECiAgICBzdWJncmFwaCBzdGVwMVsiU3RlcCAxOiBHZW5lcmF0ZSBDb25maWciXQogICAgICAgIGNvbmZpZ1siY29uZmlnLXNlcnZlciJdCiAgICAgICAgc2NyaXB0WyJnZW5lcmF0ZS1jb25maWcuc2giXQogICAgICAgIGNvbmZpZyAtLT4gc2NyaXB0CiAgICBlbmQKICAgIAogICAgc3ViZ3JhcGggc3RlcDJbIlN0ZXAgMjogRGlzdHJpYnV0ZSAoTm9kZS10by1Ob2RlIFBsdWdpbikiXQogICAgICAgIHNvdXJjZVsiY29uZmlnLXNlcnZlcjxici8%2BL2V0Yy9teWFwcC9jb25maWcueW1sIl0KICAgICAgICBhcHAxWyJhcHAtMDEiXQogICAgICAgIGFwcDJbImFwcC0wMiJdCiAgICAgICAgYXBwM1siYXBwLTAzIl0KICAgICAgICBhcHA0WyJhcHAtMDQiXQogICAgICAgIGFwcDVbImFwcC0wNSJdCiAgICAgICAgc291cmNlIC0tPiBhcHAxCiAgICAgICAgc291cmNlIC0tPiBhcHAyCiAgICAgICAgc291cmNlIC0tPiBhcHAzCiAgICAgICAgc291cmNlIC0tPiBhcHA0CiAgICAgICAgc291cmNlIC0tPiBhcHA1CiAgICBlbmQKICAgIAogICAgc3ViZ3JhcGggc3RlcDNbIlN0ZXAgMzogUm9sbGluZyBSZWxvYWQiXQogICAgICAgIHIxWyJyZWxvYWQgYXBwLTAxIl0gLS0%2BIHIyWyJyZWxvYWQgYXBwLTAyIl0KICAgICAgICByMiAtLT4gcjNbInJlbG9hZCBhcHAtMDMiXQogICAgICAgIHIzIC0tPiByNFsicmVsb2FkIGFwcC0wNCJdCiAgICAgICAgcjQgLS0%2BIHI1WyJyZWxvYWQgYXBwLTA1Il0KICAgIGVuZAogICAgCiAgICBzdGVwMSAtLT4gc3RlcDIKICAgIHN0ZXAyIC0tPiBzdGVwMw%3D%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRECiAgICBzdWJncmFwaCBzdGVwMVsiU3RlcCAxOiBHZW5lcmF0ZSBDb25maWciXQogICAgICAgIGNvbmZpZ1siY29uZmlnLXNlcnZlciJdCiAgICAgICAgc2NyaXB0WyJnZW5lcmF0ZS1jb25maWcuc2giXQogICAgICAgIGNvbmZpZyAtLT4gc2NyaXB0CiAgICBlbmQKICAgIAogICAgc3ViZ3JhcGggc3RlcDJbIlN0ZXAgMjogRGlzdHJpYnV0ZSAoTm9kZS10by1Ob2RlIFBsdWdpbikiXQogICAgICAgIHNvdXJjZVsiY29uZmlnLXNlcnZlcjxici8%2BL2V0Yy9teWFwcC9jb25maWcueW1sIl0KICAgICAgICBhcHAxWyJhcHAtMDEiXQogICAgICAgIGFwcDJbImFwcC0wMiJdCiAgICAgICAgYXBwM1siYXBwLTAzIl0KICAgICAgICBhcHA0WyJhcHAtMDQiXQogICAgICAgIGFwcDVbImFwcC0wNSJdCiAgICAgICAgc291cmNlIC0tPiBhcHAxCiAgICAgICAgc291cmNlIC0tPiBhcHAyCiAgICAgICAgc291cmNlIC0tPiBhcHAzCiAgICAgICAgc291cmNlIC0tPiBhcHA0CiAgICAgICAgc291cmNlIC0tPiBhcHA1CiAgICBlbmQKICAgIAogICAgc3ViZ3JhcGggc3RlcDNbIlN0ZXAgMzogUm9sbGluZyBSZWxvYWQiXQogICAgICAgIHIxWyJyZWxvYWQgYXBwLTAxIl0gLS0%2BIHIyWyJyZWxvYWQgYXBwLTAyIl0KICAgICAgICByMiAtLT4gcjNbInJlbG9hZCBhcHAtMDMiXQogICAgICAgIHIzIC0tPiByNFsicmVsb2FkIGFwcC0wNCJdCiAgICAgICAgcjQgLS0%2BIHI1WyJyZWxvYWQgYXBwLTA1Il0KICAgIGVuZAogICAgCiAgICBzdGVwMSAtLT4gc3RlcDIKICAgIHN0ZXAyIC0tPiBzdGVwMw%3D%3D" alt="Mermaid Diagram" width="1185" height="904"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Job Configuration:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Step&lt;/th&gt;
&lt;th&gt;Node(s)&lt;/th&gt;
&lt;th&gt;Action&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;config-server&lt;/td&gt;
&lt;td&gt;&lt;code&gt;/opt/scripts/generate-config.sh&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;config-server → app-01..05&lt;/td&gt;
&lt;td&gt;Node-to-Node copy &lt;code&gt;/etc/myapp/config.yml&lt;/code&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;app-01..05 (sequential)&lt;/td&gt;
&lt;td&gt;&lt;code&gt;systemctl reload myapp&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;The Result:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One-click config updates&lt;/li&gt;
&lt;li&gt;Parallel distribution to all servers&lt;/li&gt;
&lt;li&gt;Rolling reload to avoid downtime&lt;/li&gt;
&lt;li&gt;Complete audit trail&lt;/li&gt;
&lt;li&gt;Anyone with permissions can run it&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Testing the Plugin
&lt;/h2&gt;

&lt;p&gt;The plugin repository includes a Docker-based test environment:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd test&lt;/span&gt;/
docker-compose up &lt;span class="nt"&gt;-d&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This spins up:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A Rundeck instance with the plugin&lt;/li&gt;
&lt;li&gt;Multiple test nodes for source/destination testing&lt;/li&gt;
&lt;li&gt;Pre-configured SSH keys&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;See &lt;code&gt;test/README.md&lt;/code&gt; for detailed testing instructions.&lt;/p&gt;

&lt;h2&gt;
  
  
  CI/CD Integration
&lt;/h2&gt;

&lt;p&gt;Both repositories include GitHub Actions workflows:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;rundeck-image:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Builds on every push to main&lt;/li&gt;
&lt;li&gt;Publishes to GitHub Container Registry&lt;/li&gt;
&lt;li&gt;Tags releases with version numbers&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;rundeck-node-to-node:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Builds and tests the plugin&lt;/li&gt;
&lt;li&gt;Creates release artifacts&lt;/li&gt;
&lt;li&gt;Publishes JAR files to GitHub Releases&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Deploying Rundeck on AWS
&lt;/h2&gt;

&lt;p&gt;If you're running on AWS, you have several options for deploying Rundeck. Here's my recommendation based on cost and operational effort:&lt;/p&gt;

&lt;h3&gt;
  
  
  Best Option: Amazon ECS on Fargate
&lt;/h3&gt;

&lt;p&gt;For most teams, ECS Fargate hits the sweet spot between cost and operational simplicity:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBhbGJbIkFwcGxpY2F0aW9uPGJyLz5Mb2FkIEJhbGFuY2VyIl0gLS0%2BIGVjc1siRUNTIEZhcmdhdGU8YnIvPihSdW5kZWNrKSJdCiAgICBlY3MgLS0%2BIHJkc1siUkRTPGJyLz5Qb3N0Z3JlU1FMIl0KICAgIGVjcyAtLT4gZWZzWyJFRlM8YnIvPi92YXIvcnVuZGVjayJdCiAgICBlY3MgLS0%2BIHNtWyJTZWNyZXRzPGJyLz5NYW5hZ2VyIl0%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBhbGJbIkFwcGxpY2F0aW9uPGJyLz5Mb2FkIEJhbGFuY2VyIl0gLS0%2BIGVjc1siRUNTIEZhcmdhdGU8YnIvPihSdW5kZWNrKSJdCiAgICBlY3MgLS0%2BIHJkc1siUkRTPGJyLz5Qb3N0Z3JlU1FMIl0KICAgIGVjcyAtLT4gZWZzWyJFRlM8YnIvPi92YXIvcnVuZGVjayJdCiAgICBlY3MgLS0%2BIHNtWyJTZWNyZXRzPGJyLz5NYW5hZ2VyIl0%3D" alt="Mermaid Diagram" width="580" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Factor&lt;/th&gt;
&lt;th&gt;ECS Fargate&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Operational Effort&lt;/td&gt;
&lt;td&gt;Low - no EC2 instances to manage&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cost&lt;/td&gt;
&lt;td&gt;~$30-50/month for small workloads&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Scaling&lt;/td&gt;
&lt;td&gt;Easy horizontal scaling&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Integration&lt;/td&gt;
&lt;td&gt;Native ALB, Secrets Manager, CloudWatch&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Persistence&lt;/td&gt;
&lt;td&gt;EFS for Rundeck data, RDS for database&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Budget Option: ECS with PostgreSQL Sidecar
&lt;/h3&gt;

&lt;p&gt;For cost-conscious deployments, you can run PostgreSQL as a sidecar container instead of using RDS. This cuts costs significantly while maintaining a containerized approach:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBhbGJbIkFwcGxpY2F0aW9uPGJyLz5Mb2FkIEJhbGFuY2VyIl0gLS0%2BIGVjc1siRUNTIEZhcmdhdGUiXQogICAgc3ViZ3JhcGggZWNzWyJFQ1MgVGFzayJdCiAgICAgICAgcnVuZGVja1siUnVuZGVjazxici8%2BQ29udGFpbmVyIl0KICAgICAgICBwb3N0Z3Jlc1siUG9zdGdyZVNRTDxici8%2BQ29udGFpbmVyIl0KICAgICAgICBydW5kZWNrIC0tPiBwb3N0Z3JlcwogICAgZW5kCiAgICBlY3MgLS0%2BIGVmc1siRUZTPGJyLz4oZGF0YSArIHBnZGF0YSkiXQ%3D%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IExSCiAgICBhbGJbIkFwcGxpY2F0aW9uPGJyLz5Mb2FkIEJhbGFuY2VyIl0gLS0%2BIGVjc1siRUNTIEZhcmdhdGUiXQogICAgc3ViZ3JhcGggZWNzWyJFQ1MgVGFzayJdCiAgICAgICAgcnVuZGVja1siUnVuZGVjazxici8%2BQ29udGFpbmVyIl0KICAgICAgICBwb3N0Z3Jlc1siUG9zdGdyZVNRTDxici8%2BQ29udGFpbmVyIl0KICAgICAgICBydW5kZWNrIC0tPiBwb3N0Z3JlcwogICAgZW5kCiAgICBlY3MgLS0%2BIGVmc1siRUZTPGJyLz4oZGF0YSArIHBnZGF0YSkiXQ%3D%3D" alt="Mermaid Diagram" width="665" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Factor&lt;/th&gt;
&lt;th&gt;With RDS&lt;/th&gt;
&lt;th&gt;With PostgreSQL Sidecar&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Monthly Cost&lt;/td&gt;
&lt;td&gt;$50-80&lt;/td&gt;
&lt;td&gt;$20-35&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Database Backups&lt;/td&gt;
&lt;td&gt;Automatic&lt;/td&gt;
&lt;td&gt;Manual (EFS snapshots)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;High Availability&lt;/td&gt;
&lt;td&gt;RDS Multi-AZ option&lt;/td&gt;
&lt;td&gt;Single container&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Complexity&lt;/td&gt;
&lt;td&gt;Two services&lt;/td&gt;
&lt;td&gt;Single task&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;For most Rundeck deployments, the sidecar approach works great—Rundeck isn't a high-transaction workload, and EFS provides durability for your data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Comparison of AWS Options
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Option&lt;/th&gt;
&lt;th&gt;Monthly Cost&lt;/th&gt;
&lt;th&gt;Ops Effort&lt;/th&gt;
&lt;th&gt;Best For&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;EC2 (t3.small)&lt;/td&gt;
&lt;td&gt;$15-20&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;td&gt;Budget-conscious, 24/7 workloads&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;EC2 Spot&lt;/td&gt;
&lt;td&gt;$5-10&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;td&gt;Dev/test, can tolerate interruptions&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Lightsail&lt;/td&gt;
&lt;td&gt;$5-20&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;td&gt;Learning, simple setups&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ECS Fargate&lt;/td&gt;
&lt;td&gt;$30-50&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;td&gt;Production, low maintenance&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;EKS&lt;/td&gt;
&lt;td&gt;$100+&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;td&gt;Already running Kubernetes&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Why Not the Others?
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;EC2&lt;/strong&gt;: More operational overhead (patching, monitoring), but cheaper for 24/7 workloads with Reserved Instances&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;EKS&lt;/strong&gt;: Overkill unless you already have a Kubernetes cluster—adds complexity and ~$70/month just for the control plane&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App Runner&lt;/strong&gt;: Simpler but limited networking control, harder to reach private infrastructure&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lightsail&lt;/strong&gt;: Cheapest option but limited VPC integration for node access&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  My Recommendations
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Small team/learning&lt;/strong&gt;: EC2 t3.small or Lightsail (~$10-20/month)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Production with low ops effort&lt;/strong&gt;: ECS Fargate + RDS (~$50-80/month)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enterprise/HA&lt;/strong&gt;: ECS Fargate multi-task + RDS Multi-AZ (~$150+/month)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  ECS Task Definition Example
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"family"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"rundeck"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"networkMode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"awsvpc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"requiresCompatibilities"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"FARGATE"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"cpu"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1024"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"memory"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2048"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"containerDefinitions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"rundeck"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"image"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ghcr.io/walsen/rundeck-image:latest"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"portMappings"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"containerPort"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;8080&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"protocol"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"tcp"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"environment"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"RUNDECK_GRAILS_URL"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"value"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://rundeck.example.com"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"secrets"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"RUNDECK_DATABASE_URL"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"valueFrom"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:secretsmanager:..."&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"RUNDECK_DATABASE_PASSWORD"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"valueFrom"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:secretsmanager:..."&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"mountPoints"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"sourceVolume"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"rundeck-data"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"containerPath"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/var/rundeck"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"logConfiguration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"logDriver"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"awslogs"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"options"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"awslogs-group"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/ecs/rundeck"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"awslogs-region"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"us-east-1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="nl"&gt;"awslogs-stream-prefix"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ecs"&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"volumes"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"rundeck-data"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"efsVolumeConfiguration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"fileSystemId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"fs-xxxxxxxx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"transitEncryption"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ENABLED"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This setup gives you a production-ready Rundeck deployment with persistent storage, secrets management, and centralized logging—all with minimal operational overhead.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusions
&lt;/h2&gt;

&lt;p&gt;Rundeck transforms operational chaos into repeatable, auditable procedures. It's the tool I reach for when I need to bridge the gap between "we should automate this" and "we have time to build proper automation."&lt;/p&gt;

&lt;p&gt;The Node-to-Node plugin fills a specific gap—efficient file distribution between nodes without routing everything through Rundeck. And the Docker image removes the friction of getting a production-ready Rundeck instance running.&lt;/p&gt;

&lt;p&gt;If you're drowning in operational toil, give Rundeck a try. And if you need these specific capabilities, the tools are ready for you.&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Rundeck GitHub Action&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/rundeck-github-action" rel="noopener noreferrer"&gt;https://github.com/Walsen/rundeck-github-action&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rundeck Docker Image&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/rundeck-image" rel="noopener noreferrer"&gt;https://github.com/Walsen/rundeck-image&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Node-to-Node Plugin&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/rundeck-node-to-node" rel="noopener noreferrer"&gt;https://github.com/Walsen/rundeck-node-to-node&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rundeck Official Docs&lt;/strong&gt;: &lt;a href="https://docs.rundeck.com" rel="noopener noreferrer"&gt;https://docs.rundeck.com&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;My Blog&lt;/strong&gt;: &lt;a href="https://blog.walsen.website" rel="noopener noreferrer"&gt;https://blog.walsen.website&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>automation</category>
      <category>devops</category>
      <category>opensource</category>
      <category>tooling</category>
    </item>
    <item>
      <title>GitHub Action to Publish Hugo Posts to Dev.to</title>
      <dc:creator>Sergio D. Rodríguez Inclán</dc:creator>
      <pubDate>Mon, 26 Jan 2026 04:01:50 +0000</pubDate>
      <link>https://forem.com/w4ls3n/github-action-to-publish-hugo-posts-to-devto-2nmp</link>
      <guid>https://forem.com/w4ls3n/github-action-to-publish-hugo-posts-to-devto-2nmp</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;As developers who maintain technical blogs, we often face a common dilemma: should we publish exclusively on our own site, or should we cross-post to platforms like Dev.to, Medium, or Hashnode to reach a wider audience?&lt;/p&gt;

&lt;p&gt;The answer is usually "both," but that creates a new problem: &lt;strong&gt;manual cross-posting is tedious, error-prone, and time-consuming&lt;/strong&gt;. You write a post in Hugo, publish it to your site, then manually copy-paste the content to Dev.to, adjust the formatting, add tags, set the canonical URL, and hope you didn't miss anything.&lt;/p&gt;

&lt;p&gt;I experienced this friction firsthand with my Hugo-powered blog at &lt;a href="https://blog.walsen.website" rel="noopener noreferrer"&gt;blog.walsen.website&lt;/a&gt;. After publishing several posts and manually cross-posting them to Dev.to, I realized this workflow was unsustainable. There had to be a better way.&lt;/p&gt;

&lt;p&gt;That's when I decided to build &lt;strong&gt;&lt;a href="https://github.com/Walsen/hugo2devto" rel="noopener noreferrer"&gt;hugo2devto&lt;/a&gt;&lt;/strong&gt;: a GitHub Action that automatically publishes Hugo blog posts to Dev.to with full frontmatter support, canonical URLs, and zero manual intervention.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem: Cross-Posting is Painful
&lt;/h2&gt;

&lt;p&gt;Let me paint a picture of the traditional workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Write your post in Hugo with YAML frontmatter&lt;/li&gt;
&lt;li&gt;Build and deploy your Hugo site&lt;/li&gt;
&lt;li&gt;Open Dev.to in your browser&lt;/li&gt;
&lt;li&gt;Copy-paste your markdown content&lt;/li&gt;
&lt;li&gt;Manually configure:

&lt;ul&gt;
&lt;li&gt;Title&lt;/li&gt;
&lt;li&gt;Tags (max 4)&lt;/li&gt;
&lt;li&gt;Cover image&lt;/li&gt;
&lt;li&gt;Canonical URL&lt;/li&gt;
&lt;li&gt;Published/draft status&lt;/li&gt;
&lt;li&gt;Series information&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Preview and publish&lt;/li&gt;
&lt;li&gt;Repeat for every post update&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process has several problems:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Time-consuming&lt;/strong&gt;: 10-15 minutes per post&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Error-prone&lt;/strong&gt;: Easy to forget canonical URLs or tags&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Not scalable&lt;/strong&gt;: Discourages cross-posting&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;No version control&lt;/strong&gt;: Changes aren't tracked&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Manual synchronization&lt;/strong&gt;: Updates require repeating the entire process&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Solution: Automation Through GitHub Actions
&lt;/h2&gt;

&lt;p&gt;The ideal solution would:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Detect when a Hugo post is created or updated&lt;/li&gt;
&lt;li&gt;Automatically extract frontmatter metadata&lt;/li&gt;
&lt;li&gt;Publish or update the post on Dev.to&lt;/li&gt;
&lt;li&gt;Set the correct canonical URL&lt;/li&gt;
&lt;li&gt;Handle tags, cover images, and series&lt;/li&gt;
&lt;li&gt;Work seamlessly with existing Hugo workflows&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is exactly what &lt;code&gt;hugo2devto&lt;/code&gt; does.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the Action: Technical Deep Dive
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Architecture Overview
&lt;/h3&gt;

&lt;p&gt;The action is built with TypeScript and runs on Node.js 20. Here's the high-level architecture:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRECiAgICBzdWJncmFwaCB0cmlnZ2VyWyJHaXRIdWIgV29ya2Zsb3cgVHJpZ2dlciJdCiAgICAgICAgVDFbInB1c2ggdG8gbWFpbiJdCiAgICAgICAgVDJbIndvcmtmbG93X2Rpc3BhdGNoIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIGFjdGlvblsiaHVnbzJkZXZ0byBBY3Rpb24gdjEiXQogICAgICAgIEExWyIxLiBSZWFkIE1hcmtkb3duIEZpbGUiXQogICAgICAgIEEyWyIyLiBQYXJzZSBZQU1MIEZyb250bWF0dGVyIl0KICAgICAgICBBM1siMy4gVHJhbnNmb3JtIEh1Z28g4oaSIERldi50byJdCiAgICAgICAgQTRbIjQuIENhbGwgRGV2LnRvIEFQSSJdCiAgICAgICAgQTVbIjUuIFJldHVybiBPdXRwdXRzIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIG91dHB1dFsiUmVzdWx0Il0KICAgICAgICBPMVsiUHVibGlzaGVkIG9uIERldi50byJdCiAgICBlbmQKCiAgICB0cmlnZ2VyIC0tPiBBMQogICAgQTEgLS0%2BIEEyCiAgICBBMiAtLT4gQTMKICAgIEEzIC0tPiBBNAogICAgQTQgLS0%2BIEE1CiAgICBBNSAtLT4gTzE%3D" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmermaid.ink%2Fimg%2FZmxvd2NoYXJ0IFRECiAgICBzdWJncmFwaCB0cmlnZ2VyWyJHaXRIdWIgV29ya2Zsb3cgVHJpZ2dlciJdCiAgICAgICAgVDFbInB1c2ggdG8gbWFpbiJdCiAgICAgICAgVDJbIndvcmtmbG93X2Rpc3BhdGNoIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIGFjdGlvblsiaHVnbzJkZXZ0byBBY3Rpb24gdjEiXQogICAgICAgIEExWyIxLiBSZWFkIE1hcmtkb3duIEZpbGUiXQogICAgICAgIEEyWyIyLiBQYXJzZSBZQU1MIEZyb250bWF0dGVyIl0KICAgICAgICBBM1siMy4gVHJhbnNmb3JtIEh1Z28g4oaSIERldi50byJdCiAgICAgICAgQTRbIjQuIENhbGwgRGV2LnRvIEFQSSJdCiAgICAgICAgQTVbIjUuIFJldHVybiBPdXRwdXRzIl0KICAgIGVuZAoKICAgIHN1YmdyYXBoIG91dHB1dFsiUmVzdWx0Il0KICAgICAgICBPMVsiUHVibGlzaGVkIG9uIERldi50byJdCiAgICBlbmQKCiAgICB0cmlnZ2VyIC0tPiBBMQogICAgQTEgLS0%2BIEEyCiAgICBBMiAtLT4gQTMKICAgIEEzIC0tPiBBNAogICAgQTQgLS0%2BIEE1CiAgICBBNSAtLT4gTzE%3D" alt="Mermaid Diagram" width="346" height="992"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features
&lt;/h3&gt;

&lt;h4&gt;
  
  
  1. Full Hugo Frontmatter Support
&lt;/h4&gt;

&lt;p&gt;The action understands Hugo's frontmatter format natively:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;My&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Awesome&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Post"&lt;/span&gt;
&lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;A&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;deep&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;dive&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;into&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;something&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;cool"&lt;/span&gt;
&lt;span class="na"&gt;publishdate&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;2026-01-25T22:23:37-04:00&lt;/span&gt;
&lt;span class="na"&gt;draft&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
&lt;span class="na"&gt;tags&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hugo"&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;devto"&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;automation"&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
&lt;span class="na"&gt;series&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Hugo&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Automation"&lt;/span&gt;
&lt;span class="na"&gt;eyecatch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://example.com/cover.png"&lt;/span&gt;
&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It automatically maps these fields to Dev.to's API format:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;title&lt;/code&gt; → &lt;code&gt;title&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;description&lt;/code&gt; → &lt;code&gt;description&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;tags&lt;/code&gt; → &lt;code&gt;tags&lt;/code&gt; (limited to 4)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;series&lt;/code&gt; → &lt;code&gt;series&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;eyecatch&lt;/code&gt; / &lt;code&gt;cover_image&lt;/code&gt; → &lt;code&gt;main_image&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;draft&lt;/code&gt; → &lt;code&gt;published&lt;/code&gt; (inverted)&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  2. Automatic Canonical URL Generation
&lt;/h4&gt;

&lt;p&gt;One of the most important SEO considerations when cross-posting is setting the canonical URL to point back to your original post. The action automatically generates this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;canonicalUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;baseUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;language&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/posts/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/`&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For example, a post at &lt;code&gt;content/en/posts/my-post.md&lt;/code&gt; becomes:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://blog.walsen.website/en/posts/my-post/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  3. Multi-Language Support
&lt;/h4&gt;

&lt;p&gt;The action detects the language from the file path:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;content/en/posts/my-post.md  → English
content/es/posts/mi-post.md  → Spanish
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is crucial for Hugo sites with i18n support.&lt;/p&gt;

&lt;h4&gt;
  
  
  4. Mermaid Diagram Support (v1.1.0)
&lt;/h4&gt;

&lt;p&gt;Hugo uses shortcodes for mermaid diagrams, but Dev.to doesn't support mermaid natively. The action automatically converts Hugo mermaid shortcodes to rendered PNG images using the mermaid.ink service:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="c"&gt;&amp;lt;!-- Hugo format (in your source) --&amp;gt;&lt;/span&gt;
{{&lt;span class="nt"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="err"&gt;*&lt;/span&gt; &lt;span class="nt"&gt;mermaid&lt;/span&gt; &lt;span class="err"&gt;*/&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;}}
flowchart TD
    A --&amp;gt; B
{{&lt;span class="nt"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="err"&gt;*&lt;/span&gt; &lt;span class="err"&gt;/&lt;/span&gt;&lt;span class="nt"&gt;mermaid&lt;/span&gt; &lt;span class="err"&gt;*/&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;}}

&lt;span class="c"&gt;&amp;lt;!-- Converted to (on Dev.to) --&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;![&lt;/span&gt;&lt;span class="nv"&gt;Mermaid Diagram&lt;/span&gt;&lt;span class="p"&gt;](&lt;/span&gt;&lt;span class="sx"&gt;https://mermaid.ink/img/base64encodeddiagram&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This means your diagrams render beautifully on both platforms without any manual intervention.&lt;/p&gt;

&lt;h4&gt;
  
  
  5. Idempotent Updates
&lt;/h4&gt;

&lt;p&gt;The action checks if an article already exists on Dev.to (by canonical URL) and updates it instead of creating a duplicate. This means you can run the action multiple times safely.&lt;/p&gt;

&lt;h3&gt;
  
  
  Implementation Highlights
&lt;/h3&gt;

&lt;p&gt;Here's a simplified version of the core logic:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Read and parse the markdown file&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fileContent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;readFileSync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;utf-8&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;matter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;fileContent&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Extract metadata&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;description&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;description&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;tags&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="p"&gt;[]).&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// Dev.to limit&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;published&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;draft&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Generate canonical URL&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;slug&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;basename&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;.md&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;toLowerCase&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sr"&gt;/&lt;/span&gt;&lt;span class="se"&gt;\s&lt;/span&gt;&lt;span class="sr"&gt;+/g&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;-&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;language&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/en/&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;en&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;es&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;canonicalUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;baseUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;language&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/posts/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Prepare Dev.to article&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;article&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;body_markdown&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;published&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;tags&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;canonical_url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;canonicalUrl&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;main_image&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;eyecatch&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;cover_image&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;series&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;series&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;description&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// Check if article exists&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;existingArticle&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;findArticleByCanonicalUrl&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;canonicalUrl&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;existingArticle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// Update existing article&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;updateArticle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;existingArticle&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;article&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// Create new article&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;createArticle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;article&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Using the Action: Practical Examples
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Basic Setup
&lt;/h3&gt;

&lt;p&gt;First, get your Dev.to API key from &lt;a href="https://dev.to/settings/extensions"&gt;https://dev.to/settings/extensions&lt;/a&gt; and add it to your repository secrets as &lt;code&gt;DEVTO_API_KEY&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Then create &lt;code&gt;.github/workflows/publish-devto.yml&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Publish to Dev.to&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;main&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
    &lt;span class="na"&gt;paths&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content/*/posts/*.md'&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;publish&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Publish to Dev.to&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Walsen/hugo2devto@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;api-key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DEVTO_API_KEY }}&lt;/span&gt;
          &lt;span class="na"&gt;file-path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content/en/posts/my-post.md'&lt;/span&gt;
          &lt;span class="na"&gt;base-url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://blog.walsen.website'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Advanced: Automatic Detection of Changed Posts
&lt;/h3&gt;

&lt;p&gt;For my blog, I wanted the action to automatically detect which posts changed and publish only those:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Publish to Dev.to&lt;/span&gt;

&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;push&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;branches&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;main&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
    &lt;span class="na"&gt;paths&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content/*/posts/*.md'&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;detect-changes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;outputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;posts&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.changed-files.outputs.posts }}&lt;/span&gt;
      &lt;span class="na"&gt;has-changes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ steps.changed-files.outputs.has-changes }}&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;fetch-depth&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;2&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Get changed files&lt;/span&gt;
        &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;changed-files&lt;/span&gt;
        &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;CHANGED_FILES=$(git diff --name-only HEAD^ HEAD | grep -E 'content/en/posts/.*\.md' || echo "")&lt;/span&gt;

          &lt;span class="s"&gt;if [ -n "$CHANGED_FILES" ]; then&lt;/span&gt;
            &lt;span class="s"&gt;POSTS_JSON=$(echo "$CHANGED_FILES" | jq -R -s -c 'split("\n") | map(select(length &amp;gt; 0))')&lt;/span&gt;
            &lt;span class="s"&gt;echo "posts=$POSTS_JSON" &amp;gt;&amp;gt; $GITHUB_OUTPUT&lt;/span&gt;
            &lt;span class="s"&gt;echo "has-changes=true" &amp;gt;&amp;gt; $GITHUB_OUTPUT&lt;/span&gt;
          &lt;span class="s"&gt;else&lt;/span&gt;
            &lt;span class="s"&gt;echo "posts=[]" &amp;gt;&amp;gt; $GITHUB_OUTPUT&lt;/span&gt;
            &lt;span class="s"&gt;echo "has-changes=false" &amp;gt;&amp;gt; $GITHUB_OUTPUT&lt;/span&gt;
          &lt;span class="s"&gt;fi&lt;/span&gt;

  &lt;span class="na"&gt;publish-changed&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;if&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;needs.detect-changes.outputs.has-changes == 'true'&lt;/span&gt;
    &lt;span class="na"&gt;needs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;detect-changes&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;strategy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;matrix&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;post&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ fromJson(needs.detect-changes.outputs.posts) }}&lt;/span&gt;
      &lt;span class="na"&gt;fail-fast&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Publish to Dev.to&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Walsen/hugo2devto@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;api-key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DEVTO_API_KEY }}&lt;/span&gt;
          &lt;span class="na"&gt;file-path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ matrix.post }}&lt;/span&gt;
          &lt;span class="na"&gt;base-url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://blog.walsen.website'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Detects which markdown files changed in the last commit&lt;/li&gt;
&lt;li&gt;Converts them to a JSON array&lt;/li&gt;
&lt;li&gt;Uses a matrix strategy to publish multiple posts in parallel&lt;/li&gt;
&lt;li&gt;Sets &lt;code&gt;fail-fast: false&lt;/code&gt; so one failure doesn't stop others&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Manual Trigger
&lt;/h3&gt;

&lt;p&gt;You can also trigger publishing manually:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;workflow_dispatch&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;inputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;post_path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Path&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;to&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;the&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;post'&lt;/span&gt;
        &lt;span class="na"&gt;required&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
        &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;string&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;publish&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v4&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Publish to Dev.to&lt;/span&gt;
        &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Walsen/hugo2devto@v1&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;api-key&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.DEVTO_API_KEY }}&lt;/span&gt;
          &lt;span class="na"&gt;file-path&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ github.event.inputs.post_path }}&lt;/span&gt;
          &lt;span class="na"&gt;base-url&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://blog.walsen.website'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Real-World Results
&lt;/h2&gt;

&lt;p&gt;Since implementing this action on my blog, the results have been transformative:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Before:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;⏱️ 15 minutes per post to cross-post manually&lt;/li&gt;
&lt;li&gt;🐛 Frequent mistakes (forgotten canonical URLs, wrong tags)&lt;/li&gt;
&lt;li&gt;😓 Discouraged from updating posts on Dev.to&lt;/li&gt;
&lt;li&gt;📉 Inconsistent presence on Dev.to&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;After:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;⚡ Automatic publishing in ~30 seconds&lt;/li&gt;
&lt;li&gt;✅ Zero manual intervention required&lt;/li&gt;
&lt;li&gt;🔄 Updates synchronized automatically&lt;/li&gt;
&lt;li&gt;📈 Consistent cross-posting to Dev.to&lt;/li&gt;
&lt;li&gt;🎯 More time for writing, less for publishing&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Lessons Learned
&lt;/h2&gt;

&lt;p&gt;Building this action taught me several valuable lessons:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Start with a Script, Then Package It
&lt;/h3&gt;

&lt;p&gt;I initially built a TypeScript script (&lt;code&gt;publish-to-devto.ts&lt;/code&gt;) that worked locally. Once it was stable, I packaged it as a GitHub Action. This iterative approach made debugging much easier.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Frontmatter Mapping is Tricky
&lt;/h3&gt;

&lt;p&gt;Hugo and Dev.to use different field names and formats. Creating a robust mapping layer required careful testing with various post formats.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Idempotency Matters
&lt;/h3&gt;

&lt;p&gt;The action needed to handle both new posts and updates gracefully. Checking for existing articles by canonical URL was crucial.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Documentation is Key
&lt;/h3&gt;

&lt;p&gt;I created multiple documentation files:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;README.md&lt;/code&gt; - Overview and quick start&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;GETTING_STARTED.md&lt;/code&gt; - 5-minute setup guide&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;SETUP.md&lt;/code&gt; - Comprehensive instructions&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;HUGO_COMPATIBILITY.md&lt;/code&gt; - Hugo-specific details&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;API_KEY_SETUP.md&lt;/code&gt; - Security best practices&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This made the action accessible to users with different needs.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Community Feedback is Invaluable
&lt;/h3&gt;

&lt;p&gt;Publishing the action to the GitHub Marketplace exposed it to real-world use cases I hadn't considered. User feedback helped improve error handling and edge cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future Enhancements
&lt;/h2&gt;

&lt;p&gt;While the action works great, there's always room for improvement:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Batch Publishing&lt;/strong&gt;: Support publishing multiple posts in a single action invocation&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dry Run Mode&lt;/strong&gt;: Preview what would be published without actually doing it&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Custom Field Mapping&lt;/strong&gt;: Allow users to configure their own frontmatter mappings&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Image Upload&lt;/strong&gt;: Automatically upload local images to Dev.to&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Analytics Integration&lt;/strong&gt;: Track publishing metrics&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-Platform Support&lt;/strong&gt;: Extend to Medium, Hashnode, etc.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Additional Shortcode Support&lt;/strong&gt;: Transform other Hugo shortcodes (YouTube, Twitter embeds, etc.)&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Building &lt;code&gt;hugo2devto&lt;/code&gt; solved a real problem I faced as a technical blogger: the friction of cross-posting content. By automating the process through GitHub Actions, I eliminated manual work, reduced errors, and made it effortless to maintain a presence on Dev.to.&lt;/p&gt;

&lt;p&gt;The action is open source and available for anyone to use. Whether you're running a Hugo blog, a Jekyll site, or any markdown-based platform, the core concepts apply: &lt;strong&gt;automate the boring stuff so you can focus on writing great content&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If you're interested in trying it out or contributing, check out the repository:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Repository&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/hugo2devto" rel="noopener noreferrer"&gt;https://github.com/Walsen/hugo2devto&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;GitHub Marketplace&lt;/strong&gt;: &lt;a href="https://github.com/marketplace/actions/hugo-to-dev-to-publisher" rel="noopener noreferrer"&gt;https://github.com/marketplace/actions/hugo-to-dev-to-publisher&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The future of technical blogging is automated, and I'm excited to see where this journey leads. Happy blogging!&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Action Repository&lt;/strong&gt;: &lt;a href="https://github.com/Walsen/hugo2devto" rel="noopener noreferrer"&gt;https://github.com/Walsen/hugo2devto&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;My Blog&lt;/strong&gt;: &lt;a href="https://blog.walsen.website" rel="noopener noreferrer"&gt;https://blog.walsen.website&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dev.to Profile&lt;/strong&gt;: &lt;a href="https://dev.to/w4ls3n"&gt;https://dev.to/w4ls3n&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dev.to API Documentation&lt;/strong&gt;: &lt;a href="https://developers.forem.com/api" rel="noopener noreferrer"&gt;https://developers.forem.com/api&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>automation</category>
      <category>devto</category>
      <category>github</category>
      <category>productivity</category>
    </item>
    <item>
      <title>A Wild Ride Into Vibe Coding</title>
      <dc:creator>Sergio D. Rodríguez Inclán</dc:creator>
      <pubDate>Fri, 23 Jan 2026 11:32:21 +0000</pubDate>
      <link>https://forem.com/w4ls3n/a-wild-ride-into-vibe-coding-5n</link>
      <guid>https://forem.com/w4ls3n/a-wild-ride-into-vibe-coding-5n</guid>
      <description>&lt;p&gt;&lt;em&gt;¡Versión en español &lt;strong&gt;&lt;a href="https://blog.walsen.website/es/posts/un-viaje-desenfrenado-hacia-el-vibe-coding/" rel="noopener noreferrer"&gt;aquí!&lt;/a&gt;&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Since I was a child, computers have held a magnetic fascination for me. I vividly remember the first time I saw one in my mother’s office at the age of eleven; in that exact moment, I knew precisely what I wanted to do with my life.&lt;/p&gt;

&lt;p&gt;Although destiny led me down the path of infrastructure, I never truly abandoned the dream of developing applications. While I have a solid foundation in programming and can hold my own in the terminal, a lack of daily practice and the time required for a deep dive had—until now—prevented me from producing professional-grade software.&lt;/p&gt;

&lt;p&gt;Today, however, the landscape has shifted. With the rise of Artificial Intelligence, LLMs, and Generative AI, a new horizon has opened for profiles like mine: the ability to bring complex applications to life without needing to be an expert in deep syntax, by leveraging AI-powered editors that amplify our creative potential.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Challenge: AWS Community Day Bolivia 2025
&lt;/h3&gt;

&lt;p&gt;This adventure began with the organization of &lt;strong&gt;AWS Community Day Bolivia 2025&lt;/strong&gt;. This year, the honor of leading the event fell to the &lt;strong&gt;AWS User Group Cochabamba&lt;/strong&gt;, a team where I serve as one of the leaders. Organizing an event of this magnitude requires impeccable coordination with volunteers; while tools like Google Forms or Sheets are useful, I was looking for something more: a comprehensive, custom-built solution.&lt;/p&gt;

&lt;p&gt;My vision was to build a platform that would allow us to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Manage Projects:&lt;/strong&gt; Create and administer specific initiatives for the event.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dynamic Registration:&lt;/strong&gt; Allow volunteers to sign up directly for projects.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multichannel Communication:&lt;/strong&gt; Send announcements and updates from a centralized hub.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Admin Panel:&lt;/strong&gt; A secure portal for operations exclusive to authorized leaders.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Technological Innovation:&lt;/strong&gt; Test &lt;strong&gt;Kiro&lt;/strong&gt;, Amazon’s current IDE, and its powerful generative AI capabilities.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Community Legacy:&lt;/strong&gt; Create a project "by the community, for the community" that could serve as a real-world case study in our talks.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Journey is the Reward
&lt;/h3&gt;

&lt;p&gt;Although the development wasn't finalized in time for the Community Day, the experience was a revelation. This process deeply enriched my understanding of Generative AI, application architecture, and the modern software development lifecycle. Above all, it allowed me to discover the "tips &amp;amp; tricks" of working with AI agents and the delicate human-machine synergy required to achieve the desired result.&lt;/p&gt;

&lt;p&gt;As of this writing, &lt;strong&gt;Kiro&lt;/strong&gt; has advanced significantly. Many of the initial limitations I encountered have been addressed—particularly regarding session management and memory—solidifying its position as a cutting-edge tool that leaves previous versions behind.&lt;/p&gt;

&lt;h2&gt;
  
  
  "Vibe Coding" is No Joke
&lt;/h2&gt;

&lt;p&gt;I must confess, I was quite naive at first. I trusted Kiro’s output almost blindly because, initially at least, it generated the site’s starting structure correctly and surprisingly fast. The project kicked off with a specifications proposal (a &lt;em&gt;Spec&lt;/em&gt;) designed by Kiro itself; in my &lt;em&gt;prompt&lt;/em&gt;, I asked it to generate a three-tier serverless project: &lt;strong&gt;AstroJS&lt;/strong&gt; for the frontend, &lt;strong&gt;FastAPI&lt;/strong&gt; for the backend, and &lt;strong&gt;DynamoDB&lt;/strong&gt; for data persistence.&lt;/p&gt;

&lt;p&gt;During the early stages, I worked with three editors open and deployed directly from my local machine to the User Group’s AWS account. However, the time came to do things right and set up deployment via a &lt;strong&gt;CI/CD&lt;/strong&gt; workflow (because, as the saying goes: "the shoemaker's son always goes barefoot"). For this, I decided to use &lt;strong&gt;AWS CDK&lt;/strong&gt;. In my experience, managing state files is an additional headache I wanted to avoid, so I ruled out Terraform and Pulumi from the start.&lt;/p&gt;

&lt;h3&gt;
  
  
  Unifying All Projects Under a Single Session
&lt;/h3&gt;

&lt;p&gt;At the start of every task, I would enter a prompt to polish details that didn’t quite look right or weren't functioning correctly. This is where I hit my first major roadblock: I had four different Kiro sessions that shared no context with one another.&lt;/p&gt;

&lt;p&gt;If Kiro detected an API bottleneck while developing the frontend, the tool was essentially "blind" to the other components. This forced me into a constant cycle of copying and pasting to transfer data from one environment to another. To solve this, I decided to centralize everything into a &lt;strong&gt;single session&lt;/strong&gt;. I grouped all the repositories into a root directory so that Kiro could navigate between them with full awareness of how the components interacted.&lt;/p&gt;

&lt;p&gt;Below is the local structure I defined to achieve that synchrony:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;❯ tree &lt;span class="nt"&gt;-L&lt;/span&gt; 1 &lt;span class="nt"&gt;-a&lt;/span&gt;
&lt;span class="nb"&gt;.&lt;/span&gt;
├── .amazonq
├── .devbox
├── .envrc
├── .gitignore
├── .kiro
├── .python-version
├── .venv
├── README.md
├── devbox.json
├── devbox.lock
├── generated-diagrams
├── pyproject.toml
├── registry-api
├── registry-documentation
├── registry-frontend
├── registry-infrastructure
└── uv.lock

9 directories, 8 files

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The reader will notice that I have Devbox configured in the root directory. I made this choice because Kiro frequently needed to run Python scripts to sanitize repositories, perform searches, or handle troubleshooting tasks. Therefore, I saw the need to provide it with an isolated dependency chain, thus avoiding the installation of unnecessary software on my primary operating system.&lt;/p&gt;

&lt;h2&gt;
  
  
  Clear Rules: The Art of Coexistence
&lt;/h2&gt;

&lt;p&gt;This was the most complex stage and the one that demanded the most time. It was a period of alignment between the AI and myself; the moment where we discovered our character, our limits, and just how much we could tolerate one another. Readers might be skeptical, but after working through dozens of sessions, I can confirm that each one develops a distinct "personality." There is a subtle but real difference: the speed at which they grasp previous context, the tone of the conversation, and the level of initiative varies from session to session.&lt;/p&gt;

&lt;p&gt;For those who haven't yet experimented with &lt;strong&gt;Kiro&lt;/strong&gt; (or Amazon Q), there are fundamental aspects that must be taken very seriously:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Session Volatility:&lt;/strong&gt; Sessions are temporary. Upon reaching a context limit, the session resets, transferring only a very brief summary of what happened. Critical details can be lost in this process, especially if the previous activity was intense.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Trust Management:&lt;/strong&gt; When running tools, Kiro asks if commands are safe. If you decline, it won't execute anything; if you accept, it will run everything without further confirmation. It’s an "all or nothing" situation that requires vigilance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Git Structure Invisibility:&lt;/strong&gt; Kiro does not natively understand the concept of a "repository." It ignores the existence of commits, branches, or staged files. Because of this, you must be surgical: you have to tell it exactly which file to modify and which change to make.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Project Ecosystem Disconnect:&lt;/strong&gt; It doesn’t automatically assume the existence of config files, dependencies, or CI/CD workflows. To Kiro, every file is an isolated entity unless you provide the full map.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Ignoring these rules comes at a high price: Kiro will generate code that doesn't work or doesn't align with your goals. The most dangerous part is that the tool will always confidently assure you that everything is fine, leaving you with a false sense of success.&lt;/p&gt;

&lt;p&gt;The breaking point came during an iteration where I requested an architectural adjustment. Kiro misinterpreted the instruction and altered the &lt;strong&gt;entire&lt;/strong&gt; project: it transformed a Serverless architecture into one based on ECS and Aurora. It was a frustrating experience, but a necessary one. At that moment, I decided to establish rules of engagement: I created an &lt;strong&gt;"Architecture Primitives"&lt;/strong&gt; document. In it, I defined strict guidelines on project structure, repository management, and expected behavior when publishing changes.&lt;/p&gt;

&lt;p&gt;From that "contract" onward, the project stabilized. I moved forward with a fluidity that previously seemed impossible, and thanks to that, the beta version materialized much sooner than expected and is already available online.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Blueprint: Anatomy of an Efficient Architecture
&lt;/h2&gt;

&lt;p&gt;The project's architecture is illustrated in the following diagram:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flurtcfhef6ydag79hdb3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flurtcfhef6ydag79hdb3.png" alt="Architecture" width="800" height="940"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To achieve the goals of cost-efficiency and scalability, I designed a modular structure divided into logical layers. Here is how the components interact:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Access and Delivery Layer (Edge &amp;amp; Auth)
&lt;/h3&gt;

&lt;p&gt;This is the first line of contact with the user, prioritizing security and speed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Amazon CloudFront:&lt;/strong&gt; Acts as a CDN to distribute content with low global latency.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Cognito:&lt;/strong&gt; Serves as the Identity Provider (IdP), managing authentication and the lifecycle of both users and administrators.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. Interface Layer (API Gateway)
&lt;/h3&gt;

&lt;p&gt;We separated the control planes to ensure the security of sensitive operations.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;App API:&lt;/strong&gt; The public entry point for application functionalities.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Admin API:&lt;/strong&gt; A dedicated, isolated Gateway for privileged administrative operations.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  3. The Brain: Orchestration and Serverless Logic
&lt;/h3&gt;

&lt;p&gt;This is where the system's intelligence resides, utilizing an &lt;strong&gt;event-driven microservices&lt;/strong&gt; approach.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;AWS Step Functions:&lt;/strong&gt; Orchestrates complex workflows, ensuring each step executes in the correct order.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lambda Functions (Multi-purpose):&lt;/strong&gt; Specialized functions that execute business logic, ranging from the &lt;em&gt;Route Manager&lt;/em&gt; to access control (&lt;em&gt;HBAC&lt;/em&gt;).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon ECR:&lt;/strong&gt; Stores the container images that power our Lambdas, enabling consistent execution environments.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AWS CDK:&lt;/strong&gt; The "glue" of the entire project, allowing this infrastructure to be 100% reproducible through code.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  4. Service Layer and Dynamic Registry
&lt;/h3&gt;

&lt;p&gt;The heart of the system uses the &lt;strong&gt;Service Registry&lt;/strong&gt; pattern to avoid tight coupling.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Service Registry Pattern:&lt;/strong&gt; Implemented via Lambdas acting as a central directory, enabling dynamic service discovery at runtime.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specialized Services:&lt;/strong&gt; Dedicated modules for searches (&lt;em&gt;Search Service&lt;/em&gt;), scheduled tasks (&lt;em&gt;Cron Service&lt;/em&gt;), and personnel management (&lt;em&gt;People Service&lt;/em&gt;).&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. Persistence and Communications (Data Layer)
&lt;/h3&gt;

&lt;p&gt;A polyglot combination to handle different data types efficiently.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Amazon DynamoDB:&lt;/strong&gt; Our NoSQL database for high-speed caching and case management.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon S3:&lt;/strong&gt; The central repository for image and object storage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Amazon SES:&lt;/strong&gt; The notification engine for direct communication with volunteers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  6. Cross-Cutting Layers: Security and Observability
&lt;/h3&gt;

&lt;p&gt;Components that span the entire architecture to ensure health and protection.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Security and Configuration:&lt;/strong&gt; Intensive use of &lt;strong&gt;AWS Secrets Manager&lt;/strong&gt; and &lt;strong&gt;Parameter Store&lt;/strong&gt; for secret management and zero-trust policies.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;360° Observability:&lt;/strong&gt; Implementation of &lt;strong&gt;Amazon CloudWatch&lt;/strong&gt; for tracing, performance metrics, and log centralization.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Project Manifesto: Patterns and Guidelines
&lt;/h2&gt;

&lt;p&gt;To prevent our collaboration with AI from descending into the architectural chaos I mentioned earlier, I had to formalize our knowledge into two fundamental pillars. These documents didn't just guide Kiro; they established the foundation of what I consider modern assisted development.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Enterprise Architecture Patterns (EAP)
&lt;/h3&gt;

&lt;p&gt;It isn’t just about writing code; it’s about following principles that guarantee the system’s evolution. Based on the documentation from the &lt;a href="https://github.com/awscbba/registry-documentation" rel="noopener noreferrer"&gt;Registry Project&lt;/a&gt;, we implemented three golden rules:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Isolation and Decoupling:&lt;/strong&gt; Every service operates autonomously. The use of the &lt;strong&gt;Service Registry&lt;/strong&gt; is not optional; it is the mechanism that allows our infrastructure to scale without creating rigid dependencies (&lt;em&gt;spaghetti code&lt;/em&gt;).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Design for Resilience:&lt;/strong&gt; We apply the &lt;strong&gt;Saga Pattern&lt;/strong&gt; to manage distributed transactions in serverless environments, ensuring that if one step fails, the system can automatically recover or compensate for the error.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security by Design (Zero-Trust):&lt;/strong&gt; No component trusts another by default. Every interaction requires identity validation through &lt;strong&gt;Cognito&lt;/strong&gt; and access control based on strict policies.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. AI Coexistence Manual: The "AI Assistant Guidelines"
&lt;/h3&gt;

&lt;p&gt;Learning to talk to an agent like Kiro requires more than simple instructions; it requires a framework. These are the key takeaways from our &lt;a href="https://github.com/awscbba/registry-documentation/blob/main/workflows/ai-assistant-guidelines.md" rel="noopener noreferrer"&gt;Assistance Guidelines&lt;/a&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Incremental Context:&lt;/strong&gt; Instead of firing off massive prompts, we feed the AI specific fragments of context. If Kiro knows the database structure but not the authentication flow, we explicitly remind it of the latter before asking for a change in that area.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mandatory Human Validation:&lt;/strong&gt; The AI proposes; the human disposes. We established that no deployment occurs without a manual review of the generated &lt;em&gt;diffs&lt;/em&gt;. This prevented the project from accidentally transforming back into costly and unnecessary infrastructure.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Living Documentation:&lt;/strong&gt; Every time the AI generated an innovative solution, that logic was immediately documented. In this way, the "memory" of the project didn't rely solely on Kiro's current session, but on our central knowledge repository.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Reasoning Behind a "Multi-Cloud-Ready" Architecture in an AWS Environment
&lt;/h3&gt;

&lt;p&gt;Although the project lives and breathes in the &lt;strong&gt;Amazon Web Services&lt;/strong&gt; ecosystem, I made a strategic decision from day one: the architecture had to be &lt;strong&gt;Multi-Cloud-Ready&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Many might ask: &lt;em&gt;Why complicate the design if we already have the AWS toolset?&lt;/em&gt; The answer lies in technical sovereignty and cost control. By implementing patterns like the &lt;strong&gt;Service Registry&lt;/strong&gt; and using &lt;strong&gt;Devbox&lt;/strong&gt; to isolate environments, we avoid the dreaded &lt;em&gt;vendor lock-in&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Designing this way forces us to separate &lt;strong&gt;business logic&lt;/strong&gt; from &lt;strong&gt;infrastructure&lt;/strong&gt;. This means that if the community decided tomorrow to migrate part of the workload to another platform or integrate external services, the core of our application would not suffer a technical trauma. It is an architecture designed for freedom—where AWS is our choice of excellence, but not our only possibility.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This project allowed me to experience a new way of working and a new way of thinking about software development. It isn’t about replacing developers; it’s about empowering them—freeing their minds so they can focus on what truly matters: creativity, innovation, and solving complex problems.&lt;/p&gt;

&lt;p&gt;Generative AI is a powerful tool, but it requires a human to guide it, correct it, and refine it. That is what I love most about this experience: the fact that there is a human behind every line of generated code, and that this human has the capacity to learn, evolve, and improve.&lt;/p&gt;

&lt;p&gt;It is not about letting the AI do everything; it is about learning to work alongside it, leveraging its power to do more, be more, and achieve things that previously seemed impossible.&lt;/p&gt;

&lt;p&gt;I hope this journey serves as an inspiration for you to explore new ways of working, thinking, and creating. May it also remind you that the future isn't something that just happens to us—it is something we build together, with tools, with technology, with AI, and with humanity.&lt;/p&gt;

&lt;p&gt;Finally, to all those who have always wanted to develop applications but haven't been able to: now is the time. From this point forward, nothing is impossible. However, do not go into this process blindly; you must do it with a solid foundation. So, it's time to hit the books!&lt;/p&gt;

&lt;p&gt;Thank you for reading this far, and as always, see you next time!&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Site:&lt;/strong&gt; &lt;a href="https://registry.cloud.org.bo" rel="noopener noreferrer"&gt;https://registry.cloud.org.bo&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Repositories:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;API:&lt;/strong&gt; &lt;a href="https://github.com/awscbba/registry-api" rel="noopener noreferrer"&gt;https://github.com/awscbba/registry-api&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;FrontEnd:&lt;/strong&gt; &lt;a href="https://github.com/awscbba/registry-frontend" rel="noopener noreferrer"&gt;https://github.com/awscbba/registry-frontend&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Infrastructure:&lt;/strong&gt; &lt;a href="https://github.com/awscbba/registry-infrastructure" rel="noopener noreferrer"&gt;https://github.com/awscbba/registry-infrastructure&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Documentation:&lt;/strong&gt; &lt;a href="https://github.com/awscbba/registry-documentation" rel="noopener noreferrer"&gt;https://github.com/awscbba/registry-documentation&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>webdev</category>
      <category>programming</category>
      <category>ai</category>
      <category>aws</category>
    </item>
    <item>
      <title>El Futuro del Trabajo</title>
      <dc:creator>Sergio D. Rodríguez Inclán</dc:creator>
      <pubDate>Mon, 10 Nov 2025 20:37:38 +0000</pubDate>
      <link>https://forem.com/w4ls3n/el-futuro-del-trabajo-5ccg</link>
      <guid>https://forem.com/w4ls3n/el-futuro-del-trabajo-5ccg</guid>
      <description>&lt;p&gt;&lt;em&gt;English version &lt;strong&gt;&lt;a href="https://blog.walsen.website/posts/the-future-of-work/" rel="noopener noreferrer"&gt;here!&lt;/a&gt;&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Situación actual
&lt;/h2&gt;

&lt;p&gt;En el momento de escribir este artículo, las empresas tecnológicas se encuentran en una fase de transición. Con la llegada de la inteligencia artificial, la forma de producir software cambiará, y el tipo de profesionales requeridos para su construcción también lo hará (si no ha empezado a suceder ya).&lt;/p&gt;

&lt;p&gt;En el ámbito de la informática, los roles profesionales más comunes incluyen: arquitectos y desarrolladores, especialistas en control de calidad (QA), ingenieros de infraestructura, analistas de bases de datos, y todas las variantes relacionadas con estas especialidades.&lt;/p&gt;

&lt;p&gt;Los arquitectos y desarrolladores diseñan e implementan todo el código relacionado con el producto.&lt;/p&gt;

&lt;p&gt;Los especialistas en control de calidad (QA) aseguran que el producto cumpla con los requisitos de calidad.&lt;/p&gt;

&lt;p&gt;Los ingenieros de infraestructura se encargan de la infraestructura de la empresa, abarcando elementos como servidores, bases de datos, automatización en la nube, y un largo etcétera.&lt;/p&gt;

&lt;p&gt;Los analistas de bases de datos se ocupan de estas, incluyendo el diseño de esquemas y la migración de datos, entre otras tareas.&lt;/p&gt;

&lt;p&gt;Obviamente, existen otros roles de carácter más administrativo, pero no nos centraremos en ellos en este análisis.&lt;/p&gt;

&lt;p&gt;La cuestión fundamental es que la IA está redefiniendo la manera en que la industria crea sus productos; el primer y más importante sector afectado es, sin duda, el software. Los ingenieros de software, que hasta hace no mucho tiempo eran muy demandados por su capacidad para producir código y llevarlo a producción, realizaban su trabajo de forma predominantemente artesanal. Aunque siempre se han seguido buenas prácticas y procesos, la mayor parte del software existente ha sido diseñado y programado por cerebros y manos humanas.&lt;/p&gt;

&lt;p&gt;Hoy en día, los modelos de lenguaje de gran escala (LLM, por sus siglas en inglés) han superado a otros modelos de IA anteriores (como BERT) y se han consolidado como la herramienta más potente para la industria del software.&lt;/p&gt;

&lt;p&gt;Además, las técnicas de IA generativa, como el &lt;em&gt;vibe coding&lt;/em&gt; (si es un término específico), los agentes y los servidores MCP (Machine Control Program), combinadas, podrían potencialmente reemplazar una parte considerable del trabajo de desarrollo tradicional que impera hoy en la industria del software.&lt;/p&gt;

&lt;p&gt;Por lo tanto, las empresas tecnológicas, especialmente las de mayor envergadura, han comenzado a reemplazar ingenieros de desarrollo tradicionales por ingenieros especializados en IA.&lt;/p&gt;

&lt;h2&gt;
  
  
  ¿Cuál es la diferencia entre el desarrollo tradicional y el desarrollo con IA?
&lt;/h2&gt;

&lt;p&gt;En pocas palabras, ya no será necesario que un desarrollador cree &lt;strong&gt;una&lt;/strong&gt; API, o que un &lt;strong&gt;especialista DevOps despliegue&lt;/strong&gt; una solución en la nube; un agente con esas capacidades específicas podrá hacerlo.&lt;/p&gt;

&lt;h3&gt;
  
  
  ¿Cómo funcionan los Agentes de IA?
&lt;/h3&gt;

&lt;p&gt;Los agentes de IA son sistemas que combinan modelos de lenguaje (LLM) con herramientas y capacidades de ejecución. A diferencia de un simple chatbot, un agente puede:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Razonar&lt;/strong&gt; sobre qué acciones tomar&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ejecutar&lt;/strong&gt; herramientas y comandos&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Observar&lt;/strong&gt; los resultados&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Iterar&lt;/strong&gt; hasta completar la tarea&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;He aquí un pequeño flujo de ejemplo de despliegue usando un agente de IA.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F75gkieo483o99j3hl1cz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F75gkieo483o99j3hl1cz.png" alt="Despliegue" width="800" height="719"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Componentes clave del sistema
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;LLM (Modelo de Lenguaje)&lt;/strong&gt;: El "cerebro" que razona y toma decisiones&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Agente&lt;/strong&gt;: Orquestador que coordina entre el LLM y las herramientas&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Herramientas&lt;/strong&gt;: APIs, comandos CLI, acceso a archivos, bases de datos, etc.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Entorno&lt;/strong&gt;: El sistema real donde se ejecutan las acciones&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Flujo de trabajo
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Entrada del usuario&lt;/strong&gt;: Define la tarea de alto nivel&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Razonamiento&lt;/strong&gt;: El LLM analiza qué herramientas usar y en qué orden&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ejecución&lt;/strong&gt;: El agente invoca las herramientas necesarias&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Observación&lt;/strong&gt;: Recibe los resultados de cada acción&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Iteración&lt;/strong&gt;: El LLM decide el siguiente paso basándose en los resultados&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Completación&lt;/strong&gt;: Cuando la tarea está terminada, reporta al usuario&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Este ciclo de &lt;strong&gt;Razonar → Actuar → Observar&lt;/strong&gt; se repite hasta que el agente completa la tarea o determina que no puede continuar.&lt;/p&gt;

&lt;p&gt;La imagen a continuación ilustra lo que denomino la "torre de la abstracción". Al igual que en todo proceso, existe una evolución natural donde los componentes se especializan progresivamente y su interacción se vuelve más concreta y optimizada.&lt;/p&gt;

&lt;p&gt;Consideremos un ejemplo: el cerebro humano no "sabe" directamente cómo hacer que la mano apriete un objeto. Simplemente decide que existe una situación en la que la acción de apretar es necesaria y envía la señal al órgano especializado, que en este caso son los músculos de la mano. En la imagen, podemos establecer la siguiente analogía: el LLM representa el cerebro, el transporte simboliza los nervios, y cada agente equivale a un músculo específico de la mano.&lt;/p&gt;

&lt;p&gt;De manera similar, en el mundo del software, el proceso evolucionó desde la implementación de un módulo monolítico único hasta la ejecución de tareas específicas en microservicios orquestados por un controlador. Este controlador decide qué microservicio debe ser invocado como resultado de un evento o una condición. Aplicando esta analogía a la imagen, el LLM se correspondería con un &lt;em&gt;Control Plane&lt;/em&gt;, el transporte con el &lt;em&gt;networking&lt;/em&gt; y las rutas, y los agentes con los &lt;em&gt;Pods/containers&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;En esta evolución, el agente de IA determina qué herramientas utilizar y en qué secuencia, mientras que el LLM decide el paso siguiente basándose en los resultados obtenidos.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faugr3gmso2gbex11t36g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faugr3gmso2gbex11t36g.png" alt="Torre de la abstracción" width="800" height="929"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;El siguiente paso en el desarrollo tradicional resultó ser, quizás, más inesperado que lógico. Esta evolución de la industria del software (y de muchas otras) se traduce en la delegación del trabajo de desarrollo, implementación, pruebas y despliegue a agentes inteligentes. Estos pueden realizar dicha labor de manera ininterrumpida y con una eficiencia superior a la humana, al consumir menos recursos, dejando así atrás la faceta artesanal del proceso.&lt;/p&gt;

&lt;h2&gt;
  
  
  ¿Y entonces, qué pasará con...?
&lt;/h2&gt;

&lt;p&gt;Sí, la pregunta es crucial: ¿qué destino aguarda a los ingenieros tradicionales que, día a día, entregan tarea tras tarea?&lt;/p&gt;

&lt;p&gt;Se presentan, entonces, dos alternativas claras: o dejamos que los acontecimientos nos arrollen y el "tsunami" nos sepulte bajo toneladas de agua, o bien, nos reinventamos y adaptamos (una vez más), aprendiendo a navegar el vasto mar de cambios que se avecinan.&lt;/p&gt;

&lt;p&gt;De ahora en adelante, la industria ya no requerirá desarrolladores que elaboren código de principio a fin, ingenieros que desplieguen soluciones completas, o analistas que revisen exhaustivamente el código o la estructura de datos. En cambio, la industria demandará ingenieros capaces de orquestar los agentes ofrecidos por otros proveedores, así como de crear sus propios agentes para los casos específicos de su negocio.&lt;/p&gt;

&lt;p&gt;El trabajo que antes era realizado por equipos de decenas de personas, ahora será llevado a cabo por unos pocos, quienes, sin embargo, producirán lo mismo o incluso más que aquellos "pequeños ejércitos".&lt;/p&gt;

&lt;p&gt;Dado que la creación de productos será mucho más sencilla, es probable que surjan numerosas &lt;em&gt;startups&lt;/em&gt; que, aunque con pocas personas, lograrán un gran impacto y se posicionarán en el mercado. Además, se crearán muchas nuevas posiciones para aquellos que aprendan a interactuar con los agentes y a mejorarlos.&lt;/p&gt;

&lt;p&gt;La educación, sin duda, deberá transformarse también, especialmente la educación superior, que aún forma ingenieros con un perfil "clásico" que saldrán a un mercado que ya no los necesita. Las instituciones de educación superior deben reformar sus currículos ¡con urgencia!&lt;/p&gt;

&lt;p&gt;Del mismo modo, las empresas de software que pretendan retener a sus ingenieros deberán impulsarlos a actualizarse y a realizar esta transición de la forma menos traumática posible. Como ingenieros, nos enorgullecemos de nuestras capacidades, pero en ocasiones mostramos necedad y reticencia, o incluso incredulidad, ante el cambio, sobre todo cuando este parece no favorecernos.&lt;/p&gt;

&lt;h2&gt;
  
  
  ¿Por dónde empezar?
&lt;/h2&gt;

&lt;p&gt;Como líder de un AWS User Group, uno de mis deberes consiste en guiar a la comunidad hacia este futuro.&lt;/p&gt;

&lt;p&gt;No hay herramienta más poderosa que la educación. Por lo tanto, mi primer consejo sería buscar las certificaciones relacionadas con el tema, creadas y diseñadas por las mismas empresas que están liderando esta transición. Personalmente, considero que no hay mejor incentivo para estudiar que una certificación, ya que el conocimiento que exige no solo es teórico, sino que abarca también el dominio práctico del día a día; en consecuencia, quien no posea esa experiencia se verá impulsado a adquirirla por medios propios.&lt;/p&gt;

&lt;p&gt;La primera y más importante certificación para iniciar este camino sería la "&lt;a href="https://aws.amazon.com/es/certification/certified-ai-practitioner/" rel="noopener noreferrer"&gt;AWS Certified AI Practitioner&lt;/a&gt;", donde se abordan los fundamentos: qué es un LLM, qué es una base de datos vectorial, qué es un agente, cómo crear &lt;em&gt;prompts&lt;/em&gt; efectivos para interactuar con un agente de IA, entre otros aspectos. Dado que es una certificación de AWS, requiere el conocimiento básico, al menos, de muchos de sus servicios y su configuración mínima. Por lo tanto, revisar el material de la certificación "&lt;a href="https://aws.amazon.com/es/certification/certified-cloud-practitioner/" rel="noopener noreferrer"&gt;AWS Certified Cloud Practitioner&lt;/a&gt;" resulta también muy beneficioso; y, por supuesto, si uno se anima a obtenerla, tanto mejor.&lt;/p&gt;

&lt;p&gt;Una vez obtenida la primera certificación, se puede comenzar a crear agentes de IA y, de este modo, avanzar paulatinamente en la transición.&lt;/p&gt;

&lt;p&gt;La primera opción es un servicio completamente administrado denominado &lt;a href="https://aws.amazon.com/es/bedrock/" rel="noopener noreferrer"&gt;Amazon Bedrock&lt;/a&gt;, que integra numerosos LLM y ofrece herramientas para crear agentes, integrar modelos, etc. &lt;a href="https://aws.amazon.com/es/blogs/machine-learning/category/artificial-intelligence/amazon-machine-learning/amazon-bedrock/" rel="noopener noreferrer"&gt;Aquí&lt;/a&gt; puede consultarse una lista exhaustiva de artículos relacionados con Amazon Bedrock.&lt;/p&gt;

&lt;p&gt;AWS también ofrece un &lt;em&gt;framework&lt;/em&gt; de código abierto llamado &lt;a href="https://strandsagents.com/latest/" rel="noopener noreferrer"&gt;Strands Agents&lt;/a&gt;, orientado a código (actualmente en Python) y de uso sencillo. Su repositorio en &lt;a href="https://github.com/strands-agents" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; contiene excelentes ejemplos para iniciarse, destacando particularmente el &lt;a href="https://github.com/strands-agents/samples/tree/main/02-samples/05-personal-assistant" rel="noopener noreferrer"&gt;asistente personal&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0v7gbvaptwrkhj5btcaf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0v7gbvaptwrkhj5btcaf.png" alt="Agentic Personal Assistant" width="800" height="545"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Este ejemplo emplea varios agentes integrados entre sí, como los de búsqueda de información, imágenes y videos, entre otros; así como un agente principal encargado de responder a las preguntas del usuario. Constituye un ejemplo muy didáctico para comprender las capacidades de Strands en la creación de agentes.&lt;/p&gt;

&lt;p&gt;Una vez adquirida la experiencia necesaria, se podría optar a la certificación "&lt;a href="https://aws.amazon.com/es/certification/certified-generative-ai-developer-professional/" rel="noopener noreferrer"&gt;AWS Certified AI Developer Professional&lt;/a&gt;", la cual, en el momento de redactar este artículo, se encuentra en fase &lt;em&gt;beta&lt;/em&gt;. Obtenida esta certificación de nivel profesional, se podrá acreditar la proficiencia en el desarrollo de aplicaciones con y orientadas a la IA.&lt;/p&gt;

&lt;p&gt;Aunque esta certificación, a pesar de ser de nivel profesional, no exige ningún requisito previo para su realización, el sentir de la comunidad sugiere que las siguientes son certificaciones ideales como preparación:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://aws.amazon.com/es/certification/certified-ai-practitioner/" rel="noopener noreferrer"&gt;AWS Certified AI Practitioner&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://aws.amazon.com/es/certification/certified-solutions-architect-associate/" rel="noopener noreferrer"&gt;AWS Certified Solutions Architect - Associate&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://aws.amazon.com/es/certification/certified-machine-learning-engineer-associate/" rel="noopener noreferrer"&gt;AWS Certified Machine Learning Engineer - Associate&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://aws.amazon.com/es/certification/certified-developer-associate/" rel="noopener noreferrer"&gt;AWS Certified Developer - Associate&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;El tiempo dirá si es así.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusión
&lt;/h2&gt;

&lt;p&gt;Si hay algo seguro en &lt;strong&gt;esta&lt;/strong&gt; vida es que todo cambia. &lt;strong&gt;Por lo tanto, todos&lt;/strong&gt; debemos estar preparados para los cambios que inevitablemente llegarán.&lt;/p&gt;

&lt;p&gt;Personalmente, he comenzado mi &lt;strong&gt;propio&lt;/strong&gt; camino de transición y debo confesar que lo disfruto enormemente. Siempre fui un "Developer Wannabe" que no había encontrado el tiempo ni el coraje para concretar sus aspiraciones. Sin embargo, ahora puedo hacerlo sin la necesidad de invertir tanto tiempo profundizando en programación, y con la capacidad de crear aplicaciones funcionales en un tiempo significativamente menor. Sé que mis estimados compañeros desarrolladores critican el &lt;em&gt;vibe coding&lt;/em&gt; con vehemencia, pero es imperativo despojarse de prejuicios, reaprender y abrir la mente, porque este cambio, que ya ha comenzado, no se detendrá, le pese a quien le pese.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>career</category>
      <category>cloud</category>
      <category>aws</category>
    </item>
  </channel>
</rss>
