<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Joe Stech</title>
    <description>The latest articles on Forem by Joe Stech (@joestech).</description>
    <link>https://forem.com/joestech</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/joestech"/>
    <language>en</language>
    <item>
      <title>Automated Cloud Migrations with Kiro and the Arm MCP Server</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Sat, 20 Dec 2025 00:11:52 +0000</pubDate>
      <link>https://forem.com/aws-builders/automated-cloud-migrations-with-kiro-and-the-arm-mcp-server-m72</link>
      <guid>https://forem.com/aws-builders/automated-cloud-migrations-with-kiro-and-the-arm-mcp-server-m72</guid>
      <description>&lt;p&gt;AWS Graviton has the best price-performance of any EC2 instance on AWS, and with the recent announcement of Graviton5, the value prop just keeps getting better and better. Migrating to Graviton makes both financial sense and will provide a huge performance boost to your applications. Most of the time your applications will just move over seamlessly, but what if you've got some x86-specific optimizations in your code?&lt;/p&gt;

&lt;p&gt;Great news: you don't have to manually migrate anymore.&lt;/p&gt;

&lt;p&gt;In this post, I'll show you how to use &lt;a href="https://kiro.dev/" rel="noopener noreferrer"&gt;Kiro&lt;/a&gt; (AWS's agentic IDE) combined with the Arm MCP Server to automate the entire migration process. We're talking Docker images, SIMD intrinsics, compiler flags -- the whole thing.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's the Arm MCP Server?
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://github.com/arm/mcp" rel="noopener noreferrer"&gt;Arm MCP Server&lt;/a&gt; implements the Model Context Protocol, which is basically a way for AI coding assistants to tap into specialized tools. When you connect it to Kiro, you suddenly have an agent that can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Check Docker images&lt;/strong&gt; for arm64 support without you having to dig through manifests&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scan your codebase&lt;/strong&gt; for x86-specific code (intrinsics, build flags, etc.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Search Arm's knowledge base&lt;/strong&gt; for migration guidance and intrinsic equivalents&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Analyze assembly&lt;/strong&gt; for performance characteristics&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Problem: A Legacy x86 Application
&lt;/h2&gt;

&lt;p&gt;Let's walk through an example of how you might use the Arm MCP Server with Kiro. Pretend you've inherited a legacy benchmarking application that's deeply tied to x86. Here's the Dockerfile:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight docker"&gt;&lt;code&gt;&lt;span class="k"&gt;FROM&lt;/span&gt;&lt;span class="s"&gt; centos:6&lt;/span&gt;

&lt;span class="c"&gt;# CentOS 6 reached EOL, need to use vault mirrors&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;&lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^mirrorlist=|#mirrorlist=|g'&lt;/span&gt; /etc/yum.repos.d/CentOS-Base.repo &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^#baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g'&lt;/span&gt; /etc/yum.repos.d/CentOS-Base.repo

&lt;span class="c"&gt;# Install EPEL repository (required for some development tools)&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;yum &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; epel-release &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^mirrorlist=|#mirrorlist=|g'&lt;/span&gt; /etc/yum.repos.d/epel.repo &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^#baseurl=http://download.fedoraproject.org/pub/epel|baseurl=http://archives.fedoraproject.org/pub/archive/epel|g'&lt;/span&gt; /etc/yum.repos.d/epel.repo

&lt;span class="c"&gt;# Install Developer Toolset 2 for better C++11 support (GCC 4.8)&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;yum &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; centos-release-scl &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^mirrorlist=|#mirrorlist=|g'&lt;/span&gt; /etc/yum.repos.d/CentOS-SCLo-scl.repo &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^mirrorlist=|#mirrorlist=|g'&lt;/span&gt; /etc/yum.repos.d/CentOS-SCLo-scl-rh.repo &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^# baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g'&lt;/span&gt; /etc/yum.repos.d/CentOS-SCLo-scl.repo &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="nt"&gt;-i&lt;/span&gt; &lt;span class="s1"&gt;'s|^# baseurl=http://mirror.centos.org|baseurl=http://vault.centos.org|g'&lt;/span&gt; /etc/yum.repos.d/CentOS-SCLo-scl-rh.repo

&lt;span class="c"&gt;# Install build tools&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;yum &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="se"&gt;\
&lt;/span&gt;    devtoolset-2-gcc &lt;span class="se"&gt;\
&lt;/span&gt;    devtoolset-2-gcc-c++ &lt;span class="se"&gt;\
&lt;/span&gt;    devtoolset-2-binutils &lt;span class="se"&gt;\
&lt;/span&gt;    make &lt;span class="se"&gt;\
&lt;/span&gt;    &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; yum clean all

&lt;span class="k"&gt;WORKDIR&lt;/span&gt;&lt;span class="s"&gt; /app&lt;/span&gt;
&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="s"&gt; *.h *.cpp ./&lt;/span&gt;

&lt;span class="c"&gt;# AVX2 intrinsics are used in the code&lt;/span&gt;
&lt;span class="k"&gt;RUN &lt;/span&gt;scl &lt;span class="nb"&gt;enable &lt;/span&gt;devtoolset-2 &lt;span class="s2"&gt;"g++ -O2 -mavx2 -o benchmark &lt;/span&gt;&lt;span class="se"&gt;\
&lt;/span&gt;&lt;span class="s2"&gt;    main.cpp &lt;/span&gt;&lt;span class="se"&gt;\
&lt;/span&gt;&lt;span class="s2"&gt;    matrix_operations.cpp &lt;/span&gt;&lt;span class="se"&gt;\
&lt;/span&gt;&lt;span class="s2"&gt;    -std=c++11"&lt;/span&gt;

&lt;span class="k"&gt;CMD&lt;/span&gt;&lt;span class="s"&gt; ["./benchmark"]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will cause problems when trying to migrate to Graviton:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;centos:6&lt;/code&gt; doesn't support arm64&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-mavx2&lt;/code&gt; is an x86-only compiler flag&lt;/li&gt;
&lt;li&gt;The code uses AVX2 intrinsics (spoiler: won't compile on Arm)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you didn't spot these issues yourself, that's fine! Kiro with the Arm MCP Server will.&lt;/p&gt;

&lt;p&gt;Here's what that matrix multiplication code looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;"matrix_operations.h"&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;iostream&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;random&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;chrono&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;stdexcept&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;immintrin.h&amp;gt;&lt;/span&gt;&lt;span class="c1"&gt;  // AVX2 intrinsics&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;resize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;vector&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;double&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;randomize&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;random_device&lt;/span&gt; &lt;span class="n"&gt;rd&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;mt19937&lt;/span&gt; &lt;span class="n"&gt;gen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rd&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;uniform_real_distribution&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;dis&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;10.0&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;dis&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;gen&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;Matrix&lt;/span&gt; &lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;multiply&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cols&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;runtime_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Invalid matrix dimensions for multiplication"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="n"&gt;Matrix&lt;/span&gt; &lt;span class="nf"&gt;result&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// x86-64 optimized using AVX2 for double-precision&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;__m256d&lt;/span&gt; &lt;span class="n"&gt;sum_vec&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm256_setzero_pd&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

            &lt;span class="c1"&gt;// Process 4 elements at a time with AVX2&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(;&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;__m256d&lt;/span&gt; &lt;span class="n"&gt;a_vec&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm256_loadu_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
                &lt;span class="n"&gt;__m256d&lt;/span&gt; &lt;span class="n"&gt;b_vec&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm256_set_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                    &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
                    &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
                    &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
                    &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
                &lt;span class="p"&gt;);&lt;/span&gt;
                &lt;span class="n"&gt;sum_vec&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm256_add_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sum_vec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;_mm256_mul_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;a_vec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;b_vec&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;

            &lt;span class="c1"&gt;// Horizontal add using AVX&lt;/span&gt;
            &lt;span class="n"&gt;__m128d&lt;/span&gt; &lt;span class="n"&gt;sum_high&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm256_extractf128_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sum_vec&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="n"&gt;__m128d&lt;/span&gt; &lt;span class="n"&gt;sum_low&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm256_castpd256_pd128&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sum_vec&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="n"&gt;__m128d&lt;/span&gt; &lt;span class="n"&gt;sum_128&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;_mm_add_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sum_low&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sum_high&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

            &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;sum_arr&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
            &lt;span class="n"&gt;_mm_storeu_pd&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sum_arr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;sum_128&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;sum&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sum_arr&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;sum_arr&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;

            &lt;span class="c1"&gt;// Handle remaining elements&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(;&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;sum&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;

            &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mf"&gt;0.0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;total&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;benchmark_matrix_ops&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;=== Matrix Multiplication Benchmark ==="&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;Matrix&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;Matrix&lt;/span&gt; &lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;randomize&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;randomize&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="k"&gt;auto&lt;/span&gt; &lt;span class="n"&gt;start&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;chrono&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;high_resolution_clock&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;now&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="n"&gt;Matrix&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;multiply&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;auto&lt;/span&gt; &lt;span class="n"&gt;end&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;chrono&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;high_resolution_clock&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;now&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="k"&gt;auto&lt;/span&gt; &lt;span class="n"&gt;duration&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;chrono&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;duration_cast&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;chrono&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;milliseconds&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;end&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;start&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"Matrix size: "&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"x"&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"Time: "&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;duration&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;count&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;" ms"&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"Result sum: "&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's the header file &lt;code&gt;matrix_operations.h&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#ifndef MATRIX_OPERATIONS_H
#define MATRIX_OPERATIONS_H
&lt;/span&gt;
&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;vector&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;cstddef&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Matrix&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="nl"&gt;private:&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;vector&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;vector&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;double&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nl"&gt;public:&lt;/span&gt;
    &lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="n"&gt;randomize&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="n"&gt;Matrix&lt;/span&gt; &lt;span class="n"&gt;multiply&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;Matrix&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;other&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;const&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;sum&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;const&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;getRows&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;rows&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kt"&gt;size_t&lt;/span&gt; &lt;span class="n"&gt;getCols&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;cols&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;benchmark_matrix_ops&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="cp"&gt;#endif // MATRIX_OPERATIONS_H
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And &lt;code&gt;main.cpp&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;"matrix_operations.h"&lt;/span&gt;&lt;span class="cp"&gt;
#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;iostream&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"x86-64 AVX2 Matrix Operations Benchmark"&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"========================================"&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="cp"&gt;#if defined(__x86_64__) || defined(_M_X64)
&lt;/span&gt;    &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;cout&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="s"&gt;"Running on x86-64 architecture with AVX2 optimizations"&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;endl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="cp"&gt;#else
&lt;/span&gt;    &lt;span class="cp"&gt;#error "This code requires x86-64 architecture with AVX2 support"
#endif
&lt;/span&gt;
    &lt;span class="n"&gt;benchmark_matrix_ops&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's a lot of x86-specific intrinsic code! But you don't have to convert it manually, Kiro + Arm can do it for you.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up Kiro with the Arm MCP Server
&lt;/h2&gt;

&lt;p&gt;First, you need to configure Kiro to connect to the Arm MCP Server. Kiro uses JSON configuration files for MCP servers.&lt;/p&gt;

&lt;p&gt;Create &lt;code&gt;.kiro/settings/mcp.json&lt;/code&gt; in your project root:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"arm-mcp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"docker"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"run"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"-i"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"--rm"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"-v"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/path/to/your/code:/workspace"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"armlimited/arm-mcp:1.0.1"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A few things to note:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;code&gt;command&lt;/code&gt; runs the Arm MCP Server via Docker&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-v "/path/to/your/code:/workspace"&lt;/code&gt; mounts your current project directory so the scanner can access your code. Replace /path/to/your/code with your actual path.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Save the file and Kiro will automatically pick up the new server. You can verify it's connected by typing &lt;code&gt;/mcp&lt;/code&gt; in the chat—you should see &lt;code&gt;arm-mcp&lt;/code&gt; listed with its tools.&lt;/p&gt;

&lt;p&gt;Once that's done, you can do quick checks right in the chat. Try something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Check the base image in the Dockerfile for Arm compatibility
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Kiro will use the Arm MCP tools and tell you that centos:6 only supports amd64. Cool, but we want to automate the whole thing.&lt;/p&gt;

&lt;h2&gt;
  
  
  Full Automation With Kiro Steering Documents
&lt;/h2&gt;

&lt;p&gt;To fully automate migrations, you can use Kiro "steering documents" -- markdown files that give the AI persistent context and instructions. Instead of explaining what you want every time, you write it once and reference it.&lt;/p&gt;

&lt;p&gt;Create a file at &lt;code&gt;.kiro/steering/arm-migration.md&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;span class="na"&gt;inclusion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;manual&lt;/span&gt;
&lt;span class="nn"&gt;---&lt;/span&gt;

Your goal is to migrate a codebase from x86 to Arm. Use the MCP server tools to help you with this. Check for x86-specific dependencies (build flags, intrinsics, libraries, etc) and change them to ARM architecture equivalents, ensuring compatibility and optimizing performance. Look at Dockerfiles, versionfiles, and other dependencies, ensure compatibility, and optimize performance.

Steps to follow:
&lt;span class="p"&gt;*&lt;/span&gt; Look in all Dockerfiles and use the check_image and/or skopeo tools to verify ARM compatibility, changing the base image if necessary.
&lt;span class="p"&gt;*&lt;/span&gt; Look at the packages installed by the Dockerfile and send each package to the knowledge_base_search tool to check each package for ARM compatibility. If a package is not compatible, change it to a compatible version. When invoking the tool, explicitly ask "Is [package] compatible with ARM architecture?" where [package] is the name of the package.
&lt;span class="p"&gt;*&lt;/span&gt; Look at the contents of any requirements.txt files line-by-line and send each line to the knowledge_base_search tool to check each package for ARM compatibility. If a package is not compatible, change it to a compatible version.
&lt;span class="p"&gt;*&lt;/span&gt; Look at the codebase that you have access to, and determine what the language used is.
&lt;span class="p"&gt;*&lt;/span&gt; Run the migrate_ease_scan tool on the codebase, using the appropriate language scanner based on what language the codebase uses, and apply the suggested changes.
&lt;span class="p"&gt;*&lt;/span&gt; OPTIONAL: If you have access to build tools, rebuild the project for Arm, if you are running on an Arm-based runner. Fix any compilation errors.
&lt;span class="p"&gt;*&lt;/span&gt; OPTIONAL: If you have access to any benchmarks or integration tests for the codebase, run these and report the timing improvements to the user.

Pitfalls to avoid:
&lt;span class="p"&gt;
*&lt;/span&gt; Make sure that you don't confuse a software version with a language wrapper package version -- i.e. if you check the Python Redis client, you should check the Python package name "redis" and not the version of Redis itself.
&lt;span class="p"&gt;*&lt;/span&gt; NEON lane indices must be compile-time constants, not variables.

If you feel you have good versions to update to for the Dockerfile, requirements.txt, etc. immediately change the files, no need to ask for confirmation.

Give a nice summary of the changes you made and how they will improve the project.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;inclusion: manual&lt;/code&gt; bit means this steering doc only kicks in when you reference it. You can also use &lt;code&gt;inclusion: always&lt;/code&gt; if you want it active all the time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Running the Migration
&lt;/h2&gt;

&lt;p&gt;Now for the fun part. In Kiro's chat, just type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#arm-migration
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. Kiro will:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Find your Dockerfile and check &lt;code&gt;centos:6&lt;/code&gt; for arm64 support (it'll fail)&lt;/li&gt;
&lt;li&gt;Suggest baseimage replacements&lt;/li&gt;
&lt;li&gt;Scan your C++ code with &lt;code&gt;migrate_ease_scan&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Find all those AVX2 intrinsics and convert them to NEON&lt;/li&gt;
&lt;li&gt;Update the compiler flags&lt;/li&gt;
&lt;li&gt;Give you a summary of everything it changed&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Verify It Works
&lt;/h2&gt;

&lt;p&gt;After accepting the changes, build and test on an Arm system:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;g++ &lt;span class="nt"&gt;-O2&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; benchmark matrix_operations.cpp main.cpp &lt;span class="nt"&gt;-std&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;c++11
./benchmark
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ARM-Optimized Matrix Operations Benchmark
==========================================
Running on ARM64 architecture with NEON optimizations

=== Matrix Multiplication Benchmark ===
Matrix size: 200x200
Time: 12 ms
Result sum: 2.01203e+08
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If something breaks, just paste the error back into Kiro and it'll fix it. That's the beauty of agentic workflows -- you're not debugging alone.&lt;/p&gt;

&lt;p&gt;Happy migrating! If you run into problems or have questions, you can always email &lt;a href="mailto:mcpserver@arm.com"&gt;mcpserver@arm.com&lt;/a&gt; for help.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ec2</category>
      <category>kiro</category>
    </item>
    <item>
      <title>Dyson Swarm: How I built a hard science fiction game with AWS services</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Thu, 23 Jan 2025 05:16:33 +0000</pubDate>
      <link>https://forem.com/aws-builders/dyson-swarm-how-i-built-a-hard-science-fiction-game-with-aws-services-2de4</link>
      <guid>https://forem.com/aws-builders/dyson-swarm-how-i-built-a-hard-science-fiction-game-with-aws-services-2de4</guid>
      <description>&lt;h2&gt;
  
  
  The game concept
&lt;/h2&gt;

&lt;p&gt;I'm a huge science fiction fan -- so much so that I started and ran a science fiction magazine for five years, and I'm still very engaged in the science fiction community.&lt;/p&gt;

&lt;p&gt;Back in October I had the idea to start making a series of short games to explain hard science fiction concepts in a fun way. Right after I started the game, I saw that AWS was having a Game Builder Challenge, so I decided to write my first hard science fiction game as a part of the hackathon!&lt;/p&gt;

&lt;p&gt;Dyson Swarm is an incremental (clicker) game about dismantling the solar system in order to completely envelop the sun in space habitats. You start as a simple game developer, and progress through stages, building up resources as you go along.&lt;/p&gt;

&lt;p&gt;I originally thought that development would only take 10-20 hours, but the scope of the game ballooned, and ultimately I ended up spending closer to 70 hours developing the game (the entirety of which were spent after my wife and kids went to sleep at night, which is my excuse if you encounter a bug). Effort estimates are one of the most difficult parts of software engineering!&lt;/p&gt;

&lt;p&gt;I've hosted the game here if you want to try it out: &lt;a href="https://compellingsciencefiction.com/games/dysonswarm.html" rel="noopener noreferrer"&gt;Dyson Swarm&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The AWS Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgu2pttfdkb2r5osu8frz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgu2pttfdkb2r5osu8frz.png" alt="Image description" width="399" height="813"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The game is built using client-side Javascript, and runs entirely in the browser. Since all the game code is served through static HTML/Javascript elements, the cheapest and easiest way to host such a game is with a static site S3 bucket fronted by the CloudFront CDN -- this allows fast serving of the files all around the world, and CloudFront can do the TLS termination so that you don't need to deal with custom certificates or servers.&lt;/p&gt;

&lt;p&gt;The game also collects some high-level metrics about how many people are playing the game (anonymized), and how far they're getting (what stage of the game they've completed). These metrics are stored in an RDS Postgres database, and get there via a serverless API built using API Gateway and Lambda. I'm using an RDS instance, since I already had one available, but you could just as easily use serverless RDS for this.&lt;/p&gt;

&lt;p&gt;I deployed the S3 bucket and CloudFront distribution through the AWS console, and created a Python-based AWS CDK stack for the serverless metrics lambda.&lt;/p&gt;

&lt;h2&gt;
  
  
  The CDK stack
&lt;/h2&gt;

&lt;p&gt;This is the CDK stack for the serverless metrics lambda:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;aws_cdk&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
&lt;span class="n"&gt;aws_lambda&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;lambda_&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="n"&gt;aws_apigateway&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;apigw&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="n"&gt;aws_ecr&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;ecr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="n"&gt;aws_certificatemanager&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;acm&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="n"&gt;aws_route53&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;route53&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="n"&gt;Stack&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;constructs&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Construct&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;

&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;DysonSwarmStack&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Stack&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;scope&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Construct&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;construct_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scope&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;construct_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;repo&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ecr&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Repository&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_repository_name&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;dysonSwarmRepo&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gamesapi&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;dyson_swarm_lambda&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;lambda_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;DockerImageFunction&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;dysonSwarmLambda&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;code&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;lambda_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DockerImageCode&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_ecr&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="n"&gt;repository&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;repo&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="n"&gt;tag&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;CDK_DOCKER_TAG&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
            &lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="n"&gt;memory_size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;256&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;seconds&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="n"&gt;architecture&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;lambda_&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Architecture&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ARM_64&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# do auth inside lambda
&lt;/span&gt;        &lt;span class="n"&gt;api&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;apigw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;LambdaRestApi&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;dysonSwarm-endpoint&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;handler&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;dyson_swarm_lambda&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;default_cors_preflight_options&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;apigw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;CorsOptions&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;allow_origins&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;custom_domain&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;apigw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;DomainName&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;custom-domain&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;domain_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gameapi.compellingsciencefiction.com&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;certificate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;acm&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Certificate&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_certificate_arn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;cert&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;[cert ARN here]&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="n"&gt;endpoint_type&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;apigw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;EndpointType&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;EDGE&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;apigw&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;BasePathMapping&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;base-path-mapping&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;domain_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;custom_domain&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;rest_api&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;api&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;hosted_zone&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;route53&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;HostedZone&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_hosted_zone_attributes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;hosted-zone&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;hosted_zone_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;[zone id here]&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;zone_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;compellingsciencefiction.com&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;route53&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;CnameRecord&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;cname&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;zone&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;hosted_zone&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;record_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gameapi&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;domain_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;custom_domain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;domain_name_alias_domain_name&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's not too complex at all! You can see that it references an existing ECR repo (with a container image for the lambda) along with an existing Route 53 hosted zone (for the custom domain). Other than that, it's creating an API Gateway backed by a Lambda that is built using the specified container in ECR.&lt;/p&gt;

&lt;p&gt;This file is also located in the open-source game repo here: &lt;a href="https://github.com/JoeStech/dyson-swarm/blob/main/metrics_api/cdk/dyson_swarm_stack.py" rel="noopener noreferrer"&gt;dyson_swarm_stack.py&lt;/a&gt; if you'd like to see it in context.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Code
&lt;/h2&gt;

&lt;p&gt;The entire source code for the game can be found here: &lt;a href="https://github.com/JoeStech/dyson-swarm" rel="noopener noreferrer"&gt;Dyson Swarm GitHub Repo&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The main game loop is a hundred millisecond interval specified at the end of the &lt;a href="https://github.com/JoeStech/dyson-swarm/blob/main/dysonswarm.html" rel="noopener noreferrer"&gt;dysonswarm.html&lt;/a&gt; file, so if you want to dig into the code in detail I'd start there.&lt;/p&gt;

&lt;p&gt;The game uses local browser storage to maintain state, so if you close the browser and come back later your game is automatically saved. This is actually very easy to implement in Javascript, all you have to do is take an existing json object and store it like this: &lt;code&gt;localStorage.setItem('dysonSwarmGameState', JSON.stringify(gameState));&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Each time the game loop runs, it calculates resource gains and losses based on the current &lt;code&gt;gameState&lt;/code&gt;. All the other action happens when the buttons are clicked (these change the &lt;code&gt;gameState&lt;/code&gt;). Since there are currently 56 of these buttons that get unlocked at various stages of the game, the buttons have their own file called &lt;a href="https://github.com/JoeStech/dyson-swarm/blob/main/buttonFunctions.js" rel="noopener noreferrer"&gt;buttonFunctions.js&lt;/a&gt;. The file contains metadata about what the buttons do along with functions that get invoked when the buttons are clicked. The button functions get attached to the actual button DOM elements when the game is instantiated, by looping through all the button elements and running &lt;code&gt;handleButton()&lt;/code&gt; on them when the game is loaded. As buttons are clicked, new stages of the game are unlocked, until you've built a solar-system spanning civilization.&lt;/p&gt;

&lt;p&gt;The most complicated part of building the game was creating and importing the animations, particularly the &lt;a href="https://github.com/JoeStech/dyson-swarm/blob/main/satellites.js" rel="noopener noreferrer"&gt;satellites.js&lt;/a&gt; animation. I initially made them with SVG, but for the orbital animation things started to break down at around two thousand satellite elements. I ported this to Canvas, since it can more efficiently render a greater number of elements, being raster-based instead of vector-based.&lt;/p&gt;

&lt;p&gt;Another tricky thing was covering all the corner cases for the game logic. Some elements (like "sell shipyard") were only implemented after friends played through the game and got stuck at various points. There's still some balancing I could do to make the game flow better, and I have a list of small improvements in the repo that I'd like to make. Managing game state properly is difficult!&lt;/p&gt;

&lt;h2&gt;
  
  
  Using Q Developer
&lt;/h2&gt;

&lt;p&gt;One of the elements of the AWS Game Builder Challenge was to use AWS Q Developer to help build the game. I was excited to try it out, since I've used other coding assistants to great effect in the past. I downloaded the VSCode Q Developer plugin and used it that way.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The pros:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The built-in chat interface is good, you can ask it questions as you go along and it will provide useful answers.&lt;/li&gt;
&lt;li&gt;The /dev feature will actually create a diff and also apply it to your code if you approve it! This prevents tedious copy/paste operations and saves time.&lt;/li&gt;
&lt;li&gt;Q Developer is very good at replicating repeating patterns in code. For instance, I told it things like "add 2 percent inflation to all repeatable buttons (other than the marketer button), in the same pattern as the ad purchase button inflation." and it saved a ton of time!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The cons:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The /dev feature takes a long time (on the order of minutes), because it does multiple LLM invocations on the back-end.&lt;/li&gt;
&lt;li&gt;The /dev feature is good at generating code diffs is good for some things, but it often decides to create new files instead of adding code in-line. You have to be very careful in your prompt management to keep it from going off the rails.&lt;/li&gt;
&lt;li&gt;I think Q Developer has room for improvement, but AWS is rapidly iterating on it and I foresee the tool being used more often in the future.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The game is open-source!
&lt;/h2&gt;

&lt;p&gt;I've released the entire source code on GitHub under an MIT license, so please feel free to use any of the elements as a base for your own game.&lt;/p&gt;

&lt;p&gt;If you end up building a game of your own, I'd love to hear about it!&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>aws</category>
    </item>
    <item>
      <title>Vendor lock-in when using AWS Graviton processors is no longer a real thing</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Sun, 29 Sep 2024 23:15:23 +0000</pubDate>
      <link>https://forem.com/aws-builders/vendor-lock-in-when-using-aws-graviton-processors-is-no-longer-a-real-thing-1i12</link>
      <guid>https://forem.com/aws-builders/vendor-lock-in-when-using-aws-graviton-processors-is-no-longer-a-real-thing-1i12</guid>
      <description>&lt;p&gt;We're entering a golden age of Arm-based processors. It used to be that when you decided to save 20+% on your cloud compute bill with Graviton processors, purists could come out of the woodwork and tell you "you're locking yourself in to AWS by compiling your software for Graviton! You're ensuring that you won't be able to move your most critical workloads &lt;em&gt;ever&lt;/em&gt;." I actually heard from one of these people today, which provoked me to write this rant.&lt;/p&gt;

&lt;p&gt;In 2024, it is absolutely no longer the case that compiling for Graviton locks you in to anything. Leaving aside the fact that Arm-based Ampere chips have been available on many major clouds for a while (including Azure and Google Cloud), with the announcements of Azure's Cobalt chip and Google's Axion chip, the number of options available to run Arm-based workloads in the cloud has exploded.&lt;/p&gt;

&lt;p&gt;These new chips are all built using flavors of Arm Neoverse, making it almost certain that your Arm-based workloads will port with no issues, unless you're using &lt;em&gt;very&lt;/em&gt; specific processor features (in which case, you knew exactly what you were doing when you used those features in the first place).&lt;/p&gt;

&lt;p&gt;At this point it's a no-brainer to use Graviton for your AWS compute workloads, unless you're using some arcane x86 features. You're going to get significant savings across the board, which will likely only get better as competitive pressures mount. You'll also reduce energy use significantly, which seems poised to become even more important in this era of energy-guzzling GPU clusters.&lt;/p&gt;

&lt;p&gt;At this point using Graviton for your compute workloads is all upside, and anyone telling you different is living in the past.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>graviton</category>
      <category>cpu</category>
      <category>architecture</category>
    </item>
    <item>
      <title>A serverless Python app to send public domain gutenberg.org ebooks to your Kindle with one click!</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Sun, 19 Feb 2023 00:52:04 +0000</pubDate>
      <link>https://forem.com/aws-builders/a-serverless-app-to-send-public-domain-gutenbergorg-ebooks-to-your-kindle-with-one-click-2cdp</link>
      <guid>https://forem.com/aws-builders/a-serverless-app-to-send-public-domain-gutenbergorg-ebooks-to-your-kindle-with-one-click-2cdp</guid>
      <description>&lt;p&gt;I wrote a little serverless web app that does the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Downloads a page from gutenberg.org (Project Gutenberg is a non-profit project to provide free public domain ebooks to the public)&lt;/li&gt;
&lt;li&gt;If the url is a book download page, adds a column to the "download this ebook table" which contains a link and an input for an email&lt;/li&gt;
&lt;li&gt;After the user puts their kindle email in the input field and clicks the "send to kindle" link, the ebook is sent to the user's kindle!&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This was a super fun project that I'm going to walk you through from beginning to end. It uses AWS API Gateway, AWS Lambda, and AWS SES. &lt;del&gt;Before we start digging into the details of how I built it, you can try it out by adding "&lt;a href="mailto:joe@compellingsciencefiction.com"&gt;joe@compellingsciencefiction.com&lt;/a&gt;" to your Kindle safe sender list and then adding any gutenberg.org path to &lt;a href="https://sendtokindle.compellingsciencefiction.com" rel="noopener noreferrer"&gt;https://sendtokindle.compellingsciencefiction.com&lt;/a&gt; like this:&lt;/del&gt;&lt;/p&gt;

&lt;p&gt;&lt;del&gt;&lt;a href="https://sendtokindle.compellingsciencefiction.com/ebooks/67368" rel="noopener noreferrer"&gt;https://sendtokindle.compellingsciencefiction.com/ebooks/67368&lt;/a&gt;&lt;/del&gt;&lt;/p&gt;

&lt;p&gt;&lt;del&gt;My little app (which is hosted at a subdomain of my science fiction site) will add the extra "send to kindle" column into the Gutenberg table, you just have to put your kindle email address in the little box and click the send link.&lt;/del&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2025 EDIT: AI scraping bots have been hitting this endpoint so much that I decided to shut it down to save resources. If you'd like to deploy your own, I have the full code below.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Serverless architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fszqjxssrxwpexisgrxbo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fszqjxssrxwpexisgrxbo.png" alt="Architecture of the send to kindle application" width="800" height="811"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The architecture of the app is simple, just an API Gateway that invokes a Lambda. The Lambda saves some metadata about what book was downloaded and when in S3, downloads the requested ebook from Gutenberg, and sends it to the specified email address. Here's the CDK code that generates the AWS resources:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from aws_cdk import (
aws_lambda as lambda_,
aws_apigateway as apigw,
aws_iam as iam,
aws_ecr as ecr,
aws_certificatemanager as acm,
aws_route53 as route53,
Duration,
Stack)
from constructs import Construct
import os

class SendToKindleStack(Stack):
    def __init__(self, scope: Construct, construct_id: str, **kwargs) -&amp;gt; None:
        super().__init__(scope, construct_id, **kwargs)

        # execution role
        lambda_role = iam.Role(self, id="sendtokindle-lambda",
            role_name='SendtokindleManagementRole',
            assumed_by=iam.ServicePrincipal("lambda.amazonaws.com"),
            managed_policies= [
                        iam.ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaVPCAccessExecutionRole"),
                        iam.ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaBasicExecutionRole"),
                        iam.ManagedPolicy.from_aws_managed_policy_name("AmazonS3FullAccess"),
                        iam.ManagedPolicy.from_aws_managed_policy_name("AmazonSESFullAccess"),
                    ]
        )

        repo = ecr.Repository.from_repository_name(self, "SendToKindleRepo", "sendtokindlerepo")

        sendtokindle_management_lambda = lambda_.DockerImageFunction(self,
            "CSFsendtokindleManagementLambda",
            code=lambda_.DockerImageCode.from_ecr(
                repository=repo,
                tag=os.environ["CDK_DOCKER_TAG"]
                ),
            role=lambda_role,
            timeout=Duration.seconds(30)
        )

        api = apigw.LambdaRestApi(self,
            "csf-sendtokindle-management-endpoint",
            handler=sendtokindle_management_lambda,
            default_cors_preflight_options=apigw.CorsOptions(allow_origins=["*"])
        )

        custom_domain = apigw.DomainName(
            self,
            "custom-domain",
            domain_name="sendtokindle.compellingsciencefiction.com",
            certificate=acm.Certificate.from_certificate_arn(self,'cert',&amp;lt;cert arn str&amp;gt;),
            endpoint_type=apigw.EndpointType.EDGE
        )

        apigw.BasePathMapping(
            self,
            "base-path-mapping",
            domain_name=custom_domain,
            rest_api=api
        )

        hosted_zone = route53.HostedZone.from_hosted_zone_attributes(
            self,
            "hosted-zone",
            hosted_zone_id=&amp;lt;zone id str&amp;gt;,
            zone_name="compellingsciencefiction.com"
        )

        route53.CnameRecord(
            self,
            "cname",
            zone=hosted_zone,
            record_name="sendtokindle",
    domain_name=custom_domain.domain_name_alias_domain_name
        )
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I've put placeholders in a couple of these fields for privacy, but this is verbatim the CDK code I deployed.&lt;/p&gt;

&lt;p&gt;As you can see, the longest part of this IaC is the DNS code! If you don't care about a custom domain and just want to use the API Gateway courtesy domain, you don't even need anything past the Gateway specification.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Lambda Python code
&lt;/h2&gt;

&lt;p&gt;This Lambda pulls double-duty -- it both returns the gutenberg.org HTML (with a modified table), and it also emails the ebook to Kindle if the request sent is to download an epub image. There are many improvements that can be made, but this quick and dirty Lambda works perfectly for what I need.&lt;/p&gt;

&lt;p&gt;I'd like to point out a few fun things about this code:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It downloads the ebook to an in-memory BytesIO object instead of a file, and then builds a MIME attachment with the in-memory file.&lt;/li&gt;
&lt;li&gt;The code isn't very robust about how it differentiates between calls -- there are a lot of ways to break this Lambda, in which case it'll just return a generic 502 server error.&lt;/li&gt;
&lt;li&gt;The code uses AWS SES to send emails via boto3. Be careful when using SES in this way, it's definitely possible for someone to try spending a ton of money on my AWS account by spamming SES sends. You can set up limits on SES (and API Gateway) to mitigate attacks like this.&lt;/li&gt;
&lt;li&gt;I use BeautifulSoup to navigate the gutenberg.org HTML to add my column to the download table. I had to introspect the gutenberg.org HTML to find the correct place in the code to insert my column.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from bs4 import BeautifulSoup
import requests
import uuid
import json
import boto3
import time
from email import encoders
from email.mime.base import MIMEBase
from email.mime.application import MIMEApplication
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
from io import BytesIO
from urllib.request import urlopen
import urllib

BUCKET = "S3 bucket name str"

def response(code, body):
    return {
            'statusCode': code,
            'headers': {
                'Access-Control-Allow-Headers': 'Content-Type',
                'Access-Control-Allow-Origin': '*',
                'Access-Control-Allow-Methods': 'OPTIONS,POST,GET',
                'Content-Type': 'application/json',
            },
            'body': body
        }


def send_ebook(url, filename, email):
    epubfile = BytesIO()
    print(url)
    try:
        with urlopen(url, timeout=10) as connection:
            epubfile.write(connection.read())
    except urllib.error.HTTPError as e:
        print(e.read().decode())
    from_email = "joe@compellingsciencefiction.com"
    to_email = email
    msg = MIMEMultipart()
    msg["Subject"] = "gutenberg ebook!"
    msg["From"] = from_email
    msg["To"] = to_email

    # Set message body
    body = MIMEText("your book!", "plain")
    msg.attach(body)

    epubfile.seek(0)
    part = MIMEApplication(epubfile.read())
    part.add_header("Content-Disposition",
                    "attachment",
                    filename=filename)
    msg.attach(part)

    # Convert message to string and send
    ses_client = boto3.client("ses", region_name="us-west-2")
    response = ses_client.send_raw_email(
        Source=from_email,
        Destinations=[to_email],
        RawMessage={"Data": msg.as_string()}
    )
    print(response)


def handler(event, context):
    print(event)
    try:
        print(event['path'])
    except:
        pass

    if "epub3.image" in event['path']:
        # this is a request to send an ebook
        path = event['path']
        email = event['queryStringParameters']['email']
        filename = path.replace("/ebooks/","").replace("3.images","")
        send_ebook(f"https://www.gutenberg.org/ebooks{path}", filename, email)
        client = boto3.client('s3')
        upload_id = uuid.uuid4().hex
        payload = {
            "timestamp": time.time(),
            "book_url": path,
            "user_email": email
        }
        client.put_object(Bucket=BUCKET,Key=f'{upload_id}.json', Body=json.dumps(payload).encode('utf-8'))
        return response(200, '{"status":"Sent '+path+' to '+email+'!"}')
    else:
        # return the gutenberg html with added column
        r = requests.get(f"https://www.gutenberg.org/{event['path']}")
        print(r.status_code)
        soup = BeautifulSoup(r.text.replace('"/','"https://www.gutenberg.org/').replace("https://www.gutenberg.org/ebooks","https://sendtokindle.compellingsciencefiction.com/ebooks"),features="html.parser")

        trs = soup.find_all("tr")
        for tr in trs:
            about = tr.get('about')
            if about and 'epub3' in about:
                print(tr)
                epubpath = f'{about.split("ebooks")[1]}'
                soup.append(BeautifulSoup(f"""
                &amp;lt;script&amp;gt;
                function buildlink() {{
                  let text = document.getElementById("kindleemail").value;
                  document.getElementById("csflink").href = "{epubpath}?email=" + text;
                }}
                &amp;lt;/script&amp;gt;
                """))
                tr.append(BeautifulSoup(f"&amp;lt;td&amp;gt;&amp;lt;a id='csflink' href='/{epubpath}'&amp;gt;Send&amp;lt;br&amp;gt;to&amp;lt;br&amp;gt;kindle&amp;lt;br&amp;gt;email:&amp;lt;/a&amp;gt;&amp;lt;br&amp;gt;&amp;lt;input type='text' id='kindleemail' oninput='buildlink()'&amp;gt;&amp;lt;/td&amp;gt;", "html.parser"))
            else:
                tr.append(BeautifulSoup("&amp;lt;td class='noprint'&amp;gt;&amp;lt;/td&amp;gt;", "html.parser"))
        return {
            'statusCode': 200,
            'headers': {
                'Access-Control-Allow-Headers': 'Content-Type',
                'Access-Control-Allow-Origin': '*',
                'Access-Control-Allow-Methods': 'OPTIONS,POST,GET',
                'Content-Type': 'text/html;charset=utf-8',
            },
            'body': soup.prettify()
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If any of this inspires you to create your own little utility web app, please tell me about it (you have my compellingsciencefiction.com email address now :). I love hearing about projects like this.&lt;/p&gt;

</description>
      <category>documentation</category>
      <category>discuss</category>
      <category>help</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Commit to your team's development conventions or get out</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Tue, 14 Feb 2023 17:05:13 +0000</pubDate>
      <link>https://forem.com/aws-builders/commit-to-your-teams-development-conventions-or-get-out-190m</link>
      <guid>https://forem.com/aws-builders/commit-to-your-teams-development-conventions-or-get-out-190m</guid>
      <description>&lt;p&gt;I'm not saying this for the good of your team, even though it will be good for your team. I'm telling you this for your own sake: don't martyr yourself for your software development opinions!&lt;/p&gt;

&lt;p&gt;Software engineering attracts a certain type of mind. Our thinking is necessarily precise -- the computer does exactly what we tell it to do, and if we tell it to do something incorrectly, that's on us. In general, this leads to strong views on how development should be done. Every programmer I've met that has two or more years of development experience has already developed strong beliefs about sustainable, maintainable development conventions, and does not hesitate to inform you about them.&lt;/p&gt;

&lt;p&gt;This is fine in isolation -- you have your principles, you stick to them, and your code is cleaner and more maintainable as a result. You always use spaces, you use the gitflow workflow when branching in source control, you always promote through dev/stg/prd environments, you use the pytest framework for unit tests, etc etc. You are a paragon of forward-thinking development discipline.&lt;/p&gt;

&lt;p&gt;Problems arise, though, when you must work on a project with an engineer who doesn't share your ideology. Yes, I use that word on purpose: "best practices" are ideology, not universal laws. Good software can be written in myriad ways, and so two skilled engineers can have equally valid, yet diametrically opposed, opinions about development practices.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/l0IylSajlbPRFxH8Y/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/l0IylSajlbPRFxH8Y/giphy.gif" width="480" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;And so we come to the hard truth of my rant&lt;/strong&gt;: it is better to conform to the development practices accepted by the majority of your team than to try and force them to accept your convictions. Nothing tanks productivity on a team faster than arguing about development practices and tooling. It might be hard for you to accept, but your pet convention is not so superior that it is worth wasting engineer-weeks (or even engineer-days!) arguing about.&lt;/p&gt;

&lt;p&gt;If you just can't accept the idiotic processes your team has in place, and you feel so strongly about your development conventions that you find yourself trying to convince your coworkers again and again to change their ways, it's probably best for everyone if you try and find a role on a new team that shares your convictions. You will be happier, your former team will be happier, your new team will welcome you with open arms, you will crush all your performance reviews, and society will be improved by all the code you will be shipping.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/Hw8vYF4DNRCKY/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/Hw8vYF4DNRCKY/giphy.gif" width="245" height="170"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cryptocurrency</category>
      <category>web3</category>
      <category>blockchain</category>
      <category>crypto</category>
    </item>
    <item>
      <title>Go fast and reduce risk: using CDK to deploy your serverless applications on AWS</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Wed, 28 Dec 2022 19:19:22 +0000</pubDate>
      <link>https://forem.com/aws-builders/go-fast-and-reduce-risk-using-cdk-to-deploy-your-serverless-applications-on-aws-2i3k</link>
      <guid>https://forem.com/aws-builders/go-fast-and-reduce-risk-using-cdk-to-deploy-your-serverless-applications-on-aws-2i3k</guid>
      <description>&lt;p&gt;There are a truly absurd number of ways to deploy applications on AWS. Corey Quinn wrote two &lt;a href="https://www.lastweekinaws.com/blog/the-17-ways-to-run-containers-on-aws/" rel="noopener noreferrer"&gt;separate&lt;/a&gt; &lt;a href="https://www.lastweekinaws.com/blog/17-more-ways-to-run-containers-on-aws/" rel="noopener noreferrer"&gt;articles&lt;/a&gt; wherein he described 17 ways to deploy containers on AWS -- that's &lt;strong&gt;34 different ways just to deploy containers&lt;/strong&gt; (granted, some of the listed ways to deploy are farcical).&lt;/p&gt;

&lt;p&gt;All of these options inevitably spawn fights about the "right" way to deploy on AWS. I'm now going to explain my preferred way to deploy a serverless API, and then you can fight me.&lt;/p&gt;

&lt;h2&gt;
  
  
  The best way to deploy your serverless API on AWS:
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Write your API business logic code in a way that can be run on AWS Lambda.&lt;/li&gt;
&lt;li&gt;Write a CDK template in the &lt;strong&gt;same language&lt;/strong&gt; as your Lambda function code &lt;strong&gt;&lt;em&gt;and store along with your Lambda function code in source control&lt;/em&gt;&lt;/strong&gt; (I'm going to assume you're using git here, but use whatever source control you want).&lt;/li&gt;
&lt;li&gt;Deploy your serverless API from your git commit as a self-contained unit.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;There's a lot to unpack here, but the main thing I want convey here is that &lt;em&gt;it is awesome to write your executable code and your IaC templates in the same language&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Your CDK template&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;So what is CDK? It's the AWS 'Cloud Development Kit', and it basically lets you transpile your language of choice (as long as that language is either JavaScript, TypeScript, Python, Java, C#, or Go) into CloudFormation. This is awesome, because you don't have to deal with a weird domain specific language for your IaC or manage state files for your IaC (looking at you, Terraform).&lt;/p&gt;

&lt;p&gt;In the case of a serverless API, these are the main things you'll want in your CDK code:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;IAM policies that limit your lambda permissions using the principle of least privilege.&lt;/li&gt;
&lt;li&gt;A Lambda resource that you'll deploy your code to.&lt;/li&gt;
&lt;li&gt;An API Gateway resource (RestApi) that you'll point at your Lambda.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;There are a bunch of other services you could include here (RDS database, dynamoDB, etc etc) if you need them, but that's the beauty of this system -- you're not limited by a specific serverless framework, you can just add arbitrary services to your CDK code as you need them.&lt;/p&gt;

&lt;p&gt;Here's a bare-bones version of a serverless CDK stack file in Python (you can also write these files in JavaScript, TypeScript, Java, C#, or Go), so you can see what I mean (everything has the generic name "somestack", which you can replace with a more meaningful name):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from aws_cdk import (
aws_lambda as lambda_,
aws_apigateway as apigw,
aws_iam as iam,
aws_ecr as ecr,
Stack)
from constructs import Construct
import os

class SomeStackName(Stack):
    def __init__(self, scope: Construct, construct_id: str, **kwargs) -&amp;gt; None:
        super().__init__(scope, construct_id, **kwargs)

        # Create the Lambda execution role
        lambda_role = iam.Role(self, id="somestack-lambda",
            role_name='SomestackManagementRole',
            assumed_by=iam.ServicePrincipal("lambda.amazonaws.com"),
            managed_policies= [
                        iam.ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaVPCAccessExecutionRole"),
                        iam.ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaBasicExecutionRole")
                    ]
        )
        # Get an existing ECR repo that has a docker image you want to deploy to Lambda
        repo = ecr.Repository.from_repository_name(self, "SomestackRepo", "somestack_ecr_repo")
        # Create the Lambda function
        somestack_lambda = lambda_.DockerImageFunction(self,
            "SomestackLambda",
            code=lambda_.DockerImageCode.from_ecr(
                repository=repo,
                tag=some_image_tag_name
                ),
            role=lambda_role
        )

        # Create the API Gateway
        api = apigw.LambdaRestApi(self,
            "somestack-endpoint",
            handler=somestack_lambda,
            default_cors_preflight_options=apigw.CorsOptions(allow_origins=["*"])
        )
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's all the Infrastructure as Code that you need to create a serverless API! You can learn more from the &lt;a href="https://docs.aws.amazon.com/cdk/v2/guide/getting_started.html" rel="noopener noreferrer"&gt;official CDK developer guide&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The actual development workflow
&lt;/h2&gt;

&lt;p&gt;So now you have your CDK IaC written and you're ready to start incorporating it into a real devops pipeline. What does the day-to-day development process look like?&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Make a branch.&lt;/li&gt;
&lt;li&gt;Make some Lambda code changes, some CDK changes, whatever, and push your branch.&lt;/li&gt;
&lt;li&gt;A git server webhook should trigger a CDK build to test your Infrastructure as Code (IaC). This webhook also runs some tests on your temporarily built environment to make sure you have no regressions.&lt;/li&gt;
&lt;li&gt;Merge your branch into main (master, whatever you call it).&lt;/li&gt;
&lt;li&gt;Another git server webhook that only triggers on main branch commits deploys your code to your dev environment (which can then be promoted to stage/prod as necessary)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now you're ready to build serverless, production-grade APIs with incredible speed and very little risk! You don't have to have a team of engineers doing server maintenance, and you can spin up as many independent microservices as you need to without worrying about autoscaling policies or cluster management. It's an incredibly powerful pattern for certain types of software.&lt;/p&gt;

</description>
      <category>watercooler</category>
    </item>
    <item>
      <title>'Twas the Night Before Christmas: Cloud Edition</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Tue, 20 Dec 2022 15:33:37 +0000</pubDate>
      <link>https://forem.com/aws-builders/twas-the-night-before-christmas-cloud-edition-54dh</link>
      <guid>https://forem.com/aws-builders/twas-the-night-before-christmas-cloud-edition-54dh</guid>
      <description>&lt;p&gt;'Twas the night before Christmas, and all through the cloud&lt;br&gt;
Every server was stirring, it was really quite loud.&lt;br&gt;
The lambdas were running, provisioned with care,&lt;br&gt;
In hopes that complex workloads soon would be there.&lt;/p&gt;

&lt;p&gt;The devs were done, nestled all snug in their beds,&lt;br&gt;
While visions of SageMaker danced in their heads.&lt;br&gt;
And mamma in her t-shirt, and I in my briefs,&lt;br&gt;
Had just settled down for a long winter's sleep.&lt;/p&gt;

&lt;p&gt;When out in the cloud there arose such a clatter,&lt;br&gt;
I sprang from my bed to see what was the matter.&lt;br&gt;
Away to the keyboard I flew like a flash,&lt;br&gt;
Tore open the console and threw up the dash.&lt;/p&gt;

&lt;p&gt;The screen on the desk (what a stupendous glow)&lt;br&gt;
Gave the luster of mid-day to keyboard below.&lt;br&gt;
When, what to my wondering eyes should appear,&lt;br&gt;
But a miniature vid, streaming beautifully clear.&lt;/p&gt;

&lt;p&gt;With a middle-aged speaker, but such a fast learner,&lt;br&gt;
I knew in a moment it must be Dr. Werner.&lt;br&gt;
More rapid than Redis his new features came,&lt;br&gt;
And he whistled, and shouted, and called them by name:&lt;/p&gt;

&lt;p&gt;"Now, Lambda! now, CloudWatch! now, S3 and Redshift!&lt;br&gt;
On, CloudFront! on, Kendra! on, Neptune and GameLift!&lt;br&gt;
To the top of the bill! Let it never be small!&lt;br&gt;
Now more dashboards! more dashboards! dashboards for all!"&lt;/p&gt;

&lt;p&gt;He clicked the last slide, to his team gave acclaim,&lt;br&gt;
I'm not kidding! He even called out their full names.&lt;br&gt;
Then I heard him call out, ere he dove out of sight,&lt;br&gt;
“Happy holidays all, and to all a good-night.”&lt;/p&gt;

</description>
      <category>gratitude</category>
    </item>
    <item>
      <title>Adding an existing custom Lambda Authorizer to API Gateway with CDK and Python</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Thu, 15 Dec 2022 16:33:21 +0000</pubDate>
      <link>https://forem.com/aws-builders/adding-an-existing-custom-lambda-authorizer-to-api-gateway-with-cdk-and-python-38c3</link>
      <guid>https://forem.com/aws-builders/adding-an-existing-custom-lambda-authorizer-to-api-gateway-with-cdk-and-python-38c3</guid>
      <description>&lt;p&gt;OK, this is an pretty niche one, but I had to write it up because it annoyed me so much figuring it out. I'm using the Python CDK library, &lt;code&gt;aws_cdk&lt;/code&gt;, not the TypeScript interface (but the concepts are the same).&lt;/p&gt;

&lt;p&gt;If you have an existing custom Lambda Authorizer, and you want to add it to a new API Gateway LambdaRestApi, you can't just grab the Authorizer with a normal &lt;code&gt;from_XXXX&lt;/code&gt; command in the Authorizer CDK class. Such a &lt;code&gt;from_&lt;/code&gt; function doesn't exist for Authorizers, as of CDK 2.20.0.&lt;/p&gt;

&lt;p&gt;What you have to do is specify the Lambda function used by the authorizer, using the &lt;code&gt;from_function_name&lt;/code&gt; function. You can then create a TokenAuthorizer based on your existing authorizer Lambda, and pass that to your LambdaRestApi, like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from aws_cdk import (
aws_lambda,
aws_apigateway)

# ...
# create your stack class, IAM roles, etc

# define some EXECUTION (not auth) Lambda for your API, here we'll call it api_lambda

# grab your existing Lambda auth function from the lambda name
auth_function = aws_lambda.Function.from_function_name(self, "myAuthLambda", "existing_lambda_name")

# New Authorizer based on your existing Lambda
lambda_authorizer = aws_apigateway.TokenAuthorizer(self, "myAuthorizer", handler=auth_function)

# Your new Lambda-backed and authorized API
api = aws_apigateway(self,
        "myEndpoint",
        handler=api_lambda,
        default_method_options={"authorizer": lambda_authorizer, "authorization_type": aws_apigateway.AuthorizationType.CUSTOM})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And that's it! Now you've got an API Gateway that uses a Lambda Authorizer and also a Lambda execution backend.&lt;/p&gt;

&lt;p&gt;If you're still running into issues with your specific setup, it's pretty easy to do development testing of your Lambda Authorizers. In the API Gateway console, first go to the endpoint that calls your authorizer, and then click "Authorizers" in the left nav bar. Each Authorizer will have two links at the bottom of their panes: "Edit" and "Test". Clicking "Test" is going to be your best bet at getting actionable logs.&lt;/p&gt;

</description>
      <category>cybersecurity</category>
      <category>networking</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Solving problems with API Gateway and the Python CDK</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Fri, 09 Dec 2022 16:53:00 +0000</pubDate>
      <link>https://forem.com/aws-builders/solving-problems-with-api-gateway-and-the-python-cdk-1m6a</link>
      <guid>https://forem.com/aws-builders/solving-problems-with-api-gateway-and-the-python-cdk-1m6a</guid>
      <description>&lt;p&gt;So you decided to use the Python flavor of CDK to build your API Gateway infrastructure. Great! Python CDK is my favorite tool to manage Infrastructure as Code in AWS. Anyone who knows Python and has created resources in the AWS console can get up to speed almost instantly with Python CDK. But now you've hit an incredibly hard to debug issue: you've specified an existing API Gateway in a different CDK Stack, and are now trying to use &lt;code&gt;aws_apigateway.LambdaRestApi.from_rest_api_attributes&lt;/code&gt; to get the existing API Gateway and add new paths to it using add_resource and &lt;code&gt;add_method&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Everything is going great, your new stack deploys, and then you try to invoke your new API Gateway-&amp;gt;Lambda endpoint. You get&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Authorization header requires 'Credential' parameter. Authorization header requires 'Signature' parameter. Authorization header requires 'SignedHeaders' parameter. Authorization header requires existence of either a 'X-Amz-Date' or a Date' header.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Oh no. That generic error could mean anything. So you Google around for a while, trying to figure out what could be going wrong. You look at the API Gateway in the AWS console. You realize that even though the resource paths are being added to your Gateway, they're not being deployed to a &lt;code&gt;Stage&lt;/code&gt;!&lt;/p&gt;

&lt;p&gt;The LambdaRestApi class should do that automatically, right? Yes, it will, but unfortunately not if you're using the &lt;code&gt;from_rest_api_attributes&lt;/code&gt; function to get an existing Gateway. This is a known issue: &lt;a href="https://github.com/aws/aws-cdk/issues/12417" rel="noopener noreferrer"&gt;https://github.com/aws/aws-cdk/issues/12417&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'd encourage you to upvote the github issue above so that the CDK team prioritizes it, but here's the workaround with Python CDK:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import uuid
from aws_cdk import aws_apigateway

api = aws_apigateway.LambdaRestApi.from_rest_api_attributes(self, [gateway id], rest_api_id=[api id], root_resource_id=[gateway root resource id])
r = api.root.add_resource("hello-world")
r.add_method("ANY", aws_apigateway.LambdaIntegration([your previously specified Lambda CDK object])
deployment = aws_apigateway.Deployment(self, f"deployment-{uuid.uuid4().hex}', api=api)
stage = aws_apigateway.Stage(self, [stage id (some string)], deployment=deployment, stage_name = [some string])
api.deployment_stage = stage
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And now your new endpoint deploys properly. As you can see, there are many parameters that you need to create yourself (I've put those in square brackets), but the main thing to note is that I've manually added a Deployment and Stage to the API Gateway, and added a hash to the deployment ID that changes every time the stack is deployed. This forces the deployment to be pushed out, and the solution won't work without it.&lt;/p&gt;

&lt;p&gt;If you need to add multiple resources/methods in different stacks (my own current use case!), the best way I've found is just to put the deployment into a separate stack and build it after you add/update any of your resource stacks. It's a little gross, but prevents you from having to add multiple stage names.&lt;/p&gt;

&lt;p&gt;If this one had you beating your head against a wall for a while, you're not alone. I hope the CDK team addresses this issue soon, but in the meantime you now have a fix.&lt;/p&gt;

</description>
      <category>nvidia</category>
      <category>machinelearning</category>
      <category>ai</category>
    </item>
    <item>
      <title>How I built my ideal daily Python newsletter with AWS and Python</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Thu, 08 Dec 2022 17:00:40 +0000</pubDate>
      <link>https://forem.com/aws-builders/how-i-built-my-ideal-daily-python-newsletter-with-aws-and-python-1646</link>
      <guid>https://forem.com/aws-builders/how-i-built-my-ideal-daily-python-newsletter-with-aws-and-python-1646</guid>
      <description>&lt;p&gt;I'm a heavy Python user. I've basically used it to do my job every day for the last decade, so it's valuable for me to keep tabs on new developments in the Python ecosystem. Amazing programmers talk about new Python developments on Hacker News, Reddit, and Twitter all the time, so I created a daily newsletter to collect the top Python stories from all three platforms. If you want to take a look at the finished product, it can be found here: &lt;a href="https://compellingpython.com/" rel="noopener noreferrer"&gt;compellingpython.com&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Secret weapon number one: AWS Lambda with Docker
&lt;/h2&gt;

&lt;p&gt;Using Docker containers on Lambda has been a little bit of a game-changer for me, mainly because it's so easy to add extra supporting stuff (Jinja HTML template files, packages, etc) to the container without having to have an elaborate custom build system.&lt;/p&gt;

&lt;p&gt;Sending a daily email is a perfect application for AWS Lambda, because all it takes is a single invocation that runs for a max of 15 minutes a day (using 128 mb of RAM). This comes out to about 6 cents per month (yes, you read that correctly, 6 cents &lt;em&gt;&lt;strong&gt;per month&lt;/strong&gt;&lt;/em&gt;), which is essentially free.&lt;/p&gt;

&lt;p&gt;With Docker Lambdas, all I have to do is write a Dockerfile that looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;FROM public.ecr.aws/lambda/python:3.9

COPY requirements.txt .
RUN pip3 install -r requirements.txt --target "${LAMBDA_TASK_ROOT}"

COPY hn.py reddit.py twitter.py article_processor.py app_handler.py ${LAMBDA_TASK_ROOT}/
COPY templates/ ${LAMBDA_TASK_ROOT}/templates/

CMD [ "app_handler.handler" ]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will install the python packages I specify in a requirements.txt file, then copy over my python source files for reading data from HN, Reddit, and Twitter, along with the main Lambda app_handler.py file, and then tell the Lambda container where the entry point function is (a function called handler in the app_handler.py file.&lt;/p&gt;

&lt;p&gt;After I build and push the docker container to ECR (instructions for this can be found right in your ECR repo!), then it's time for me to deploy the Docker image to Lambda with AWS CDK.&lt;/p&gt;

&lt;h2&gt;
  
  
  Secret weapon number two: AWS CDK
&lt;/h2&gt;

&lt;p&gt;Infrastructure as Code – so valuable, yet so tedious. In the past I would always groan inwardly when it was time to write Terraform or Cloudformation to template out my AWS resources, but in 2019 Amazon released CDK. CDK supports Python, so I can now write all my IaC in Python, which is a joy! Python CDK code transpiles to Cloudformation (with an intermediate Typescript step, more on that later), so all the state information lives in AWS and I can use drift detection on my stacks. I know this is a relatively new tool, but I'm surprised it's not used more widely.&lt;/p&gt;

&lt;p&gt;So the first step in building the newsletter is deploying the Docker Lambda I discussed above using CDK. This Lambda will obtain data from Hacker News, Reddit, and Twitter, filter it, and then send me a nicely formatted email.&lt;/p&gt;

&lt;p&gt;To provision the Lambda along with an IAM role, this is all the CDK code I need:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;lambda_role = iam.Role(self, id="python-newsletter-lambda",
    role_name='PythonNewsletterRole',
    assumed_by=iam.ServicePrincipal("lambda.amazonaws.com"),
    managed_policies= [
                iam.ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaVPCAccessExecutionRole"),
                iam.ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaBasicExecutionRole"),
                iam.ManagedPolicy.from_aws_managed_policy_name("AmazonS3FullAccess"),
                iam.ManagedPolicy.from_aws_managed_policy_name("AmazonSESFullAccess"),
            ]
)

repo = ecr.Repository.from_repository_name(self, "NewsletterRepo", "python_newsletter")

newsletter_lambda = lambda_.DockerImageFunction(self,
    "PythonNewsletterLambda",
    code=lambda_.DockerImageCode.from_ecr(
        repository=repo,
        tag=os.environ["CDK_DOCKER_TAG"]
        ),
    role=lambda_role,
    timeout=Duration.minutes(15)
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There are three pieces here:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Generating the IAM role. I'm not being super locked-down in this example, since I'm giving the Lambda full S3 access and full SES (Simple Email Service, more on that in the next section) access. You can generate custom locked-down policies here instead of using the AWS-managed policies.&lt;/li&gt;
&lt;li&gt;Creating a reference to an ECR repo. ECR is the Docker image repo service that Amazon provides. This is the repo that I pushed my Docker image to in the first section of this article. If you're new to CDK, I want to point out that if you're referencing existing resources you should always use the from_* function from the CDK documentation about the resource you want to use. Check out what I'm talking about in the CDK ECR documentation here.&lt;/li&gt;
&lt;li&gt;Deploy the Docker Lambda. You can see that all I had to do was specify which image tag in the ECR repo I want to deploy (via the CDK_DOCKER_TAG environment variable), specify the role I created, and specify that the timeout is 15 minutes.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That's it! Now when I run &lt;code&gt;cdk deploy&lt;/code&gt; my Lambda will be deployed with all the proper configuration, and if I ever need to change anything I just have to rerun &lt;code&gt;cdk deploy&lt;/code&gt;. The final step is just to schedule the Lambda daily using a cron job in EventBridge.&lt;/p&gt;

&lt;h2&gt;
  
  
  Secret weapon number three: Python and AWS SES
&lt;/h2&gt;

&lt;p&gt;OK, I acknowledge that Python is about the least secret weapon there is. But now that I've described the infrastructure, you might be wondering what's inside this magical little email-sending Lambda function. I'll leave the tedious API calls to get the data from HN/Reddit/Twitter as an exercise for the reader, but I do want to talk briefly about the design of the code package – particularly how the emails are sent.&lt;/p&gt;

&lt;p&gt;Here's the basic process:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;First, I grab all the data from the community APIs for the previous day and rank by upvotes.&lt;/li&gt;
&lt;li&gt;Next I grab the top 3 posts about Python from each community, and grab the full HTML of the web page.&lt;/li&gt;
&lt;li&gt;After parsing out the main content of the page (doesn't always work), I send the main body text to my summarizer algorithm (post about that later!) and get a summary of the article.&lt;/li&gt;
&lt;li&gt;I take all the metadata about the top three articles for each community and use it to populate a Jinja2 HTML template.&lt;/li&gt;
&lt;li&gt;Finally, I send out the HTML emails to myself (and my discerning friends and colleagues who have signed up at compellingpython.com) using AWS SES!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;SES&lt;/strong&gt; (Amazon's Simple Email Service) is fantastic for an application like this, because it's just about the cheapest way to send bulk emails programmatically that there is. You can send 62,000 emails a month for free, and after that it's 10 cents per thousand. Ten dollars to send a hundred thousand emails without maintaining all your own SMTP infrastructure is kind of a ridiculous deal.&lt;/p&gt;

&lt;p&gt;There are some drawbacks, like having to carefully manage your own deliverability metrics and email lists (most other hosted email services provide these things) but as a simple email sending utility I have no complaints. If you want to pay more you can get dedicated IPs, but for fun projects like this one I just use the shared IPs that SES provides for free.&lt;/p&gt;

&lt;p&gt;That's it! That's the service. There is one other aspect to this whole newsletter that I haven't discussed, which is the collection and management of email addresses themselves – I have a separate Lambda function for this that I might write up at a later date. For now though, take care and have a great week.&lt;/p&gt;

</description>
      <category>watercooler</category>
    </item>
    <item>
      <title>Notes from AWS re:Invent 2022: the Giant Magellan Telescope</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Mon, 05 Dec 2022 16:15:13 +0000</pubDate>
      <link>https://forem.com/aws-builders/notes-from-aws-reinvent-2022-the-giant-magellan-telescope-4168</link>
      <guid>https://forem.com/aws-builders/notes-from-aws-reinvent-2022-the-giant-magellan-telescope-4168</guid>
      <description>&lt;p&gt;Most of the sessions I attended at re:Invent this year were highly technical, but I also attended some broader science talks, and this talk by Giant Magellan President Dr. Robert Shelton was one of them.&lt;/p&gt;

&lt;p&gt;Apparently in the Atacama Desert, the driest place on earth, a new telescope is being built: the Giant Magellan Telescope. It's going to be 22 stories tall, the entire building will rotate, and will be up to 200 times more powerful than existing research telescopes. &lt;a href="https://www.giantmagellan.org"&gt;https://www.giantmagellan.org&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The telescope's president was presenting at AWS because AWS provides the compute required to simulate the telescope's building envelope. The telescope will require precise environmental controls, and the Atacama Desert is a place with wild temperature swings. With AWS-based simulation, the Giant Magellan project can ensure that their building designs will support the environmental controls required by telescope and the extraordinarily sophisticated instrumentation that can be attached to the telescope!&lt;/p&gt;

&lt;p&gt;Dr. Shelton spoke about how the telescope will have the capability to resolve atmospheric composition of exoplanets at the level required to detect extraterrestrial life, finally answering the age-old question: are we alone in the universe?&lt;/p&gt;

</description>
      <category>astronomy</category>
      <category>aws</category>
    </item>
    <item>
      <title>Notes from re:Invent 2022: a serverless/devops recap</title>
      <dc:creator>Joe Stech</dc:creator>
      <pubDate>Sat, 03 Dec 2022 00:28:01 +0000</pubDate>
      <link>https://forem.com/aws-builders/notes-from-reinvent-2022-a-serverlessdevops-recap-2db6</link>
      <guid>https://forem.com/aws-builders/notes-from-reinvent-2022-a-serverlessdevops-recap-2db6</guid>
      <description>&lt;p&gt;This week was my first time attending re:Invent, and it was as intense as I had imagined! With 60k people flying into Vegas for the conference and dozens of events happening every single hour of the day, picking and choosing what to spend time on was a challenge. I got a great overview of the new (and some existing) serverless and devops tools, though, and met some awesome new people, which was my primary goal. I'll write more detailed articles about select sessions next week, but here were my favorites, with links to YouTube so you can watch them too!&lt;/p&gt;

&lt;h2&gt;
  
  
  Building containers for AWS (CON325)
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=S7JwFFZ-7_Q"&gt;https://www.youtube.com/watch?v=S7JwFFZ-7_Q&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you build Docker containers locally on macOS, this was a talk for you. Phil Estes and Jesse Butler did an entertaining job of presenting a new open-source alternative to Docker Desktop on macOS called &lt;a href="https://github.com/runfinch"&gt;Finch&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Check out the talk for details, but my main takeaways were:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Finch is a macOS native client for container development.&lt;/li&gt;
&lt;li&gt;Finch uses the same interface you're all familiar with for building containers (i.e. &lt;code&gt;finch build -t [name:tag] .&lt;/code&gt; instead of &lt;code&gt;docker build -t [name:tag] .&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Use the &lt;code&gt;--secret&lt;/code&gt; parameter to pass in your secrets to the container on &lt;code&gt;build&lt;/code&gt;, and then use &lt;code&gt;--mount-type=secret&lt;/code&gt; when you &lt;code&gt;run&lt;/code&gt;. Don't put secrets in layers and delete them later! They're still there, embedded in the previous layer.&lt;/li&gt;
&lt;li&gt;Use multistage builds to make your containers MUCH smaller.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Deep dive into Amazon Neptune Serverless (DAT322)
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=WwuAfTpL7ww"&gt;https://www.youtube.com/watch?v=WwuAfTpL7ww&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Not a lot of developers use graph databases, but since basically everything in the real world can be represented as a graph, they're incredibly useful! And now you can run a Neptune graph database without allocating servers (kind of). Takeaways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You can't scale down to zero, the minimum your cluster can go is to 2.5 NCUs. This is about $290/month at $0.1608/hr per NCU, which is pretty pricey for an absolute minimum serverless setup. They're working on lowering minimums.&lt;/li&gt;
&lt;li&gt;Can only scale up to about the size of a r6g.8xlarge&lt;/li&gt;
&lt;li&gt;Is not yet supported by CloudFormation (I am constantly surprised by how poorly AWS supports CloudFormation).&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Practical experience with a serverless-first strategy at Capital One (SVS311)
&lt;/h2&gt;

&lt;p&gt;This one hasn't been uploaded to YouTube yet, but it was one of my favorites of the conference. They set it up as a back and forth between a very senior engineer at Capital One and an AWS architect, both explaining their own recommended best practices for Lambdas at scale (Capital One uses a LOT of Lambdas). Takeaways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Use Lambda init (cold start) to cache secrets for subsequent calls to the same warm Lambda&lt;/li&gt;
&lt;li&gt;Lambda init gives you a whole CPU (no fractional throttling) no matter what your allocated Lambda size is. This means you can do "Lambda power tuning" to optimize for cost/performance&lt;/li&gt;
&lt;li&gt;If you get errors in your init phase, this will result in stuck Provisioned Concurrency. AWS won't retry, you'll need to release a new version&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How AWS Application Composer helps you and your team build great apps (SVS211)
&lt;/h2&gt;

&lt;p&gt;Again, no upload yet for this one. AWS Application composer is basically a visual yaml editor for CloudFormation templates.&lt;/p&gt;

&lt;p&gt;I'm looking forward to using this to visualize existing architectures already built in CloudFormation, but right now the tool is pretty limited (it's still in Preview). Takeaways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The service is very slick for quickly and visually compiling new resources into a yaml&lt;/li&gt;
&lt;li&gt;It doesn't hook directly into CloudFormation state, you have to upload the actual yaml file if you want to see an existing architecture&lt;/li&gt;
&lt;li&gt;There are a bunch of services that it doesn't support out of the box (it doesn't error when loading such a yaml, it just grays out the unsupported services).&lt;/li&gt;
&lt;li&gt;Basically it's just a yaml visualizer/editor for now. The service is promising though!&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Governance Guardrails across your multi-account environment (COM312)
&lt;/h2&gt;

&lt;p&gt;A great dev talk by Paushali Kundu about how Thomson Reuters manages compliance enforcement and monitoring across their 300+ AWS accounts.&lt;/p&gt;

&lt;p&gt;I learned about CloudFormation Hooks for the first time, which can enforce rules on CloudFormation stacks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Accelerating workloads using parallelism with AWS Step Functions and Lambda (API205)
&lt;/h2&gt;

&lt;p&gt;Yet another one with no uploaded video. I was initially skeptical about map/reduce with Step Functions and Lambda, only because there are already so many great tools to do massively parallel data processing. I was pleasantly surprised though!&lt;/p&gt;

&lt;p&gt;If you have millions of CSVs (or whatever) on S3 that you want to process in parallel, they've made it SO EASY to do so. Basically all you have to do is use a visual editor on the AWS console to make a mapping of S3 objects-&amp;gt;Lambda function mapper, and then plug in a final reducer lambda, and you're done. They did a live demo where they processed 550k csv files on S3 in about 2 minutes, it was awesome. Takeaways:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;If you don't want to set up your own cluster to do your map/reduce calculations, absolutely use Step Functions&lt;/li&gt;
&lt;li&gt;Concurrency limits can be tuned so you don't overwhelm externally-called services that have API limits&lt;/li&gt;
&lt;li&gt;You can easily specify how many files to batch in a single Lambda by the number of items OR a number of kilobytes&lt;/li&gt;
&lt;li&gt;You can set a number of items that can fail before the whole thing fails, OR you can set a percentage of map invocations that can fail&lt;/li&gt;
&lt;li&gt;You can send the compiled results from the mappers directly to S3&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That's it for now, I have to catch my flight! I'll be writing more about re:Invent next week.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
