<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Rafael Milewski</title>
    <description>The latest articles on Forem by Rafael Milewski (@milewski).</description>
    <link>https://forem.com/milewski</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/milewski"/>
    <language>en</language>
    <item>
      <title>EchoSense: Capture Every Word</title>
      <dc:creator>Rafael Milewski</dc:creator>
      <pubDate>Sun, 07 Dec 2025 14:41:17 +0000</pubDate>
      <link>https://forem.com/milewski/echosense-pitch-1l5o</link>
      <guid>https://forem.com/milewski/echosense-pitch-1l5o</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/mux"&gt;DEV's Worldwide Show and Tell Challenge Presented by Mux&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I'm reusing a previous project of mine: 

&lt;/p&gt;
&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" class="crayons-story__hidden-navigation-link"&gt;EchoSense: Your Pocket-Sized Companion for Smarter Meetings&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
      &lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" class="crayons-article__context-note crayons-article__context-note__feed"&gt;&lt;p&gt;AssemblyAI Voice Agents Challenge: Real-Time&lt;/p&gt;

&lt;/a&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/milewski" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F90599%2Fa54eaa3b-91c5-4684-8b21-7937ba8c871f.png" alt="milewski profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/milewski" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Rafael Milewski
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Rafael Milewski
                &lt;a href="/++"&gt;&lt;img alt="Subscriber" class="subscription-icon" src="https://assets.dev.to/assets/subscription-icon-805dfa7ac7dd660f07ed8d654877270825b07a92a03841aa99a1093bd00431b2.png"&gt;&lt;/a&gt;
              
              &lt;div id="story-author-preview-content-2112888" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/milewski" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F90599%2Fa54eaa3b-91c5-4684-8b21-7937ba8c871f.png" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Rafael Milewski&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Nov 23 '24&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" id="article-link-2112888"&gt;
          EchoSense: Your Pocket-Sized Companion for Smarter Meetings
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/devchallenge"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;devchallenge&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/assemblyaichallenge"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;assemblyaichallenge&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/ai"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;ai&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/api"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;api&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
          &lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left"&gt;
            &lt;div class="multiple_reactions_aggregate"&gt;
              &lt;span class="multiple_reactions_icons_container"&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
              &lt;/span&gt;
              &lt;span class="aggregate_reactions_counter"&gt;25&lt;span class="hidden s:inline"&gt; reactions&lt;/span&gt;&lt;/span&gt;
            &lt;/div&gt;
          &lt;/a&gt;
            &lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              18&lt;span class="hidden s:inline"&gt; comments&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            3 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;

&lt;/div&gt;




&lt;p&gt;I built a portable device that captures its surroundings and enhances it with real-time insights and knowledge capabilities. Users can place the device in a meeting, for example, to get live transcription, ask questions to the AI it connects to, and receive automatic summaries. The system can package all of the content and send it by email for later review, analysis, or record keeping.&lt;/p&gt;

&lt;h2&gt;
  
  
  My Pitch Video
&lt;/h2&gt;

&lt;p&gt;

&lt;iframe src="https://player.mux.com/lncJdULJVF4W75WEC1GQYbYp4MM6wr9nnMqvXt01HOvE" width="710" height="399"&gt;
&lt;/iframe&gt;



&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Since this project relies on dedicated hardware to function as intended, it's not possible to provide a full end-to-end demo without physically shipping a device to the judges.&lt;/p&gt;

&lt;p&gt;However, I've created a simulation environment that allows you to preview the frontend experience and explore the core interactions:&lt;/p&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://echosense-simulation.vercel.app?a=1" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;echosense-simulation.vercel.app&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;




&lt;p&gt;Source Code: &lt;/p&gt;

&lt;p&gt;

&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/milewski" rel="noopener noreferrer"&gt;
        milewski
      &lt;/a&gt; / &lt;a href="https://github.com/milewski/echosense-challenge" rel="noopener noreferrer"&gt;
        echosense-challenge
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Making sense of echoes and delivering insights
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;EchoSense&lt;/h1&gt;

&lt;/div&gt;
&lt;p&gt;
Portable device for real-time audio transcription and interactive summaries.
&lt;/p&gt;

&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://private-user-images.githubusercontent.com/2874967/389211676-ddc9d8fc-b195-46de-875c-10c86c9c9d3b.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NzQ2NDE3MTMsIm5iZiI6MTc3NDY0MTQxMywicGF0aCI6Ii8yODc0OTY3LzM4OTIxMTY3Ni1kZGM5ZDhmYy1iMTk1LTQ2ZGUtODc1Yy0xMGM4NmM5YzlkM2IucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI2MDMyNyUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNjAzMjdUMTk1NjUzWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9YWJiYWRhMWI4NTkyMjhlNzgzMGIzZWY5YTYwYTRjNDY3NDI1YmI5YjQxZmU1NTQ5OGZiM2YyNzJjMTBiODBmOSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.4GE1GGufQ6Z_Q9EytIK9QwHH4Ddzh0n7oqpqkNm8w7Y"&gt;&lt;img width="1000" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-user-images.githubusercontent.com%2F2874967%2F389211676-ddc9d8fc-b195-46de-875c-10c86c9c9d3b.png%3Fjwt%3DeyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NzQ2NDE3MTMsIm5iZiI6MTc3NDY0MTQxMywicGF0aCI6Ii8yODc0OTY3LzM4OTIxMTY3Ni1kZGM5ZDhmYy1iMTk1LTQ2ZGUtODc1Yy0xMGM4NmM5YzlkM2IucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI2MDMyNyUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNjAzMjdUMTk1NjUzWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9YWJiYWRhMWI4NTkyMjhlNzgzMGIzZWY5YTYwYTRjNDY3NDI1YmI5YjQxZmU1NTQ5OGZiM2YyNzJjMTBiODBmOSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.4GE1GGufQ6Z_Q9EytIK9QwHH4Ddzh0n7oqpqkNm8w7Y"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is the main repository for my submission to &lt;a href="https://dev.to/challenges/assemblyai" rel="nofollow"&gt;AssemblyAI Challenge&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/echosense-challenge/./esp32" rel="noopener noreferrer"&gt;Esp32&lt;/a&gt;&lt;/strong&gt;: The firmware source code for the ESP32-S3-Zero device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/echosense-challenge/./frontend" rel="noopener noreferrer"&gt;Frontend&lt;/a&gt;&lt;/strong&gt;: The UI that communicates with the device via websocket.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each subfolder includes instructions for running the project locally.&lt;/p&gt;




&lt;p&gt;For a more detailed overview, including screenshots, you can read the submission sent to the challenge here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" rel="nofollow"&gt;https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71&lt;/a&gt;&lt;/p&gt;

&lt;/div&gt;
&lt;br&gt;
&lt;br&gt;
  &lt;/div&gt;
&lt;br&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/milewski/echosense-challenge" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;br&gt;
&lt;/div&gt;





&lt;h2&gt;
  
  
  The Story Behind It
&lt;/h2&gt;

&lt;p&gt;This project was originally created for a &lt;a href="https://dev.to/challenges/assemblyai-2024-11-13"&gt;dev.to hackathon&lt;/a&gt;, but the idea was inspired by a real-world observation. I often attend meetings where most participants are non-native English speakers. I noticed some coworkers relying on some automatic captioning software instaled on their computer just to keep up, either because their English inst good enough or because the mix of accents made things difficult to understand.&lt;/p&gt;

&lt;p&gt;This was the root inspiration for EchoSense. I wanted to build something that not only provided real-time captioning, but went beyond what tools like Zoom or Teams offered at the time by adding features like live transcription, AI-powered insights, summaries, and more. I believe a device like this can be useful for a wide variety of users, and since it's simple to assemble and build, it could even be a fun weekend project.&lt;/p&gt;

&lt;p&gt;I also drew inspiration from a few open-source hardware projects, such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/pollen-robotics/reachy_mini" rel="noopener noreferrer"&gt;https://github.com/pollen-robotics/reachy_mini&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/modem-works/terra" rel="noopener noreferrer"&gt;https://github.com/modem-works/terra&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Both share all the files and tutorials needed to build a "smart" device yourself. I love this approach, it's a great way to learn, experiment, and deepen your understanding of how hardware and software work together.&lt;/p&gt;

&lt;h2&gt;
  
  
  Technical Highlights
&lt;/h2&gt;

&lt;p&gt;The stack isn't complicated. The device is built on an ESP32, but instead of using traditional C-based code, it's written in Rust. I chose Rust because I believe it's better suited for embedded development: it provides a modern developer experience, strong safety guarantees, and excellent performance. In practice, this made development significantly easier and less error-prone than if I had written it in C.&lt;/p&gt;

&lt;p&gt;Due to hardware limitations, the device itself isn't capable of running transcription or large language model tasks locally. Instead, it captures audio and streams data to a third-party service for real-time transcription and AI processing, keeping the hardware lightweight while still enabling powerful functionality.&lt;/p&gt;

&lt;p&gt;One interesting challenge I faced during this project was that the ESP32 variant I chose has an extremely small stack memory. It would sometimes crash simply by receiving an extra item in a JSON array from the server... in other words, a JSON response of just a few kilobytes would instantly crash the device when attempting to parse it because there wasn't enough memory to hold the data. This is almost unthinkable for the new generation of developers who are used to working with effectively unlimited resources and architect as if client devices have infinite RAM. In most cases today that mindset doesn't cause issues, because memory is abundant, but on a constrained device like this, every byte matters and small decisions can have huge consequences.&lt;/p&gt;

&lt;p&gt;Because of this limitation, the software had to be written with extreme efficiency to avoid stack overflows while still handling tasks in real time and serving multiple clients.&lt;/p&gt;

&lt;p&gt;This experience gave me a glimpse of what it must have been like to build computer software or video games when machines were orders of magnitude slower than this microcontroller. It's fascinating how far computing has come, and how disconnected most modern developers are from the constraints that used to define everyday programming.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>muxchallenge</category>
      <category>showandtell</category>
      <category>video</category>
    </item>
    <item>
      <title>Conduit: A UI-less node-based system</title>
      <dc:creator>Rafael Milewski</dc:creator>
      <pubDate>Sat, 03 May 2025 11:52:25 +0000</pubDate>
      <link>https://forem.com/milewski/conduit-a-ui-less-node-based-system-3nkh</link>
      <guid>https://forem.com/milewski/conduit-a-ui-less-node-based-system-3nkh</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/aws-amazon-q-v2025-04-30"&gt;Amazon Q Developer "Quack The Code" Challenge&lt;/a&gt;: Exploring the Possibilities&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I have built &lt;strong&gt;Conduit&lt;/strong&gt;, a domain-specific language (DSL) for creating node-based workflows. Conduit enables developers to create reusable building blocks with an intuitive syntax that can be mixed and matched to build complex data processing pipelines. It's similar to node-based UI tools like ComfyUI but without the graphical interface, offering a code-first approach that's more flexible and embeddable.&lt;/p&gt;

&lt;p&gt;One of Conduit's most powerful features is its true cross-language compatibility. Unlike many workflow tools that are tied to specific languages or platforms, Conduit workflows can be compiled to native libraries (.so/.dll) and seamlessly integrated with virtually any programming language through FFI (Foreign Function Interface). This means you can define your workflows once in Conduit's intuitive DSL and then use them from JavaScript, Python, PHP, Java, Go, or any other language that supports FFI - all while maintaining native performance. This language-agnostic approach eliminates the need for complex interoperability layers or performance-degrading bridges between different technology stacks.&lt;/p&gt;

&lt;p&gt;What makes this project special is how Amazon Q Developer transformed my development process. With its assistance, I was able to build in &lt;strong&gt;TWO&lt;/strong&gt; days what would have taken a month otherwise, including creating a custom DSL parser, implementing an ECS architecture, and developing procedural macros — all challenging aspects of Rust development that typically require significant time investment and expertise.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;The project includes working examples that demonstrate Conduit's capabilities:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A Rust example that reads an image file, resizes it, and saves the result:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;pipeline&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;r#"
  resizer {
    source &amp;lt;- read_file {
      input &amp;lt;- "./examples/conduit-example/cover.png"
    }
    width &amp;lt;- 512
    height &amp;lt;- 215
    output -&amp;gt; write_file {
      destination &amp;lt;- "./examples/conduit-example/cover.smaller.png"
    }
  }
"#&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;engine&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Engine&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="n"&gt;engine&lt;/span&gt;&lt;span class="nf"&gt;.run_pipeline&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pipeline&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;If you're a visual learner, the above example can be visualized like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fql47n7ch9t8hpdnwdq4i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fql47n7ch9t8hpdnwdq4i.png" alt="Conduit Workflow Visualization"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Demo code can be found &lt;a href="https://github.com/milewski/conduit-challenge/tree/main/examples/conduit-example" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;A Node.js integration example showing how Conduit can be used from JavaScript:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;ConduitEngine&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./Engine&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;engine&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;ConduitEngine&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;pipeline&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`
  resizer {
    source &amp;lt;- read_file {
      input &amp;lt;- "../conduit-example/cover.png"
    }
    width &amp;lt;- 512
    height &amp;lt;- 215
    output -&amp;gt; write_file {
      destination &amp;lt;- "../conduit-example/cover.smaller.png"
    }
  }
`&lt;/span&gt;

&lt;span class="nx"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;runPipeline&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;pipeline&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nx"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;destroy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;blockquote&gt;
&lt;p&gt;Demo code can be found &lt;a href="https://github.com/milewski/conduit-challenge/tree/main/examples/node-js-example" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2&gt;
  
  
  Code Repository
&lt;/h2&gt;




&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://assets.dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/milewski" rel="noopener noreferrer"&gt;
        milewski
      &lt;/a&gt; / &lt;a href="https://github.com/milewski/conduit-challenge" rel="noopener noreferrer"&gt;
        conduit-challenge
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      This is the main repository for my submission to "Amazon Q Developer "Quack The Code" challenge.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://github.com/art/logo.svg"&gt;&lt;img width="250" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Fart%2Flogo.svg"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;
Conduit is a domain-specific language (DSL) for creating node-based workflows in Rust. It enables you to build complex data processing pipelines with a simple, declarative syntax
&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Table of Contents&lt;/h2&gt;
&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#quick-start" rel="noopener noreferrer"&gt;Quick Start&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#basic-syntax" rel="noopener noreferrer"&gt;Basic Syntax&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#node-components" rel="noopener noreferrer"&gt;Node Components&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#arrow-notation" rel="noopener noreferrer"&gt;Arrow Notation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#module-implementation" rel="noopener noreferrer"&gt;Module Implementation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#anonymous-nodes" rel="noopener noreferrer"&gt;Anonymous Nodes&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#node-sharing-and-chaining" rel="noopener noreferrer"&gt;Node Sharing and Chaining&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#examples" rel="noopener noreferrer"&gt;Examples&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#contributing" rel="noopener noreferrer"&gt;Contributing&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/milewski/conduit-challenge#license" rel="noopener noreferrer"&gt;License&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Installation&lt;/h2&gt;
&lt;/div&gt;

&lt;p&gt;Add Conduit to your Cargo.toml:&lt;/p&gt;

&lt;div class="highlight highlight-source-toml notranslate position-relative overflow-auto js-code-highlight"&gt;
&lt;pre&gt;[&lt;span class="pl-en"&gt;dependencies&lt;/span&gt;]
&lt;span class="pl-smi"&gt;conduit&lt;/span&gt; = { &lt;span class="pl-smi"&gt;git&lt;/span&gt; = &lt;span class="pl-s"&gt;&lt;span class="pl-pds"&gt;"&lt;/span&gt;https://github.com/milewski/conduit-challange.git&lt;span class="pl-pds"&gt;"&lt;/span&gt;&lt;/span&gt;, &lt;span class="pl-smi"&gt;version&lt;/span&gt; = &lt;span class="pl-s"&gt;&lt;span class="pl-pds"&gt;"&lt;/span&gt;0.1.0&lt;span class="pl-pds"&gt;"&lt;/span&gt;&lt;/span&gt; }
&lt;span class="pl-smi"&gt;conduit-derive&lt;/span&gt; = { &lt;span class="pl-smi"&gt;git&lt;/span&gt; = &lt;span class="pl-s"&gt;&lt;span class="pl-pds"&gt;"&lt;/span&gt;https://github.com/milewski/conduit-challange.git&lt;span class="pl-pds"&gt;"&lt;/span&gt;&lt;/span&gt;, &lt;span class="pl-smi"&gt;version&lt;/span&gt; = &lt;span class="pl-s"&gt;&lt;span class="pl-pds"&gt;"&lt;/span&gt;0.1.0&lt;span class="pl-pds"&gt;"&lt;/span&gt;&lt;/span&gt; }&lt;/pre&gt;

&lt;/div&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Quick Start&lt;/h2&gt;

&lt;/div&gt;

&lt;div class="highlight highlight-source-rust notranslate position-relative overflow-auto js-code-highlight"&gt;
&lt;pre&gt;&lt;span class="pl-k"&gt;use&lt;/span&gt; conduit&lt;span class="pl-kos"&gt;::&lt;/span&gt;&lt;span class="pl-v"&gt;Engine&lt;/span&gt;&lt;span class="pl-kos"&gt;;&lt;/span&gt;
&lt;span class="pl-k"&gt;use&lt;/span&gt; conduit_derive&lt;span class="pl-kos"&gt;::&lt;/span&gt;&lt;span class="pl-v"&gt;Node&lt;/span&gt;&lt;span class="pl-kos"&gt;;&lt;/span&gt;
&lt;span class="pl-c1"&gt;#&lt;span class="pl-kos"&gt;[&lt;/span&gt;derive&lt;span class="pl-kos"&gt;(&lt;/span&gt;&lt;span class="pl-v"&gt;Node&lt;/span&gt;&lt;span class="pl-kos"&gt;)&lt;/span&gt;&lt;span class="pl-kos"&gt;]&lt;/span&gt;&lt;/span&gt;
&lt;span class="pl-k"&gt;pub&lt;/span&gt; &lt;span class="pl-k"&gt;struct&lt;/span&gt; &lt;span class="pl-smi"&gt;Logger&lt;/span&gt; &lt;span class="pl-kos"&gt;{&lt;/span&gt;
    &lt;span class="pl-k"&gt;pub&lt;/span&gt; &lt;span class="pl-c1"&gt;left&lt;/span&gt;&lt;span class="pl-kos"&gt;:&lt;/span&gt; &lt;span class="pl-smi"&gt;Input&lt;/span&gt;&lt;span class="pl-kos"&gt;&amp;lt;&lt;/span&gt;&lt;span class="pl-smi"&gt;String&lt;/span&gt;&lt;span class="pl-kos"&gt;&amp;gt;&lt;/span&gt;&lt;span class="pl-kos"&gt;,&lt;/span&gt;
    &lt;span class="pl-k"&gt;pub&lt;/span&gt; &lt;span class="pl-c1"&gt;right&lt;/span&gt;&lt;span class="pl-kos"&gt;:&lt;/span&gt; &lt;span class="pl-smi"&gt;Input&lt;/span&gt;&lt;span class="pl-kos"&gt;&amp;lt;&lt;/span&gt;&lt;span class="pl-smi"&gt;String&lt;/span&gt;&lt;span class="pl-kos"&gt;&amp;gt;&lt;/span&gt;&lt;span class="pl-kos"&gt;,&lt;/span&gt;
&lt;span class="pl-kos"&gt;}&lt;/span&gt;

&lt;span class="pl-k"&gt;impl&lt;/span&gt; &lt;span class="pl-smi"&gt;ExecutableNode&lt;/span&gt; &lt;span class="pl-k"&gt;for&lt;/span&gt; &lt;span class="pl-smi"&gt;Logger&lt;/span&gt; &lt;span class="pl-kos"&gt;{&lt;/span&gt;
    &lt;span class="pl-k"&gt;fn&lt;/span&gt; &lt;span class="pl-en"&gt;run&lt;/span&gt;&lt;span class="pl-kos"&gt;(&lt;/span&gt;&lt;span class="pl-c1"&gt;&amp;amp;&lt;/span&gt;&lt;span class="pl-smi"&gt;self&lt;/span&gt;&lt;span class="pl-kos"&gt;)&lt;/span&gt; &lt;span class="pl-kos"&gt;{&lt;/span&gt;
        &lt;span class="pl-en"&gt;println&lt;/span&gt;&lt;span class="pl-en"&gt;!&lt;/span&gt;&lt;span class="pl-kos"&gt;(&lt;/span&gt;&lt;span class="pl-s"&gt;"{} {}"&lt;/span&gt;&lt;span class="pl-kos"&gt;,&lt;/span&gt; &lt;span class="pl-smi"&gt;self&lt;/span&gt;&lt;span class="pl-kos"&gt;.&lt;/span&gt;left&lt;span class="pl-kos"&gt;.&lt;/span&gt;read&lt;/pre&gt;…
&lt;/div&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/milewski/conduit-challenge" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;I intend to grow this into an open-source project because deep inside, this is ideally how I would like ComfyUI to be. There's still a long journey ahead for building all the custom nodes, which is especially challenging given that the majority of code for AI workflows is written in Python. However, with my hands-on experience with &lt;a href="https://github.com/huggingface/candle" rel="noopener noreferrer"&gt;Candle&lt;/a&gt; and &lt;a href="https://burn.dev" rel="noopener noreferrer"&gt;Burn&lt;/a&gt; libraries, I may be able to get pretty close!&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  The Problem and Inspiration
&lt;/h2&gt;

&lt;p&gt;I'm a huge fan of ComfyUI, a powerful node-based system I’ve used extensively in hackathons and personal projects. But despite its strengths, I kept running into two consistent limitations:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;When I needed functionality beyond what the existing nodes or community contributions provided, I had to build it myself — typically in Python, through a manual and often clunky process.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Systems like ComfyUI are not easy to embed. Sometimes, I just want a lightweight, embeddable engine that works directly within the language I'm using — without setting up servers, UIs, or dealing with platform-specific quirks.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This led me to a question:&lt;/p&gt;

&lt;p&gt;What if there were a tool like ComfyUI, but entirely code-driven — no GUI, no friction, just flexible building blocks?&lt;/p&gt;

&lt;p&gt;What if I could create node-based workflows not just for ML models, but for anything — like making API calls, evaluating expressions, chaining CLI commands, validating form data, or even just adding numbers?&lt;/p&gt;

&lt;p&gt;That’s the idea behind Conduit: a node-based system that’s completely UI-less, embeddable, and designed for developers who want full control without leaving their editor.&lt;/p&gt;




&lt;h2&gt;
  
  
  Technical Implementation
&lt;/h2&gt;

&lt;p&gt;Conduit features a simple, intuitive syntax heavily inspired by hardware description languages like Verilog. Creating custom nodes is straightforward—just define a Rust struct, derive it with the Node macro, and you're ready to build. No need to deal with Python or JavaScript; everything is native Rust with the ability to be called from other languages via FFI.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;I wrote a basic tutorial explaining the syntax here, which covers everything you need to know about the DSL: &lt;a href="https://github.com/milewski/conduit-challenge?tab=readme-ov-file#basic-syntax" rel="noopener noreferrer"&gt;https://github.com/milewski/conduit-challenge?tab=readme-ov-file#basic-syntax&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;One of the key advantages of Conduit is its cross-language compatibility. You can build your workflow, compile it to a &lt;code&gt;.so&lt;/code&gt;/&lt;code&gt;.dll&lt;/code&gt; dynamic library, and use it in &lt;strong&gt;ANY&lt;/strong&gt; programming language via FFI. I've provided an example of how to use it in JavaScript, but there are no limitations—you could integrate it with Dart, PHP, Java, or any other language. Because the workflows run natively, you get native performance without the bottlenecks specific to each language's runtime environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tech Stack
&lt;/h3&gt;

&lt;p&gt;The project was built entirely in Rust, with a custom DSL written using &lt;a href="https://pest.rs" rel="noopener noreferrer"&gt;PEST&lt;/a&gt;, a powerful parsing expression grammar library. The language grammar (PEG) can be found in &lt;a href="https://github.com/milewski/conduit-challenge/blob/main/packages/conduit/schema.pest" rel="noopener noreferrer"&gt;schema.pest&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Performance is a core advantage of Conduit's architecture. By leveraging Rust's zero-cost abstractions and the ECS pattern, Conduit workflows execute with near-native performance regardless of complexity. Unlike traditional scripting-based workflow systems that interpret code at runtime or rely on language-specific VMs, Conduit compiles workflows to optimized machine code. This approach eliminates interpreter overhead and enables automatic parallelization of independent nodes. In practical terms, this means Conduit can process data-intensive workflows (like image manipulation or large dataset transformations) significantly faster than equivalent implementations in interpreted languages, while maintaining predictable memory usage and avoiding garbage collection pauses.&lt;/p&gt;

&lt;p&gt;Conduit adopts an Entity Component System (ECS) design pattern using &lt;a href="https://crates.io/crates/bevy_ecs" rel="noopener noreferrer"&gt;bevy-ecs&lt;/a&gt;, which makes it simple to parallelize operations where possible, resulting in extremely efficient workflow execution. It also utilizes an &lt;a href="https://crates.io/crates/inventory" rel="noopener noreferrer"&gt;inventory&lt;/a&gt; system to automatically register nodes, further enhancing the ergonomics of the project.&lt;/p&gt;

&lt;p&gt;The project is divided into two crates:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://github.com/milewski/conduit-challenge/tree/main/packages/conduit" rel="noopener noreferrer"&gt;Conduit&lt;/a&gt;: The main crate responsible for parsing the DSL and executing the node workflows&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/milewski/conduit-challenge/tree/main/packages/conduit-derive" rel="noopener noreferrer"&gt;Conduit-Derive&lt;/a&gt;: A procedural macro crate that eliminates boilerplate code when creating new nodes&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  How I Used Amazon Q Developer
&lt;/h2&gt;

&lt;p&gt;Before this hackathon, I wasn't aware that Amazon Q CLI existed. Initially, I assumed it would be similar to other IDE tools like GitHub Copilot or Gemini Assistant—primarily code auto-completion tools. However, I was pleasantly surprised to discover it was a full-featured CLI assistant, my first experience using AI directly from the command line.&lt;/p&gt;

&lt;h3&gt;
  
  
  Development Journey with Amazon Q
&lt;/h3&gt;

&lt;p&gt;My journey began by asking Amazon Q for hackathon project ideas. I provided context about my interest in node-based systems like ComfyUI, and it suggested building something similar to what Conduit has become. When I asked for a prototype, the quality of the output immediately gave me confidence that I was working with a powerful tool.&lt;/p&gt;

&lt;p&gt;The code Amazon Q produced was remarkably similar to what I would have written myself, which gave me the confidence to continue. What would have taken me months to build, I was able to accomplish in just two days with Amazon Q's assistance!&lt;/p&gt;

&lt;p&gt;Another point worth mentioning: I saw there was extensive documentation on the AWS Q website, but I skipped it all and jumped straight to the installation. The command line itself has a &lt;code&gt;/help&lt;/code&gt; command that explains what features are available, making it effortless to figure out how to use the tool. This ease of use was a pleasant surprise.&lt;/p&gt;

&lt;h3&gt;
  
  
  Architectural Decisions
&lt;/h3&gt;

&lt;p&gt;I started from scratch, first creating the foundation and then asking Amazon Q to help build feature by feature. For straightforward tasks, I wrote the code myself, but for more complex challenges that would typically require significant time to figure out the best approach, I asked Amazon Q to provide multiple options.&lt;/p&gt;

&lt;p&gt;What impressed me most was Amazon Q's ability to generate substantial amounts of code while maintaining a step-by-step approach. I could discuss concerns like, "&lt;em&gt;I think if you change the struct this way, it would require a lifetime parameter and could cause trouble when implementing X Y Z&lt;/em&gt;" and it would understand my concerns and propose alternative approaches that aligned with the existing codebase.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advanced Rust Features
&lt;/h3&gt;

&lt;p&gt;Once my first version was working, I wanted to simplify the codebase to reduce boilerplate and create a more ergonomic library. That's when I asked Amazon Q to create a procedural macro to generate all the necessary boilerplate. Amazon Q condensed it into a single &lt;code&gt;#[derive(Node)]&lt;/code&gt; attribute and made the code super clean to use! Admittedly, procedural macros in Rust are hard to write and would have taken me a long time to implement myself, but with a single prompt, Amazon Q was able to build it and make it work on the first try!&lt;/p&gt;

&lt;p&gt;I also discussed with Amazon Q what would be the best design patterns to apply to this project for better performance, parallel execution, and multi-threading. Initially, I implemented it using channels, but Amazon Q recommended using the ECS pattern instead, which made the implementation much simpler. This guidance on architectural decisions was invaluable.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tips for Using Amazon Q Developer
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Work incrementally&lt;/strong&gt;: Ask Amazon Q to write or change code in small increments rather than tackling large features all at once. This isn't because it can't handle complex tasks, but because smaller changes are easier to understand and review. You won't be able to effectively follow along, make changes, or fix bugs if you don't understand the code being produced.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Commit frequently&lt;/strong&gt;: Always commit your code before asking Amazon Q to make changes. Even though it shows and confirms changes before implementing them, having a git history makes it easier to understand what was modified and where.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Provide clear direction&lt;/strong&gt;: Give Amazon Q hints about what you want to accomplish. Specify which files you want to modify, what changes you want to make, and how they should be implemented. It works best when you provide specific file and function names.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Create an AmazonQ.md file&lt;/strong&gt;: Write a guide explaining what Amazon Q should and shouldn't do, along with project context. Including instructions to always run tests or compile the project helps ensure that Amazon Q will verify its changes and fix any errors before finalizing them.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Future Development
&lt;/h2&gt;

&lt;p&gt;If you like the project and want to help it move forward, contributions are welcome! There are several features and improvements on the roadmap:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Async Runtime/Scheduler&lt;/strong&gt;: Implement support for nodes that can run asynchronously&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enhanced Error Handling&lt;/strong&gt;: Complete the integration with &lt;code&gt;thiserror&lt;/code&gt; for more robust error management&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Arithmetic Expressions&lt;/strong&gt;: Update the DSL to support arithmetic operations:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  node { property &amp;lt;- (500 * 5) / (10 * another::module) }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Conditional Logic&lt;/strong&gt;: Add support for conditional nodes:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  test { property &amp;lt;- if (true &amp;amp;&amp;amp; false, "when-true", "when-false") } 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Expanded Node Library&lt;/strong&gt;: Implement more nodes for various use cases&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Pull requests addressing any of these areas would be greatly appreciated!&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>awschallenge</category>
      <category>ai</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Imaginarium - Build. Explore. Leave Your Mark.</title>
      <dc:creator>Rafael Milewski</dc:creator>
      <pubDate>Sun, 27 Apr 2025 18:19:32 +0000</pubDate>
      <link>https://forem.com/milewski/imaginarium-build-explore-leave-your-mark-1p3i</link>
      <guid>https://forem.com/milewski/imaginarium-build-explore-leave-your-mark-1p3i</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://int.alibabacloud.com/m/1000402443/" rel="noopener noreferrer"&gt;Alibaba Cloud&lt;/a&gt; Challenge: &lt;a href="https://dev.to/challenges/alibaba"&gt;Build a Web Game&lt;/a&gt;.&lt;/em&gt;*&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I created a multiplayer open-world game where you control a robot in an infinite, isometric grid-based world. Your mission is to use your imagination to build unique monuments that decorate the landscape. Each monument is dynamically generated with the help of AI (powered by ComfyUI), and is permanently placed at the location where it was created. As players explore the world, they can discover the creations of others and encounter fellow travelers cruising through the ever-evolving map, making it a vibrant and shared experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://imaginarium.monster" rel="noopener noreferrer"&gt;https://imaginarium.monster&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmm04m7ugj16ku7pjeno4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmm04m7ugj16ku7pjeno4.png" alt="Game Play" width="800" height="458"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxl15gmphyrkt6261nxu3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxl15gmphyrkt6261nxu3.png" alt="Game Play" width="800" height="531"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Source Code
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/milewski/imaginarium-challange" rel="noopener noreferrer"&gt;https://github.com/milewski/imaginarium-challange&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Repository Contains:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The game source code located in &lt;a href="https://github.com/milewski/imaginarium-challange/tree/main/game" rel="noopener noreferrer"&gt;&lt;code&gt;/game&lt;/code&gt;&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The WebSocket server and API located in &lt;a href="https://github.com/milewski/imaginarium-challange/tree/main/server" rel="noopener noreferrer"&gt;&lt;code&gt;/server&lt;/code&gt;&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;The Terraform setup files for the infrastructure located in &lt;a href="https://github.com/milewski/imaginarium-challange/tree/main/infrastructure" rel="noopener noreferrer"&gt;&lt;code&gt;/infrastructure&lt;/code&gt;&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Alibaba Cloud Services Implementation
&lt;/h2&gt;

&lt;p&gt;To support the game's backend and dynamic content delivery, I utilized several Alibaba Cloud services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Elastic Compute Service (ECS):&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Used for hosting the backend servers, including the API and WebSocket services. ECS provided the flexibility and performance needed to run Rust-based backend components efficiently.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Object Storage Service (OSS):&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Used to store and serve static assets generated by players, such as monument images. OSS offered scalable and reliable storage with easy access over the web.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Content Delivery Network (CDN):&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Integrated to accelerate the delivery of frontend assets, improving load times and the overall user experience, especially for a browser-based WASM game.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Container Registry (CR):&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Used to host and manage the container images for the server components. Leveraging Alibaba Cloud’s container registry streamlined the deployment process. &lt;/p&gt;




&lt;p&gt;While there were additional Alibaba Cloud services that could have further enhanced the project, integrating them would have required more development time. I effectively had only four days to work on this project, and it was an incredibly fun experience. Overall, working with Alibaba Cloud was smooth and straightforward, largely thanks to the official Terraform provider, which made it easy to explore available services and quickly provision and configure the necessary infrastructure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Game Development Highlights
&lt;/h2&gt;

&lt;p&gt;The game was developed using the &lt;a href="https://bevyengine.org/" rel="noopener noreferrer"&gt;Bevy game engine&lt;/a&gt;, entirely built in Rust, and compiled to WebAssembly (WASM) to run seamlessly in the browser. User-generated monument images are dynamically created using a Stable Diffusion XL model, enhanced with several LoRAs via ComfyUI. All generation services are containerized with Docker and deployed to Alibaba Cloud in a Docker Swarm cluster of a single node.&lt;/p&gt;

&lt;h3&gt;
  
  
  Technical Highlights:
&lt;/h3&gt;

&lt;p&gt;Bevy, Rust, Vuejs, Alibaba Cloud, Docker, ComfyUI, WebSocket, Axum, Terraform&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The game runs entirely in the browser via WebAssembly.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Real-time communication with the server is handled through a WebSocket connection using a custom binary protocol.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Both the server and the client frontend are built in Rust, sharing code between them for efficiency and consistency.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A lightweight Vue.js wrapper manages loading logic, WASM communication, modal handling, and audio token interactions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Monument generation is dynamic, leveraging ComfyUI pipelines for on-the-fly creation of assets.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>alibabachallenge</category>
      <category>devchallenge</category>
      <category>gamedev</category>
      <category>webdev</category>
    </item>
    <item>
      <title>EchoSense: Your Pocket-Sized Companion for Smarter Meetings</title>
      <dc:creator>Rafael Milewski</dc:creator>
      <pubDate>Sat, 23 Nov 2024 17:02:33 +0000</pubDate>
      <link>https://forem.com/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71</link>
      <guid>https://forem.com/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/assemblyai"&gt;AssemblyAI Challenge &lt;/a&gt;: Sophisticated Speech-to-Text and No More Monkey Business.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I developed EchoSense, a portable hardware device that captures spoken content in settings like meetings, classes, brainstorming sessions, and conferences. It features a web interface with real-time transcriptions of everything echoing through its microphone. Users can ask questions about the discussion or generate summaries in real-time, making it an invaluable tool for live events.&lt;/p&gt;

&lt;p&gt;The device operates on a modest 40MHz SoC with 4MB of RAM. It’s lightweight, efficient, and can run on a tiny lithium battery, making it highly portable.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tech Used
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Vue, TypeScript, shadcn/ui&lt;/li&gt;
&lt;li&gt;ESP32, Rust, Espressif IoT Development Framework (IDF)&lt;/li&gt;
&lt;li&gt;WebSocket, SendGrid, AssemblyAI&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Since this is a hardware device, providing a link to a demo isn’t feasible. However, I’ve recorded a video showcasing it in action, along with instructions on how to build one yourself.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/eDMwni7t40E"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;and here is the GitHub repository with the source code:&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.dev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/milewski" rel="noopener noreferrer"&gt;
        milewski
      &lt;/a&gt; / &lt;a href="https://github.com/milewski/echosense-challenge" rel="noopener noreferrer"&gt;
        echosense-challenge
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Making sense of echoes and delivering insights
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;EchoSense&lt;/h1&gt;

&lt;/div&gt;
&lt;p&gt;
Portable device for real-time audio transcription and interactive summaries.
&lt;/p&gt;

&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://private-user-images.githubusercontent.com/2874967/389211676-ddc9d8fc-b195-46de-875c-10c86c9c9d3b.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc0OTY0ODAsIm5iZiI6MTc1NzQ5NjE4MCwicGF0aCI6Ii8yODc0OTY3LzM4OTIxMTY3Ni1kZGM5ZDhmYy1iMTk1LTQ2ZGUtODc1Yy0xMGM4NmM5YzlkM2IucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDkxMCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTA5MTBUMDkyMzAwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9YzNiZjZkMDk4OWYxNTRmZDI3MDJiZTgwMjc2MTU5ZjQ4NjEyYjU2YWFiOGMwNWRmOTEwNzI4ZjZjZWRmZWQ1ZSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.96WZ9bP3DZ6v4nPYLmPrm2x8hCrQNT6ZSGZ5K7rSs7A"&gt;&lt;img width="1000" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-user-images.githubusercontent.com%2F2874967%2F389211676-ddc9d8fc-b195-46de-875c-10c86c9c9d3b.png%3Fjwt%3DeyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc0OTY0ODAsIm5iZiI6MTc1NzQ5NjE4MCwicGF0aCI6Ii8yODc0OTY3LzM4OTIxMTY3Ni1kZGM5ZDhmYy1iMTk1LTQ2ZGUtODc1Yy0xMGM4NmM5YzlkM2IucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDkxMCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTA5MTBUMDkyMzAwWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9YzNiZjZkMDk4OWYxNTRmZDI3MDJiZTgwMjc2MTU5ZjQ4NjEyYjU2YWFiOGMwNWRmOTEwNzI4ZjZjZWRmZWQ1ZSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.96WZ9bP3DZ6v4nPYLmPrm2x8hCrQNT6ZSGZ5K7rSs7A"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is the main repository for my submission to &lt;a href="https://dev.to/challenges/assemblyai" rel="nofollow"&gt;AssemblyAI Challenge&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/echosense-challenge./esp32" rel="noopener noreferrer"&gt;Esp32&lt;/a&gt;&lt;/strong&gt;: The firmware source code for the ESP32-S3-Zero device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/echosense-challenge./frontend" rel="noopener noreferrer"&gt;Frontend&lt;/a&gt;&lt;/strong&gt;: The UI that communicates with the device via websocket.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each subfolder includes instructions for running the project locally.&lt;/p&gt;




&lt;p&gt;For a more detailed overview, including screenshots, you can read the submission sent to the challenge here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71" rel="nofollow"&gt;https://dev.to/milewski/echosense-your-pocket-sized-companion-for-smarter-meetings-3i71&lt;/a&gt;&lt;/p&gt;

&lt;/div&gt;
&lt;br&gt;
&lt;br&gt;
  &lt;/div&gt;
&lt;br&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/milewski/echosense-challenge" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;br&gt;
&lt;/div&gt;
&lt;br&gt;


&lt;h2&gt;
  
  
  Screenshots
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnnmmbpb39o3p2y0pk4g.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnnmmbpb39o3p2y0pk4g.jpg" alt="demo-1" width="800" height="623"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzdqibcu2qpwlym8nvu9e.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzdqibcu2qpwlym8nvu9e.jpg" alt="demo-2" width="800" height="623"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4svrtiynjivoo02mktwy.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4svrtiynjivoo02mktwy.jpg" alt="demo-3" width="800" height="327"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Journey
&lt;/h2&gt;

&lt;p&gt;When powered on, the device automatically connects to the configured Wi-Fi network and requests a temporary token from AssemblyAI, valid for one hour. It establishes a real-time transcription WebSocket connection and generates a local network URL, displayed as a QR code on the OLED screen.&lt;/p&gt;

&lt;p&gt;The QR code directs users to the device’s IP address, where a web server runs on port 80. The server hosts a Vue.js-based interface, with all assets (CSS, JS, images) inlined into a single minified and mangled HTML file. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;This optimization ensures minimal memory usage—essential in a resource-constrained environment where every byte counts.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;As the user speaks, audio is streamed in ~500ms chunks, sampled at 16000Hz in PCM16 format, via the WebSocket connection to AssemblyAI. Transcriptions are returned and displayed live to any user who scans the QR code. Simultaneously, the audio is saved locally on the device’s SD card for further use.&lt;/p&gt;

&lt;p&gt;The following diagram illustrate this functionality: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnvpxq0dfc7y0zj0t49np.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnvpxq0dfc7y0zj0t49np.jpg" alt="diagram-1" width="800" height="637"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7f5gva9jjblpph640q7m.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7f5gva9jjblpph640q7m.jpg" alt="diagram-2" width="800" height="560"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Prompts Qualification
&lt;/h2&gt;

&lt;p&gt;My submission qualifies for 2 prompts:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Really Rad Real-Time&lt;/li&gt;
&lt;li&gt;No More Monkey Business&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Incomplete functions
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;The SD card was initially intended to store recordings and later attach them to emails. However, I realized that file sizes would grow too large, exceeding email attachment limits. To address this, a backend would be required to receive and convert the files from raw PCM16 to MP3. Since this wasn't the main focus of the challenge, I left this feature unfinished, as it would require building and hosting a backend.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Currently, there’s no way to configure Wi-Fi, API keys, or recording options via the web UI. All keys are injected at build time during compilation. Ideally, users would set up the device via a local Wi-Fi connection between their phone and the device, but this setup would require additional work.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I had planned to design and 3D print a case, possibly as a cube, to align with names like MeetingBox or MetaCube. Unfortunately, I didn’t have time to complete this, so the prototype was built and presented on a breadboard.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;If anyone has any question, feel free to ask below or open an issue on GitHub—I’ll be happy to help!&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>assemblyaichallenge</category>
      <category>ai</category>
      <category>api</category>
    </item>
    <item>
      <title>DearBook: Create Magical Illustrated Children's Stories with AI</title>
      <dc:creator>Rafael Milewski</dc:creator>
      <pubDate>Sun, 10 Nov 2024 05:43:59 +0000</pubDate>
      <link>https://forem.com/milewski/dearbook-create-magical-illustrated-childrens-stories-with-ai-4mpe</link>
      <guid>https://forem.com/milewski/dearbook-create-magical-illustrated-childrens-stories-with-ai-4mpe</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/pgai"&gt;Open Source AI Challenge with pgai and Ollama &lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built an AI-powered children's book generator that creates fully illustrated stories based on user input. The user provides a brief prompt outlining the desired storyline, such as:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Create a fun story about a bird who was afraid to fly.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;In response, an entire illustrated book is generated!&lt;/p&gt;

&lt;p&gt;Users can also explore and read stories created by others. All stories are public and anonymous.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://dearbook.fun" rel="noopener noreferrer"&gt;https://dearbook.fun&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/JdkwNeeGhNc"&gt;
&lt;/iframe&gt;
&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.dev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/milewski" rel="noopener noreferrer"&gt;
        milewski
      &lt;/a&gt; / &lt;a href="https://github.com/milewski/dearbook-challenge" rel="noopener noreferrer"&gt;
        dearbook-challenge
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Create Magical Illustrated Children's Stories with AI
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;DearBook&lt;/h1&gt;

&lt;/div&gt;
&lt;p&gt;
Create Magical Illustrated Children's Stories with AI
&lt;/p&gt;

&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://github.com/user-attachments/assets/ca158f7f-1966-4be3-8daf-74cd79baca8e"&gt;&lt;img width="1000" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Fuser-attachments%2Fassets%2Fca158f7f-1966-4be3-8daf-74cd79baca8e"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is the main repository for my submission to &lt;a href="https://dev.to/challenges/pgai" rel="nofollow"&gt;The Open Source AI Challenge&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The project is organized into three folders:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/dearbook-challenge./backend" rel="noopener noreferrer"&gt;Backend&lt;/a&gt;&lt;/strong&gt;: Contains the API, Queue, Database, ComfyUI, and Ollama.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/dearbook-challenge./frontend" rel="noopener noreferrer"&gt;Frontend&lt;/a&gt;&lt;/strong&gt;: The UI that communicates with the API.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/milewski/dearbook-challenge./infrastructure" rel="noopener noreferrer"&gt;Infrastructure&lt;/a&gt;&lt;/strong&gt;: The Terraform and stack files used to deploy the application on a Docker Swarm cluster.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each subfolder includes instructions for running the project locally. Setup is straightforward,
as everything has been containerized, running &lt;code&gt;docker compose up&lt;/code&gt; is all that’s needed.&lt;/p&gt;
&lt;div class="markdown-alert markdown-alert-warning"&gt;
&lt;p class="markdown-alert-title"&gt;Warning&lt;/p&gt;
&lt;p&gt;You need a good NVIDIA GPU to run this project!!.&lt;/p&gt;
&lt;/div&gt;
&lt;p&gt;For a more detailed overview, including screenshots, you can read the submission sent to the challenge here:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://dev.to/milewski/dearbook-create-magical-illustrated-childrens-stories-with-ai-4mpe" rel="nofollow"&gt;https://dev.to/milewski/dearbook-create-magical-illustrated-childrens-stories-with-ai-4mpe&lt;/a&gt;&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h3 class="heading-element"&gt;Winner announcement:&lt;/h3&gt;

&lt;/div&gt;
&lt;p&gt;&lt;a href="https://dev.to/devteam/congrats-to-the-winners-of-the-open-source-ai-challenge-with-pgai-and-ollama-46b6" rel="nofollow"&gt;https://dev.to/devteam/congrats-to-the-winners-of-the-open-source-ai-challenge-with-pgai-and-ollama-46b6&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;



&lt;/div&gt;
&lt;br&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/milewski/dearbook-challenge" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;br&gt;
&lt;/div&gt;
&lt;br&gt;


&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpebracnvsskq7wny540a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpebracnvsskq7wny540a.png" alt="Home Page" width="800" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flr1pwsxthvfh3fbtdlod.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flr1pwsxthvfh3fbtdlod.png" alt="Book Creation" width="800" height="608"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbfj3rarbdudqea8b89nx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbfj3rarbdudqea8b89nx.png" alt="Book View" width="800" height="608"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Tools Used
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;TimescaleDB&lt;/strong&gt;: The self-hosted Docker version was used.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;pgvector&lt;/strong&gt;: It was used to store the embedding of the storyline of the book, alongside other important context useful for the search. (&lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/database/migrations/2024_11_01_131325_create_books_table.php#L31" rel="noopener noreferrer"&gt;source&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;pgvectorscale&lt;/strong&gt;: The &lt;strong&gt;StreamingDiskANN&lt;/strong&gt; index type was used on the book's searchable embedding to speed up the process. (&lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/database/migrations/2024_11_01_131325_create_books_table.php#L37" rel="noopener noreferrer"&gt;source&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;pgai&lt;/strong&gt;: The &lt;code&gt;ai.ollama_embed()&lt;/code&gt; was used to generate embeddings of user input content during the search function. (&lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/app/Services/BookService.php#L34" rel="noopener noreferrer"&gt;source&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ollama&lt;/strong&gt;: &lt;/p&gt;

&lt;p&gt;It was used for &lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/app/Services/BookService.php#L148" rel="noopener noreferrer"&gt;creating embeddings&lt;/a&gt;, &lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/app/Services/BookService.php#L117-L141" rel="noopener noreferrer"&gt;generating the story of the book&lt;/a&gt;, analyzing whether the user input content &lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/app/Services/BookService.php#L171-L186" rel="noopener noreferrer"&gt;is safe for children&lt;/a&gt;, and &lt;a href="https://github.com/milewski/dearbook-challenge/blob/6847dafa2ab9dfa0f99ba22434e665113dd1eabb/backend/app/Services/BookService.php#L87-L110" rel="noopener noreferrer"&gt;generating an idea prompt for an image generation model.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There were mainly two models used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://dev.tourl"&gt;llama3.1:8b&lt;/a&gt;: for text generation.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://ollama.com/library/mxbai-embed-large" rel="noopener noreferrer"&gt;mxbai-embed-large&lt;/a&gt;: for embedding.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;ComfyUI&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;It was used to generate the logo, book cover and every page based on the prompt generated by Ollama.&lt;/p&gt;

&lt;p&gt;The main base model utilized was &lt;a href="https://civitai.com/models/133005/juggernaut-xl" rel="noopener noreferrer"&gt;juggernaut-xl&lt;/a&gt;, alongside a few LoRAs. They can all be found in the project &lt;a href="https://github.com/milewski/dearbook-challenge/tree/4e40acaf502197069d4c0903288261a0527eab91/backend/app/Services/ComfyUI/Workflows" rel="noopener noreferrer"&gt;workflow file&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Stack
&lt;/h3&gt;

&lt;p&gt;Overview of other technologies used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;PHP / Laravel / FrankenPHP / WebSocket&lt;/li&gt;
&lt;li&gt;VueJs /  Typescript / Tailwind&lt;/li&gt;
&lt;li&gt;Docker / Docker Swarm / Traefik / Redis&lt;/li&gt;
&lt;li&gt;Terraform / Vultr GPU Cloud&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Technical Description
&lt;/h2&gt;

&lt;p&gt;The process for the book generation is as following:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5xca4dg6cgl9nabrz3z5.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5xca4dg6cgl9nabrz3z5.jpg" alt="Technical Diagram" width="800" height="894"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The user creates a prompt, which is optional. If left blank, Ollama generates a new story independently.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A preliminary check is performed to ensure the prompt does not contain any violent or inappropriate content for children. If such content is found, the generation process is aborted immediately. &lt;a href="https://github.com/milewski/dearbook-challenge/blob/4e40acaf502197069d4c0903288261a0527eab91/backend/app/Services/BookService.php#L297-L312" rel="noopener noreferrer"&gt;(prompt)&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If the user provides input, the 10 records most similar to the user’s prompt are retrieved based on their embeddings. If no input is provided, the top 10 closest embeddings to each other are retrieved. This is done to prevent the LLM from generating duplicate content. &lt;em&gt;In previous tests, it often produced stories about Max and the paintbrush. With this control, if a story about a paintbrush is generated, it ensures that it is at least not related to Max.&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The LLM is then instructed to create a story with at least 10 paragraphs, ensuring that it doesn’t closely resemble any of the top 10 stories already in the database. &lt;a href="https://github.com/milewski/dearbook-challenge/blob/4e40acaf502197069d4c0903288261a0527eab91/backend/app/Services/BookService.php#L191-L224" rel="noopener noreferrer"&gt;(prompt)&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once the main story is created, each paragraph on its own doesn’t provide enough context for an image generation model to maintain consistency across pages. To address this, a new prompt is given to the LLM to generate context-rich descriptions for each paragraph, including details about the story, main characters, gender, and other relevant information. &lt;a href="https://github.com/milewski/dearbook-challenge/blob/4e40acaf502197069d4c0903288261a0527eab91/backend/app/Services/BookService.php#L269-L290" rel="noopener noreferrer"&gt;(prompt)&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;These prompts are then sent to ComfyUI, which generates the necessary assets.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;I certainly learned a lot through this experience. Until now, I hadn’t had much exposure to or understanding of vector embeddings, but now it has finally clicked. &lt;/p&gt;

&lt;p&gt;Also, knowing that with pgai I can interact with LLMs directly in SQL gives me more ideas than I have time to execute them. I had also never tried to configure a server with NVIDIA GPUs, and now I understand many of the challenges involved. My current setup is a Docker Swarm cluster with three nodes: one for ComfyUI, Ollama, and CPU-based apps. Getting NVIDIA to run on Swarm was troublesome, but it was a valuable learning experience.&lt;/p&gt;

&lt;p&gt;I intend to keep this demo app running until the end of the challenge, and after that, anyone curious to see it can host it on their own computer. All the Docker files and instructions on how to run it are in the github repository.&lt;/p&gt;




&lt;h3&gt;
  
  
  Prizes qualifications
&lt;/h3&gt;

&lt;p&gt;I believe my submission qualifies for the following additional prize categories:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Open-source Models from Ollama&lt;/strong&gt;: I used &lt;a href="https://dev.tourl"&gt;llama3.1:8b&lt;/a&gt; and &lt;a href="https://ollama.com/library/mxbai-embed-large" rel="noopener noreferrer"&gt;mxbai-embed-large&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;All the Extensions&lt;/strong&gt;: I used &lt;strong&gt;pgvector&lt;/strong&gt;, &lt;strong&gt;pgai&lt;/strong&gt; and &lt;strong&gt;pgvectorscale&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;del&gt;Vectorizer Vibe: I could not use pgai Vectorizer because &lt;a href="https://github.com/timescale/pgai/blob/a01e6208e81942b289970feebfc96bafb95c3fcc/projects/extension/sql/ai--0.4.0.sql#L1592-L1601" rel="noopener noreferrer"&gt;it was not implemented for Ollama&lt;/a&gt;. So I rolled my own queue solution.&lt;/del&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>devchallenge</category>
      <category>pgaichallenge</category>
      <category>database</category>
      <category>ai</category>
    </item>
    <item>
      <title>Noisy Monsters 🎶👾</title>
      <dc:creator>Rafael Milewski</dc:creator>
      <pubDate>Sun, 13 Oct 2024 18:01:24 +0000</pubDate>
      <link>https://forem.com/milewski/noisy-monsters-4lki</link>
      <guid>https://forem.com/milewski/noisy-monsters-4lki</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/pinata"&gt;The Pinata Challenge &lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;I built a music app where you place monsters on the scene and record custom audio for each of them. The recordings can be anything—mouth beats, hand-made noises, or whatever sound you can think of. Each monster loops its sound, turning the scene into a lively messy audio experience!&lt;/p&gt;

&lt;p&gt;All visual assets were created using ComfyUI.&lt;/p&gt;

&lt;p&gt;Tech used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;PixiJs&lt;/li&gt;
&lt;li&gt;ToneJs&lt;/li&gt;
&lt;li&gt;Vue&lt;/li&gt;
&lt;li&gt;ComfyUI (SDXL &amp;amp; Flux Dev)&lt;/li&gt;
&lt;li&gt;Pinata&lt;/li&gt;
&lt;li&gt;Node for the backend API&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Try it out here: &lt;a href="https://noisy-monsters.onrender.com" rel="noopener noreferrer"&gt;https://noisy-monsters.onrender.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can also share your creations and let others experience them:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://noisy-monsters.onrender.com?share=QmUzrpzHcwDfKuTqpV19L3mdZ8J84XDE8WCi3Fthx1z8Ub" rel="noopener noreferrer"&gt;https://noisy-monsters.onrender.com?share=QmUzrpzHcwDfKuTqpV19L3mdZ8J84XDE8WCi3Fthx1z8Ub&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  My Code
&lt;/h2&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.dev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/milewski" rel="noopener noreferrer"&gt;
        milewski
      &lt;/a&gt; / &lt;a href="https://github.com/milewski/pinata-challenge" rel="noopener noreferrer"&gt;
        pinata-challenge
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;Pinata Challenge&lt;/h1&gt;

&lt;/div&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://private-user-images.githubusercontent.com/2874967/376057878-3514ba7e-911b-4470-a83c-7e0230ad94fa.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc0OTY0NzgsIm5iZiI6MTc1NzQ5NjE3OCwicGF0aCI6Ii8yODc0OTY3LzM3NjA1Nzg3OC0zNTE0YmE3ZS05MTFiLTQ0NzAtYTgzYy03ZTAyMzBhZDk0ZmEucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDkxMCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTA5MTBUMDkyMjU4WiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9ZTRiMTk4NjUyZDdhZWVhYjY5NjkzNjZkOGI2YzhkN2VkN2ViMTFmMGM2Njc0NmI3OTlhMmM3MGE3OGEzZjg4OSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.nl4lz7BEcnnl9Pj781quwGHJEgAApJqt59rueoHEygg"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fprivate-user-images.githubusercontent.com%2F2874967%2F376057878-3514ba7e-911b-4470-a83c-7e0230ad94fa.png%3Fjwt%3DeyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTc0OTY0NzgsIm5iZiI6MTc1NzQ5NjE3OCwicGF0aCI6Ii8yODc0OTY3LzM3NjA1Nzg3OC0zNTE0YmE3ZS05MTFiLTQ0NzAtYTgzYy03ZTAyMzBhZDk0ZmEucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDkxMCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTA5MTBUMDkyMjU4WiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9ZTRiMTk4NjUyZDdhZWVhYjY5NjkzNjZkOGI2YzhkN2VkN2ViMTFmMGM2Njc0NmI3OTlhMmM3MGE3OGEzZjg4OSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.nl4lz7BEcnnl9Pj781quwGHJEgAApJqt59rueoHEygg" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;This Repo is my submission to the dev.to challenge: &lt;a href="https://dev.to/challenges/pinata" rel="nofollow"&gt;https://dev.to/challenges/pinata&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Demo: &lt;a href="https://noisy-monsters.onrender.com?share=QmUzrpzHcwDfKuTqpV19L3mdZ8J84XDE8WCi3Fthx1z8Ub" rel="nofollow noopener noreferrer"&gt;https://noisy-monsters.onrender.com&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;



&lt;/div&gt;
&lt;br&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/milewski/pinata-challenge" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;br&gt;
&lt;/div&gt;
&lt;br&gt;


&lt;h2&gt;
  
  
  More Details
&lt;/h2&gt;

&lt;p&gt;The app uses Pinata to store and share the state of the scene along with the user-recorded audio. When you click "Share" all recorded audio is uploaded to Pinata's IPFS, along with the app's state (the monster positions, and which sound is associated with each monster) in JSON format. The CID is embedded in the URL as &lt;code&gt;?share=cid&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;When someone opens the app with a &lt;code&gt;?share=cid&lt;/code&gt; link, the app fetches the JSON data from IPFS and reconstructs the scene with all the monsters and sounds intact, allowing users to share their creations easily!&lt;/p&gt;

&lt;h2&gt;
  
  
  Notes
&lt;/h2&gt;

&lt;p&gt;I only discovered this challenge two days before the submission deadline, so I had to finish the app as quickly as possible. As a result, there are several features I'd like to improve or add, such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Intro screens and clearer instructions on how to play.&lt;/li&gt;
&lt;li&gt;Hover states for better UI feedback.&lt;/li&gt;
&lt;li&gt;Some feedback when user starts recording.&lt;/li&gt;
&lt;li&gt;Error handling.&lt;/li&gt;
&lt;li&gt;Various code optimizations.&lt;/li&gt;
&lt;li&gt;Add different audio effects / modifiers for each monster.&lt;/li&gt;
&lt;li&gt;Rewrite the code, Vue was totally not necessary.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Despite the time crunch, I'm excited to share this project and plan to keep improving it!&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>pinatachallenge</category>
      <category>webdev</category>
      <category>api</category>
    </item>
  </channel>
</rss>
