<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Sayed Ali Alkamel</title>
    <description>The latest articles on Forem by Sayed Ali Alkamel (@sayed_ali_alkamel).</description>
    <link>https://forem.com/sayed_ali_alkamel</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/sayed_ali_alkamel"/>
    <language>en</language>
    <item>
      <title>[Boost]</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Sun, 15 Mar 2026 12:20:17 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/-3fgm</link>
      <guid>https://forem.com/sayed_ali_alkamel/-3fgm</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/sayed_ali_alkamel" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" alt="sayed_ali_alkamel"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://dev.to/sayed_ali_alkamel/vibe-coding-flutter-the-senior-devs-honest-take-1k0f" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;Vibe Coding Flutter: The Senior Dev's Honest Take&lt;/h2&gt;
      &lt;h3&gt;Sayed Ali Alkamel ・ Mar 13&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#flutter&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#dart&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#ai&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#vibecoding&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
      <category>flutter</category>
      <category>dart</category>
      <category>ai</category>
      <category>vibecoding</category>
    </item>
    <item>
      <title>Vibe Coding Flutter: The Senior Dev's Honest Take</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Fri, 13 Mar 2026 12:46:50 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/vibe-coding-flutter-the-senior-devs-honest-take-1k0f</link>
      <guid>https://forem.com/sayed_ali_alkamel/vibe-coding-flutter-the-senior-devs-honest-take-1k0f</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;I'm a Google Developer Expert in Dart &amp;amp; Flutter. This is my honest, research-backed take on vibe coding — with real tweets, real code, and a real workflow you can use Monday morning.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  ⚡ TL;DR — The Five Things You Need to Know
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Vibe coding is AI-assisted dev where &lt;strong&gt;intent replaces syntax&lt;/strong&gt; — coined by Andrej Karpathy in Feb 2025, now a legitimate workflow&lt;/li&gt;
&lt;li&gt;Flutter is one of the &lt;strong&gt;best frameworks for vibe coding&lt;/strong&gt; because Dart is strongly-typed, widget trees are predictable, and hot-reload is unbeatable&lt;/li&gt;
&lt;li&gt;It genuinely crushes boilerplate, scaffolding, tests, and Figma-to-widget conversion&lt;/li&gt;
&lt;li&gt;It breaks badly on state management, platform channels, and performance tuning — &lt;strong&gt;you still need to read the code&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;The power move: pair &lt;code&gt;AGENTS.md&lt;/code&gt; + MCP servers + a clear PRD, and treat AI as your most productive junior dev&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Where It Started: A Throwaway Tweet That Changed Everything
&lt;/h2&gt;

&lt;p&gt;On February 2, 2025, Andrej Karpathy — OpenAI co-founder, former Tesla AI Senior Director — posted something that wasn't supposed to be a manifesto. It was a shower thought about his weekend hobby.&lt;/p&gt;

&lt;p&gt;It got &lt;strong&gt;million of views&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-1886192184808149383-570" src="https://platform.twitter.com/embed/Tweet.html?id=1886192184808149383"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1886192184808149383-570');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1886192184808149383&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"There's a new kind of coding I call 'vibe coding', where you fully give in to the vibes, embrace exponentials, and forget that the code even exists. It's possible because the LLMs are getting too good. I just talk to Composer with SuperWhisper so I barely even touch the keyboard. I accept all, I don't read the diffs anymore. I just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@karpathy&lt;/strong&gt;, Feb 2, 2025 · &lt;a href="https://x.com/karpathy/status/1886192184808149383" rel="noopener noreferrer"&gt;x.com/karpathy/status/1886192184808149383&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Exactly one year later, Karpathy revisited the post. The word "vibe coding" had earned a Merriam-Webster entry (March 2025) and was named the Collins English Dictionary Word of the Year 2025 and spawned university courses. But his preferred term had quietly evolved:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Today, programming via LLM agents is increasingly becoming a default workflow for professionals, except with more oversight and scrutiny. My current favorite term is 'agentic engineering' — agentic because you're orchestrating agents who write the code, engineering because there's an art &amp;amp; science to it. It's something you can learn and become better at."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@karpathy&lt;/strong&gt;, Feb 2026 retrospective · &lt;a href="https://x.com/karpathy/status/2019137879310836075" rel="noopener noreferrer"&gt;x.com/karpathy/status/2019137879310836075&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This evolution matters. What started as a "forget the code exists" attitude has matured into &lt;strong&gt;deliberate orchestration with oversight&lt;/strong&gt;. And that nuance is exactly what Flutter developers need to internalize.&lt;/p&gt;




&lt;h2&gt;
  
  
  What the Flutter Community Is Actually Saying on X
&lt;/h2&gt;

&lt;p&gt;Twitter / X has been the real-time laboratory for Flutter vibe coding opinions. Here's an honest cross-section — hype, skepticism, and people actually shipping things.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Enthusiasts: "It Unlocks Everything"
&lt;/h3&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-1906004591570825422-677" src="https://platform.twitter.com/embed/Tweet.html?id=1906004591570825422"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1906004591570825422-677');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1906004591570825422&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Vibe coding with #Flutter basically lets anyone build whatever niche app they want in a couple of hours."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@FlutterCarl&lt;/strong&gt; · &lt;a href="https://x.com/FlutterCarl/status/1906004591570825422" rel="noopener noreferrer"&gt;x.com/FlutterCarl/status/1906004591570825422&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-1900578339770843172-196" src="https://platform.twitter.com/embed/Tweet.html?id=1900578339770843172"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1900578339770843172-196');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1900578339770843172&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Preparing a complete Vibe coding with Flutter video... Stay tuned 👀"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@mcflyDev (Gautier 💙)&lt;/strong&gt; · &lt;a href="https://x.com/mcflyDev/status/1900578339770843172" rel="noopener noreferrer"&gt;x.com/mcflyDev/status/1900578339770843172&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Can't wait to try Vibe coding Flutter apps in the Vide!"&lt;/em&gt; — reacting to Norbert Kozsir's Flutter AI-IDE that runs and tests widgets it creates, implements pixel-perfect widgets from screenshots, and writes code exactly the way you want.&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;&lt;a class="mentioned-user" href="https://dev.to/csells"&gt;@csells&lt;/a&gt; (Chris Sells)&lt;/strong&gt; · &lt;a href="https://x.com/csells/status/1903182124141908165" rel="noopener noreferrer"&gt;x.com/csells/status/1903182124141908165&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"🔴 LIVE Vibe Coding with @norbertkozsir @devangelslondon and @esratech! #VibeCoding #Flutter #Dart #FlutterCommunity"&lt;/em&gt; — 2,729 Views on the live session.&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@FlutterComm&lt;/strong&gt; · &lt;a href="https://x.com/FlutterComm/status/1914001730355814887" rel="noopener noreferrer"&gt;x.com/FlutterComm/status/1914001730355814887&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  The Pragmatists: "It's a Multiplier, Not a Magic Wand"
&lt;/h3&gt;

&lt;p&gt;Andrea Bizzotto (codewithandrea.com), who maintains one of the most respected Flutter newsletters (22k+ subscribers), captured the nuanced position that most senior Flutter developers share:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"AI is a multiplier that amplifies both your skills and your mistakes. So learn to use it well, and don't feel like you need to go all-in. Sometimes the old-fashioned way of writing code manually is still the right call. The decision matrix is simple: compare prompting effort vs coding effort."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;Andrea Bizzotto&lt;/strong&gt; · &lt;a href="https://codewithandrea.com/newsletter/november-2025/" rel="noopener noreferrer"&gt;codewithandrea.com/newsletter/november-2025&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  The Skeptics: "Don't Actually Forget the Code Exists"
&lt;/h3&gt;

&lt;p&gt;Andrew Chen from a16z pointed out the macro trajectory that makes seasoned engineers uncomfortable:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Random thoughts/predictions on where vibe coding might go: most code will be written by the time-rich. Thus, most code will be written by kids/students rather than software engineers. This is the same trend as video, photos, and other social media — we are in the early innings..."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@andrewchen&lt;/strong&gt; · March 9, 2025&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;And the Dart language team itself signaled where things are heading — not pure vibe coding, but structured, type-safe AI tooling for Dart specifically:&lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2031499569096528324-155" src="https://platform.twitter.com/embed/Tweet.html?id=2031499569096528324"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2031499569096528324-155');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2031499569096528324&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"AI apps just got a lot easier to build 🏗️ Genkit Dart (Preview) is officially out, bringing type-safety and a model-agnostic API to the Dart side. Support for Gemini, Claude, OpenAI. Type-safe AI flows. Dev UI for AI testing and traces. #Genkit #Dart #Flutter"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@dart_lang&lt;/strong&gt; · &lt;a href="https://x.com/dart_lang/status/2031499569096528324" rel="noopener noreferrer"&gt;x.com/dart_lang/status/2031499569096528324&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  Why Flutter Is Unusually Well-Suited for Vibe Coding
&lt;/h2&gt;

&lt;p&gt;Most vibe coding discussion centers on web (React, Next.js) or Python backends. But Flutter has structural properties that make it arguably &lt;strong&gt;better suited&lt;/strong&gt; for AI-assisted development than most frameworks. Here's why.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Dart's Strong Typing Catches AI Mistakes Early
&lt;/h3&gt;

&lt;p&gt;When an LLM generates incorrect Flutter code, Dart's type system screams immediately. You don't get mysterious runtime crashes at 2 AM — the &lt;strong&gt;compiler tells you exactly where the AI hallucinated&lt;/strong&gt; a method that doesn't exist.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="c1"&gt;// ❌ AI hallucinated this parameter — Dart catches it at compile time&lt;/span&gt;
&lt;span class="n"&gt;Widget&lt;/span&gt; &lt;span class="nf"&gt;buildCard&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;BuildContext&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;Card&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="nl"&gt;roundedCorners:&lt;/span&gt; &lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// Error: "The named parameter 'roundedCorners' isn't defined"&lt;/span&gt;
    &lt;span class="nl"&gt;child:&lt;/span&gt; &lt;span class="n"&gt;ListTile&lt;/span&gt;&lt;span class="p"&gt;(...),&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// ✅ What it should be — and you know immediately&lt;/span&gt;
&lt;span class="n"&gt;Widget&lt;/span&gt; &lt;span class="nf"&gt;buildCard&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;BuildContext&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;Card&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="nl"&gt;shape:&lt;/span&gt; &lt;span class="n"&gt;RoundedRectangleBorder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
      &lt;span class="nl"&gt;borderRadius:&lt;/span&gt; &lt;span class="n"&gt;BorderRadius&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;circular&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nl"&gt;child:&lt;/span&gt; &lt;span class="n"&gt;ListTile&lt;/span&gt;&lt;span class="p"&gt;(...),&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Widget Trees Are Declarative — Perfect for AI Reasoning
&lt;/h3&gt;

&lt;p&gt;Flutter's declarative UI model maps almost 1:1 with how LLMs "think" about layout. When you say &lt;em&gt;"add a gradient background to this container with rounded corners and a shadow,"&lt;/em&gt; there's a direct, unambiguous translation to a &lt;code&gt;DecoratedBox&lt;/code&gt; with a &lt;code&gt;BoxDecoration&lt;/code&gt;. The AI doesn't have to guess about lifecycle methods or imperative DOM manipulation.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Hot Reload Is the Tightest Feedback Loop in Mobile Dev
&lt;/h3&gt;

&lt;p&gt;Vibe coding thrives on fast iteration: &lt;strong&gt;prompt → generate → verify → adjust&lt;/strong&gt;. Flutter's hot reload (now including &lt;a href="https://codewithandrea.com/newsletter/march-2025/" rel="noopener noreferrer"&gt;Flutter Web hot reload (stable since 3.35)&lt;/a&gt;) makes the verify step nearly instantaneous. You see AI output materialise on your device in under a second.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Dart Is Expressive but Readable
&lt;/h3&gt;

&lt;p&gt;Unlike Java or Kotlin, Dart is concise enough that AI-generated code stays readable. Unlike TypeScript in a React codebase, there's no JSX/CSS-in-JS impedance mismatch. A Flutter file generated by AI tends to look like Flutter code a human would write — which makes review dramatically faster.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Truth Matrix: When to Vibe, When to Think
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Task&lt;/th&gt;
&lt;th&gt;Approach&lt;/th&gt;
&lt;th&gt;Why&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Generate 10 model classes from API spec&lt;/td&gt;
&lt;td&gt;✅ Use AI&lt;/td&gt;
&lt;td&gt;Pure mechanical translation, zero judgment needed&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Build a multi-step onboarding flow UI&lt;/td&gt;
&lt;td&gt;✅ Use AI&lt;/td&gt;
&lt;td&gt;Declarative widgets, verify visually with hot reload&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Write 40 widget tests for existing screens&lt;/td&gt;
&lt;td&gt;✅ Use AI&lt;/td&gt;
&lt;td&gt;Pattern-heavy, AI is exceptionally good at this&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Fix a 2px pixel misalignment&lt;/td&gt;
&lt;td&gt;✋ Code it&lt;/td&gt;
&lt;td&gt;Low coding effort, high prompt effort — just fix it&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Architect a real-time sync feature&lt;/td&gt;
&lt;td&gt;✋ Code it&lt;/td&gt;
&lt;td&gt;Requires domain knowledge of your specific constraints&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Implement fingerprint auth with fallback&lt;/td&gt;
&lt;td&gt;🔍 AI draft + review&lt;/td&gt;
&lt;td&gt;AI can scaffold, but you must audit every security line&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Optimise a ListView with 10,000 items&lt;/td&gt;
&lt;td&gt;🔍 AI draft + review&lt;/td&gt;
&lt;td&gt;AI knows the patterns, you need to profile the output&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Rule of thumb from Andrea Bizzotto:&lt;/strong&gt; If the prompting effort is lower than the coding effort → use AI. If manually fixing it is faster than explaining it → just fix it manually.&lt;/p&gt;

&lt;h3&gt;
  
  
  What AI crushes ✅
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Scaffold generation (routes, services, models)&lt;/li&gt;
&lt;li&gt;Figma design → Flutter widget conversion&lt;/li&gt;
&lt;li&gt;Boilerplate (copyWith, fromJson, toJson, Freezed classes)&lt;/li&gt;
&lt;li&gt;Responsive layout skeletons&lt;/li&gt;
&lt;li&gt;Animation scaffolding (implicit animations)&lt;/li&gt;
&lt;li&gt;Simple CRUD screens with Firebase&lt;/li&gt;
&lt;li&gt;README and documentation generation&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Where to tread carefully ⚠️
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Complex Riverpod / BLoC state logic&lt;/li&gt;
&lt;li&gt;Platform channel implementations&lt;/li&gt;
&lt;li&gt;Performance tuning (jank, Impeller issues)&lt;/li&gt;
&lt;li&gt;Custom painters and shaders&lt;/li&gt;
&lt;li&gt;Security-sensitive code (auth, storage, network)&lt;/li&gt;
&lt;li&gt;Deep navigation flows with guards&lt;/li&gt;
&lt;li&gt;Accessibility semantics&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Production-Grade Vibe Coding Workflow for Flutter
&lt;/h2&gt;

&lt;p&gt;Based on real-world patterns from Viktor Lidholt (Serverpod), the Globe.dev team, and senior Flutter practitioners — here's the workflow that actually holds up.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Write a PRD Before You Touch the AI
&lt;/h3&gt;

&lt;p&gt;A Product Requirements Document doesn't have to be formal — a Markdown file describing the feature, expected behavior, and constraints is enough. AI with context is an entirely different animal from AI without it.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Context is everything.&lt;/strong&gt; It's like assigning a task to a junior engineer without any context — poor delivery is almost guaranteed. — &lt;a href="https://globe.dev/blog/beyond-vibe-coding-production-flutter-dart-ai/" rel="noopener noreferrer"&gt;globe.dev/blog/beyond-vibe-coding-production-flutter-dart-ai&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Step 2: Create an &lt;code&gt;AGENTS.md&lt;/code&gt; in Your Repo Root
&lt;/h3&gt;

&lt;p&gt;This file tells every AI agent about your architecture, state management choice, folder structure, code style, and package preferences. It's the &lt;strong&gt;single highest-leverage action&lt;/strong&gt; you can take. Without it, agents default to whatever they were trained on — which probably isn't your codebase.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="gh"&gt;# AGENTS.md — Flutter App Agent Instructions&lt;/span&gt;

&lt;span class="gu"&gt;## Architecture&lt;/span&gt;
&lt;span class="p"&gt;-&lt;/span&gt; State management: Riverpod (hooks_riverpod 2.x)
&lt;span class="p"&gt;-&lt;/span&gt; Navigation: go_router 13.x with nested routes
&lt;span class="p"&gt;-&lt;/span&gt; Data layer: Repository pattern, Freezed models
&lt;span class="p"&gt;-&lt;/span&gt; Network: Dio with interceptors, no direct http package

&lt;span class="gu"&gt;## Folder Structure&lt;/span&gt;
lib/
  features/          # One folder per feature
    auth/
      data/          # Repositories, DTOs
      domain/        # Models, use cases
      presentation/  # Widgets, controllers
  core/              # Shared utils, theme, routing

&lt;span class="gu"&gt;## Rules&lt;/span&gt;
&lt;span class="p"&gt;-&lt;/span&gt; NEVER use setState in feature widgets, always Riverpod
&lt;span class="p"&gt;-&lt;/span&gt; ALL models must be Freezed + json_serializable
&lt;span class="p"&gt;-&lt;/span&gt; Widget tests are REQUIRED for all new screens
&lt;span class="p"&gt;-&lt;/span&gt; Use const constructors wherever possible
&lt;span class="p"&gt;-&lt;/span&gt; Follow existing theme tokens in core/theme/

&lt;span class="gu"&gt;## MCP Servers in Use&lt;/span&gt;
&lt;span class="p"&gt;-&lt;/span&gt; Figma MCP: for any UI implementation task
&lt;span class="p"&gt;-&lt;/span&gt; Dart MCP: for pub.dev package lookups
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3: Connect the Right MCP Servers
&lt;/h3&gt;

&lt;p&gt;The &lt;strong&gt;Dart MCP server&lt;/strong&gt;, &lt;strong&gt;Figma MCP server&lt;/strong&gt;, and &lt;strong&gt;Firebase Studio&lt;/strong&gt; dramatically improve output quality. They give agents access to your actual APIs, your actual design specs — instead of making plausible-sounding things up.&lt;/p&gt;

&lt;p&gt;Very Good Ventures published a great resource: &lt;a href="https://verygood.ventures/blog/7-mcp-servers-every-dart-and-flutter-developer-should-know/" rel="noopener noreferrer"&gt;7 MCP Servers Every Dart and Flutter Developer Should Know&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Plan Mode First, Then Execute
&lt;/h3&gt;

&lt;p&gt;Use Cursor's or Claude Code's planning mode to &lt;strong&gt;generate a plan before any code gets written&lt;/strong&gt;. Review it. Catch architectural misalignments before they become 300-line mistakes. This one habit catches 80% of problems.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: Hot Reload, Inspect, Iterate
&lt;/h3&gt;

&lt;p&gt;Run the generated code immediately. Flutter's hot reload plus DevTools makes verifying AI output faster than reading it line-by-line — for simple UI changes. But &lt;strong&gt;always read state and logic code&lt;/strong&gt; before accepting it.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 6: Audit. Don't Just Accept All.
&lt;/h3&gt;

&lt;p&gt;Karpathy's original "Accept All, don't read the diffs" was fine for his personal weekend projects. For anything that ships to users, AI-generated code is your technical debt. Read it. If it's messy, ask the AI to clean it up before accepting.&lt;/p&gt;




&lt;h2&gt;
  
  
  Real Vibe Coding Scenarios That Actually Work in Flutter
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Scenario 1: Weekend Proof of Concept — Liquid Glass iOS Effect
&lt;/h3&gt;

&lt;p&gt;Viktor Lidholt from the Serverpod team vibe-coded a Flutter proof-of-concept for Apple's Liquid Glass effect (introduced in iOS 26) over a weekend.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Obviously not production-level code, but it shows the viability of the approach."&lt;/em&gt; — Viktor Lidholt, &lt;a href="https://serverpod.dev/blog/vibe-coding-flutter" rel="noopener noreferrer"&gt;serverpod.dev/blog/vibe-coding-flutter&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Perfect use case: time-boxed, exploratory, visual feedback. Ship the vibe, then decide if it's worth productionising.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scenario 2: Figma → Flutter with the Figma MCP Server
&lt;/h3&gt;

&lt;p&gt;Using the Figma MCP server + Claude Code, developers are mapping complete design files to Flutter widget trees. The MCP reliably captures dimensional requirements — sizes, spacing, font scales — so the AI output is close enough that human refinement takes minutes, not hours.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Prompt: "Implement the ProductCard from the Figma design"&lt;/span&gt;
&lt;span class="c1"&gt;// Figma MCP provides exact sizes, spacing, colors from the design file&lt;/span&gt;

&lt;span class="kd"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ProductCard&lt;/span&gt; &lt;span class="kd"&gt;extends&lt;/span&gt; &lt;span class="n"&gt;StatelessWidget&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="n"&gt;ProductCard&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="k"&gt;super&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kd"&gt;required&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;product&lt;/span&gt;&lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="kd"&gt;final&lt;/span&gt; &lt;span class="n"&gt;Product&lt;/span&gt; &lt;span class="n"&gt;product&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="nd"&gt;@override&lt;/span&gt;
  &lt;span class="n"&gt;Widget&lt;/span&gt; &lt;span class="n"&gt;build&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;BuildContext&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;Container&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
      &lt;span class="nl"&gt;width:&lt;/span&gt; &lt;span class="mi"&gt;160&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;// from Figma&lt;/span&gt;
      &lt;span class="nl"&gt;decoration:&lt;/span&gt; &lt;span class="n"&gt;BoxDecoration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nl"&gt;color:&lt;/span&gt; &lt;span class="n"&gt;Theme&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;of&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;colorScheme&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;surface&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="nl"&gt;borderRadius:&lt;/span&gt; &lt;span class="n"&gt;BorderRadius&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;circular&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;  &lt;span class="c1"&gt;// from Figma&lt;/span&gt;
        &lt;span class="nl"&gt;boxShadow:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;AppShadows&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;card&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
      &lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="nl"&gt;child:&lt;/span&gt; &lt;span class="n"&gt;Column&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nl"&gt;crossAxisAlignment:&lt;/span&gt; &lt;span class="n"&gt;CrossAxisAlignment&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;start&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="nl"&gt;children:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
          &lt;span class="n"&gt;ClipRRect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="nl"&gt;borderRadius:&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="n"&gt;BorderRadius&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;vertical&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
              &lt;span class="nl"&gt;top:&lt;/span&gt; &lt;span class="n"&gt;Radius&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;circular&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="nl"&gt;child:&lt;/span&gt; &lt;span class="n"&gt;CachedNetworkImage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
              &lt;span class="nl"&gt;imageUrl:&lt;/span&gt; &lt;span class="n"&gt;product&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;imageUrl&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
              &lt;span class="nl"&gt;height:&lt;/span&gt; &lt;span class="mi"&gt;120&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;// from Figma&lt;/span&gt;
              &lt;span class="nl"&gt;fit:&lt;/span&gt; &lt;span class="n"&gt;BoxFit&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;cover&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="p"&gt;),&lt;/span&gt;
          &lt;span class="p"&gt;),&lt;/span&gt;
          &lt;span class="n"&gt;Padding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="nl"&gt;padding:&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="n"&gt;EdgeInsets&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;  &lt;span class="c1"&gt;// from Figma&lt;/span&gt;
            &lt;span class="nl"&gt;child:&lt;/span&gt; &lt;span class="n"&gt;Column&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="cm"&gt;/* title, price, rating */&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
          &lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="p"&gt;],&lt;/span&gt;
      &lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Scenario 3: Full Test Suite from Existing Screens
&lt;/h3&gt;

&lt;p&gt;Prompt: &lt;em&gt;"Write widget tests for every screen in the &lt;code&gt;lib/features/&lt;/code&gt; folder, following the existing patterns in &lt;code&gt;test/&lt;/code&gt;."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The AI finds your patterns, replicates them, and generates a scaffold for 20–40 tests in a single pass. Review them, run them, patch the failures. What would take a full day takes two hours.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Anti-Patterns That Will Burn You
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;"If you let too much bad code into your project, the models will perform worse over time, and your ability to keep the project clean will suffer."&lt;/strong&gt; — Viktor Lidholt, Serverpod&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;❌ No &lt;code&gt;AGENTS.md&lt;/code&gt;&lt;/strong&gt; — Without explicit instructions, the AI defaults to its training data. It might use BLoC when your project uses Riverpod, old &lt;code&gt;go_router&lt;/code&gt; patterns, or target the wrong platform entirely.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;❌ Accepting security-sensitive code without audit&lt;/strong&gt; — Documented cases exist of apps deployed with hardcoded secrets, missing authentication checks, and insecure data storage — all AI-generated, none reviewed. Rule: &lt;strong&gt;any code that touches auth, storage, or network gets a human eyeball every time, no exceptions.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;❌ Vibe coding your state management&lt;/strong&gt; — Riverpod, BLoC, and Provider have subtleties around lifecycle, disposal, and async state that LLMs frequently get wrong in non-trivial cases. The AI will generate code that looks correct, compiles cleanly, but leaks memory or produces incorrect state transitions. Tests often don't catch this.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;❌ No stopping condition on agentic loops&lt;/strong&gt; — Autonomous agents can spiral. Give them bounded tasks with clear acceptance criteria. &lt;code&gt;"Implement the login screen per the Figma design and the PRD"&lt;/code&gt; is a good prompt. &lt;code&gt;"Build the entire app"&lt;/code&gt; is a recipe for a mess you'll spend three days untangling.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Flutter Vibe Coding Stack in 2026
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Claude Code&lt;/strong&gt; — Karpathy's own 2025 year-in-review called it &lt;em&gt;"the first convincing demonstration of what an LLM Agent looks like."&lt;/em&gt; It runs in your environment with your private context, making it the most Flutter-codebase-aware option when paired with &lt;code&gt;AGENTS.md&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cursor&lt;/strong&gt; — The dominant AI IDE with excellent multi-file operations and planning mode. Pairs well with the Dart MCP server.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Gemini CLI + Flutter Extension&lt;/strong&gt; — Google's own answer. The Flutter Extension for Gemini CLI combines the Dart and Flutter MCP Server with additional context and commands — natural choice for Firebase-heavy Flutter projects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Firebase Studio / DreamFlow&lt;/strong&gt; — Higher-level "vibey" tools where you interact only with the generated app, not the code. Best for non-engineers or pure prototyping, not for production Flutter development.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dart MCP Server&lt;/strong&gt; — Not a standalone tool but the connective tissue. Every AI agent for Flutter becomes measurably better with it. Add it to your setup before anything else.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Key Insight from Karpathy's 1-Year Retrospective
&lt;/h2&gt;

&lt;p&gt;Karpathy's 2026 anniversary post is the most important document in this space right now. He named what skilled developers are actually doing — and it's not "vibe coding" in the throwaway sense:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Agentic engineering: agentic because the new default is that you are not writing the code directly 99% of the time, you are orchestrating agents who do, and acting as oversight. Engineering to emphasize that there is an art &amp;amp; science and expertise to it."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— &lt;strong&gt;@karpathy&lt;/strong&gt;, Feb 2026 · &lt;a href="https://x.com/karpathy/status/2019137879310836075" rel="noopener noreferrer"&gt;x.com/karpathy/status/2019137879310836075&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This reframe is everything for Flutter developers. You're not abdicating engineering judgment — you're &lt;strong&gt;operating at a higher level of abstraction&lt;/strong&gt;. Your expertise shifts from &lt;em&gt;"how do I write this AnimationController"&lt;/em&gt; to &lt;em&gt;"what is the right interaction model and how do I verify the AI implemented it correctly."&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The Bottom Line
&lt;/h2&gt;

&lt;p&gt;Vibe coding is real, it's here, and dismissing it is as silly as dismissing hot reload in 2018. But the breathless "just describe your app and it builds itself" narrative sets up a failure mode that is very real for production Flutter development.&lt;/p&gt;

&lt;p&gt;Flutter's architecture gives you a genuine edge. The widget tree is predictable, Dart's type system is loud about mistakes, and hot reload gives you the tightest feedback loop in mobile development. These properties mean you can move fast &lt;em&gt;and&lt;/em&gt; maintain visibility into what the AI is generating.&lt;/p&gt;

&lt;p&gt;Karpathy evolved from "forget the code exists" to "agentic engineering with oversight." &lt;strong&gt;That's the right frame.&lt;/strong&gt; You're the architect, the QA engineer, the tech lead. The AI is your most productive junior developer — high throughput, close review, clear instructions, and never unsupervised in your security code.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Use vibe coding wisely: accelerate where it helps, but don't let it erode your developer skills."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;— Viktor Lidholt, Serverpod · &lt;a href="https://serverpod.dev/blog/vibe-coding-flutter" rel="noopener noreferrer"&gt;serverpod.dev/blog/vibe-coding-flutter&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The Flutter developers who learn to &lt;strong&gt;orchestrate agents well&lt;/strong&gt; — not just prompt and accept — are the ones who will ship extraordinarily fast without accruing the technical debt that kills velocity later.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Have you been vibe coding in Flutter? What's your workflow? Drop it in the comments — I'm genuinely curious what the community has converged on.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If this was useful, follow me here on dev.to and on X [@YourHandle] for more Flutter content.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Tags:&lt;/strong&gt; &lt;code&gt;#flutter&lt;/code&gt; &lt;code&gt;#dart&lt;/code&gt; &lt;code&gt;#ai&lt;/code&gt; &lt;code&gt;#vibecoding&lt;/code&gt; &lt;code&gt;#productivity&lt;/code&gt;&lt;/p&gt;

</description>
      <category>flutter</category>
      <category>dart</category>
      <category>ai</category>
      <category>vibecoding</category>
    </item>
    <item>
      <title>I Built an AI Boardroom App in 8 Hours with Flutter &amp; AI 🚀 (Open Source)</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Mon, 05 Jan 2026 18:43:02 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/i-built-an-ai-boardroom-app-in-8-hours-with-flutter-ai-open-source-1fjm</link>
      <guid>https://forem.com/sayed_ali_alkamel/i-built-an-ai-boardroom-app-in-8-hours-with-flutter-ai-open-source-1fjm</guid>
      <description>&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg0tg12pwhotgtzro60fn.png" alt="App Screenshot 1" width="800" height="1734"&gt;&lt;/th&gt;
&lt;th&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd9xixw662t55riyg93le.png" alt="App Screenshot 2" width="800" height="1734"&gt;&lt;/th&gt;
&lt;th&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2an2qt3bgjscdjgqcew4.png" alt="App Screenshot 3" width="800" height="1734"&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;blockquote&gt;
&lt;p&gt;"What if you could have a board of directors made up of the world's smartest AI models, debating your problems in real-time?"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That was the question. The result? &lt;strong&gt;LLM Council&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;And the craziest part? I built the entire mobile app between &lt;strong&gt;lunch and late dinner&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;As a &lt;strong&gt;Google Developer Expert in Flutter&lt;/strong&gt;, I've built hundreds of apps, but the speed at which we can now ship software using AI tools like &lt;strong&gt;Antigravity&lt;/strong&gt; is frankly mind-blowing.&lt;/p&gt;

&lt;p&gt;Here’s the story of how I took inspiration from an AI legend, fired up my IDE, and shipped a premium cross-platform app in a single day. 👇&lt;/p&gt;




&lt;h2&gt;
  
  
  💡 The Inspiration
&lt;/h2&gt;

&lt;p&gt;It started when I saw &lt;strong&gt;Andrej Karpathy&lt;/strong&gt; (founding member of OpenAI, former Director of AI at Tesla) tweet about his project: &lt;a href="https://github.com/karpathy/llm-council" rel="noopener noreferrer"&gt;llm-council&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;He built a web interface where you can ask a question, and multiple LLMs (GPT-4, Claude, etc.) answer it. Then, they "read" each other's answers and a "Chairman" model synthesizes the best advice.&lt;/p&gt;

&lt;p&gt;I loved the concept. &lt;strong&gt;But I wanted it in my pocket.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I wanted a premium, executive-tier mobile experience. Something that felt like walking into a boardroom. Dark mode, gold accents, smooth animations.&lt;/p&gt;

&lt;p&gt;So I challenged myself: &lt;strong&gt;Can I build this before dinner?&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  🛠️ The Tech Stack
&lt;/h2&gt;

&lt;p&gt;To move fast without breaking things, I stuck to a battle-tested stack:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Flutter 3.6+&lt;/strong&gt;: For that silky smooth 60fps UI on iOS and Android.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bloc &amp;amp; Clean Architecture&lt;/strong&gt;: Because "fast" shouldn't mean "messy code".&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OpenRouter API&lt;/strong&gt;: To access all models (Claude 3.5 Sonnet, GPT-4o, Gemini 1.5 Pro) with one key.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Antigravity&lt;/strong&gt;: The AI coding assistant that acted as my pair programmer on steroids.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  ⚡ The "Lunch to Late Dinner" Sprint (1:00 PM - 9:00 PM)
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1:00 PM - The Setup 🏗️
&lt;/h3&gt;

&lt;p&gt;I didn't waste time on boilerplate. I initialized the Flutter project and set up the domain layer.&lt;br&gt;
&lt;em&gt;User -&amp;gt; Question -&amp;gt; Council -&amp;gt; Deliberation -&amp;gt; Synthesis.&lt;/em&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  2:30 PM - The "Antigravity" Boost 🚀
&lt;/h3&gt;

&lt;p&gt;This is where things got wild. Instead of manually typing out every model class and repository, I used &lt;strong&gt;Antigravity&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Me: &lt;em&gt;"Generate a repository that hits OpenRouter. It needs to handle streaming responses from 4 different models simultaneously."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Antigravity: &lt;em&gt;Done.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;It didn't just write code; it wrote &lt;em&gt;good&lt;/em&gt; code. It handled the &lt;code&gt;Dio&lt;/code&gt; interceptors, the error parsing, and the concurrent &lt;code&gt;Future.wait&lt;/code&gt; calls for the council members.&lt;/p&gt;
&lt;h3&gt;
  
  
  4:30 PM - The UI Polish ✨
&lt;/h3&gt;

&lt;p&gt;A "Council" implies prestige. A standard Material Design look wouldn't cut it.&lt;br&gt;
I went for a "Succession-style" aesthetic:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Deep Navy Backgrounds&lt;/strong&gt; (&lt;code&gt;#0F172A&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gold Accents&lt;/strong&gt; for the active speaker.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Anonymized Peer Reviews&lt;/strong&gt;: Models rank each other blindly (Model A doesn't know Model B wrote the answer).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I implemented &lt;code&gt;flutter_animate&lt;/code&gt; to make the messages slide in. It felt alive.&lt;/p&gt;
&lt;h3&gt;
  
  
  6:30 PM - The Synthesis Logic 🧠
&lt;/h3&gt;

&lt;p&gt;The magic of this app is the &lt;strong&gt;Chairman&lt;/strong&gt;.&lt;br&gt;
The app aggregates all the answers, strips the names, and feeds them back to the Chairman model with the prompt:&lt;br&gt;
&lt;em&gt;"Review these perspectives and provide a synthesized, executive summary."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The result? Answers that are significantly more balanced and nuanced than any single model could provide.&lt;/p&gt;
&lt;h3&gt;
  
  
  8:00 PM - Final Optimizations &amp;amp; Testing 🏁
&lt;/h3&gt;

&lt;p&gt;The last hour was spent on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Adding local persistence with &lt;code&gt;sqflite&lt;/code&gt; so conversations are saved.&lt;/li&gt;
&lt;li&gt;Securing the API key storage.&lt;/li&gt;
&lt;li&gt;Ensuring the "Chairman" animation was buttery smooth.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;9:00 PM&lt;/strong&gt;: Commit, Push, Done. Dinner time.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  🧑‍💻 The Code (Open Source)
&lt;/h2&gt;

&lt;p&gt;I’m making the whole thing open source. You can clone it, put in your own keys, and have your personal AI board of directors.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Check out the repo here:&lt;/strong&gt;&lt;br&gt;
👉 &lt;strong&gt;&lt;a href="https://github.com/sayed3li97/llm_council_app" rel="noopener noreferrer"&gt;github.com/sayed3li97/llm_council_app&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;(Note: If the link is 404, I'm just polishing the README! Check back in 5 mins)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Here is a snippet of how we handle the parallel consultation using Dart's concurrency:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;CouncilSession&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;consult&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;String&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// 1. Fire off requests to all members in parallel&lt;/span&gt;
  &lt;span class="kd"&gt;final&lt;/span&gt; &lt;span class="n"&gt;responses&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;wait&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;members&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;_api&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;ask&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// 2. Anonymize and Request Peer Reviews&lt;/span&gt;
  &lt;span class="kd"&gt;final&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;_conductPeerReviews&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// 3. Synthesis by Chairman&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;_chairman&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;synthesize&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;responses&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;reviews&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  🚀 Why This Matters
&lt;/h2&gt;

&lt;p&gt;We are entering a new era of development. It's not about typing speed anymore; it's about &lt;strong&gt;architectural vision&lt;/strong&gt; and &lt;strong&gt;tool leverage&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;As a GDE, my advice to developers in 2026 is simple: &lt;strong&gt;Embrace the tools.&lt;/strong&gt;&lt;br&gt;
by using an AI agent like Antigravity, I focused on the &lt;em&gt;product experience&lt;/em&gt; the animations, the flow, the value while the AI handled the plumbing.&lt;/p&gt;

&lt;p&gt;I built a production-ready app in 8 hours.&lt;br&gt;
&lt;strong&gt;What will you build?&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;If you enjoyed this, drop a star on the repo and follow me for more Flutter &amp;amp; AI experiments!&lt;/em&gt;&lt;/p&gt;

</description>
      <category>flutter</category>
      <category>dart</category>
      <category>ai</category>
      <category>mobile</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Thu, 04 Sep 2025 22:52:31 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/-41g0</link>
      <guid>https://forem.com/sayed_ali_alkamel/-41g0</guid>
      <description>&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj" class="crayons-story__hidden-navigation-link"&gt;The Philosophical Choice Between SQLite and DuckDB for Flutter Developers&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/sayed_ali_alkamel" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" alt="sayed_ali_alkamel profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/sayed_ali_alkamel" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Sayed Ali Alkamel
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Sayed Ali Alkamel
                
              
              &lt;div id="story-author-preview-content-2820710" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/sayed_ali_alkamel" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Sayed Ali Alkamel&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Sep 4 '25&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj" id="article-link-2820710"&gt;
          The Philosophical Choice Between SQLite and DuckDB for Flutter Developers
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/flutter"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;flutter&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/sql"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;sql&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/duckdb"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;duckdb&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/mobile"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;mobile&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
          &lt;a href="https://dev.to/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left"&gt;
            &lt;div class="multiple_reactions_aggregate"&gt;
              &lt;span class="multiple_reactions_icons_container"&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
              &lt;/span&gt;
              &lt;span class="aggregate_reactions_counter"&gt;5&lt;span class="hidden s:inline"&gt; reactions&lt;/span&gt;&lt;/span&gt;
            &lt;/div&gt;
          &lt;/a&gt;
            &lt;a href="https://dev.to/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              &lt;span class="hidden s:inline"&gt;Add Comment&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            3 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;

&lt;/div&gt;


</description>
      <category>flutter</category>
      <category>sql</category>
      <category>duckdb</category>
      <category>mobile</category>
    </item>
    <item>
      <title>The Philosophical Choice Between SQLite and DuckDB for Flutter Developers</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Thu, 04 Sep 2025 22:52:16 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj</link>
      <guid>https://forem.com/sayed_ali_alkamel/the-philosophical-choice-between-sqlite-and-duckdb-for-flutter-developers-43hj</guid>
      <description>&lt;p&gt;In the grand architecture of software, our apps are more than just a collection of screens and buttons; they are mechanisms for organizing, interpreting, and presenting information. But what is the fundamental nature of that information? Is it a series of rapid, fleeting moments, or a vast, intricate tapestry of historical record? This is the central question Flutter developers must face when choosing a local data store.&lt;/p&gt;

&lt;p&gt;It is a duality of purpose. You are not just picking a piece of software; you are selecting an engine whose core design philosophy is either to manage the constant, chaotic flow of everyday operations, or to distill profound insights from the mountain of data that accumulates over time.&lt;/p&gt;

&lt;h2&gt;
  
  
  SQLite: The Guardian of the Moment
&lt;/h2&gt;

&lt;p&gt;Imagine a bustling city hall, where a single clerk handles a never-ending queue of citizens. Each interaction is a small, atomic transaction: a birth certificate request, a change of address, a new permit. The clerk’s job is to execute each request with perfect fidelity, ensuring no two people interfere with one another. The system is optimized for this kind of work for fast, reliable, and frequent changes to small pieces of data.&lt;/p&gt;

&lt;p&gt;This, in essence, is the architectural soul of SQLite. As a row-based database, it writes an entire row of data at once, a perfect fit for a note-taking app, a user preferences screen, or a chat history. Its very design is a testament to the power of transactional integrity. It is the perfect tool for when the truth of the matter is about the present state of a single thing.&lt;/p&gt;

&lt;h2&gt;
  
  
  DuckDB: The Architect of History
&lt;/h2&gt;

&lt;p&gt;Now consider a different setting: a grand archive of human knowledge, where a team of researchers is asked to find patterns across millennia of records. They don't just pull up one file; they look for all mentions of a specific phrase, or all documents written on a particular type of paper. Their work is about aggregation, comparison, and analysis across the entire collection. They can't afford to read every single word in every single document. They need a system designed for large-scale investigation.&lt;/p&gt;

&lt;p&gt;This is the brilliant innovation of DuckDB. As a columnar database, it stores data vertically. Instead of storing a full record together, it groups all &lt;code&gt;age&lt;/code&gt; values, all &lt;code&gt;city&lt;/code&gt; values, and all &lt;code&gt;purchase_price&lt;/code&gt; values. When you ask a question like "What is the average price of all purchases in New York?", the engine doesn't have to read every single piece of data in every single row. It goes straight to the &lt;code&gt;city&lt;/code&gt; column and then the &lt;code&gt;purchase_price&lt;/code&gt; column, vectorizing the calculation with blinding speed. It is the perfect tool for when the truth is not in a single point, but in the overarching trend.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tipping Point: A Matter of Scale and Intention
&lt;/h2&gt;

&lt;p&gt;So, which do you choose? The answer depends entirely on your application’s deepest purpose.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;If your Flutter app's primary function is to read, write, and update discrete, single records like a personal finance tracker logging a daily expense or a social media feed refreshing a single post. SQLite is the unequivocal choice. Its maturity, small footprint, and transactional efficiency are unmatched for these tasks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If your app must perform complex aggregations, generate analytical dashboards, or process large, immutable datasets such as an app that visualizes thousands of sensor readings or a tool that helps users find trends in their workout history. DuckDB is the revolutionary engine you need. It will slice through data with a velocity that a transactional database simply cannot replicate.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The ultimate lesson here is one of purpose-built design. You wouldn't use a scalpel to hammer a nail, nor a sledgehammer to perform surgery. The same is true for your database. Recognize the true nature of your app's data, and you will find the correct tool for the job. The path to a truly great application lies not in blindly choosing the most popular solution, but in understanding the beautiful, fundamental duality that governs the very heart of data.&lt;/p&gt;

</description>
      <category>flutter</category>
      <category>sql</category>
      <category>duckdb</category>
      <category>mobile</category>
    </item>
    <item>
      <title>What's New in Flutter 3.32.0? Your Dev Workflow Just Got an Upgrade!</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Tue, 22 Jul 2025 06:34:46 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/whats-new-in-flutter-3320-your-dev-workflow-just-got-an-upgrade-okk</link>
      <guid>https://forem.com/sayed_ali_alkamel/whats-new-in-flutter-3320-your-dev-workflow-just-got-an-upgrade-okk</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/sayed_ali_alkamel" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" alt="sayed_ali_alkamel"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://dev.to/sayed_ali_alkamel/whats-new-in-flutter-3320-your-dev-workflow-just-got-an-upgrade-49h4" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;What's New in Flutter 3.32.0? Your Dev Workflow Just Got an Upgrade!&lt;/h2&gt;
      &lt;h3&gt;Sayed Ali Alkamel ・ Jul 20&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#flutter&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#dart&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#mobile&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#programming&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
      <category>flutter</category>
      <category>dart</category>
      <category>mobile</category>
      <category>programming</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Sun, 20 Jul 2025 15:28:53 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/-4ch6</link>
      <guid>https://forem.com/sayed_ali_alkamel/-4ch6</guid>
      <description>&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" class="crayons-story__hidden-navigation-link"&gt;Dive into Google's Agent Development Kit (ADK) to build production-ready AI agents&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/sayed_ali_alkamel" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" alt="sayed_ali_alkamel profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/sayed_ali_alkamel" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Sayed Ali Alkamel
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Sayed Ali Alkamel
                
              
              &lt;div id="story-author-preview-content-2674524" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/sayed_ali_alkamel" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Sayed Ali Alkamel&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Jul 10 '25&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" id="article-link-2674524"&gt;
          Dive into Google's Agent Development Kit (ADK) to build production-ready AI agents
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/ai"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;ai&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/googlecloud"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;googlecloud&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/aiagents"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;aiagents&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/python"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;python&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
          &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left"&gt;
            &lt;div class="multiple_reactions_aggregate"&gt;
              &lt;span class="multiple_reactions_icons_container"&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
              &lt;/span&gt;
              &lt;span class="aggregate_reactions_counter"&gt;5&lt;span class="hidden s:inline"&gt; reactions&lt;/span&gt;&lt;/span&gt;
            &lt;/div&gt;
          &lt;/a&gt;
            &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              1&lt;span class="hidden s:inline"&gt; comment&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            6 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;

&lt;/div&gt;


</description>
      <category>ai</category>
      <category>googlecloud</category>
      <category>aiagents</category>
      <category>python</category>
    </item>
    <item>
      <title>What's New in Flutter 3.32.0? Your Dev Workflow Just Got an Upgrade!</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Sun, 20 Jul 2025 15:20:04 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/whats-new-in-flutter-3320-your-dev-workflow-just-got-an-upgrade-49h4</link>
      <guid>https://forem.com/sayed_ali_alkamel/whats-new-in-flutter-3320-your-dev-workflow-just-got-an-upgrade-49h4</guid>
      <description>&lt;p&gt;Hey Flutter fanatics and dev enthusiasts! Get ready to level up your app-building game because Flutter 3.32.0 has landed, packed with exciting enhancements and thoughtful refinements designed to make your development journey smoother, faster, and more enjoyable than ever before. Let's dive into the highlights of this fantastic new release!&lt;/p&gt;

&lt;h2&gt;
  
  
  See Your Widgets Shine with Enhanced Previews! 🎨
&lt;/h2&gt;

&lt;p&gt;One of the most exciting updates in 3.32.0 is the significant leap forward in Widget Previews. Now, with Flutter Web as the default environment, and the implementation of new layouts like &lt;strong&gt;GridView&lt;/strong&gt; and &lt;strong&gt;ListView&lt;/strong&gt;, visualizing and iterating on your UI components has never been easier. Imagine seeing your designs come to life instantly, allowing for rapid adjustments and a truly iterative development process. This is a game-changer for crafting pixel-perfect interfaces!&lt;/p&gt;

&lt;h2&gt;
  
  
  Impeller Gets Even More Impressive! 🚀
&lt;/h2&gt;

&lt;p&gt;For those who crave buttery-smooth animations and top-tier graphics performance, you'll be thrilled with the latest Impeller updates. This release brings a host of optimizations, including backfilling &lt;strong&gt;TextContents&lt;/strong&gt; unit tests, migrating unit tests off Skia geometry classes, and improving rendering efficiency by adjusting UVs for pixel snapping and increasing glyph atlas resolution. The result? A noticeable boost in performance and even more beautiful visuals for your applications. Get ready for an experience that's truly "impeller-fect"!&lt;/p&gt;

&lt;h2&gt;
  
  
  Cupertino Widgets: A Touch of Native Polish! 🍎
&lt;/h2&gt;

&lt;p&gt;Flutter continues its commitment to providing a truly native look and feel across platforms. In 3.32.0, Cupertino (iOS) widgets receive a delightful set of fixes and additions. You'll find improvements like preventing vertical drag gestures from being blocked in &lt;strong&gt;CupertinoSheetRoute&lt;/strong&gt; body, the addition of &lt;strong&gt;minWidth&lt;/strong&gt; and &lt;strong&gt;minHeight&lt;/strong&gt; to &lt;strong&gt;CupertinoButton&lt;/strong&gt;, and ensuring &lt;strong&gt;CupertinoAlertDialog&lt;/strong&gt; dividers span the full width. These refinements add a layer of native elegance to your iOS applications.&lt;/p&gt;

&lt;h2&gt;
  
  
  Tooling That Works Smarter, Not Harder! 🛠️
&lt;/h2&gt;

&lt;p&gt;We all appreciate tools that make our lives easier, and Flutter 3.32.0 delivers on this front with excellent tooling improvements. Expect streamlined development workflows with correctly selected entrypoint targets for web builds, inferred placeholder types for &lt;strong&gt;Gen-l10n&lt;/strong&gt;, and the handy &lt;code&gt;--ignore-timeouts&lt;/code&gt; flag for the &lt;code&gt;flutter test&lt;/code&gt; command. These enhancements are all about making your coding process more efficient and less prone to hiccups.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cross-Platform Stability and Performance! 🌐
&lt;/h2&gt;

&lt;p&gt;Beyond the headline features, Flutter 3.32.0 also includes a variety of &lt;strong&gt;platform-specific fixes&lt;/strong&gt; for Android, iOS, Windows, and Linux. These behind-the-scenes improvements contribute to more stable and performant applications across all supported environments, ensuring a high-quality experience for your users no matter what device they're on.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ready to Dive Deeper?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This release is a testament to Flutter's continuous evolution, focusing on refining the core framework, enhancing developer tools, and ensuring a consistent, high-quality experience across all supported platforms.&lt;/p&gt;

&lt;p&gt;For a comprehensive breakdown of every change, bug fix, and new feature, be sure to check out the official release notes:&lt;/p&gt;

&lt;p&gt;Flutter 3.32.0 Release Notes : &lt;a href="https://docs.flutter.dev/release/release-notes/release-notes-3.32.0" rel="noopener noreferrer"&gt;https://docs.flutter.dev/release/release-notes/release-notes-3.32.0&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Go forth, create, and let Flutter 3.32.0 empower your next amazing app! Happy coding!&lt;/p&gt;

</description>
      <category>flutter</category>
      <category>dart</category>
      <category>mobile</category>
      <category>programming</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Thu, 10 Jul 2025 14:03:17 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/-53ca</link>
      <guid>https://forem.com/sayed_ali_alkamel/-53ca</guid>
      <description>&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" class="crayons-story__hidden-navigation-link"&gt;Dive into Google's Agent Development Kit (ADK) to build production-ready AI agents&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/sayed_ali_alkamel" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" alt="sayed_ali_alkamel profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/sayed_ali_alkamel" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Sayed Ali Alkamel
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Sayed Ali Alkamel
                
              
              &lt;div id="story-author-preview-content-2674524" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/sayed_ali_alkamel" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2652218%2F63a5dfd1-8229-48c1-85eb-54a58560297f.jpg" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Sayed Ali Alkamel&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Jul 10 '25&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" id="article-link-2674524"&gt;
          Dive into Google's Agent Development Kit (ADK) to build production-ready AI agents
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/ai"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;ai&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/googlecloud"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;googlecloud&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/aiagents"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;aiagents&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/python"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;python&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
          &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left"&gt;
            &lt;div class="multiple_reactions_aggregate"&gt;
              &lt;span class="multiple_reactions_icons_container"&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
              &lt;/span&gt;
              &lt;span class="aggregate_reactions_counter"&gt;5&lt;span class="hidden s:inline"&gt; reactions&lt;/span&gt;&lt;/span&gt;
            &lt;/div&gt;
          &lt;/a&gt;
            &lt;a href="https://dev.to/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              1&lt;span class="hidden s:inline"&gt; comment&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            6 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;

&lt;/div&gt;


</description>
      <category>ai</category>
      <category>googlecloud</category>
      <category>aiagents</category>
      <category>python</category>
    </item>
    <item>
      <title>Dive into Google's Agent Development Kit (ADK) to build production-ready AI agents</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Thu, 10 Jul 2025 10:06:26 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa</link>
      <guid>https://forem.com/sayed_ali_alkamel/dive-into-googles-agent-development-kit-adk-to-build-production-ready-ai-agents-goa</guid>
      <description>&lt;p&gt;The landscape of artificial intelligence is undergoing a profound transformation. What began with simple chatbots and reactive AI assistants is rapidly evolving into a world dominated by &lt;strong&gt;agentic AI&lt;/strong&gt; – autonomous systems capable of understanding complex goals, planning their own steps, executing tasks, and even self-correcting without constant human intervention. This evolution positions AI agents not merely as tools but as digital collaborators, poised to redefine workflows across industries. The future of AI agents is characterized by sophisticated capabilities such as reflection, advanced reasoning through chain-of-thought processes, robust memory systems, and enhanced user experiences.   &lt;/p&gt;

&lt;p&gt;At the forefront of this shift is Google's &lt;strong&gt;Agent Development Kit (ADK)&lt;/strong&gt;, an open-source framework designed to empower developers in building these intelligent, production-ready agentic systems. ADK is not just a theoretical construct, it powers critical components within Google's own ecosystem, including Agentspace and the Google Customer Engagement Suite. Its emergence addresses the growing need for structured, scalable solutions in the increasingly complex field of AI agent development.   &lt;/p&gt;

&lt;p&gt;This progression from simple AI assistants to autonomous AI agents marks a fundamental shift in how AI is leveraged. It moves beyond merely responding to commands to enabling autonomous entities that proactively act on defined goals. This implies a higher level of trust and sophistication required from both developers in designing these systems and users in interacting with them. The frameworks supporting this transition, like ADK, must therefore provide robust mechanisms for control, evaluation, and safety, which ADK aims to address through its structured approach and built-in features. This also suggests that developers will increasingly focus on "agent orchestration" and "tool integration" rather than solely on "prompt engineering."   &lt;/p&gt;

&lt;p&gt;Furthermore, the increasing complexity of real-world tasks and the demand for "intelligent automation" directly drives the need for multi-agent systems and frameworks such as ADK. A single agent often becomes a bottleneck when faced with intricate problems. The development community has recognized this, leading to a focus on "multi-agent by design" architectures , where "teams of specialized bots collaborate"  and frameworks facilitate "orchestrating complex workflows". This signifies that as AI applications mature and tackle more intricate real-world problems—such as supply chain optimization, disaster response, or financial fraud detection a single, monolithic AI is insufficient. The clear causal link here is that complex problems necessitate collaborative, specialized AI entities, which in turn require frameworks specifically built for multi-agent orchestration like ADK.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is Google Agent Development Kit (ADK)?
&lt;/h2&gt;

&lt;p&gt;Google's Agent Development Kit (ADK) is an open-source, flexible, and modular framework specifically engineered for building, running, and evaluating AI agents. Its design philosophy centers on making agent development feel more akin to traditional software development, providing developers with familiar paradigms for creating sophisticated AI solutions.   &lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Key Building Blocks&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
ADK is constructed around several core components that enable its powerful capabilities:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Agent&lt;/strong&gt;: This is the fundamental unit within ADK, designed to perform specific jobs. ADK offers a diverse range of agent types to suit various needs. The &lt;code&gt;LlmAgent&lt;/code&gt; is driven by Large Language Models (LLMs) for planning and reasoning, while &lt;code&gt;WorkflowAgent&lt;/code&gt; types, such as &lt;code&gt;SequentialAgent&lt;/code&gt;, &lt;code&gt;ParallelAgent&lt;/code&gt;, and &lt;code&gt;LoopAgent&lt;/code&gt;, enable deterministic control over task execution. For maximum adaptability, developers can also create &lt;code&gt;Custom Agents&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Tools&lt;/strong&gt;: Tools are functions or capabilities that extend an agent's abilities, allowing it to interact with external systems. This includes capabilities like searching the web, executing code, reading documents, and making API calls. ADK provides a rich tool ecosystem, supporting pre-built tools, Model Context Protocol (MCP) tools, integrations with third-party libraries like LangChain and LlamaIndex, and even the unique ability for agents to call other agents as if they were simple functions, enabling multi-agent hierarchies.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Memory &amp;amp; State&lt;/strong&gt;: To facilitate coherent and continuous interactions, ADK incorporates robust memory management. A &lt;code&gt;Session&lt;/code&gt; represents an ongoing interaction, managing its short-term &lt;code&gt;State&lt;/code&gt; (current context). &lt;code&gt;Memory&lt;/code&gt; provides longer-term recall across sessions, preventing the common "context amnesia" often observed in basic chatbots.&lt;br&gt;
&lt;code&gt;Artifact Management&lt;/code&gt; further enhances this by providing a mechanism for agents to manage and store files and data blobs (e.g., generated CSVs or images) associated with a session.  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Orchestration/Runner&lt;/strong&gt;: The &lt;code&gt;Runner&lt;/code&gt; acts as the orchestrator within ADK, managing the flow of events between the user, the agent, and its tools, ensuring everything executes in the correct order. ADK supports flexible orchestration patterns. This can be achieved through &lt;code&gt;WorkflowAgents&lt;/code&gt; for predictable pipelines (e.g., Sequential, Parallel, Loop execution) or through LLM-driven dynamic routing, where an &lt;code&gt;LlmAgent&lt;/code&gt; can transfer control or delegate tasks. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Code Execution&lt;/strong&gt;: A powerful capability within ADK is its support for dynamic code execution, allowing agents to write and run code during their process to solve complex problems or automate tasks.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Planning&lt;/strong&gt;: ADK agents are equipped with advanced planning features, enabling them to break down complex goals into a sequence of steps, determining the optimal approach using their available tools and reasoning capabilities.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Models&lt;/strong&gt;: ADK is designed to be model-agnostic, supporting various Large Language Model (LLM) providers, including OpenAI, Anthropic, and local models, often facilitated via LiteLLM. However, it is specifically optimized for seamless integration with Google's Gemini models and the broader Google Cloud ecosystem.   &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Why Master ADK Today?
&lt;/h2&gt;

&lt;p&gt;The rapid evolution and widespread adoption of AI agents make mastering frameworks like Google ADK a critical skill for developers. The industry is experiencing unprecedented growth and transformation, and ADK is positioned to be a key enabler.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Current Trends in AI Agent Development&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Several significant trends underscore the importance of agent development skills:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Rise of Agentic AI for Autonomous Goal Fulfillment&lt;/strong&gt;: The next phase of AI autonomy is here. AI agents are no longer merely reactive; they are proactive, capable of setting sub-goals, executing complex plans, and self-correcting along the way. This shift is driving agents from being simple tools to becoming intelligent teammates.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Growth of Multimodal Agents&lt;/strong&gt;: Modern applications demand fluid interactions across various data types. Multimodal AI agents can understand and generate content using not just text, but also images, audio, and video, leading to more natural and intuitive user experiences.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Shift Toward Specialized, Microservice-Based Agents&lt;/strong&gt;: The trend is moving away from generic, monolithic agents towards specialized AI agents, akin to microservices, designed for domain-specific tasks. This approach enhances efficiency, optimizes resource utilization, and accelerates software development and scalable deployments.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Emergence of Collaborative Multi-Agent Intelligence&lt;/strong&gt;: Complex problems often exceed the capabilities of a single agent. The industry is increasingly embracing an "AI workforce" where multiple agents collaborate, delegate tasks, and communicate to solve distributed problems, offering flexibility, parallelism, and robustness.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Advancement of Memory-Augmented Agents&lt;/strong&gt;: To provide truly personalized and continuous interactions, AI agent design now incorporates advanced memory retention. Agents can remember past interactions, user preferences, and long-term goals, leading to highly personalized and efficient experiences in areas like customer support and e-commerce.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Increasing Focus on Privacy-First and Explainable AI&lt;/strong&gt;: As AI agents handle sensitive data and critical operations, trust, transparency, data security, and explainability are non-negotiable. Frameworks are increasingly integrating ethical AI guidelines, bias detection, fairness checks, and compliance monitoring directly into development pipelinesRise of Agentic AI for Autonomous Goal Fulfillment: The next phase of AI autonomy is here. AI agents are no longer merely reactive; they are proactive, capable of setting sub-goals, executing complex plans, and self-correcting along the way. This shift is driving agents from being simple tools to becoming intelligent teammates.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Growth of Multimodal Agents&lt;/strong&gt;: Modern applications demand fluid interactions across various data types. Multimodal AI agents can understand and generate content using not just text, but also images, audio, and video, leading to more natural and intuitive user experiences.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Shift Toward Specialized, Microservice-Based Agents&lt;/strong&gt;: The trend is moving away from generic, monolithic agents towards specialized AI agents, akin to microservices, designed for domain-specific tasks. This approach enhances efficiency, optimizes resource utilization, and accelerates software development and scalable deployments.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Emergence of Collaborative Multi-Agent Intelligence&lt;/strong&gt;: Complex problems often exceed the capabilities of a single agent. The industry is increasingly embracing an "AI workforce" where multiple agents collaborate, delegate tasks, and communicate to solve distributed problems, offering flexibility, parallelism, and robustness.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Advancement of Memory-Augmented Agents&lt;/strong&gt;: To provide truly personalized and continuous interactions, AI agent design now incorporates advanced memory retention. Agents can remember past interactions, user preferences, and long-term goals, leading to highly personalized and efficient experiences in areas like customer support and e-commerce.   &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Increasing Focus on Privacy-First and Explainable AI&lt;/strong&gt;: As AI agents handle sensitive data and critical operations, trust, transparency, data security, and explainability are non-negotiable. Frameworks are increasingly integrating ethical AI guidelines, bias detection, fairness checks, and compliance monitoring directly into development pipelines&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>googlecloud</category>
      <category>aiagents</category>
      <category>python</category>
    </item>
    <item>
      <title>Google Gemma 3 Unlocked: The 128K-Token Multimodal AI Breakthrough Every Developer Must Explore</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Fri, 14 Mar 2025 13:03:06 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/google-gemma-3-unlocked-the-128k-token-multimodal-ai-breakthrough-every-developer-must-explore-n5k</link>
      <guid>https://forem.com/sayed_ali_alkamel/google-gemma-3-unlocked-the-128k-token-multimodal-ai-breakthrough-every-developer-must-explore-n5k</guid>
      <description>&lt;p&gt;Welcome, fellow AI explorers, to a journey into the heart of Google’s latest marvel—&lt;strong&gt;Gemma 3&lt;/strong&gt;. In a universe where advanced language models often feel as distant as galaxies, Gemma 3 brings state-of-the-art intelligence within arm’s reach. Today, we’ll explore its architecture, use cases, and how you can harness its power, all while sprinkling in a few technical stardust.&lt;/p&gt;




&lt;h2&gt;
  
  
  A New Era in AI Architecture
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The Cosmic Context Window
&lt;/h3&gt;

&lt;p&gt;Gemma 3 is designed to manage an astronomical &lt;strong&gt;128K-token context window&lt;/strong&gt;—roughly equivalent to an entire novel’s worth of text. For perspective, while GPT-4’s maximum context is 32K tokens, Gemma 3’s extended window allows it to maintain both the &lt;em&gt;big picture&lt;/em&gt; and minute details simultaneously. This is achieved by blending &lt;strong&gt;global attention layers&lt;/strong&gt; (which capture long-range dependencies) with &lt;strong&gt;local attention layers&lt;/strong&gt; (which focus on shorter spans of text). In effect, Gemma 3 navigates complex tasks without falling prey to the “KV-cache memory explosion” that can plague traditional transformers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Multimodal Vision
&lt;/h3&gt;

&lt;p&gt;Not content with just textual prowess, Gemma 3 integrates a &lt;strong&gt;vision encoder&lt;/strong&gt; (based on a variant of SigLIP) that remains frozen during training. This enables the model to process images alongside text. Imagine feeding in an image of a device and asking for its function—the model can interpret the visual cues and provide a coherent answer. This cross-modal capability heralds a future where our AIs can both read and see, expanding their realm of understanding.&lt;/p&gt;

&lt;h3&gt;
  
  
  Stellar Training Techniques
&lt;/h3&gt;

&lt;p&gt;Google’s engineering team employed a cutting-edge training regimen that includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Multilingual Tokenizer:&lt;/strong&gt; Supporting over 140 languages, ensuring Gemma 3 is a true polyglot in the digital cosmos.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Four-Phase Post-Training Finetuning:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Distillation:&lt;/strong&gt; Learning from a larger “teacher” model.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reinforcement Learning from Human Feedback (RLHF):&lt;/strong&gt; Aligning outputs with human expectations.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mathematical Enhancements (RLMF):&lt;/strong&gt; Sharpening its reasoning and numerical capabilities.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Coding Enhancements (RLEF):&lt;/strong&gt; Boosting its programming proficiency.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;These measures have resulted in a model that, in head-to-head benchmarks, often outperforms larger contemporaries—demonstrating that clever training can triumph over sheer scale.&lt;/p&gt;

&lt;h3&gt;
  
  
  Quantization for Efficiency
&lt;/h3&gt;

&lt;p&gt;Efficiency is key in any universe. Gemma 3 is available in &lt;strong&gt;official quantized versions&lt;/strong&gt; that significantly reduce memory and compute requirements while maintaining near-peak performance. This means you can deploy Gemma 3 on consumer-grade hardware—bringing supercharged AI out of the exclusive realm of data centers and into your local environment. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb52pwkwxu9kiiy4c5fdz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb52pwkwxu9kiiy4c5fdz.png" alt="Image description" width="800" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Figure: Gemma 3 (27B) achieving competitive Elo scores against larger models.&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Celestial Use Cases
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Personal AI Assistant
&lt;/h3&gt;

&lt;p&gt;Gemma 3’s compact footprint allows it to run on a single GPU, or even on mobile devices. Imagine having a sophisticated assistant capable of handling complex inquiries, brainstorming creative ideas, or simply engaging in profound conversation—all without sending your data off to a remote server.&lt;/p&gt;

&lt;h3&gt;
  
  
  Multilingual Communication
&lt;/h3&gt;

&lt;p&gt;Its support for over 140 languages makes Gemma 3 ideal for building translation apps, language tutors, or customer support chatbots. This global capability ensures that language is no barrier to accessing high-quality AI.&lt;/p&gt;

&lt;h3&gt;
  
  
  Code Companion and Problem Solver
&lt;/h3&gt;

&lt;p&gt;Thanks to its refined training in mathematics and programming, Gemma 3 can serve as a robust coding assistant. It can generate code snippets, explain algorithms, or debug your scripts. For developers, it’s like having a seasoned co-pilot who’s as comfortable with Python as they are with astrophysics.&lt;/p&gt;

&lt;h3&gt;
  
  
  Visual Analysis and Beyond
&lt;/h3&gt;

&lt;p&gt;Gemma 3’s vision capabilities open doors to applications like image captioning, visual troubleshooting, and content moderation. It’s not just about reading the text; it’s about understanding the visual world, too.&lt;/p&gt;

&lt;h3&gt;
  
  
  Long-Form Analysis
&lt;/h3&gt;

&lt;p&gt;The vast context window allows researchers, lawyers, or authors to feed in entire documents or datasets for thorough analysis. This “memory of an elephant” capability ensures a coherent grasp of complex or lengthy materials.&lt;/p&gt;

&lt;h3&gt;
  
  
  Agentic AI and Tool Integration
&lt;/h3&gt;

&lt;p&gt;Gemma 3 supports structured outputs and function calling. This means it can not only answer questions but also perform actions—whether it’s formatting its responses as JSON or invoking predefined functions. This integration is pivotal in creating interactive AI systems that can actively engage with other tools and APIs.&lt;/p&gt;




&lt;h2&gt;
  
  
  Getting Started with Gemma 3
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Experiment in the Browser
&lt;/h3&gt;

&lt;p&gt;Head over to &lt;a href="https://ai.google/" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt; to try out Gemma 3 in a web interface. No extensive setup is needed—just a few clicks and you’re interacting with cutting-edge AI.&lt;/p&gt;

&lt;h3&gt;
  
  
  Downloading the Model Weights
&lt;/h3&gt;

&lt;p&gt;Gemma 3 is open and available via platforms like &lt;a href="https://huggingface.co/" rel="noopener noreferrer"&gt;Hugging Face&lt;/a&gt;. Google has released multiple sizes (1B, 4B, 12B, 27B), both pre-trained and instruction-tuned. Choose the version that suits your hardware and begin exploring.&lt;/p&gt;

&lt;h3&gt;
  
  
  Example: Running Gemma 3 with Transformers
&lt;/h3&gt;

&lt;p&gt;Here’s a quick example using the Hugging Face Transformers library:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;transformers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;AutoTokenizer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;AutoModelForCausalLM&lt;/span&gt;

&lt;span class="c1"&gt;# Load the instruction-tuned 4B model (for chat) – ensure your hardware is capable
&lt;/span&gt;&lt;span class="n"&gt;model_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;google/gemma-3-4b-it&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;tokenizer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;AutoTokenizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_pretrained&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;AutoModelForCausalLM&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_pretrained&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;device_map&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;auto&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;prompt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;User: How does Gemma 3 compare to GPT-4?&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;Assistant:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;inputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tokenizer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;return_tensors&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;pt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;to&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;cuda&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;generate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;inputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;max_new_tokens&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;tokenizer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;decode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;skip_special_tokens&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Fine-Tuning and Deployment
&lt;/h2&gt;

&lt;p&gt;Gemma 3 is designed for flexibility and adaptability. You can fine-tune the model on your domain-specific data using frameworks like Hugging Face’s Trainer or LoRA. This opens up possibilities for specialized applications such as medical Q&amp;amp;A systems, coding assistants, or customer support chatbots.&lt;/p&gt;

&lt;p&gt;For deployment, consider these robust options:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Google Vertex AI:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Easily deploy and manage machine learning models at scale with Google’s infrastructure.&lt;br&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/vertex-ai" rel="noopener noreferrer"&gt;Google Vertex AI&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Google Cloud Run:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Run your containerized Gemma 3 applications serverlessly, ensuring efficient scaling as needed.&lt;br&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/run" rel="noopener noreferrer"&gt;Google Cloud Run&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These platforms provide the reliability and scalability required to power applications ranging from personal projects to enterprise-grade solutions.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion: A New Star in the AI Constellation
&lt;/h2&gt;

&lt;p&gt;In the vast universe of artificial intelligence, &lt;strong&gt;Gemma 3&lt;/strong&gt; shines as a new star—merging advanced reasoning, multimodal capabilities, and efficiency in one compact package. This model not only pushes the boundaries of what AI can achieve but also democratizes access to high-performance AI, empowering developers and researchers alike.&lt;/p&gt;

&lt;p&gt;Whether you're tinkering in a garage or innovating in a high-tech lab, Gemma 3 invites you to harness its power and redefine what's possible. Embrace this opportunity to build groundbreaking applications, contribute to an ever-expanding AI community, and be part of a movement that brings the cosmos of AI right to your fingertips.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Clear skies, and happy coding!&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Google AI Blog&lt;/strong&gt; – Official updates and insights into Google’s AI research.&lt;br&gt;&lt;br&gt;
&lt;a href="https://ai.googleblog.com/" rel="noopener noreferrer"&gt;https://ai.googleblog.com/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Google AI Studio&lt;/strong&gt; – Experiment with the latest Google AI models directly in your browser.&lt;br&gt;&lt;br&gt;
&lt;a href="https://ai.google/" rel="noopener noreferrer"&gt;https://ai.google/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Hugging Face&lt;/strong&gt; – Discover and download Gemma 3 alongside other cutting-edge models.&lt;br&gt;&lt;br&gt;
&lt;a href="https://huggingface.co/" rel="noopener noreferrer"&gt;https://huggingface.co/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Google Vertex AI&lt;/strong&gt; – Deploy and manage machine learning models at scale.&lt;br&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/vertex-ai" rel="noopener noreferrer"&gt;https://cloud.google.com/vertex-ai&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Google Cloud Run&lt;/strong&gt; – Run containerized applications on a serverless platform.&lt;br&gt;&lt;br&gt;
&lt;a href="https://cloud.google.com/run" rel="noopener noreferrer"&gt;https://cloud.google.com/run&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>machinelearning</category>
      <category>opensource</category>
      <category>llm</category>
    </item>
    <item>
      <title>Manus AI: A Technical Deep Dive into China's First Autonomous AI Agent</title>
      <dc:creator>Sayed Ali Alkamel</dc:creator>
      <pubDate>Tue, 11 Mar 2025 13:45:47 +0000</pubDate>
      <link>https://forem.com/sayed_ali_alkamel/manus-ai-a-technical-deep-dive-into-chinas-first-autonomous-ai-agent-30d3</link>
      <guid>https://forem.com/sayed_ali_alkamel/manus-ai-a-technical-deep-dive-into-chinas-first-autonomous-ai-agent-30d3</guid>
      <description>&lt;p&gt;Manus AI is a pioneering autonomous AI agent developed by Monica, a Chinese AI startup. Unlike traditional AI assistants that rely on continuous user prompts, Manus AI can independently plan and execute tasks, moving beyond reactive responses to proactive problem-solving. This represents a paradigm shift in AI interaction, moving towards systems that can function as true digital assistants capable of making informed decisions . This article provides a technical deep dive into Manus AI, exploring its architecture, algorithms, capabilities, and limitations.   &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmijesqaeggbqbqnjjmew.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmijesqaeggbqbqnjjmew.jpeg" alt="Image description" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How Manus AI Works
&lt;/h2&gt;

&lt;p&gt;Manus AI functions as a multi-agent system, where each agent specializes in a specific aspect of task completion . This architecture allows Manus AI to break down complex tasks into smaller, more manageable steps and solve problems in sequence. Imagine providing Manus AI with a complex request, such as "plan a trip to Japan in April." The AI would then delegate sub-tasks to specialized agents, such as researching destinations, comparing flight prices, and creating a detailed itinerary, all while operating autonomously.   &lt;/p&gt;

&lt;p&gt;The system operates within an agent loop, iteratively completing tasks through the following steps :   &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Analyze Events&lt;/strong&gt;: Manus AI analyzes user requests and the current state of the task by processing an event stream that includes user messages, execution results, and other relevant information. This analysis helps the AI understand the user's needs and the context of the task.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Select Tools&lt;/strong&gt;: Based on the analysis, Manus AI selects the appropriate tool or API call for the next step. This selection considers task planning, relevant knowledge, and available data APIs. For example, if the task involves web research, Manus AI might select its integrated web browser to gather information.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Execute Commands&lt;/strong&gt;: Manus AI executes the selected tool action within a secure sandbox environment. This environment allows the AI to run shell scripts, web automation, or data processing without compromising system security. For instance, Manus AI can write and execute Python code to automate data analysis tasks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Iterate&lt;/strong&gt;: Manus AI refines its actions based on new data and observations generated from the executed commands. It repeats the cycle of analyzing events, selecting tools, and executing commands until the task is completed. This iterative process allows the AI to adapt to changing circumstances and optimize its approach.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Submit Results&lt;/strong&gt;: Once the task is complete, Manus AI submits the results to the user in the form of messages, reports, or deployed applications. For example, after planning the trip to Japan, Manus AI might provide the user with a detailed itinerary, including flight information, hotel reservations, and suggested activities.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enter Standby&lt;/strong&gt;: After submitting the results, Manus AI enters an idle state and waits for new tasks or user input. This allows the AI to conserve resources and be ready for the next request.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Furthermore, Manus AI operates asynchronously in the cloud, meaning it can continue working on tasks even when the user's device is offline . This allows users to assign tasks to Manus AI and focus on other activities while the AI works in the background.   &lt;/p&gt;




&lt;h2&gt;
  
  
  Core Architectural Features
&lt;/h2&gt;

&lt;p&gt;Manus AI operates within a Linux sandbox environment that provides a controlled execution space for installing software, running scripts, and manipulating files . This sandboxed environment ensures that the AI's actions are secure and do not compromise the user's system. Key architectural features include:   &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Shell and Command-Line Execution&lt;/strong&gt;: Manus AI can execute shell commands, manage processes, and automate system tasks, providing flexibility and control over the execution environment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integrated Web Browser Control&lt;/strong&gt;: Manus AI can navigate websites, extract data, interact with web elements, and execute JavaScript within a browser console, enabling it to gather information and interact with web applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;File System Management&lt;/strong&gt;: Manus AI can read, write, and organize files, enabling it to handle document-based workflows and manage data efficiently.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment Capabilities&lt;/strong&gt;: Manus AI can deploy applications, including setting up websites and hosting services on public URLs, allowing it to create and share interactive content and tools.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Algorithms and Technologies
&lt;/h2&gt;

&lt;p&gt;Manus AI leverages a combination of advanced AI models and technologies to achieve its autonomous capabilities. While the exact details of its architecture are not publicly disclosed, research suggests that Manus AI integrates Claude 3.6 Sonnet, Alibaba's Qwen line of models, and open-source scaffolding . This combination is notable because it utilizes Claude, despite Anthropic's restrictions on its use in China, potentially highlighting a strategic move by Monica to leverage cutting-edge AI technologies . It also utilizes prompt engineering and other techniques common in AI agent development. &lt;br&gt;
 &lt;br&gt;
Some of the key algorithms and technologies used by Manus AI include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Advanced Neural Network Designs&lt;/strong&gt;: Manus AI incorporates advanced neural network designs, such as transformer networks, to process and generate text, images, and code. These networks allow the AI to understand and generate human-like text, analyze visual content, and automate programming tasks.   &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Optimized Training Algorithms&lt;/strong&gt;: Manus AI utilizes optimized training algorithms, such as reinforcement learning, to learn from past interactions and improve its performance over time. This allows the AI to adapt to user preferences and optimize its responses for specific tasks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Long-Term Memory (LTM)&lt;/strong&gt;: Manus AI employs LTM mechanisms, such as hierarchical memory networks and attention-based memory retrieval, to learn from both short-term and historical data . This enables the AI to retain information from past interactions and use it to improve future performance.   &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory Augmented Neural Networks (MANNs)&lt;/strong&gt;: Manus AI utilizes MANNs to enhance information retention and efficiently access vast amounts of information. This has resulted in a 30% increase in performance for complex tasks such as multi-step reasoning and problem-solving.  &lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Data Sources and Training Methods
&lt;/h2&gt;

&lt;p&gt;Manus AI's training data and methods are not publicly available. However, it is likely trained on a massive dataset of text and code, similar to other large language models. The AI's ability to learn from user interactions and feedback suggests that it may also incorporate online learning or reinforcement learning techniques to continuously improve its performance . This adaptive learning allows Manus AI to become more tailored to the specific needs of the user over time.   &lt;/p&gt;




&lt;h2&gt;
  
  
  Programming Languages
&lt;/h2&gt;

&lt;p&gt;Manus AI has demonstrated the ability to write and execute code in Python . It can also interact with web browsers and execute JavaScript within a browser console . This suggests that Manus AI may have capabilities in other programming languages as well, making it a versatile tool for developers and programmers.   &lt;/p&gt;




&lt;h2&gt;
  
  
  Real-World Applications
&lt;/h2&gt;

&lt;p&gt;Manus AI has shown promising capabilities in various real-world applications, demonstrating its potential to automate tasks and improve productivity across different domains. Some notable examples include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Screening Resumes&lt;/strong&gt;: Manus AI can analyze resumes, extract key information, and rank candidates based on specific criteria. This can significantly reduce the time and effort required for recruiters to shortlist candidates.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Researching Real Estate&lt;/strong&gt;: Manus AI can research properties, analyze market trends, and generate comprehensive reports based on user preferences, such as budget, location, and desired features. This can help users make informed decisions when buying or renting properties.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Creating Travel Itineraries&lt;/strong&gt;: Manus AI can plan trips, including booking flights, reserving hotels, and suggesting activities, based on user preferences and constraints. This can save users time and effort in planning their travels.  &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Analyzing Financial Data&lt;/strong&gt;: Manus AI can analyze financial data, generate reports, and create interactive dashboards to provide insights into market trends and investment opportunities.   &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These examples demonstrate Manus AI's potential to impact knowledge work and productivity across various sectors. By automating complex tasks, Manus AI can free up human workers to focus on more creative and strategic endeavors .   &lt;/p&gt;




&lt;h2&gt;
  
  
  Human-Machine Collaboration
&lt;/h2&gt;

&lt;p&gt;Despite its autonomous capabilities, Manus AI was designed with human collaboration in mind . The system maintains feedback channels that allow for human oversight and intervention when needed. This collaboration model represents a balance between independence and control, allowing users to guide the AI's decision-making process without micromanaging every action.   &lt;/p&gt;




&lt;h2&gt;
  
  
  Performance Evaluation
&lt;/h2&gt;

&lt;p&gt;Manus AI's performance was evaluated using the GAIA benchmark, a comprehensive test for general AI assistants developed by Meta AI, Hugging Face, and AutoGPT . This benchmark evaluates AI agents on practical, real-world tasks that require reasoning, problem-solving, and interaction with external tools or data sources.  &lt;/p&gt;

&lt;p&gt;The results suggest that Manus AI performs significantly better than previous state-of-the-art models, including OpenAI's Deep Research system . Here's a table summarizing Manus AI's performance on the GAIA benchmark:   &lt;/p&gt;

&lt;p&gt;While these benchmark results are impressive, it's important to note that real-world performance can differ from controlled testing environments.&lt;/p&gt;




&lt;h2&gt;
  
  
  Limitations and Challenges
&lt;/h2&gt;

&lt;p&gt;While Manus AI represents a significant advancement in AI agent technology, it still faces limitations and challenges:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Scalability and Server Capacity&lt;/strong&gt;: The initial release of Manus AI faced scalability issues due to high demand and limited server capacity . Monica is actively working to address these issues to ensure a smoother user experience.   &lt;/li&gt;
&lt;li&gt;Ethical and Regulatory Considerations: As AI agents become more autonomous, ethical and regulatory considerations become increasingly important. Ensuring responsible AI development and usage is crucial to mitigate potential risks and biases.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Risks&lt;/strong&gt;: Manus AI's ability to interact with external systems and execute code introduces potential security risks. Robust security measures are essential to protect user data and prevent unauthorized access.   &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Glitches and Inconsistencies&lt;/strong&gt;: Some users have reported glitches, looping errors, and performance inconsistencies, particularly with complex or poorly defined tasks . Further development and refinement are needed to improve the AI's reliability and robustness.   &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Over-Reliance on Existing Models&lt;/strong&gt;: Investigations suggest that Manus AI heavily relies on existing models like Claude Sonnet and Qwen finetunes, raising concerns about its originality and potential limitations.   &lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Manus AI and the Global AI Landscape
&lt;/h2&gt;

&lt;p&gt;Manus AI's development is significant not only for its technological advancements but also for its role in the broader context of China's AI landscape. China has been actively investing in AI research and development, and Manus AI represents a notable achievement in its pursuit of AI leadership.&lt;/p&gt;

&lt;p&gt;Manus AI's emergence also highlights the increasing competition between China and US-based AI labs . While US companies like OpenAI and Google have been at the forefront of AI innovation, Manus AI demonstrates China's growing capabilities in developing advanced AI systems. This competition could lead to accelerated innovation and a more diverse AI landscape in the future.   &lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Manus AI is a pioneering AI agent that pushes the boundaries of autonomous task execution. Its multi-agent architecture, advanced algorithms, and ability to interact with external systems make it a powerful tool for automating complex tasks and improving productivity. While challenges remain in terms of scalability, security, and reliability, Manus AI represents a significant step towards the development of truly autonomous AI agents.&lt;/p&gt;

&lt;p&gt;The implications of Manus AI's development are far-reaching. Its ability to automate complex tasks could significantly impact various industries, from customer service and human resources to finance and software development. By taking over tedious and time-consuming tasks, Manus AI can free up human workers to focus on more creative and strategic endeavors, potentially leading to increased efficiency and productivity.&lt;/p&gt;

&lt;p&gt;However, the rise of autonomous AI agents also raises important ethical and societal questions. As AI systems become more capable and independent, it's crucial to ensure responsible development and usage, address potential biases, and mitigate risks to privacy and security.&lt;/p&gt;




&lt;h2&gt;
  
  
  Future Directions
&lt;/h2&gt;

&lt;p&gt;Manus AI is still under development, and Monica has outlined plans for future enhancements and expansions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Open-Sourcing Components&lt;/strong&gt;: Monica plans to open-source parts of Manus AI's technology stack by late 2025, fostering collaborative innovation and community engagement . This could potentially lead to an open-weight release, similar to DeepSeek-R1, further accelerating AI research and development.   &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Global Expansion&lt;/strong&gt;: Monica is working to address scalability issues and expand Manus AI's availability to a wider audience.   &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Enhanced Capabilities&lt;/strong&gt;: Ongoing development efforts focus on improving Manus AI's performance, reliability, and security, as well as expanding its capabilities to handle even more complex tasks.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With continued development and refinement, Manus AI has the potential to revolutionize the way we interact with AI and automate tasks across various domains. Its emergence marks a significant step towards a future where AI agents play an increasingly important role in our work and daily lives.&lt;/p&gt;




&lt;h2&gt;
  
  
  References:
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Manus AI: Features, Architecture, Access, Early Issues &amp;amp; More ..., accessed March 11, 2025, &lt;a href="https://www.datacamp.com/blog/manus-ai" rel="noopener noreferrer"&gt;https://www.datacamp.com/blog/manus-ai&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus AI: The World's First Truly Autonomous AI Agent? | by Cogni Down Under - Medium, accessed March 11, 2025, &lt;a href="https://medium.com/@cognidownunder/manus-ai-the-worlds-first-truly-autonomous-ai-agent-16ebb065bb0a" rel="noopener noreferrer"&gt;https://medium.com/@cognidownunder/manus-ai-the-worlds-first-truly-autonomous-ai-agent-16ebb065bb0a&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus AI Explained: The New Chinese Ai Agent That Rivals DeepSeek - 9meters, accessed March 11, 2025, &lt;a href="https://9meters.com/technology/ai/manus-ai-explained-the-new-chinese-ai-agent-that-rivals-deepseek" rel="noopener noreferrer"&gt;https://9meters.com/technology/ai/manus-ai-explained-the-new-chinese-ai-agent-that-rivals-deepseek&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus tools and prompts - GitHub Gist, accessed March 11, 2025, &lt;a href="https://gist.github.com/jlia0/db0a9695b3ca7609c9b1a08dcbf872c9" rel="noopener noreferrer"&gt;https://gist.github.com/jlia0/db0a9695b3ca7609c9b1a08dcbf872c9&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;What is Manus AI? How to Use and Key Features - Great Learning, accessed March 11, 2025, &lt;a href="https://www.mygreatlearning.com/blog/what-is-manus-ai/" rel="noopener noreferrer"&gt;https://www.mygreatlearning.com/blog/what-is-manus-ai/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Why Manus Matters - by Dean W. Ball - Hyperdimensional, accessed March 11, 2025, &lt;a href="https://www.hyperdimensional.co/p/why-manus-matters" rel="noopener noreferrer"&gt;https://www.hyperdimensional.co/p/why-manus-matters&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus AI: The Best Autonomous AI Agent Redefining Automation and Productivity, accessed March 11, 2025, &lt;a href="https://huggingface.co/blog/LLMhacker/manus-ai-best-ai-agent" rel="noopener noreferrer"&gt;https://huggingface.co/blog/LLMhacker/manus-ai-best-ai-agent&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus AI: A Comprehensive Overview | by ByteBridge - Medium, accessed March 11, 2025, &lt;a href="https://bytebridge.medium.com/manus-ai-a-comprehensive-overview-c87c9dad32f0" rel="noopener noreferrer"&gt;https://bytebridge.medium.com/manus-ai-a-comprehensive-overview-c87c9dad32f0&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Introducing Manus: The World's First Universal AI Agent - Another Chinese AI Product Splashes - AInvest, accessed March 11, 2025, &lt;a href="https://www.ainvest.com/news/introducing-manus-world-universal-ai-agent-chinese-ai-product-splashes-2503/" rel="noopener noreferrer"&gt;https://www.ainvest.com/news/introducing-manus-world-universal-ai-agent-chinese-ai-product-splashes-2503/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;What is Manus AI?: The First General AI Agent Unveiled | by Tahir | Mar, 2025 | Medium, accessed March 11, 2025, &lt;a href="https://medium.com/@tahirbalarabe2/what-is-manus-ai-the-first-general-ai-agent-unveiled-39a2c5702f91" rel="noopener noreferrer"&gt;https://medium.com/@tahirbalarabe2/what-is-manus-ai-the-first-general-ai-agent-unveiled-39a2c5702f91&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus AI, accessed March 11, 2025, &lt;a href="https://manus.im/" rel="noopener noreferrer"&gt;https://manus.im/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;What is Manus, the AI agent taking on OpenAI Deep Research - TechTalks, accessed March 11, 2025, &lt;a href="https://bdtechtalks.com/2025/03/10/manus-ai-agent/" rel="noopener noreferrer"&gt;https://bdtechtalks.com/2025/03/10/manus-ai-agent/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Manus AI Agent Revolutionizes Automation in 2025 - SEOpital, accessed March 11, 2025, &lt;a href="https://www.seopital.co/blog/manus-ai-agent" rel="noopener noreferrer"&gt;https://www.seopital.co/blog/manus-ai-agent&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;What Is Manus AI? - Blockchain Council, accessed March 11, 2025, &lt;a href="https://www.blockchain-council.org/ai/what-is-manus-ai/" rel="noopener noreferrer"&gt;https://www.blockchain-council.org/ai/what-is-manus-ai/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Chinese AI agent Manus uses Claude Sonnet and open-source technology - The Decoder, accessed March 11, 2025, &lt;a href="https://the-decoder.com/chinese-ai-agent-manus-uses-claude-sonnet-and-open-source-technology/" rel="noopener noreferrer"&gt;https://the-decoder.com/chinese-ai-agent-manus-uses-claude-sonnet-and-open-source-technology/&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;  &lt;/p&gt;

</description>
      <category>ai</category>
      <category>machinelearning</category>
      <category>gai</category>
      <category>manus</category>
    </item>
  </channel>
</rss>
