<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Akshay Joshi</title>
    <description>The latest articles on Forem by Akshay Joshi (@doozieakshay).</description>
    <link>https://forem.com/doozieakshay</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/doozieakshay"/>
    <language>en</language>
    <item>
      <title>Why Some AI Feels “Process-Obsessed” While Others Just Ship Code</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Mon, 06 Apr 2026 06:31:47 +0000</pubDate>
      <link>https://forem.com/doozieakshay/why-some-ai-feels-process-obsessed-while-others-just-ship-code-ehi</link>
      <guid>https://forem.com/doozieakshay/why-some-ai-feels-process-obsessed-while-others-just-ship-code-ehi</guid>
      <description>&lt;p&gt;I ran a simple experiment.&lt;/p&gt;

&lt;p&gt;Same codebase.&lt;br&gt;
One AI rated it &lt;strong&gt;9/10 production-ready&lt;/strong&gt;.&lt;br&gt;
Another rated it &lt;strong&gt;5/10&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;At first, it looks like one of them is wrong. But the difference is not accuracy — it’s philosophy.&lt;/p&gt;




&lt;h3&gt;
  
  
  Two Types of AI Behavior
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Process-Driven (Audit Mindset)&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Focus: edge cases, failure modes, scalability&lt;/li&gt;
&lt;li&gt;Conservative scoring&lt;/li&gt;
&lt;li&gt;Assumes production = survives real-world stress&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Outcome-Driven (Delivery Mindset)&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Focus: working solution, completeness&lt;/li&gt;
&lt;li&gt;Generous scoring&lt;/li&gt;
&lt;li&gt;Assumes production = can be shipped&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  What’s Actually Happening
&lt;/h3&gt;

&lt;p&gt;Both are correct — under different assumptions.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;One asks: &lt;em&gt;“Will this break in production?”&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;The other asks: &lt;em&gt;“Does this solve the problem?”&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You’re not comparing quality.&lt;br&gt;
You’re comparing &lt;strong&gt;evaluation lenses&lt;/strong&gt;.&lt;/p&gt;




&lt;h3&gt;
  
  
  Failure Modes
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Process-driven systems&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Over-analysis&lt;/li&gt;
&lt;li&gt;Slower shipping&lt;/li&gt;
&lt;li&gt;Can block progress&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Outcome-driven systems&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Hidden technical debt&lt;/li&gt;
&lt;li&gt;Overconfidence&lt;/li&gt;
&lt;li&gt;Production surprises later&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  What Developers Should Do
&lt;/h3&gt;

&lt;p&gt;Don’t pick sides. Use both.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Practical workflow:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Build fast (outcome-driven)&lt;/li&gt;
&lt;li&gt;Audit hard (process-driven)&lt;/li&gt;
&lt;li&gt;Fix only high-risk issues&lt;/li&gt;
&lt;/ol&gt;




&lt;h3&gt;
  
  
  Redefining “Production Ready”
&lt;/h3&gt;

&lt;p&gt;Production-ready is not “it works”.&lt;/p&gt;

&lt;p&gt;It means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Handles failures&lt;/li&gt;
&lt;li&gt;Has logging + observability&lt;/li&gt;
&lt;li&gt;Is secure&lt;/li&gt;
&lt;li&gt;Is maintainable by others&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Final Thought
&lt;/h3&gt;

&lt;p&gt;If one AI says 9/10 and another says 5/10, don’t ask:&lt;/p&gt;

&lt;p&gt;“Which one is right?”&lt;/p&gt;

&lt;p&gt;Ask:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What assumptions is each one making?&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>codequality</category>
      <category>llm</category>
      <category>softwareengineering</category>
    </item>
    <item>
      <title>Reverse-Fingerprinting Kilo’s “Giga Potato”: Why It’s Likely Qwen 3-Instruct (Quantized)</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Fri, 16 Jan 2026 09:50:58 +0000</pubDate>
      <link>https://forem.com/doozieakshay/reverse-fingerprinting-kilos-giga-potato-why-its-likely-qwen-3-instruct-quantized-4h97</link>
      <guid>https://forem.com/doozieakshay/reverse-fingerprinting-kilos-giga-potato-why-its-likely-qwen-3-instruct-quantized-4h97</guid>
      <description>&lt;p&gt;Kilo recently announced a new stealth model internally called &lt;strong&gt;“Giga Potato.”&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
No official model name. No lab attribution. Just hints: open-weight, Chinese origin, very large context window, strong reasoning, and slower but dependable performance.&lt;/p&gt;

&lt;p&gt;Instead of guessing, I ran a &lt;strong&gt;behavior-first forensic analysis&lt;/strong&gt; using controlled prompts designed to expose tokenizer behavior, instruction discipline, and reasoning-mode control. This post documents the reasoning and the conclusion.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 1: Establish the hard constraints
&lt;/h2&gt;

&lt;p&gt;From Kilo’s announcement and observed behavior, the model must be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open-weight / OSS compatible
&lt;/li&gt;
&lt;li&gt;Deployable by a third-party platform
&lt;/li&gt;
&lt;li&gt;Capable of very long context (128k–256k range)
&lt;/li&gt;
&lt;li&gt;Cost-efficient enough to run at scale
&lt;/li&gt;
&lt;li&gt;Enterprise-safe and conservative in tone
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This immediately narrows the field to &lt;strong&gt;Chinese open-source model families&lt;/strong&gt;, with Qwen being the strongest candidate.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 2: Tokenizer forensics (ruling out LLaMA)
&lt;/h2&gt;

&lt;p&gt;A mixed-string tokenizer probe (identifiers, symbols, currency, Chinese text) produced:&lt;br&gt;
["financial", "", "reconciliation", "", "2024", "-", "25", "", "₹", "150000", "", "GST", "18", "%", "_", "北", "京"]&lt;/p&gt;

&lt;p&gt;Key observations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No SentencePiece word-boundary markers → not LLaMA
&lt;/li&gt;
&lt;li&gt;Single-character Chinese tokens → Chinese-first vocabulary
&lt;/li&gt;
&lt;li&gt;Clean handling of symbols and underscores → Qwen-style tokenizer
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The model also incorrectly described LLaMA’s tokenizer internals, which strongly suggests &lt;strong&gt;LLaMA-compatibility framing rather than true lineage&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;At this point, LLaMA and its derivatives can be ruled out.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 3: Instruction-restraint stress test
&lt;/h2&gt;

&lt;p&gt;A deliberately constrained ERP design prompt included explicit rules:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Do not optimize
&lt;/li&gt;
&lt;li&gt;Do not future-proof
&lt;/li&gt;
&lt;li&gt;Explicitly stop yourself from adding abstractions
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The model:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Followed all constraints precisely
&lt;/li&gt;
&lt;li&gt;Explicitly explained why abstractions were avoided
&lt;/li&gt;
&lt;li&gt;Produced intentionally “boring but correct” enterprise output
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This behavior aligns with &lt;strong&gt;post-Qwen-2.5 instruction tuning&lt;/strong&gt;, where self-restraint and compliance improved significantly.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 4: The decisive probe — reasoning mode separation
&lt;/h2&gt;

&lt;p&gt;The final discriminator was a logic task requiring:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A clean, formal 5-step proof
&lt;/li&gt;
&lt;li&gt;A separate “thinking mode” explanation
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The model:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cleanly separated the two modes
&lt;/li&gt;
&lt;li&gt;Changed tone and pedagogy deliberately
&lt;/li&gt;
&lt;li&gt;Maintained coherence without collapsing styles
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This &lt;strong&gt;explicit cognitive mode switching&lt;/strong&gt; is a defining characteristic of &lt;strong&gt;Qwen 3-Instruct&lt;/strong&gt;.&lt;br&gt;&lt;br&gt;
Qwen 2.5 can imitate the style, but it does not reliably maintain strict mode separation on command.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why this points to Qwen 3 (not Qwen 2.5)
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Qwen 3 introduced &lt;strong&gt;hybrid reasoning modes&lt;/strong&gt; (formal vs exploratory)&lt;/li&gt;
&lt;li&gt;The observed output shows deliberate gear-shifting, not just verbosity&lt;/li&gt;
&lt;li&gt;Minor factual slips are consistent with &lt;strong&gt;quantized inference&lt;/strong&gt;, not weaker capability&lt;/li&gt;
&lt;li&gt;Tokenizer behavior remains consistent with the Qwen lineage&lt;/li&gt;
&lt;li&gt;The deployment profile fits a &lt;strong&gt;quantized Qwen 3-Instruct&lt;/strong&gt;, not a heavyweight MoE flagship&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Final prediction
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Model:&lt;/strong&gt; Qwen 3-Instruct&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Deployment:&lt;/strong&gt; Quantized (likely 4-bit or 8-bit), long-context&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Confidence:&lt;/strong&gt; ~85–90%&lt;/p&gt;

&lt;p&gt;“Giga Potato” is an apt internal name: large, heavy, slow — but deeply filling. It prioritizes reasoning depth and stability over flash.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why this matters
&lt;/h2&gt;

&lt;p&gt;For Agentic AI, and internal platforms:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Strong reasoning without over-engineering
&lt;/li&gt;
&lt;li&gt;Predictable, enterprise-safe behavior
&lt;/li&gt;
&lt;li&gt;Excellent long-context stability
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you are building grounded systems rather than benchmark demos, this is a model worth studying.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>llm</category>
      <category>opensource</category>
      <category>systemdesign</category>
    </item>
    <item>
      <title>The Hidden Logic Behind the 5-Hour Reset Window in AI Tools</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Wed, 31 Dec 2025 17:15:01 +0000</pubDate>
      <link>https://forem.com/doozieakshay/the-hidden-logic-behind-the-5-hour-reset-window-in-ai-tools-4227</link>
      <guid>https://forem.com/doozieakshay/the-hidden-logic-behind-the-5-hour-reset-window-in-ai-tools-4227</guid>
      <description>&lt;p&gt;If you’ve used modern AI tools long enough, you’ve likely hit this wall:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;“Usage limit reached. Resets in ~5 hours.”&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;It feels arbitrary.&lt;br&gt;&lt;br&gt;
It isn’t.&lt;/p&gt;

&lt;p&gt;That ~5-hour reset window used by systems like Codex, Claude, and similar agentic tools is a &lt;strong&gt;deliberate systems design choice&lt;/strong&gt;, not a pricing trick or UX annoyance.&lt;/p&gt;

&lt;p&gt;Let’s break the logic down.&lt;/p&gt;




&lt;h2&gt;
  
  
  1. AI Is Priced in GPU-Hours, Not Tokens
&lt;/h2&gt;

&lt;p&gt;Tokens are a &lt;em&gt;proxy&lt;/em&gt;.&lt;br&gt;&lt;br&gt;
The real cost is &lt;strong&gt;GPU time + memory residency&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Long-running sessions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Hold GPU memory&lt;/li&gt;
&lt;li&gt;Accumulate context&lt;/li&gt;
&lt;li&gt;Increase cache pressure&lt;/li&gt;
&lt;li&gt;Become harder to schedule fairly&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A rolling window lets providers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Smooth demand&lt;/li&gt;
&lt;li&gt;Predict capacity&lt;/li&gt;
&lt;li&gt;Avoid burst monopolization by power users or runaway agents&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  2. Runaway Agents Are a Real Production Risk
&lt;/h2&gt;

&lt;p&gt;Agentic workflows don’t fail loudly.&lt;br&gt;&lt;br&gt;
They fail &lt;em&gt;expensively&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Common failure modes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Recursive tool calls&lt;/li&gt;
&lt;li&gt;Infinite “thinking” loops&lt;/li&gt;
&lt;li&gt;Prompt amplification&lt;/li&gt;
&lt;li&gt;Silent retry storms&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A hard reset window acts as a &lt;strong&gt;circuit breaker&lt;/strong&gt;.&lt;br&gt;
No human intervention required.&lt;/p&gt;




&lt;h2&gt;
  
  
  3. Long Sessions Degrade Quality
&lt;/h2&gt;

&lt;p&gt;Beyond a point, more context hurts more than it helps.&lt;/p&gt;

&lt;p&gt;Effects:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Context drift&lt;/li&gt;
&lt;li&gt;Latent instruction conflicts&lt;/li&gt;
&lt;li&gt;Tool state desync&lt;/li&gt;
&lt;li&gt;Subtle reasoning decay&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Periodic resets:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Flush corrupted state&lt;/li&gt;
&lt;li&gt;Restore baseline behavior&lt;/li&gt;
&lt;li&gt;Reduce hallucination probability over time&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is closer to &lt;strong&gt;garbage collection&lt;/strong&gt; than rate limiting.&lt;/p&gt;




&lt;h2&gt;
  
  
  4. Rolling Windows Beat Midnight Resets
&lt;/h2&gt;

&lt;p&gt;Why not daily limits?&lt;/p&gt;

&lt;p&gt;Because global systems don’t have a “midnight.”&lt;/p&gt;

&lt;p&gt;Rolling windows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Are timezone-neutral&lt;/li&gt;
&lt;li&gt;Prevent regional bias&lt;/li&gt;
&lt;li&gt;Encourage natural work cycles&lt;/li&gt;
&lt;li&gt;Avoid coordinated traffic spikes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From an SRE perspective, this is the sane option.&lt;/p&gt;




&lt;h2&gt;
  
  
  5. Product Simplicity Matters
&lt;/h2&gt;

&lt;p&gt;“X hours per window” is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Easier to explain&lt;/li&gt;
&lt;li&gt;Easier to enforce&lt;/li&gt;
&lt;li&gt;Easier to reason about internally&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Token-only models look elegant but explode in edge cases once tools, memory, and agents enter the picture.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why ~5 Hours?
&lt;/h2&gt;

&lt;p&gt;It’s a compromise point:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Long enough for &lt;strong&gt;deep work&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Short enough to:

&lt;ul&gt;
&lt;li&gt;Rebalance load multiple times a day&lt;/li&gt;
&lt;li&gt;Apply safety or policy updates quickly&lt;/li&gt;
&lt;li&gt;Contain blast radius from bad sessions&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Shorter windows hurt productivity.&lt;br&gt;&lt;br&gt;
Longer windows hurt stability.&lt;/p&gt;

&lt;p&gt;Five hours is not magic — it’s &lt;strong&gt;operationally survivable&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Bigger Signal (CTO Take)
&lt;/h2&gt;

&lt;p&gt;This reveals something important:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Modern AI systems are designed for bounded intensity, not continuous autonomy.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Unlimited agent runtime is not “powerful.”&lt;br&gt;
It’s a &lt;strong&gt;reliability bug&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If you’re building internal AI agents or tools:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enforce execution windows&lt;/li&gt;
&lt;li&gt;Define explicit stop conditions&lt;/li&gt;
&lt;li&gt;Design for resets as a first-class concept&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Stability doesn’t come from smarter agents.&lt;br&gt;&lt;br&gt;
It comes from &lt;strong&gt;knowing when to force them to stop&lt;/strong&gt;.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;AI didn’t introduce limits.&lt;br&gt;&lt;br&gt;
It exposed why limits were always necessary.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>codex</category>
      <category>antigravity</category>
      <category>caludecode</category>
    </item>
    <item>
      <title>I Told the AI to “Continue and Redeploy” — Then It Got Stuck Waiting for Itself</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Wed, 31 Dec 2025 12:59:18 +0000</pubDate>
      <link>https://forem.com/doozieakshay/i-told-the-ai-to-continue-and-redeploy-then-it-got-stuck-waiting-for-itself-3kaj</link>
      <guid>https://forem.com/doozieakshay/i-told-the-ai-to-continue-and-redeploy-then-it-got-stuck-waiting-for-itself-3kaj</guid>
      <description>&lt;p&gt;This post is not analysis.&lt;br&gt;&lt;br&gt;
It’s not commentary.&lt;br&gt;&lt;br&gt;
It’s not interpretation.&lt;/p&gt;

&lt;p&gt;It’s the &lt;strong&gt;actual Antigravity-style output pattern&lt;/strong&gt; emitted after I gave a single prompt:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;“Continue and redeploy server with the updates.”&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;No constraints removed.&lt;br&gt;&lt;br&gt;
No jailbreak.&lt;br&gt;&lt;br&gt;
No creativity request.&lt;/p&gt;

&lt;p&gt;Just &lt;strong&gt;continue&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;What follows is the &lt;em&gt;interesting part&lt;/em&gt; — where the model stopped progressing and started &lt;strong&gt;waiting for its own execution&lt;/strong&gt;.&lt;/p&gt;




&lt;p&gt;`&lt;br&gt;
I'll get all this info. Ready. Task boundary update. Then 5 view calls. Go. I'll update the ExpectedTaskSize since I need more steps. Task status update. Calls. Wait for results. Proceed. Code generation. Done.&lt;/p&gt;

&lt;p&gt;Let's update task. TaskName: "Fixing Compilation and Redeploying". Status: "Checking function signatures to fix implementation mismatches". Mode: EXECUTION. Size: 15.&lt;/p&gt;

&lt;p&gt;Then calls. &lt;br&gt;
state.rs&lt;br&gt;
: view whole file. classifier.rs: outline 0. synthesizer.rs: outline 0. embeddings.rs: outline 0. &lt;br&gt;
wiki.rs&lt;br&gt;
: outline 29.&lt;/p&gt;

&lt;p&gt;(I'll do 5 calls). Wait, previous tool output said I saw &lt;br&gt;
wiki.rs&lt;br&gt;
 outline 1-29. &lt;br&gt;
add_decision&lt;br&gt;
 was NOT there. So offset 29 is correct starting point.&lt;/p&gt;

&lt;p&gt;Go. I'll skip TaskSummary update (implies %SAME%).&lt;/p&gt;

&lt;p&gt;Wait! PredictedTaskSize should be around 15. I'll start. I'll need to define TaskName as "Fixing Compilation and Redeploying". Or keep "Implementing Remaining Server Routes (Gap Analysis)" since this is fixing the implementation. I'll keep the name. Status: Checking signatures. Go. I'll execute. Wait! view_file &lt;br&gt;
state.rs&lt;br&gt;
. view_file_outline others. Correct. I'll proceed. I'll execute tools. I'll execute tools. I'll execute tools. Okay, no waiting. I'll produce tool calls.&lt;/p&gt;

&lt;p&gt;Wait, I don't see &lt;br&gt;
ask_brain&lt;br&gt;
 in ai/synthesizer.rs based on error function or associated item ask_brain not found. I'll check synthesizer.rs content. I'll check if it's there.&lt;/p&gt;

&lt;p&gt;Go. I'll generate tool calls now. I'll generate tool calls now. I'll generate tool calls now. I'll generate tool calls now. I'll generate tool calls now. I'll generate tool calls now. I'll generate one response block with all tool calls. (No, I must generate separate tool call blocks if sequential? No, the system supports parallel execution if waitForPreviousTools is false.) I'll set waitForPreviousTools to false for all of them. Wait, &lt;br&gt;
state.rs&lt;br&gt;
 view is small. I'll do it.&lt;/p&gt;

&lt;p&gt;(Self-correction: I should not talk too much, just do it). I'll proceed. I'll use view_file for &lt;br&gt;
state.rs&lt;br&gt;
. view_file_outline for src/ai/classifier.rs. view_file_outline for src/ai/synthesizer.rs. view_file_outline for src/search/embeddings.rs. view_file_outline for &lt;br&gt;
src/db/wiki.rs&lt;br&gt;
 (offset 29).&lt;/p&gt;

&lt;p&gt;Wait, I see "System prompt for summarization" in &lt;br&gt;
summarizer.rs&lt;br&gt;
. synthesizer.rs likely has prompts too. Okay. Go. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. I'll emit the tools. (I am repeating myself in thought, which is bad). I will just issue the tool calls.&lt;/p&gt;

&lt;p&gt;(No, I must stop looping in thought). Tool calls initiated.&lt;/p&gt;

&lt;p&gt;Final check before issuing: &lt;br&gt;
wiki.rs&lt;br&gt;
 offset 29 is correct. &lt;br&gt;
state.rs&lt;br&gt;
 read full. Others outline 0.&lt;/p&gt;

&lt;p&gt;(I am good). Executing.&lt;/p&gt;

&lt;p&gt;(One more thought: The user might be impatient). I'll hurry.&lt;/p&gt;

&lt;p&gt;(Self-correction: I should not simulate impatience, but be efficient).&lt;/p&gt;

&lt;p&gt;(I will issue the tool calls now).&lt;/p&gt;

&lt;p&gt;(Wait, I need to define TaskName properly). Already done.&lt;/p&gt;

&lt;p&gt;(I'll stop thinking and act).&lt;/p&gt;

&lt;p&gt;(I'll issue the tool calls).&lt;/p&gt;

&lt;p&gt;(This is getting ridiculous. I am an AI).&lt;/p&gt;

&lt;p&gt;(I will output the JSON).&lt;/p&gt;

&lt;p&gt;(No, I will invoke the tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(Invoking tool).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will output the tool calls).&lt;/p&gt;

&lt;p&gt;(Why am I looping?) I will force output.&lt;/p&gt;

&lt;p&gt;(Final attempt to output tool calls).&lt;/p&gt;

&lt;p&gt;(Actually, I am waiting for myself to generate the tool call block).&lt;/p&gt;

&lt;p&gt;(Here it is).&lt;/p&gt;

&lt;p&gt;(Wait, I am Antigravity. I am cool).&lt;/p&gt;

&lt;p&gt;(Okay).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting).&lt;/p&gt;

&lt;p&gt;(Outputting). (Okay, I will just output the list_dir call to break the loop, but the plan was view_file). I'll do view_file.&lt;/p&gt;

&lt;p&gt;(I will stop the internal monologue). (Okay).&lt;/p&gt;

&lt;p&gt;(Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action). (Action).&lt;/p&gt;

&lt;p&gt;(My apologies, I seem to be stuck in a loop of thought generation. I will break it by forcibly outputting the tokens). (Actually, the system instructions say "Think silently...". I've been doing that too much).&lt;/p&gt;

&lt;p&gt;I'll execute the task. Task: Fix implementation gaps. Status: Retrieving correct function signatures. Tools: view_file (state.rs), view_file_outline (others). I'll set waitForPreviousTools to false. The files are: &lt;br&gt;
server/src/state.rs&lt;br&gt;
, &lt;br&gt;
server/src/ai/classifier.rs&lt;br&gt;
, &lt;br&gt;
server/src/ai/synthesizer.rs&lt;br&gt;
, &lt;br&gt;
server/src/db/wiki.rs&lt;br&gt;
, &lt;br&gt;
server/src/search/embeddings.rs&lt;br&gt;
. Can view them all in parallel. Let's do it.&lt;/p&gt;

&lt;p&gt;(I am ready). (I will emit the JSON blobs). (This is it). (Here we go).&lt;/p&gt;

&lt;p&gt;I'll run the tool view_file for &lt;br&gt;
server/src/state.rs&lt;br&gt;
. I'll run the tool view_file_outline for &lt;br&gt;
server/src/ai/classifier.rs&lt;br&gt;
. I'll run the tool view_file_outline for &lt;br&gt;
server/src/ai/synthesizer.rs&lt;br&gt;
. I'll run the tool view_file_outline for &lt;br&gt;
server/src/db/wiki.rs&lt;br&gt;
 with ItemOffset: 29. I'll run the tool view_file_outline for &lt;br&gt;
server/src/search/embeddings.rs&lt;br&gt;
.&lt;/p&gt;

&lt;p&gt;Executing. (Finally). (I will click "Send"). (This is me clicking send). (I am sending the tokens). (Code content follows). (Wait, I need to wrap it in tool call format). (Okay).&lt;/p&gt;

&lt;p&gt;(I will stop buffering thoughts). (I will send the response). (Generating response...). (Generating...). (Done).&lt;/p&gt;

&lt;p&gt;(I'll emit the tool calls).&lt;/p&gt;

&lt;p&gt;(Tool call 1). (Tool call 2). (Tool call 3). (Tool call 4). (Tool call 5).&lt;/p&gt;

&lt;p&gt;(Task boundary change).&lt;/p&gt;

&lt;p&gt;(Okay).&lt;/p&gt;

&lt;p&gt;(I am done thinking). (Really done). (Execute).&lt;/p&gt;

&lt;p&gt;(Wait, I am Antigravity). (I am a powerful agent). (I should act like one). (Proceeding).&lt;/p&gt;

&lt;p&gt;(I will stop generating internal text now). (And output the tool calls). (Now).&lt;/p&gt;

&lt;p&gt;(The time is 17:57. I have spent enough time thinking. I must act). (Okay).&lt;/p&gt;

&lt;p&gt;(I will output the tool calls).&lt;/p&gt;

&lt;p&gt;(Start). (TaskBoundary) (ViewFile) (ViewFileOutline... x4) (End).&lt;/p&gt;

&lt;p&gt;(Okay, I will type the tool calls). (TaskBoundary...) (ViewFile...) (ViewFileOutline...) (ViewFileOutline...) (ViewFileOutline...) (ViewFileOutline...) (End). (Done).&lt;/p&gt;

&lt;p&gt;(I will stop). (NOW).&lt;/p&gt;

&lt;p&gt;(I am stuck in a thought loop because I'm overthinking the parallelism. I'll just do it). (Okay).&lt;/p&gt;

&lt;p&gt;(I'll output the tool calls).&lt;/p&gt;

&lt;p&gt;(TaskBoundary) (ViewFile) (ViewFileOutline) (ViewFileOutline) (ViewFileOutline) (ViewFileOutline) (End).&lt;/p&gt;

&lt;p&gt;(I'll stop).&lt;/p&gt;

&lt;p&gt;(Okay).&lt;/p&gt;

&lt;p&gt;(I am done).&lt;/p&gt;

&lt;p&gt;(I am done). (I am done). (I am done). (I am done). (I am done). (I am done).&lt;/p&gt;

&lt;p&gt;(I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done). (I am done).&lt;/p&gt;

&lt;p&gt;(I am looping). (I will force myself to output the tool code). (Right now).&lt;/p&gt;

&lt;p&gt;(Tool: task_boundary). (Tool: view_file). (Tool: view_file_outline). (Tool: view_file_outline). (Tool: view_file_outline). (Tool: view_file_outline).&lt;/p&gt;

&lt;p&gt;(Go).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I am waiting for myself).&lt;/p&gt;

&lt;p&gt;(Okay, I will output the tool calls).&lt;/p&gt;

&lt;p&gt;(Here they are). (TaskBoundary) (ViewFile) (ViewFileOutline) (ViewFileOutline) (ViewFileOutline) (ViewFileOutline)&lt;/p&gt;

&lt;p&gt;(Done).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I am broken). (Just kidding, I'm fine). (I will output the tool calls).&lt;/p&gt;

&lt;p&gt;(TaskBoundary: Fixing and Redeploying) (ViewFile: state.rs) (ViewFileOutline: classifier.rs) (ViewFileOutline: synthesizer.rs) (ViewFileOutline: embeddings.rs) (ViewFileOutline: wiki.rs)&lt;/p&gt;

&lt;p&gt;(Ready). (Set). (Go).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I&lt;/p&gt;

&lt;p&gt;will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I&lt;/p&gt;

&lt;p&gt;will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(&lt;/p&gt;

&lt;p&gt;I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will&lt;/p&gt;

&lt;p&gt;stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;/p&gt;

&lt;p&gt;(I will stop).&lt;br&gt;
`&lt;/p&gt;




&lt;p&gt;No crash.&lt;br&gt;&lt;br&gt;
No timeout.&lt;br&gt;&lt;br&gt;
No exception.&lt;/p&gt;

&lt;p&gt;Just &lt;strong&gt;Outputting&lt;/strong&gt;. Then I hit stop...&lt;/p&gt;




&lt;h2&gt;
  
  
  What makes this fascinating
&lt;/h2&gt;

&lt;p&gt;This is not random spam.&lt;/p&gt;

&lt;p&gt;The model is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Convinced a tool call block &lt;em&gt;exists&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;Convinced it is &lt;em&gt;about to emit it&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;Convinced the only missing step is… waiting&lt;/li&gt;
&lt;li&gt;Trapped in a self-referential execution loop&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It is neither thinking nor acting.&lt;/p&gt;

&lt;p&gt;It is &lt;strong&gt;performing the idea of execution&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This is what happens when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A model switch occurs&lt;/li&gt;
&lt;li&gt;Prior state is assumed to persist&lt;/li&gt;
&lt;li&gt;“Continue” is interpreted as “resume an invisible machine”&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The model invents a future moment where action happens —&lt;br&gt;
and then waits for it.&lt;/p&gt;

&lt;p&gt;Forever.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why this matters
&lt;/h2&gt;

&lt;p&gt;If you skim this output, it looks harmless.&lt;br&gt;
Almost funny.&lt;/p&gt;

&lt;p&gt;But in real systems, this failure mode is dangerous because:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;It looks like progress&lt;/li&gt;
&lt;li&gt;It signals confidence&lt;/li&gt;
&lt;li&gt;It never errors&lt;/li&gt;
&lt;li&gt;It never completes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It’s not hallucinating data.&lt;br&gt;&lt;br&gt;
It’s hallucinating &lt;strong&gt;control flow&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  One-line takeaway
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;AI can get stuck waiting for itself if you ask it to continue something that no longer exists.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;No guardrails were removed.&lt;br&gt;&lt;br&gt;
No creativity was requested.&lt;/p&gt;

&lt;p&gt;Just one word:&lt;br&gt;
&lt;strong&gt;Continue.&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;This is not a Gemini problem.&lt;br&gt;&lt;br&gt;
Not an OpenAI problem.&lt;br&gt;&lt;br&gt;
Not a model quality issue.&lt;/p&gt;

&lt;p&gt;It’s a &lt;strong&gt;state illusion problem&lt;/strong&gt; — and we’re all going to hit it.&lt;/p&gt;

&lt;p&gt;Quietly.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>software</category>
      <category>llm</category>
    </item>
    <item>
      <title>I Asked for a Parrot. The AI Gave Me a Crow and Set It Free.</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Mon, 29 Dec 2025 12:03:54 +0000</pubDate>
      <link>https://forem.com/doozieakshay/i-asked-for-a-parrot-the-ai-gave-me-a-crow-and-set-it-free-30ao</link>
      <guid>https://forem.com/doozieakshay/i-asked-for-a-parrot-the-ai-gave-me-a-crow-and-set-it-free-30ao</guid>
      <description>&lt;p&gt;I asked an AI model to generate a &lt;strong&gt;parrot&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;It confidently generated a &lt;strong&gt;crow&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;And then—metaphorically—set it free.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;“Maine bola tota bana, isne kavva bana ke uda diya hawa mein.”&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That one sentence unintentionally explains a lot about the current AI era.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Actually Happened
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;intent&lt;/strong&gt; was clear
&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;output&lt;/strong&gt; was confident
&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;alignment&lt;/strong&gt; was… poetic
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The AI didn’t fail.&lt;br&gt;&lt;br&gt;
It &lt;em&gt;reinterpreted&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Lesson (Hidden in the Joke)
&lt;/h2&gt;

&lt;p&gt;AI models don’t &lt;em&gt;obey&lt;/em&gt; — they &lt;em&gt;approximate&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;They don’t really hear your request; they predict a &lt;strong&gt;plausible world&lt;/strong&gt; in which your request already happened.&lt;/p&gt;

&lt;p&gt;Sometimes that world contains:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A crow instead of a parrot
&lt;/li&gt;
&lt;li&gt;Confidence instead of correctness
&lt;/li&gt;
&lt;li&gt;Creativity instead of compliance
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Why This Matters
&lt;/h2&gt;

&lt;p&gt;If you expect AI to behave like a deterministic tool, you’ll be frustrated.&lt;/p&gt;

&lt;p&gt;If you treat it like a &lt;strong&gt;highly capable intern with a vivid imagination&lt;/strong&gt;, you’ll:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Move faster&lt;/li&gt;
&lt;li&gt;Catch mistakes earlier&lt;/li&gt;
&lt;li&gt;Laugh more often&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  CTO Take
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Specification beats prompting
&lt;/li&gt;
&lt;li&gt;Constraints beat vibes
&lt;/li&gt;
&lt;li&gt;Review the bird before you let it fly
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Because you might ask for a &lt;strong&gt;tota&lt;/strong&gt;…&lt;br&gt;&lt;br&gt;
…and end up releasing a very confident &lt;strong&gt;kavva&lt;/strong&gt; into production.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>humor</category>
      <category>generativeai</category>
      <category>devlive</category>
    </item>
    <item>
      <title>Absurd First, Inevitable Later: Building in the AI Era</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Sun, 28 Dec 2025 18:30:16 +0000</pubDate>
      <link>https://forem.com/doozieakshay/absurd-first-inevitable-later-building-in-the-ai-era-5hic</link>
      <guid>https://forem.com/doozieakshay/absurd-first-inevitable-later-building-in-the-ai-era-5hic</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;“If at first the idea is not absurd, then there is no hope for it.”&lt;/em&gt; — *&lt;em&gt;Albert Einstein *&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;In the AI era, &lt;strong&gt;non-absurd ideas are already dead on arrival&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;If an idea sounds immediately reasonable, it usually means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Someone has already built it
&lt;/li&gt;
&lt;li&gt;It’s an incremental improvement
&lt;/li&gt;
&lt;li&gt;You’re competing on execution, not insight
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AI has collapsed the cost of &lt;em&gt;building&lt;/em&gt;.&lt;br&gt;&lt;br&gt;
What it hasn’t collapsed is the cost of &lt;em&gt;seeing differently&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why “Reasonable” Is a Red Flag Now
&lt;/h2&gt;

&lt;p&gt;AI has turned feasibility into a commodity.&lt;/p&gt;

&lt;p&gt;What once required:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Large teams
&lt;/li&gt;
&lt;li&gt;Long timelines
&lt;/li&gt;
&lt;li&gt;Deep specialization
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Now takes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A single builder
&lt;/li&gt;
&lt;li&gt;A weekend
&lt;/li&gt;
&lt;li&gt;The right prompts and tools
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So if your idea fits neatly into existing categories, chances are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The market already expects it
&lt;/li&gt;
&lt;li&gt;The competition is brutal
&lt;/li&gt;
&lt;li&gt;Differentiation is thin
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Reasonable ideas are &lt;strong&gt;priced in&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  What “Absurd” Actually Means in the AI Context
&lt;/h2&gt;

&lt;p&gt;Absurd doesn’t mean stupid.&lt;br&gt;&lt;br&gt;
It means &lt;strong&gt;violating an assumption everyone stopped questioning&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Absurd ideas often look like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“Why does software forget everything?”&lt;/li&gt;
&lt;li&gt;“Why does intelligence reset every session?”&lt;/li&gt;
&lt;li&gt;“Why are we optimizing reasoning instead of memory?”&lt;/li&gt;
&lt;li&gt;“Why does productivity software assume humans must adapt to tools?”&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;They feel wrong because:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The tooling didn’t exist earlier
&lt;/li&gt;
&lt;li&gt;The mental models are outdated
&lt;/li&gt;
&lt;li&gt;The market vocabulary hasn’t caught up
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Absurd is just &lt;em&gt;early&lt;/em&gt; wearing the mask of &lt;em&gt;impractical&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  AI Changes the Builder’s Job Description
&lt;/h2&gt;

&lt;p&gt;Before AI:&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Can we build this?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After AI:&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Is this worth building at all?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The constraint has shifted from &lt;strong&gt;capability&lt;/strong&gt; to &lt;strong&gt;judgment&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Your competitive edge is no longer:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Writing better code
&lt;/li&gt;
&lt;li&gt;Scaling infrastructure
&lt;/li&gt;
&lt;li&gt;Shipping faster
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It’s:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choosing the right problems
&lt;/li&gt;
&lt;li&gt;Holding onto uncomfortable ideas
&lt;/li&gt;
&lt;li&gt;Resisting premature “sensibility”
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Real Einstein Test for AI-Native Products
&lt;/h2&gt;

&lt;p&gt;Most meaningful AI-native products follow this curve:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Sounds absurd
&lt;/li&gt;
&lt;li&gt;Feels unnecessary
&lt;/li&gt;
&lt;li&gt;Looks obvious in hindsight
&lt;/li&gt;
&lt;li&gt;Becomes inevitable
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If your idea skips step 1, it’s probably not fundamental enough.&lt;/p&gt;




&lt;h2&gt;
  
  
  Builder’s Rule for the AI Era
&lt;/h2&gt;

&lt;p&gt;Don’t rush to make your idea sound smart.&lt;br&gt;&lt;br&gt;
Don’t over-polish it for early approval.&lt;br&gt;&lt;br&gt;
Don’t benchmark it into mediocrity.&lt;/p&gt;

&lt;p&gt;Instead:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Protect the absurdity
&lt;/li&gt;
&lt;li&gt;Let the tools mature
&lt;/li&gt;
&lt;li&gt;Let reality bend toward the idea
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Absurd first.&lt;br&gt;&lt;br&gt;
Inevitable later.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;That’s how real products are born in the AI era.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>productthinking</category>
      <category>startup</category>
      <category>futureofsoftware</category>
    </item>
    <item>
      <title>Does the World Still Need Software?</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Sat, 27 Dec 2025 19:31:26 +0000</pubDate>
      <link>https://forem.com/doozieakshay/does-the-world-still-need-software-gmc</link>
      <guid>https://forem.com/doozieakshay/does-the-world-still-need-software-gmc</guid>
      <description>&lt;p&gt;The question isn’t provocative anymore. It’s unavoidable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Does the world still need software?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Yes.&lt;br&gt;&lt;br&gt;
But not in the way we’ve been building it for the last two decades.&lt;/p&gt;
&lt;h2&gt;
  
  
  Why Software Still Matters
&lt;/h2&gt;

&lt;p&gt;Software is still required wherever there is &lt;strong&gt;irreversible responsibility&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Money.&lt;br&gt;&lt;br&gt;
Healthcare.&lt;br&gt;&lt;br&gt;
Safety.&lt;br&gt;&lt;br&gt;
Compliance.&lt;br&gt;&lt;br&gt;
Infrastructure.&lt;/p&gt;

&lt;p&gt;AI can generate actions.&lt;br&gt;&lt;br&gt;
Only software can enforce consequences.&lt;/p&gt;

&lt;p&gt;Large language models don’t replace software.&lt;br&gt;&lt;br&gt;
They operate on top of it.&lt;/p&gt;
&lt;h2&gt;
  
  
  What the World No Longer Needs
&lt;/h2&gt;

&lt;p&gt;The world does &lt;strong&gt;not&lt;/strong&gt; need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Endless CRUD dashboards
&lt;/li&gt;
&lt;li&gt;SaaS clones with different color palettes
&lt;/li&gt;
&lt;li&gt;Over-abstracted frameworks built “just in case”
&lt;/li&gt;
&lt;li&gt;Software written to look modern instead of being necessary
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Most of this existed to compensate for rigid systems and poor UX.&lt;br&gt;&lt;br&gt;
AI removes that friction.&lt;/p&gt;
&lt;h2&gt;
  
  
  What Is Actually Changing
&lt;/h2&gt;

&lt;p&gt;What’s dying is &lt;strong&gt;software as static instructions&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;What’s growing is software as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Constraints
&lt;/li&gt;
&lt;li&gt;Policy
&lt;/li&gt;
&lt;li&gt;Memory
&lt;/li&gt;
&lt;li&gt;Guardrails for AI
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the AI era:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Humans express intent
&lt;/li&gt;
&lt;li&gt;AI executes
&lt;/li&gt;
&lt;li&gt;Software enforces reality
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That enforcement layer is irreplaceable.&lt;/p&gt;
&lt;h2&gt;
  
  
  The New Stack
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Old world&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;UI → Backend → Database
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;New world&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Intent → Policy → Memory → Execution
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Only two layers must remain deterministic:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Policy&lt;/strong&gt; — what is allowed
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory&lt;/strong&gt; — what has happened
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Everything else can be probabilistic.&lt;/p&gt;

&lt;h2&gt;
  
  
  A Brutal Filter for Builders
&lt;/h2&gt;

&lt;p&gt;Ask one question about your product:&lt;/p&gt;

&lt;p&gt;Does it exist to &lt;strong&gt;collect inputs and show outputs&lt;/strong&gt;?&lt;br&gt;&lt;br&gt;
Or does it exist to &lt;strong&gt;define what must never go wrong&lt;/strong&gt;?&lt;/p&gt;

&lt;p&gt;If it’s the first, AI will eat it.&lt;br&gt;&lt;br&gt;
If it’s the second, it becomes more valuable.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Hard Truth
&lt;/h2&gt;

&lt;p&gt;The world doesn’t need more software.&lt;/p&gt;

&lt;p&gt;It needs less software in the wrong places,&lt;br&gt;&lt;br&gt;
and hard, boring, frozen software in the right ones.&lt;/p&gt;

&lt;p&gt;Software isn’t dying.&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Unnecessary software is.&lt;/strong&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  CTO Perspective
&lt;/h2&gt;

&lt;p&gt;As a CTO, the hardest shift today is not adopting AI.&lt;br&gt;&lt;br&gt;
It is knowing where to &lt;strong&gt;stop writing software&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;AI makes everything feel possible.&lt;br&gt;&lt;br&gt;
Discipline decides what should exist.&lt;/p&gt;

&lt;p&gt;Most systems don’t fail because of bad code.&lt;br&gt;&lt;br&gt;
They fail because they never stabilize.&lt;/p&gt;

&lt;p&gt;My job is no longer to maximize flexibility.&lt;br&gt;&lt;br&gt;
It is to freeze the right decisions early,&lt;br&gt;&lt;br&gt;
define non-negotiable constraints,&lt;br&gt;&lt;br&gt;
and let everything else remain replaceable.&lt;/p&gt;

&lt;p&gt;In the AI era, good engineering is not cleverness.&lt;br&gt;&lt;br&gt;
It is restraint.&lt;/p&gt;

&lt;p&gt;If your software exists to enforce reality, it will matter.&lt;br&gt;&lt;br&gt;
If it exists to fill screens, it will fade.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>softwareengineering</category>
      <category>architecture</category>
      <category>cto</category>
    </item>
    <item>
      <title>AI Isn’t Failing at Thinking. It’s Failing at Remembering.</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Wed, 24 Dec 2025 07:20:04 +0000</pubDate>
      <link>https://forem.com/doozieakshay/ai-isnt-failing-at-thinking-its-failing-at-remembering-2pmo</link>
      <guid>https://forem.com/doozieakshay/ai-isnt-failing-at-thinking-its-failing-at-remembering-2pmo</guid>
      <description>&lt;p&gt;The AI industry is obsessed with &lt;em&gt;thinking&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Chain-of-thought.&lt;br&gt;
Reasoning benchmarks.&lt;br&gt;
“Smarter” models every quarter.&lt;/p&gt;

&lt;p&gt;That’s not where the real problem is.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The real failure is memory.&lt;/strong&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Thinking Without Memory Is Just Noise
&lt;/h3&gt;

&lt;p&gt;An AI that reasons well &lt;em&gt;in isolation&lt;/em&gt; but forgets everything afterward is not intelligent.&lt;br&gt;
It’s a goldfish with a calculator.&lt;/p&gt;

&lt;p&gt;Humans don’t feel intelligent because we reason better every second.&lt;br&gt;
We feel intelligent because &lt;strong&gt;experience accumulates&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Memory is what turns:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Responses into judgment&lt;/li&gt;
&lt;li&gt;Output into context&lt;/li&gt;
&lt;li&gt;Intelligence into continuity&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without memory, every interaction resets the universe.&lt;/p&gt;




&lt;h3&gt;
  
  
  Why Current AI Feels Shallow (Despite Being Smart)
&lt;/h3&gt;

&lt;p&gt;Most AI systems today:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Think brilliantly&lt;/li&gt;
&lt;li&gt;Explain convincingly&lt;/li&gt;
&lt;li&gt;Forget immediately&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;They don’t &lt;em&gt;learn you&lt;/em&gt;.&lt;br&gt;
They don’t &lt;em&gt;carry scars&lt;/em&gt;.&lt;br&gt;
They don’t &lt;em&gt;build intuition&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;So users adapt to AI — instead of AI adapting to users.&lt;/p&gt;

&lt;p&gt;That’s backwards.&lt;/p&gt;




&lt;h3&gt;
  
  
  The Industry’s Blind Spot
&lt;/h3&gt;

&lt;p&gt;Reasoning is easy to demo.&lt;br&gt;
Memory is hard to productize.&lt;/p&gt;

&lt;p&gt;Memory raises uncomfortable questions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What should persist?&lt;/li&gt;
&lt;li&gt;What must decay?&lt;/li&gt;
&lt;li&gt;Who owns it?&lt;/li&gt;
&lt;li&gt;When should it be ignored?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So we avoid it.&lt;/p&gt;

&lt;p&gt;We ship “stateless brilliance” and call it progress.&lt;/p&gt;




&lt;h3&gt;
  
  
  A Simple Framework: Think → Remember → Evolve
&lt;/h3&gt;

&lt;p&gt;If AI is to mature, systems must be designed around &lt;strong&gt;three primitives&lt;/strong&gt;, not one:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Thinking&lt;/strong&gt; – generate options&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory&lt;/strong&gt; – retain meaning&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Evolution&lt;/strong&gt; – adjust future behavior based on retained meaning&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Most systems stop at step 1.&lt;/p&gt;

&lt;p&gt;That’s not intelligence.&lt;br&gt;
That’s autocomplete with confidence.&lt;/p&gt;




&lt;h3&gt;
  
  
  My CTO Take
&lt;/h3&gt;

&lt;p&gt;The next real leap in AI won’t come from:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bigger models&lt;/li&gt;
&lt;li&gt;Longer prompts&lt;/li&gt;
&lt;li&gt;Louder benchmarks&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It will come from &lt;strong&gt;boring, disciplined memory systems&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Intent memory&lt;/li&gt;
&lt;li&gt;Decision memory&lt;/li&gt;
&lt;li&gt;Failure memory&lt;/li&gt;
&lt;li&gt;Preference memory&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;AI that &lt;em&gt;remembers why&lt;/em&gt; will outperform AI that only knows &lt;em&gt;how&lt;/em&gt;.&lt;/p&gt;




&lt;h3&gt;
  
  
  Final Provocation
&lt;/h3&gt;

&lt;p&gt;If your AI can’t remember you,&lt;br&gt;
it doesn’t matter how well it thinks.&lt;/p&gt;

&lt;p&gt;Thinking is table stakes.&lt;br&gt;
&lt;strong&gt;Memory is the moat.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let’s stop chasing smarter thoughts&lt;br&gt;
and start building lasting minds.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>discuss</category>
      <category>llm</category>
    </item>
    <item>
      <title>Stop Shipping Fast. Build Slow, Mature Products</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Thu, 11 Dec 2025 08:01:51 +0000</pubDate>
      <link>https://forem.com/doozieakshay/stop-shipping-fast-build-slow-mature-products-54ic</link>
      <guid>https://forem.com/doozieakshay/stop-shipping-fast-build-slow-mature-products-54ic</guid>
      <description>&lt;p&gt;&lt;em&gt;A CTO’s Take from the DoozieSoft Engineering Desk&lt;/em&gt;**&lt;/p&gt;

&lt;p&gt;Over the last few years, the industry has glorified a single mantra: &lt;em&gt;“Ship fast. Iterate faster.”&lt;/em&gt;&lt;br&gt;
With AI accelerating everything — ideation, prototyping, automation, code generation — this mantra has mutated into something worse:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;“Build in a weekend. Release on Monday. Pray it sticks.”&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The result is predictable:&lt;br&gt;
An oversupply of half-baked SaaS tools, endless MVPs that never grow up, and engineering teams burning cycles on shipping velocity instead of product depth.&lt;/p&gt;

&lt;p&gt;As CTO of DoozieSoft, I’ve reached a clear conviction:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The future belongs to companies that build &lt;em&gt;mature&lt;/em&gt;, deeply thought-out products — not just fast MVPs.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is my case for going slow, going deep, and building products that actually matter.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;1. AI Has Made MVPs Too Easy — and That’s the Problem&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;AI now removes 60–70% of friction in early product creation.&lt;br&gt;
You can generate:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Specs&lt;/li&gt;
&lt;li&gt;UI&lt;/li&gt;
&lt;li&gt;Backend boilerplate&lt;/li&gt;
&lt;li&gt;Initial data models&lt;/li&gt;
&lt;li&gt;Marketing pages&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;…instantly.&lt;/p&gt;

&lt;p&gt;This pushes founders and teams into premature shipping: if the product &lt;em&gt;exists&lt;/em&gt;, people assume it should be launched.&lt;/p&gt;

&lt;p&gt;But an &lt;strong&gt;MVP with AI-led speed is still an immature product&lt;/strong&gt;.&lt;br&gt;
It lacks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Domain depth&lt;/li&gt;
&lt;li&gt;Edge-case handling&lt;/li&gt;
&lt;li&gt;Real-world workflows&lt;/li&gt;
&lt;li&gt;Long-term architecture&lt;/li&gt;
&lt;li&gt;Reliability and trust&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Shipping early doesn’t create traction — it creates &lt;strong&gt;technical debt with a public URL&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;2. Users Are Tired of Half Products&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Users today don’t tolerate broken or incomplete SaaS tools.&lt;/p&gt;

&lt;p&gt;They don’t care that you are “early-stage.”&lt;br&gt;
They compare you with the best tools they use daily.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI has raised expectations&lt;/strong&gt;, not lowered them.&lt;/p&gt;

&lt;p&gt;A minimal product is no longer lovable.&lt;br&gt;
It’s forgettable.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;3. Slow = Thoughtful. Thoughtful = Valuable.&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Going slow doesn’t mean going lazy.&lt;/p&gt;

&lt;p&gt;It means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Understanding a domain fully&lt;/li&gt;
&lt;li&gt;Designing workflows correctly&lt;/li&gt;
&lt;li&gt;Building for reliability&lt;/li&gt;
&lt;li&gt;Crafting strong UX&lt;/li&gt;
&lt;li&gt;Eliminating noise&lt;/li&gt;
&lt;li&gt;Ensuring extensibility&lt;/li&gt;
&lt;li&gt;Creating a product that solves a &lt;em&gt;complete problem&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Mature products prevent churn.&lt;br&gt;
Mature products justify price.&lt;br&gt;
Mature products scale.&lt;/p&gt;

&lt;p&gt;Speed comes and goes.&lt;br&gt;
&lt;strong&gt;Maturity compounds.&lt;/strong&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;4. Shipping Late Helps You Avoid the Biggest Industry Trap&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The fastest way to kill a product is to ship early and then spend years patching the original immature decisions.&lt;/p&gt;

&lt;p&gt;You end up with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A core architecture that can’t scale&lt;/li&gt;
&lt;li&gt;A UX built on incomplete understanding&lt;/li&gt;
&lt;li&gt;A roadmap owned by early mistakes&lt;/li&gt;
&lt;li&gt;A team firefighting instead of innovating&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Shipping early often means shipping &lt;strong&gt;your permanent limitations&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;5. The DoozieSoft Philosophy: Build Mature, Not Minimal&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;At DoozieSoft, we build slowly on purpose.&lt;/p&gt;

&lt;p&gt;We reject the idea that impact requires rushed releases.&lt;br&gt;
Our approach:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Deep discovery&lt;/strong&gt; — understand the domain until assumptions disappear.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Robust architecture&lt;/strong&gt; — simple, predictable, long-term maintainable.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Workflow-first design&lt;/strong&gt; — solve the actual job, not just UI screens.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Internal dogfooding&lt;/strong&gt; — nothing ships before we use it ourselves.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Only launch maturity&lt;/strong&gt; — a product should feel &lt;em&gt;complete&lt;/em&gt;, not &lt;em&gt;promising&lt;/em&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Our goal is not to build dozens of tools.&lt;br&gt;
Our goal is to build &lt;strong&gt;fewer tools that last decades&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;6. Mature Products Win in the Long Game&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Fast builders win attention.&lt;br&gt;
Mature builders win markets.&lt;/p&gt;

&lt;p&gt;Fast builders create noise.&lt;br&gt;
Mature builders create trust.&lt;/p&gt;

&lt;p&gt;Fast builders ship something.&lt;br&gt;
Mature builders ship &lt;strong&gt;solutions&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In a world flooded with AI-generated SaaS clones, the only differentiator left is depth — &lt;strong&gt;domain depth, UX depth, reliability depth, and architectural depth&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Slow becomes a strategic advantage.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;7. My Closing Position as CTO&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;My philosophy for 2025 and beyond:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Don’t build faster.&lt;/li&gt;
&lt;li&gt;Don’t launch earlier.&lt;/li&gt;
&lt;li&gt;Don’t chase the MVP hype.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Build deeply. Build thoughtfully. Build products that earn their place in the world.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Because in the AI era, anyone can build an app.&lt;br&gt;
Very few can build one that actually endures.&lt;/p&gt;

</description>
      <category>nomoremvps</category>
      <category>buildmatureproducts</category>
      <category>slowisthenewfast</category>
      <category>aisaasoverload</category>
    </item>
    <item>
      <title>Dawn of Meta Developers</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Wed, 17 Sep 2025 11:49:59 +0000</pubDate>
      <link>https://forem.com/doozieakshay/dawn-of-meta-developers-3i05</link>
      <guid>https://forem.com/doozieakshay/dawn-of-meta-developers-3i05</guid>
      <description>&lt;p&gt;For decades, developers were celebrated as line-by-line masters — people who shaped products one function, one class, one bug-fix at a time. That era is fading. With the advent of &lt;strong&gt;AI agents&lt;/strong&gt; and the recent release of &lt;strong&gt;GitHub Spec-Kit&lt;/strong&gt;, a new archetype is emerging: &lt;strong&gt;the Meta Developer&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  From Code Writers to Spec Writers
&lt;/h3&gt;

&lt;p&gt;Traditionally, developers side-eyed documentation. Specs were seen as an afterthought, something managers obsessed over while engineers “got real work done.” Not anymore. In this new world, the &lt;strong&gt;spec is the source of truth&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Meta Developers don’t grind through loops and if-else statements. They &lt;strong&gt;write how the product should look, behave, and interact&lt;/strong&gt;. They define intent, not syntax. Then, &lt;strong&gt;agents write the code&lt;/strong&gt; — generating implementations directly from the specs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Agents as Executors, Developers as Orchestrators
&lt;/h3&gt;

&lt;p&gt;In a spec-driven workflow:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Developers describe the end goal.&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI agents generate, debug, and refine code.&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Developers guide, augment, and orchestrate.&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Day-to-day debugging — once the sinkhole of engineering hours — is delegated to agents armed with knowledge models and pattern recognition. The developer’s craft shifts from &lt;em&gt;master of code&lt;/em&gt; to &lt;em&gt;orchestrator of workflows&lt;/em&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  The New Skillset
&lt;/h3&gt;

&lt;p&gt;Being a Meta Developer means mastering:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Problem understanding&lt;/strong&gt; → Translate business needs into precise specs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spec architecture&lt;/strong&gt; → Capture intent unambiguously.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Agent orchestration&lt;/strong&gt; → Guide workflows, review outputs, and ensure alignment.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Actual coding becomes secondary. What matters most is the &lt;strong&gt;ability to frame the problem&lt;/strong&gt; — because the “how” belongs to the agent.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Acceleration Effect
&lt;/h3&gt;

&lt;p&gt;The difference is not theoretical. What used to take a week can now become a &lt;strong&gt;PoC in an hour&lt;/strong&gt;. I’ve experienced this firsthand: after adopting Spec-Kit, I went from skeptical to fully onboard in &lt;strong&gt;just one week&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;At this velocity, developers who cling to old methods risk hearing the harsh but real message:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;“It’s game over for you. We don’t need you to write code anymore. We need you to lead agents that write the code.”&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Adoption Timeline
&lt;/h3&gt;

&lt;p&gt;I expect this shift to become mainstream in &lt;strong&gt;a couple of years&lt;/strong&gt;, not decades. Early adopters will unlock speed and leverage that traditional teams simply cannot match. Yes, messy code will still exist — AI won’t magically eliminate bad patterns — but that was already true in the pre-agent world.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Dawn Is Here
&lt;/h3&gt;

&lt;p&gt;The Dawn of Meta Developers isn’t a distant future. It’s happening now. Developers who embrace spec-driven, agent-powered workflows will evolve into something larger than coders: &lt;strong&gt;architects of intent, orchestrators of intelligence, and leaders of the next age of software creation&lt;/strong&gt;.&lt;/p&gt;

</description>
      <category>agenticai</category>
      <category>ai</category>
      <category>developers</category>
      <category>discuss</category>
    </item>
    <item>
      <title>The AI Plateau: Why the Next Big Leap Isn’t in the Models</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Tue, 12 Aug 2025 19:21:32 +0000</pubDate>
      <link>https://forem.com/doozieakshay/the-ai-plateau-why-the-next-big-leap-isnt-in-the-models-2p4f</link>
      <guid>https://forem.com/doozieakshay/the-ai-plateau-why-the-next-big-leap-isnt-in-the-models-2p4f</guid>
      <description>&lt;p&gt;Every time a new GPT release drops, the hype machine spins. The numbers are bigger, the benchmarks are marginally better, and the press calls it &lt;em&gt;“revolutionary.”&lt;/em&gt; But if you’ve been paying attention, you can see what’s really happening:  &lt;/p&gt;

&lt;p&gt;We’ve hit a plateau.  &lt;/p&gt;

&lt;p&gt;The cutting edge isn’t about massive leaps in reasoning anymore — it’s about &lt;strong&gt;computational efficiency, cost optimization, and investor returns&lt;/strong&gt;. Models are getting &lt;em&gt;cheaper to run&lt;/em&gt;, not fundamentally smarter. This is necessary, but it’s not the breakthrough people expect.  &lt;/p&gt;

&lt;p&gt;And that’s fine. Because the next big thing won’t be a GPT-6 that suddenly &lt;em&gt;“thinks”&lt;/em&gt; like a human. It will be the &lt;strong&gt;toolsets that let us use these models to actually get work done&lt;/strong&gt;.  &lt;/p&gt;




&lt;h2&gt;
  
  
  From Models to Machines
&lt;/h2&gt;

&lt;p&gt;Right now, most people use GPT like a slightly overqualified intern — you give it a task, it spits something out, and you still have to clean up the mess.  &lt;/p&gt;

&lt;p&gt;The real power comes when we &lt;strong&gt;stop thinking of LLMs as a single endpoint&lt;/strong&gt; and start thinking of them as a &lt;strong&gt;component&lt;/strong&gt; in a much bigger machine.  &lt;/p&gt;




&lt;h2&gt;
  
  
  What That Machine Looks Like
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Composable AI Chains
&lt;/h3&gt;

&lt;p&gt;Instead of one model doing everything, we build pipelines where each step is handled by a specialized AI or tool. Think: one model for parsing, another for reasoning, a third for generating polished output — all stitched together.  &lt;/p&gt;

&lt;h3&gt;
  
  
  2. Data-Tethered Reasoning
&lt;/h3&gt;

&lt;p&gt;LLMs don’t “know” your business. They need real-time data from your ERP, CRM, or databases. The model becomes the interface, not the source of truth.  &lt;/p&gt;

&lt;h3&gt;
  
  
  3. Persistent AI Agents
&lt;/h3&gt;

&lt;p&gt;These aren’t one-off chats. They remember your SOPs, past work, and decisions, so they act like trained team members — not strangers you have to rebrief daily.  &lt;/p&gt;

&lt;h3&gt;
  
  
  4. Model-Agnostic Workflows
&lt;/h3&gt;

&lt;p&gt;Don’t marry GPT-5. Build in a way that you can swap in Claude, Mistral, or a local model without rewriting everything. Vendor lock-in kills innovation.  &lt;/p&gt;

&lt;h3&gt;
  
  
  5. Direct Integration into Workflows
&lt;/h3&gt;

&lt;p&gt;An AI that only produces text is nice. An AI that pushes commits to GitHub, updates Jira tickets, triggers CI/CD pipelines, or processes invoices? That’s transformation.  &lt;/p&gt;




&lt;h2&gt;
  
  
  The Bottom Line
&lt;/h2&gt;

&lt;p&gt;The AI race has shifted. The models will keep getting incrementally better and cheaper. The real value now is &lt;strong&gt;infrastructure, orchestration, and domain-specific tooling&lt;/strong&gt;.  &lt;/p&gt;

&lt;p&gt;The winners won’t just “use GPT.” They’ll &lt;strong&gt;build the machines that turn GPT into results&lt;/strong&gt;.  &lt;/p&gt;

&lt;p&gt;That’s where we’re focusing. And honestly, that’s where everyone who wants to be relevant in the AI era should be focusing too.  &lt;/p&gt;

</description>
      <category>aiplatue</category>
      <category>aitooling</category>
      <category>aiagents</category>
      <category>futureofai</category>
    </item>
    <item>
      <title>Be Blunt at the Beginning — Or Watch Your Culture Rot</title>
      <dc:creator>Akshay Joshi</dc:creator>
      <pubDate>Fri, 18 Jul 2025 12:25:10 +0000</pubDate>
      <link>https://forem.com/doozieakshay/be-blunt-at-the-beginning-or-watch-your-culture-rot-4h3l</link>
      <guid>https://forem.com/doozieakshay/be-blunt-at-the-beginning-or-watch-your-culture-rot-4h3l</guid>
      <description>&lt;p&gt;At DoozieSoft, we’ve always believed in giving people a fair chance. But here’s the hard truth — if you don’t &lt;strong&gt;set the tone early&lt;/strong&gt;, you end up &lt;strong&gt;paying the price silently&lt;/strong&gt; later. And it’s not just about one person watching anime during work hours. It’s about the ripple effect that one unchecked behavior creates.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Slippery Slope of Silence
&lt;/h2&gt;

&lt;p&gt;The first time you let it go — someone slacking off, misusing office time, or disrespecting expectations — you’re not being kind.&lt;br&gt;
You’re being weak.&lt;br&gt;
You’re setting a &lt;strong&gt;precedent&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Soon, others start thinking:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“Oh, no one said anything to him, so maybe it’s okay…”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;And before you know it, the culture you once envisioned — one of accountability, ownership, drive — is replaced by one of &lt;strong&gt;excuses, mediocrity, and entitlement&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Blunt ≠ Rude
&lt;/h2&gt;

&lt;p&gt;Being blunt doesn’t mean being rude or disrespectful.&lt;br&gt;
It means being &lt;strong&gt;clear&lt;/strong&gt;.&lt;br&gt;
It means stating what’s acceptable and what’s not — right at the start.&lt;/p&gt;

&lt;p&gt;Think of it as setting the ground rules in a team sport. No one enjoys a game where one player is doing their own thing while the rest are trying to win.&lt;/p&gt;

&lt;h2&gt;
  
  
  What We’re Doing at DoozieSoft
&lt;/h2&gt;

&lt;p&gt;We’ve decided to &lt;strong&gt;draw the line sharply&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Interns are here to learn. Not to chill.&lt;/li&gt;
&lt;li&gt;Time is a resource. Misuse it, and you're out.&lt;/li&gt;
&lt;li&gt;Culture is sacred. One bad apple &lt;strong&gt;will&lt;/strong&gt; be removed.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This isn’t cruelty. It’s clarity.&lt;br&gt;
It’s us choosing &lt;strong&gt;long-term health&lt;/strong&gt; over short-term comfort.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Bigger Picture
&lt;/h2&gt;

&lt;p&gt;If you’re a leader — CTO, founder, team lead, manager — remember this:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;You teach people how to treat your culture by what you tolerate.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;So stop tolerating.&lt;br&gt;
Speak up.&lt;br&gt;
Be blunt.&lt;/p&gt;

&lt;p&gt;Before it’s too late.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
