<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Deveshwar Jaiswal</title>
    <description>The latest articles on Forem by Deveshwar Jaiswal (@svssdeva).</description>
    <link>https://forem.com/svssdeva</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/svssdeva"/>
    <language>en</language>
    <item>
      <title>Bun replaced 4 tools in my stack — here's what actually held up and what didn't</title>
      <dc:creator>Deveshwar Jaiswal</dc:creator>
      <pubDate>Sun, 19 Apr 2026 11:40:00 +0000</pubDate>
      <link>https://forem.com/svssdeva/bun-replaced-4-tools-in-my-stack-heres-what-actually-held-up-and-what-didnt-2ik8</link>
      <guid>https://forem.com/svssdeva/bun-replaced-4-tools-in-my-stack-heres-what-actually-held-up-and-what-didnt-2ik8</guid>
      <description>&lt;p&gt;Vishwakarma is the divine architect in Vedic tradition. He doesn't fight battles or write&lt;br&gt;
laws. He builds the instruments that others use to do those things — weapons for the gods,&lt;br&gt;
chariots for the heroes, the celestial city of Dwaraka. His work is invisible in the final&lt;br&gt;
story because it's structural.&lt;/p&gt;

&lt;p&gt;That's Bun.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Bun actually is
&lt;/h2&gt;

&lt;p&gt;Not a Node killer. Not a React competitor. Not a framework.&lt;/p&gt;

&lt;p&gt;Bun is a runtime that also ships as a package manager, a bundler, and a test runner — all&lt;br&gt;
from one binary, with one install, and one config surface. Its value isn't that it makes&lt;br&gt;
your app faster. It's that it removes the toolchain you were managing around your app.&lt;/p&gt;




&lt;h2&gt;
  
  
  What changed when I switched
&lt;/h2&gt;

&lt;p&gt;My baseline stack before Bun: Node, npm, esbuild, Jest. Four tools. Four version pins.&lt;br&gt;
Four separate failure modes in CI. The kind of setup that's industry-standard and quietly&lt;br&gt;
expensive to maintain.&lt;/p&gt;

&lt;p&gt;After switching to Bun:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;CI install time dropped.&lt;/strong&gt; Not purely because of speed — because there's one process&lt;br&gt;
fetching and linking dependencies instead of npm orchestrating multiple sub-processes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The config surface shrank.&lt;/strong&gt; No separate Jest config, no esbuild config, no &lt;code&gt;.nvmrc&lt;/code&gt;&lt;br&gt;
to keep in sync with CI. One runtime, one set of assumptions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hot reload in dev got faster.&lt;/strong&gt; Consistent, not dramatic. The kind of improvement that&lt;br&gt;
compounds over a workday.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where the hype is accurate
&lt;/h2&gt;

&lt;p&gt;The benchmark numbers are real — in benchmarking conditions. HTTP throughput, cold start&lt;br&gt;
time, install speed. If those are your bottlenecks, Bun helps.&lt;/p&gt;

&lt;p&gt;The TypeScript support is native. No ts-node, no esbuild wrapper, no compilation step in&lt;br&gt;
dev. You write &lt;code&gt;.ts&lt;/code&gt; and it runs. That alone removes a category of configuration friction.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where I'd push back
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The 3x speed claims are benchmarking conditions, not production conditions.&lt;/strong&gt; In a&lt;br&gt;
containerised environment with real I/O — database calls, external APIs, filesystem ops —&lt;br&gt;
the gap narrows. Still faster. Not 3x faster.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Native addon compatibility is not complete.&lt;/strong&gt; The Node compatibility layer has improved&lt;br&gt;
significantly in the last year, but anything touching native Node addons (N-API, node-gyp)&lt;br&gt;
needs to be tested before you commit. Don't assume.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Alpine Linux / musl libc.&lt;/strong&gt; Bun's Linux binary targets glibc. If your Docker images are&lt;br&gt;
Alpine-based, you'll need to swap to a Debian-based image. Minor friction, worth knowing&lt;br&gt;
before you're debugging it in production.&lt;/p&gt;




&lt;h2&gt;
  
  
  Who should switch now
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Greenfield TypeScript projects with no native addon dependencies&lt;/li&gt;
&lt;li&gt;Side projects and internal tools where you control the full stack&lt;/li&gt;
&lt;li&gt;Anyone whose CI bottleneck is actually npm install time&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Who should wait
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Projects with deep native module dependencies&lt;/li&gt;
&lt;li&gt;Teams where Node expertise is load-bearing and switching cost is high&lt;/li&gt;
&lt;li&gt;Anything running on musl unless you're willing to change your base image&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Vishwakarma frame
&lt;/h2&gt;

&lt;p&gt;Vishwakarma doesn't appear in the climax of the Mahabharata. He built the instruments&lt;br&gt;
that made it possible. That's the right mental model for a runtime.&lt;/p&gt;

&lt;p&gt;Bun didn't write your application. It's not trying to. It built the forge.&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://beyondcodekarma.in/blogs/tech/bun-the-visvakarma-of-javascript" rel="noopener noreferrer"&gt;Full write-up →&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;Building something with tight performance requirements and want a second opinion on the&lt;br&gt;
stack → &lt;a href="https://beyondcodekarma.in/hire/" rel="noopener noreferrer"&gt;I work with teams on this&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>bunjs</category>
      <category>npm</category>
      <category>webdev</category>
      <category>programming</category>
    </item>
    <item>
      <title>The JS Event Loop Has a Model Gap, Here's What Most Tutorials Don't Show You</title>
      <dc:creator>Deveshwar Jaiswal</dc:creator>
      <pubDate>Mon, 13 Apr 2026 12:45:36 +0000</pubDate>
      <link>https://forem.com/svssdeva/the-js-event-loop-has-a-model-gap-heres-what-most-tutorials-dont-show-you-2pjn</link>
      <guid>https://forem.com/svssdeva/the-js-event-loop-has-a-model-gap-heres-what-most-tutorials-dont-show-you-2pjn</guid>
      <description>&lt;p&gt;The standard explanation of the JavaScript Event Loop goes like this: there's a call stack,&lt;br&gt;
there's a queue, and when the stack is empty the loop pulls the next item from the queue.&lt;/p&gt;

&lt;p&gt;That model isn't wrong. It's incomplete in ways that will eventually burn you.&lt;/p&gt;


&lt;h2&gt;
  
  
  What the standard explanation misses
&lt;/h2&gt;
&lt;h3&gt;
  
  
  1. The microtask queue is not part of the loop cycle in the way the task queue is
&lt;/h3&gt;

&lt;p&gt;Most diagrams show two queues with different priorities. The actual model is different.&lt;/p&gt;

&lt;p&gt;After every task completes, the runtime runs the &lt;strong&gt;microtask checkpoint&lt;/strong&gt;. This checkpoint&lt;br&gt;
drains the microtask queue completely — including any microtasks queued by microtasks — before&lt;br&gt;
the event loop selects the next task.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nf"&gt;setTimeout&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;task&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;resolve&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;microtask 1&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;resolve&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="p"&gt;})&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;microtask 2&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;

&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;sync&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output: &lt;code&gt;sync&lt;/code&gt; → &lt;code&gt;microtask 1&lt;/code&gt; → &lt;code&gt;microtask 2&lt;/code&gt; → &lt;code&gt;task&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The setTimeout callback doesn't run until every microtask — including chains — has settled.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. The render step sits between tasks, not between microtasks
&lt;/h3&gt;

&lt;p&gt;The browser's render pipeline (style recalc, layout, paint) runs between tasks. Not between&lt;br&gt;
microtasks.&lt;/p&gt;

&lt;p&gt;This means: a long Promise chain is as harmful to your frame rate as a long synchronous&lt;br&gt;
function. No heavy computation required. If you're queueing work in Promises and wondering&lt;br&gt;
why your animations stutter, this is likely the reason.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. &lt;code&gt;setTimeout(fn, 0)&lt;/code&gt; and &lt;code&gt;Promise.resolve().then(fn)&lt;/code&gt; are different lanes
&lt;/h3&gt;

&lt;p&gt;Not different speeds. Different lanes with different rules.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;setTimeout&lt;/code&gt; → task queue. One per loop iteration. Render can happen before the next one.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;Promise.resolve().then&lt;/code&gt; → microtask queue. Entire chain runs before the loop advances.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Swapping one for the other is not a performance micro-optimisation. It changes execution&lt;br&gt;
semantics.&lt;/p&gt;




&lt;h2&gt;
  
  
  The mental model I use
&lt;/h2&gt;

&lt;p&gt;I mapped this to Vedic karma and dharma — unconventional framing, but it makes the model stick.&lt;/p&gt;

&lt;p&gt;Every async operation you schedule is &lt;strong&gt;karma&lt;/strong&gt;: a consequence set in motion, deferred but&lt;br&gt;
not cancelled. The event loop is &lt;strong&gt;dharma&lt;/strong&gt;: the rule governing when consequences are processed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Call stack&lt;/strong&gt; = the present moment. One thing at a time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Microtask queue&lt;/strong&gt; = urgent karma. Drains immediately after the present resolves.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Task queue&lt;/strong&gt; = future karma. Waits for a full loop cycle.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Render step&lt;/strong&gt; = the breath between cycles. Can be held by urgent karma flooding.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It's a mnemonic layer, not a replacement for reading the WHATWG HTML spec. But it's why&lt;br&gt;
the model stuck for me in a way no diagram had managed.&lt;/p&gt;




&lt;h2&gt;
  
  
  Interactive visualizer
&lt;/h2&gt;

&lt;p&gt;I built a step-by-step visualizer: write any snippet, step through execution, and watch&lt;br&gt;
every queue animate in real time. Microtask drain, render step timing, task queue behaviour&lt;br&gt;
— all visible.&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://beyondcodekarma.in/javascript/js-event-loop/" rel="noopener noreferrer"&gt;Try the interactive visualizer →&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;If you're building production frontend and want to dig into performance or architecture →&lt;br&gt;
&lt;a href="https://beyondcodekarma.in/hire/" rel="noopener noreferrer"&gt;I'm available for hire&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>performance</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
