<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Sushan</title>
    <description>The latest articles on Forem by Sushan (@sushan).</description>
    <link>https://forem.com/sushan</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/sushan"/>
    <language>en</language>
    <item>
      <title>Running Claude Code with Ollama models (Local / Cloud)</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Thu, 12 Mar 2026 06:05:45 +0000</pubDate>
      <link>https://forem.com/sushan/running-claude-code-with-ollama-models-local-cloud-4eha</link>
      <guid>https://forem.com/sushan/running-claude-code-with-ollama-models-local-cloud-4eha</guid>
      <description>&lt;h1&gt;
  
  
  Run Claude Code with Ollama (Local, Cloud, or Any Model)
&lt;/h1&gt;

&lt;p&gt;This guide shows how to run &lt;strong&gt;Claude Code using Ollama&lt;/strong&gt;, allowing you to use &lt;strong&gt;local models, cloud models, or any Ollama-supported model&lt;/strong&gt; directly from your terminal.&lt;/p&gt;




&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Make sure the following tools are installed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Ollama&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Claude Code&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Install Ollama
&lt;/h2&gt;

&lt;p&gt;If Ollama is not installed, you can install it using the commands below.&lt;/p&gt;

&lt;p&gt;You can also follow this guide:&lt;br&gt;&lt;br&gt;
&lt;a href="https://dev.to/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d"&gt;https://dev.to/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Windows (PowerShell)
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;irm&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;https://ollama.com/install.ps1&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;iex&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  macOS / Linux
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://ollama.com/install.sh | sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Verify installation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Install Claude Code
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Windows (PowerShell)
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight powershell"&gt;&lt;code&gt;&lt;span class="n"&gt;irm&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nx"&gt;https://claude.ai/install.ps1&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="o"&gt;|&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;iex&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  macOS / Linux
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://claude.ai/install.sh | bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify installation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;claude &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Running Claude Code with Ollama
&lt;/h2&gt;

&lt;p&gt;Once both tools are installed, you can start Claude Code through Ollama.&lt;/p&gt;

&lt;p&gt;The commands work the same on &lt;strong&gt;Windows, macOS, and Linux&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Option 1: Launch and Select a Model
&lt;/h2&gt;

&lt;p&gt;Run the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama launch claude
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will open a &lt;strong&gt;model selection menu&lt;/strong&gt; where you can choose a model using the &lt;strong&gt;arrow keys&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1jtksza2hc3p9cqnsolu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1jtksza2hc3p9cqnsolu.png" alt="Ollama launch menu" width="800" height="519"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Option 2: Launch with a Specific Model
&lt;/h2&gt;

&lt;p&gt;You can also specify the model directly.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama launch claude &lt;span class="nt"&gt;--model&lt;/span&gt; kimi-k2.5:cloud
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Other examples:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama launch claude &lt;span class="nt"&gt;--model&lt;/span&gt; llama3
ollama launch claude &lt;span class="nt"&gt;--model&lt;/span&gt; qwen2.5
ollama launch claude &lt;span class="nt"&gt;--model&lt;/span&gt; kimi-k2.5:cloud
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace the model name with any model available in your Ollama environment.&lt;/p&gt;




&lt;h2&gt;
  
  
  Grant Folder Access
&lt;/h2&gt;

&lt;p&gt;When Claude Code starts, it will ask permission to access the current folder.&lt;/p&gt;

&lt;p&gt;Select &lt;strong&gt;Yes&lt;/strong&gt; to allow Claude to read and modify files in the directory.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0sy75nc4lvp68to4cyto.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0sy75nc4lvp68to4cyto.png" alt="Allowing Claude access" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Done
&lt;/h2&gt;

&lt;p&gt;Claude Code will now start and connect to the selected model.&lt;/p&gt;

&lt;p&gt;You can start interacting with your codebase immediately.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F20i53mzv176hpkxgqvn5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F20i53mzv176hpkxgqvn5.png" alt="Claude Code running" width="800" height="399"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Official Documentation
&lt;/h2&gt;

&lt;p&gt;For more details, see the official docs:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.ollama.com/integrations/claude-code" rel="noopener noreferrer"&gt;https://docs.ollama.com/integrations/claude-code&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>llm</category>
      <category>tooling</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>How to connect a local AI model(with Ollama) to VS Code. (Updated)</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Mon, 01 Dec 2025 14:36:16 +0000</pubDate>
      <link>https://forem.com/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d</link>
      <guid>https://forem.com/sushan/how-to-connect-a-local-ai-model-to-vs-code-1g8d</guid>
      <description>&lt;p&gt;You can try out the latest Ollama models on VS Code for free.&lt;/p&gt;

&lt;p&gt;We are using Ollama, which is a free local AI model running application developed by the Llama community.&lt;br&gt;
​​&lt;br&gt;
​&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing and using Ollama
&lt;/h2&gt;

&lt;p&gt;You can download Ollama from its &lt;a href="https://ollama.com/download" rel="noopener noreferrer"&gt;website&lt;/a&gt;.&lt;br&gt;
​​&lt;/p&gt;

&lt;p&gt;Now you'll be able to access ollama using your terminal.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open your terminal.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type &lt;code&gt;ollama&lt;/code&gt; to verify if it's been installed.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsy2rd94qd46iavdbspna.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsy2rd94qd46iavdbspna.png" alt="Ollama run" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run a model you like(depending on your hardware), using the command:&lt;br&gt;
&lt;code&gt;ollama run qwen3:4b&lt;/code&gt;&lt;br&gt;
This command will pull and run the model.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd857whehrc04o6gdl9y1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd857whehrc04o6gdl9y1.png" alt="Model Running" width="800" height="371"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change the model name to your preferred model and install it.&lt;br&gt;
To view all the available models, go to &lt;a href="https://ollama.com/search" rel="noopener noreferrer"&gt;ollama.com/search&lt;/a&gt;&lt;br&gt;
ㅤ&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If you want to run a high-end AI model, you can use Ollama Cloud for free.&lt;br&gt;
Run them like this: &lt;code&gt;ollama pull qwen3-coder:480b-cloud&lt;/code&gt;.&lt;br&gt;
​ㅤ&lt;br&gt;
ㅤ&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Integrating with VS Code
&lt;/h2&gt;

&lt;p&gt;Make sure the Ollama server is running in the background.&lt;br&gt;
Verification: Check this URL, &lt;a href="http://localhost:11434" rel="noopener noreferrer"&gt;localhost:11434&lt;/a&gt;, and see if Ollama is running.&lt;br&gt;
If not: Run it using the command &lt;code&gt;ollama serve&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkatkq0crdkq1os7icwfi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkatkq0crdkq1os7icwfi.png" alt="Ollama serve command" width="800" height="181"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffy9w1wrwm0gzo39xjogk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffy9w1wrwm0gzo39xjogk.png" alt="Ollama run check" width="800" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;ㅤ&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open VS Code -&amp;gt; Copilot chat sidebar.&lt;/li&gt;
&lt;li&gt;Select the model dropdown -&amp;gt; Manage models -&amp;gt; Add Models(Top right) -&amp;gt; Select Ollama -&amp;gt; Hit Enter ("Ollama" prewritten) -&amp;gt; Type '&lt;a href="http://localhost:11434/" rel="noopener noreferrer"&gt;http://localhost:11434/&lt;/a&gt;' and Enter -&amp;gt; Select the desired models.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyc1kxxoyiippz8bl1c8i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyc1kxxoyiippz8bl1c8i.png" alt="Manage Models Option" width="601" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Ollama will be added to the list like this.&lt;/li&gt;
&lt;li&gt;Make sure to make the model visible by clicking the eye 👁️ sign.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp167v0nfodfnn57zs8rm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp167v0nfodfnn57zs8rm.png" alt="Configuring Ollama" width="800" height="478"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Demo:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;If you still don't see your models in the models drop-down in the chatbar, make sure you're in Ask mode.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6oxmtkubrhk1b0whpxbp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6oxmtkubrhk1b0whpxbp.png" alt="Ask Mode in Ollama" width="599" height="239"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F72ia7np5qg0qrwdc8d1h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F72ia7np5qg0qrwdc8d1h.png" alt="Ollama Models" width="800" height="482"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The option will disappear once you turn the Ollama server off.&lt;/p&gt;

&lt;h2&gt;
  
  
  And if it still doesn't work:
&lt;/h2&gt;

&lt;p&gt;Get a free Gemini API key from &lt;a href="https://aistudio.google.com/api-keys" rel="noopener noreferrer"&gt;https://aistudio.google.com/api-keys&lt;/a&gt;&lt;br&gt;
Select Google model, add API key, and DONE.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>vscode</category>
      <category>ai</category>
      <category>tooling</category>
    </item>
    <item>
      <title>Use Ollama with VS Code for free.</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Thu, 16 Oct 2025 13:43:10 +0000</pubDate>
      <link>https://forem.com/sushan/use-ollama-with-vs-code-for-free-14m2</link>
      <guid>https://forem.com/sushan/use-ollama-with-vs-code-for-free-14m2</guid>
      <description>&lt;p&gt;You can try out the latest Ollama models on VS Code for free.&lt;br&gt;
It also offers Ollama Cloud, which helps you run powerful models.&lt;/p&gt;

&lt;p&gt;More: &lt;a href="https://ollama.com/cloud" rel="noopener noreferrer"&gt;Ollama Cloud&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, pull the coding models so they can be accessed via VS Code:&lt;br&gt;
ollama pull qwen3-coder:480b-cloud&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open the Copilot chat sidebar&lt;/li&gt;
&lt;li&gt;Select the model dropdown -&amp;gt; Manage models -&amp;gt; Select Ollama&lt;/li&gt;
&lt;li&gt;Select the desired models.&lt;/li&gt;
&lt;li&gt;Open the model dropdown and choose the model.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flj5lrb17n17y2l0rl15n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flj5lrb17n17y2l0rl15n.png" alt="Selecting models in Ollama" width="468" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>programming</category>
      <category>ai</category>
    </item>
    <item>
      <title>But, Why is Serverless? When should you use it?</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Sat, 13 Sep 2025 15:22:18 +0000</pubDate>
      <link>https://forem.com/sushan/but-why-is-serverless-when-should-you-use-it-9n1</link>
      <guid>https://forem.com/sushan/but-why-is-serverless-when-should-you-use-it-9n1</guid>
      <description>&lt;p&gt;Serverless sounds like "no servers," but of course, servers still exist in the process.&lt;br&gt;
We all know that it's to help &lt;strong&gt;reduce the workload of managing a server&lt;/strong&gt; for a developer.&lt;/p&gt;

&lt;h3&gt;
  
  
  But why do we need it? Why does it even exist?, And how did it come into existence?
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Beginning:
&lt;/h2&gt;

&lt;p&gt;So in 2006, during the cloud revolution, companies like AWS started offering VMs(Virtual Machines) on the cloud (with services like EC2 and S3) that developers could rent. This was a &lt;strong&gt;low-cost and efficient alternative to purchasing your own physical servers&lt;/strong&gt;.&lt;br&gt;
It was Infrastructure as a Service(&lt;strong&gt;IaaS&lt;/strong&gt;).&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Simpler Era:
&lt;/h2&gt;

&lt;p&gt;But, even if devs could rent VMs, they &lt;strong&gt;didn't want the hassle of setting up and managing these servers&lt;/strong&gt;, and that's when Platform as a Service(&lt;strong&gt;PaaS&lt;/strong&gt;) showed up.&lt;br&gt;
In 2008, GAE(Google App Engine) was released with the PaaS model. Around the same time, Heroku (est. 2007) debuted in 2009 as a widely admired PaaS platform, letting users quickly deploy their apps.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Game Changer:
&lt;/h2&gt;

&lt;p&gt;Around 2014, AWS launched &lt;strong&gt;AWS Lambda&lt;/strong&gt;, marking a major shift in cloud computing. You no longer needed to provision or run servers. It enabled the &lt;strong&gt;execution of discrete code snippets (functions)&lt;/strong&gt; triggered by events.&lt;br&gt;
Strictly speaking, it was called “function-as-a-service”.&lt;/p&gt;

&lt;p&gt;Lambda marked the birth of serverless computing, and soon other players started catching up. Developers could now write code, and the &lt;strong&gt;cloud providers would handle execution, scaling, and billing&lt;/strong&gt;.&lt;br&gt;
It was not only reliable but damn &lt;strong&gt;cheap too&lt;/strong&gt;, allowing wider adoption among developers.&lt;/p&gt;

&lt;p&gt;Now we have:&lt;br&gt;
Microsoft Azure Functions,&lt;br&gt;
Google Cloud Functions, and &lt;br&gt;
Cloudflare Workers...&lt;/p&gt;

&lt;p&gt;All of them appeared after AWS introduced Lambda, and they tried to decrease the cost of serverless further.&lt;br&gt;
Serverless was cheap, reliable, and removed a ton of operational overhead.&lt;/p&gt;

&lt;p&gt;And now you can see how much Lambda costs.&lt;br&gt;
&lt;a href="https://aws.amazon.com/lambda/pricing/" rel="noopener noreferrer"&gt;Source: Amazon AWS&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm4nrymsnqhv8bpotlwdk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm4nrymsnqhv8bpotlwdk.png" alt="Showing pricing of Lambda" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Drawbacks of Serverless:
&lt;/h2&gt;

&lt;p&gt;After all, no technology is complete. Serverless has many benefits, but it also comes with a few tradeoffs.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cold Start&lt;/strong&gt;: This is the main issue of serverless. When a function hasn’t been used recently, the cloud provider may &lt;strong&gt;spin it up to save resources&lt;/strong&gt;, which helps bring down the cost of serverless.
This isn't ideal for real-time applications like trading platforms, games, etc.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limited Control&lt;/strong&gt;: Serverless might not give you much control over your infrastructure due to platform restrictions.&lt;/li&gt;
&lt;li&gt;Cost Predictability &amp;amp; &lt;strong&gt;Vendor Lock-in&lt;/strong&gt; Concerns: If you need a predictable billing (for budgeting) or want to avoid being tied to just one cloud provider, serverless may introduce some challenges.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  When Should You Use It?
&lt;/h2&gt;

&lt;p&gt;Serverless is good when your service runs &lt;strong&gt;sporadically&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Event-driven tasks like file uploads, DB change triggers, etc.&lt;/li&gt;
&lt;li&gt;APIs or microservices with variable traffic.&lt;/li&gt;
&lt;li&gt;It's quite reliable for prototyping, MVPs, or for experimental features where you want to move quickly and minimize initial infrastructure cost and risk.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  When not to use it:
&lt;/h3&gt;

&lt;p&gt;Just because serverless seems to be cheap doesn't mean that it's the cheapest for every kind of task.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Long-running or compute-intensive&lt;/strong&gt; tasks that can exceed the execution time, memory, or CPU limits of serverless platforms.&lt;/li&gt;
&lt;li&gt;High, steady traffic where functions are always invoked heavily, which may even exceed the cost of alternatives.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Latency-sensitive applications&lt;/strong&gt; because of cold starts.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>cloud</category>
      <category>architecture</category>
      <category>serverless</category>
      <category>devops</category>
    </item>
    <item>
      <title>The Birth of Python</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Sun, 07 Sep 2025 13:43:10 +0000</pubDate>
      <link>https://forem.com/sushan/the-birth-of-python-9fk</link>
      <guid>https://forem.com/sushan/the-birth-of-python-9fk</guid>
      <description>&lt;p&gt;In the &lt;strong&gt;late 1980s&lt;/strong&gt;, computers were buzzing all around, programmers were wrestling with clunky programming languages, and one guy in Amsterdam was getting a little frustrated. It was &lt;strong&gt;Guido van Rossum&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Guido van Rossum was employed at a research institute where he was using a &lt;strong&gt;programming language called ABC&lt;/strong&gt;. ABC had some neat concepts as it was accessible for beginners and was sweet and clean, but it was also inflexible.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6ro8vtv3stkqpqg3wtr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6ro8vtv3stkqpqg3wtr.png" alt="Guido Van Rossum in 1989&amp;lt;br&amp;gt;
" width="800" height="516"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Try to extend it? No chance!&lt;/p&gt;

&lt;p&gt;Try connecting it with system calls? Nope.&lt;/p&gt;

&lt;p&gt;Guido appreciated the spirit of ABC, but he wanted something more useful, something with enough flexibility to work through real-life problems.&lt;/p&gt;

&lt;p&gt;So, during the &lt;strong&gt;Christmas break in 1989&lt;/strong&gt;, while most were all about unwrapping gifts or singing Christmas carols, Guido decided to treat himself with &lt;strong&gt;a new programming language&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Guido wanted it to be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;As easy to read as plain English&lt;/strong&gt;,&lt;/li&gt;
&lt;li&gt;As fun as ABC, but not as restrictive,&lt;/li&gt;
&lt;li&gt;Powerful enough to connect to systems.
And of course, because Guido had a ridiculous sense of humor, he named it after the British comedy ‘Monty Python’s Flying Circus.’ (Yup, Python has nothing to do with 🐍 snakes!).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At first, it was named &lt;strong&gt;Monty Python&lt;/strong&gt;, but it didn’t sound quite right, so Guido shortened it to Python.&lt;/p&gt;

&lt;p&gt;A few months later, this little holiday side project began to grow legs. &lt;strong&gt;By 1991&lt;/strong&gt;, Python had an &lt;strong&gt;official release, 0.9.0,&lt;/strong&gt; with core features like functions, exceptions, and even modules.&lt;/p&gt;

&lt;p&gt;It quickly gained some following. Programmers were loving it. It was easy without being trivial. It was flexible without being unstructured. Other languages made you feel like you were solving complicated math puzzles. &lt;strong&gt;Python made you feel like you were well… talking to computers.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbob4u4sfi91grt04it8y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbob4u4sfi91grt04it8y.png" alt="Vibrant Python logo" width="800" height="689"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Eventually, what started as a Christmas experiment by Guido van Rossum became one of the most beloved programming languages on the planet. From powering &lt;strong&gt;Instagram&lt;/strong&gt; to helping in &lt;strong&gt;NASA&lt;/strong&gt; space missions and &lt;strong&gt;AI&lt;/strong&gt;, Python slithered into &lt;strong&gt;almost every corner of technology&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In 2024, it became the &lt;strong&gt;most popular programming language&lt;/strong&gt; worldwide. (Though some surveys still place JavaScript just ahead)&lt;/p&gt;

&lt;p&gt;The best part? It never lost that human touch Guido was going for. Python wasn’t meant to make machines happy. &lt;strong&gt;It was meant to make programmers happy.&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>python</category>
      <category>programming</category>
      <category>programmers</category>
    </item>
    <item>
      <title>The Accidental Birth of NodeJS!</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Wed, 13 Aug 2025 15:21:14 +0000</pubDate>
      <link>https://forem.com/sushan/the-accidental-birth-of-nodejs-5eoj</link>
      <guid>https://forem.com/sushan/the-accidental-birth-of-nodejs-5eoj</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Ever wonder how Node.js came to be? Here’s the story of a frustrated developer and a wild idea that changed backend dev forever.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;It’s 2009.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ryan Dahl is&lt;/strong&gt; sitting at his desk, &lt;strong&gt;frustrated&lt;/strong&gt;. He’s been wrestling with the same problem for days:&lt;br&gt;
&lt;strong&gt;Why do web servers just stop and wait?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;He's building a system to handle real-time file uploads, but the servers of the time are blocking — they freeze until each file operation finishes.&lt;br&gt;
Ryan wants something faster, more fluid, something that can juggle thousands of connections at once.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feirv8rws0eajydwy1eo3.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feirv8rws0eajydwy1eo3.jpg" alt="Frustrated Ryan Dahl" width="715" height="694"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, &lt;strong&gt;he tries C&lt;/strong&gt;.&lt;br&gt;
Too low-level and too much pain.&lt;/p&gt;

&lt;p&gt;Then he tries &lt;strong&gt;Ruby and Python&lt;/strong&gt;.&lt;br&gt;
Still blocking and still slow for what he needs.&lt;/p&gt;

&lt;p&gt;Then, almost by accident, he stumbles upon something new:&lt;br&gt;
&lt;strong&gt;Google’s V8 JavaScript engine&lt;/strong&gt;, which was built to make Chrome blazing fast.&lt;/p&gt;

&lt;p&gt;JavaScript on the server seemed to be an absurd idea at the time.&lt;br&gt;
JS was the language for pop-ups, form validation, etc., not heavy backend work.&lt;/p&gt;

&lt;p&gt;But he thinks V8 is fast.&lt;br&gt;
So Ryan does something unusual — &lt;strong&gt;he embeds the V8 JavaScript engine inside C++ code&lt;/strong&gt;, then writes an event loop and hooks it up to the system’s I/O.&lt;/p&gt;

&lt;p&gt;Finally, the result? &lt;strong&gt;A platform where JavaScript can run outside the browser&lt;/strong&gt;, talking directly to the operating system — and doing it without blocking.&lt;br&gt;
He had never planned to create a backend revolution. What &lt;strong&gt;he needed was just a tool for his own problem&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7fok9n1e07qxdyupvpyi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7fok9n1e07qxdyupvpyi.png" alt="NodeJS logo" width="512" height="512"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But later, when he released Node.js, developers around the world saw its potential.&lt;br&gt;
And what began as &lt;strong&gt;an experiment became one of the most widely used backend technologies&lt;/strong&gt; in history.&lt;br&gt;
Sometimes, the future is built by accident.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Basics of MCP</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Tue, 12 Aug 2025 13:41:04 +0000</pubDate>
      <link>https://forem.com/sushan/basics-of-mcp-1npn</link>
      <guid>https://forem.com/sushan/basics-of-mcp-1npn</guid>
      <description>&lt;p&gt;Here are some helpful terms for MCP users.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzo76eu2rmd8wnolag7fh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzo76eu2rmd8wnolag7fh.png" alt="MCP Basics" width="800" height="1108"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>api</category>
      <category>llm</category>
    </item>
    <item>
      <title>Why are so many devs ditching WordPress?</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Thu, 07 Aug 2025 15:51:17 +0000</pubDate>
      <link>https://forem.com/sushan/why-are-so-many-devs-ditching-wordpress-3299</link>
      <guid>https://forem.com/sushan/why-are-so-many-devs-ditching-wordpress-3299</guid>
      <description>&lt;p&gt;Because they’ve met: Headless CMS&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg5lquckf6kfg9fkfg2kx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg5lquckf6kfg9fkfg2kx.png" alt="Headless CMS vs Traditional CMS" width="679" height="645"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>wordpress</category>
      <category>cms</category>
      <category>webdev</category>
    </item>
    <item>
      <title>But, what's the difference b/w Docker and VMs?</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Tue, 05 Aug 2025 16:21:21 +0000</pubDate>
      <link>https://forem.com/sushan/but-whats-the-difference-bw-docker-and-vms-29p3</link>
      <guid>https://forem.com/sushan/but-whats-the-difference-bw-docker-and-vms-29p3</guid>
      <description>&lt;p&gt;But, what's the difference b/w Docker (a containerization platform) and VMs (Virtual Machines)?&lt;/p&gt;

&lt;p&gt;Here's the difference b/w the architecture of &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;VMs&lt;/li&gt;
&lt;li&gt;Docker (A containerization platform)&lt;/li&gt;
&lt;li&gt;Docker (on non-Linux-based systems)&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>docker</category>
      <category>virtualmachine</category>
      <category>discuss</category>
      <category>architecture</category>
    </item>
    <item>
      <title>But, what does Qwen3-235B-A22B-Thinking-2507-FP8 mean?</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Sat, 02 Aug 2025 15:29:28 +0000</pubDate>
      <link>https://forem.com/sushan/but-what-does-qwen3-235b-a22b-thinking-2507-fp8-mean-44g4</link>
      <guid>https://forem.com/sushan/but-what-does-qwen3-235b-a22b-thinking-2507-fp8-mean-44g4</guid>
      <description>&lt;p&gt;Here's the information it holds.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;235B: 235 billion parameters.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;A22B:  22B is active at any given time.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;2507: Release date — July 2025.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;FP8: Uses 8-bit floating point precision.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>deeplearning</category>
      <category>mlh</category>
      <category>announcement</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Thu, 31 Jul 2025 15:02:08 +0000</pubDate>
      <link>https://forem.com/sushan/-8a3</link>
      <guid>https://forem.com/sushan/-8a3</guid>
      <description>&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/sushan/why-use-mcp-over-apis-289l" class="crayons-story__hidden-navigation-link"&gt;Why use MCP over APIs?&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/sushan" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F3059755%2F47973540-f8a5-4459-83db-cd0149cbc310.jpg" alt="sushan profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/sushan" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Sushan
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Sushan
                
              
              &lt;div id="story-author-preview-content-2744134" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/sushan" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F3059755%2F47973540-f8a5-4459-83db-cd0149cbc310.jpg" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Sushan&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/sushan/why-use-mcp-over-apis-289l" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Jul 31 '25&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/sushan/why-use-mcp-over-apis-289l" id="article-link-2744134"&gt;
          Why use MCP over APIs?
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/mcp"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;mcp&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/api"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;api&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/ai"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;ai&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
            &lt;a href="https://dev.to/sushan/why-use-mcp-over-apis-289l#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              &lt;span class="hidden s:inline"&gt;Add Comment&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            1 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;

&lt;/div&gt;


</description>
      <category>mcp</category>
      <category>api</category>
      <category>ai</category>
    </item>
    <item>
      <title>Why use MCP over APIs?</title>
      <dc:creator>Sushan</dc:creator>
      <pubDate>Thu, 31 Jul 2025 15:01:17 +0000</pubDate>
      <link>https://forem.com/sushan/why-use-mcp-over-apis-289l</link>
      <guid>https://forem.com/sushan/why-use-mcp-over-apis-289l</guid>
      <description>&lt;p&gt;Here's a summary of MCP and its advantages.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4xscaql3k09cuwn62pqm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4xscaql3k09cuwn62pqm.png" alt="MCP Advantages" width="700" height="692"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>api</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
