<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Ahod26</title>
    <description>The latest articles on Forem by Ahod26 (@ahod26).</description>
    <link>https://forem.com/ahod26</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/ahod26"/>
    <language>en</language>
    <item>
      <title>Built an MCP server for .NET developers working with AI</title>
      <dc:creator>Ahod26</dc:creator>
      <pubDate>Thu, 18 Dec 2025 18:19:03 +0000</pubDate>
      <link>https://forem.com/ahod26/built-an-mcp-server-for-net-developers-working-with-ai-578k</link>
      <guid>https://forem.com/ahod26/built-an-mcp-server-for-net-developers-working-with-ai-578k</guid>
      <description>&lt;p&gt;If you're building AI applications with .NET or constantly learning, you've noticed LLMs confidently give you code that doesn't compile or completely wrong explanations about how things work.&lt;/p&gt;

&lt;p&gt;I got tired of it, so I built &lt;a href="https://github.com/Ahod26/dotnet-ai-mcp-server" rel="noopener noreferrer"&gt;DotNet AI MCP Server&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;**&lt;/p&gt;

&lt;h2&gt;
  
  
  What it does:
&lt;/h2&gt;

&lt;p&gt;**&lt;br&gt;
Connects your favorite client to two sources:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Live GitHub repos&lt;/strong&gt; - Semantic Kernel, OpenAI .NET SDK, MCP C# SDK, AutoGen, and more. Real code and documentation from the actual repos.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Microsoft Learn&lt;/strong&gt; - I proxied the official Microsoft Learn MCP tools but optimized them: better token efficiency, clearer descriptions, and improved argument names so the LLM actually picks the right tool.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;**&lt;/p&gt;

&lt;h2&gt;
  
  
  The key difference:
&lt;/h2&gt;

&lt;p&gt;**&lt;br&gt;
Zero prompt engineering. Just ask your question naturally - "How do Semantic Kernel agents work?" or "Show me how to build an MCP server with C#" - and the tools trigger automatically. No need to tell it "use this tool" or "search the documentation" like other MCP servers require.&lt;/p&gt;

&lt;p&gt;It uses progressive file exposure (repos → folders → files → content) which saves tokens and doesn't flood your context with irrelevant data.&lt;/p&gt;

&lt;p&gt;**&lt;/p&gt;

&lt;h2&gt;
  
  
  Currently tracking:
&lt;/h2&gt;

&lt;p&gt;**&lt;br&gt;
&lt;strong&gt;AI Frameworks &amp;amp; LLM SDKs&lt;/strong&gt;: Semantic Kernel • AutoGen • Kernel Memory • OpenAI .NET • Google Gemini • Anthropic Claude • MCP C# SDK • LangChain.NET • OllamaSharp&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Vector Database C# SDKs&lt;/strong&gt;: Pinecone • Qdrant • Weaviate • Redis Stack&lt;/p&gt;

&lt;p&gt;Setup takes 30 seconds. If it helps, drop a ⭐ so other .NET devs can find it.&lt;/p&gt;

&lt;p&gt;Try it: &lt;a href="https://github.com/Ahod26/dotnet-ai-mcp-server" rel="noopener noreferrer"&gt;https://github.com/Ahod26/dotnet-ai-mcp-server&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Roast me if it sucks. 🔥&lt;/p&gt;

</description>
      <category>csharp</category>
    </item>
  </channel>
</rss>
