<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Yaman Ahlawat</title>
    <description>The latest articles on Forem by Yaman Ahlawat (@yaman_ahlawat).</description>
    <link>https://forem.com/yaman_ahlawat</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/yaman_ahlawat"/>
    <language>en</language>
    <item>
      <title>LLM Registry - A CLI Tool to Track Model Capabilities Across Providers</title>
      <dc:creator>Yaman Ahlawat</dc:creator>
      <pubDate>Mon, 16 Feb 2026 03:14:02 +0000</pubDate>
      <link>https://forem.com/yaman_ahlawat/llm-registry-a-cli-tool-to-track-model-capabilities-across-providers-1c2f</link>
      <guid>https://forem.com/yaman_ahlawat/llm-registry-a-cli-tool-to-track-model-capabilities-across-providers-1c2f</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/github-2026-01-21"&gt;GitHub Copilot CLI Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;LLM Registry is a Python package and CLI tool that keeps track of LLM model metadata - pricing, features, context windows, API parameters - across different providers.&lt;br&gt;
The problem is simple: if you're working with models from OpenAI, Anthropic, Google, and others, there's no single place to check things like "does this model support vision?" or "what's the input cost per million tokens?" You end up digging through multiple provider docs every time. I got tired of that, so I built this.&lt;br&gt;
It ships with 140+ models across 16 providers out of the box. You can query them via a Python API or a CLI tool called llmr. You can also add your own custom models that get stored locally and override the built-in ones if needed.&lt;/p&gt;

&lt;p&gt;It's published on PyPI (pip install &lt;a href="https://pypi.org/project/llm-registry/" rel="noopener noreferrer"&gt;llm-registry&lt;/a&gt;) and the source is on GitHub (&lt;a href="https://github.com/yamanahlawat/llm-registry" rel="noopener noreferrer"&gt;https://github.com/yamanahlawat/llm-registry&lt;/a&gt;).&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fimacjs57df487pgcykw4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fimacjs57df487pgcykw4.png" alt=" " width="800" height="463"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvus3ksc6dajq41cxchot.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvus3ksc6dajq41cxchot.png" alt=" " width="800" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhryljiq9vue48121fple.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhryljiq9vue48121fple.png" alt=" " width="800" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  My Experience with GitHub Copilot CLI
&lt;/h2&gt;

&lt;p&gt;I used Copilot CLI throughout building this project. Since LLM Registry is itself a CLI tool, it made sense to build it with a terminal-native agent rather than relying on an IDE.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>githubchallenge</category>
      <category>cli</category>
      <category>githubcopilot</category>
    </item>
  </channel>
</rss>
