<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Saad Nasir</title>
    <description>The latest articles on Forem by Saad Nasir (@saad_nasir_3734a6f23ad395).</description>
    <link>https://forem.com/saad_nasir_3734a6f23ad395</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/saad_nasir_3734a6f23ad395"/>
    <language>en</language>
    <item>
      <title>I Built a "Safety Belt" for AI Code Generation. Here's Why</title>
      <dc:creator>Saad Nasir</dc:creator>
      <pubDate>Mon, 20 Apr 2026 15:17:45 +0000</pubDate>
      <link>https://forem.com/saad_nasir_3734a6f23ad395/i-built-a-safety-belt-for-ai-code-generation-heres-why-2b45</link>
      <guid>https://forem.com/saad_nasir_3734a6f23ad395/i-built-a-safety-belt-for-ai-code-generation-heres-why-2b45</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8y28xom0iijypnhxzevb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8y28xom0iijypnhxzevb.png" alt=" " width="800" height="619"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  I Built a "Safety Belt" for AI Code Generation
&lt;/h1&gt;

&lt;p&gt;AI coding tools are incredible. They're also terrifying.&lt;/p&gt;

&lt;p&gt;Last month, I asked Cursor to add a simple caching layer to my API. It generated 200 lines of code, imported three new libraries, and refactored two functions I didn't ask it to touch.&lt;/p&gt;

&lt;p&gt;It worked. But I had no idea &lt;strong&gt;why&lt;/strong&gt; it chose Redis over Memcached. Or why it rewrote my error handler.&lt;/p&gt;

&lt;p&gt;I stared at the diff and realized: &lt;strong&gt;I didn't fully understand my own codebase anymore.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;That's when I built &lt;strong&gt;Verif.ai&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Problem Nobody's Talking About
&lt;/h2&gt;

&lt;p&gt;We're entering the era of "vibe coding"—telling an AI what we want and accepting whatever it spits out as long as tests pass.&lt;/p&gt;

&lt;p&gt;Here's what's happening under the surface:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Symptom&lt;/th&gt;
&lt;th&gt;What It Really Means&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;"Why did it use that library?"&lt;/td&gt;
&lt;td&gt;You're accumulating dependency debt you don't understand&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;"Who approved this change?"&lt;/td&gt;
&lt;td&gt;No one. The AI just did it&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;"Can we prove this is compliant?"&lt;/td&gt;
&lt;td&gt;No audit trail exists&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;"I don't remember writing this"&lt;/td&gt;
&lt;td&gt;You didn't. And neither did anyone else&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;This is &lt;strong&gt;comprehension debt&lt;/strong&gt;. Code that works but nobody fully understands.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Verif.ai Does
&lt;/h2&gt;

&lt;p&gt;Verif.ai does three simple things:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. It Pauses the AI&lt;/strong&gt;&lt;br&gt;
Before AI-generated code touches your files, Verif.ai intercepts and says: &lt;em&gt;"Hold on. Explain yourself first."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. It Demands a Case File&lt;/strong&gt;&lt;br&gt;
The AI must document:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What it's about to do&lt;/li&gt;
&lt;li&gt;Why it chose that approach&lt;/li&gt;
&lt;li&gt;What alternatives it considered&lt;/li&gt;
&lt;li&gt;Where it got its information&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;3. It Waits for Human Approval&lt;/strong&gt;&lt;br&gt;
You review. You approve. Only then does code land.&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Every approval is cryptographically signed. You get a tamper-proof audit trail.&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
Check out the repo, try the quickstart, and tear it apart in the issues.&lt;/p&gt;

&lt;p&gt;GitHub:  &lt;a href="https://github.com/saadnasirajk5-tech/Verif" rel="noopener noreferrer"&gt;https://github.com/saadnasirajk5-tech/Verif&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Even a star helps more than you know. It tells me I'm not crazy for caring about this stuff.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>opensource</category>
      <category>python</category>
    </item>
  </channel>
</rss>
