<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Marius Gherasim</title>
    <description>The latest articles on Forem by Marius Gherasim (@mariuscg).</description>
    <link>https://forem.com/mariuscg</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/mariuscg"/>
    <language>en</language>
    <item>
      <title>Goodbye SEO, Hello GEO: Why your stack needs an /llms.txt file in 2026</title>
      <dc:creator>Marius Gherasim</dc:creator>
      <pubDate>Tue, 06 Jan 2026 09:50:44 +0000</pubDate>
      <link>https://forem.com/mariuscg/goodbye-seo-hello-geo-why-your-stack-needs-an-llmstxt-file-in-2026-2e73</link>
      <guid>https://forem.com/mariuscg/goodbye-seo-hello-geo-why-your-stack-needs-an-llmstxt-file-in-2026-2e73</guid>
      <description>&lt;p&gt;We have spent the last two decades optimizing the DOM for Google's crawlers. We obsess over semantic HTML, hydration, and Core Web Vitals. But in 2026, the game has changed. We aren't just building for browsers anymore; we are building for Agents.&lt;/p&gt;

&lt;p&gt;Welcome to the era of GEO (Generative Engine Optimization).&lt;/p&gt;

&lt;p&gt;The Problem: HTML is "Expensive"&lt;/p&gt;

&lt;p&gt;When an AI agent (like Gemini, ChatGPT, or a custom RAG pipeline) crawls your site, it isn't looking for visual layout. It’s looking for context.&lt;/p&gt;

&lt;p&gt;Standard websites are heavy. Parsing a modern React app to extract a simple value proposition wastes tokens. If your site structure is confusing, AI models will hallucinate your services or, worse, skip you entirely to save compute costs. A low GEO score means you are effectively invisible to the "new internet."&lt;/p&gt;

&lt;p&gt;The Solution: The llms.txt Standard&lt;/p&gt;

&lt;p&gt;Just as robots.txt tells crawlers what they can access, llms.txt tells AI agents what matters.&lt;/p&gt;

&lt;p&gt;This is a standardized Markdown file located at the root of your domain. It strips away the UI/UX fluff and presents your site's logic in raw, token-efficient text.&lt;/p&gt;

&lt;p&gt;Why it matters:&lt;/p&gt;

&lt;p&gt;Token Efficiency: Without it, AI agents waste up to 400% more tokens traversing your DOM.&lt;/p&gt;

&lt;p&gt;Context Window: It ensures your core value prop fits within the context window of smaller, faster models.&lt;/p&gt;

&lt;p&gt;Accuracy: It prevents hallucination by providing a source of truth.&lt;/p&gt;

&lt;p&gt;How to Implement It&lt;/p&gt;

&lt;p&gt;You don't need a data scientist to fix this. You just need to refactor your content for machines.&lt;/p&gt;

&lt;p&gt;Audit your current visibility: See how an LLM currently parses your homepage.&lt;/p&gt;

&lt;p&gt;Create the file: Generate a structured Markdown summary of your documentation and pricing.&lt;/p&gt;

&lt;p&gt;Deploy: Place it at yourdomain.com/llms.txt.&lt;/p&gt;

&lt;p&gt;I built Refactor.Tools to automate this process. It runs a GEO audit to show you exactly how AI sees your site (scoring readability and token efficiency) and helps you generate a compliant llms.txt instantly.&lt;/p&gt;

&lt;p&gt;The future of the web is headless and agent-driven. Don't let your site get left behind.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6tu3f429xgxeeahcod9r.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6tu3f429xgxeeahcod9r.jpeg" alt=" " width="800" height="543"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>ai</category>
      <category>seo</category>
      <category>productivity</category>
    </item>
    <item>
      <title>A checklist for 2026 Web Security: Why TLS 1.3 and DMARC are no longer optional for SEO and GEO (AI visibility)</title>
      <dc:creator>Marius Gherasim</dc:creator>
      <pubDate>Tue, 30 Dec 2025 15:45:37 +0000</pubDate>
      <link>https://forem.com/mariuscg/a-checklist-for-2026-web-security-why-tls-13-and-dmarc-are-no-longer-optional-for-seo-and-geo-ai-4mo7</link>
      <guid>https://forem.com/mariuscg/a-checklist-for-2026-web-security-why-tls-13-and-dmarc-are-no-longer-optional-for-seo-and-geo-ai-4mo7</guid>
      <description>&lt;p&gt;In 2026, website security is no longer an optional add-on; it is your digital foundation. Whether your visitors are humans, search engine bots, or AI agents, a secure site is the only way to maintain trust and visibility. &lt;/p&gt;

&lt;p&gt;Think of security as the locks on your front door: if they are broken, you risk losing your data, your customers, and your search engine rankings. To ensure your "locks" are solid, you need to instantly audit your website, identifying and fixing security flaws before they become liabilities. &lt;/p&gt;

&lt;p&gt;🛡️ Why Security Matters Now &lt;/p&gt;

&lt;p&gt;For Humans: A "Not Secure" warning kills brand trust. Regularly scanning your SSL/TLS certificates ensures visitors always feel safe. &lt;/p&gt;

&lt;p&gt;For Search Engines: Google penalizes sites with security gaps. A clean audit helps maintain your SEO rankings and prevents manual actions. &lt;/p&gt;

&lt;p&gt;For AI Agents: AI crawlers (like Gemini or ChatGPT) prioritize secure, high-authority sources. If flaws are detected, AI agents may skip your content entirely. &lt;/p&gt;

&lt;p&gt;⚠️ The Silent Risks &lt;/p&gt;

&lt;p&gt;Most site owners don't realize they are at risk until it's too late. It is vital to shine a light on "hidden" issues like: Vulnerable Libraries: Identifying outdated code with known exploits. &lt;/p&gt;

&lt;p&gt;DNS Gaps: Flagging missing SPF or DMARC records to prevent hackers from spoofing your email. &lt;/p&gt;

&lt;p&gt;Script Risks: Auditing third-party tools (like chats or cookie banners) that might be leaking data. &lt;/p&gt;

&lt;p&gt;✅ Fix it Fast &lt;/p&gt;

&lt;p&gt;Securing your site doesn't have to be a manual headache. You can turn a complex audit into an easy, actionable process: Check Your Score: Instantly see your Threat Intelligence breakdown. If your score isn't a perfect green, you need to know exactly why. &lt;/p&gt;

&lt;p&gt;Seal DNS Leaks: Identify missing CAA or DMARC records and get the data you need to fix them in your domain settings. &lt;/p&gt;

&lt;p&gt;Audit Scripts: Easily review "Medium Risk" scripts. If you don’t recognize a third-party script, remove it to keep your site clean. &lt;/p&gt;

&lt;p&gt;Modern Encryption: Ensure you are using TLS 1.3, the gold standard for secure communication in 2026. &lt;/p&gt;

&lt;p&gt;Is your site ready for the year ahead? Don't wait for a breach to find out. Head over to a &lt;a href="https://refactor.tools" rel="noopener noreferrer"&gt;Refactor.Tools&lt;/a&gt; run an audit and keep your digital presence safe, visible, and trusted.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>security</category>
      <category>seo</category>
      <category>ai</category>
    </item>
    <item>
      <title>Why Your Site Needs /llms.txt (and How to Create One)</title>
      <dc:creator>Marius Gherasim</dc:creator>
      <pubDate>Thu, 18 Dec 2025 12:54:08 +0000</pubDate>
      <link>https://forem.com/mariuscg/why-your-site-needs-llmstxt-and-how-to-create-one-4jgp</link>
      <guid>https://forem.com/mariuscg/why-your-site-needs-llmstxt-and-how-to-create-one-4jgp</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmziaupvadz0btlyzbdoy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmziaupvadz0btlyzbdoy.png" alt="A confused robot looking at a book (robots.txt), a labyrinth (Sitemap.xml) and a document (llms.txt)" width="800" height="466"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>ai</category>
      <category>productivity</category>
      <category>seo</category>
    </item>
    <item>
      <title>I got tired of Audit tools that just list problems, so I built one that writes the fixes (and roasts you)</title>
      <dc:creator>Marius Gherasim</dc:creator>
      <pubDate>Fri, 12 Dec 2025 18:08:48 +0000</pubDate>
      <link>https://forem.com/mariuscg/i-got-tired-of-audit-tools-that-just-list-problems-so-i-built-one-that-writes-the-fixes-and-25bh</link>
      <guid>https://forem.com/mariuscg/i-got-tired-of-audit-tools-that-just-list-problems-so-i-built-one-that-writes-the-fixes-and-25bh</guid>
      <description>&lt;p&gt;The Problem We've all been there. You run a Lighthouse audit or check PageSpeed Insights. It gives you a red score and a vague list of complaints:&lt;/p&gt;

&lt;p&gt;"Eliminate render-blocking resources"&lt;/p&gt;

&lt;p&gt;"Ensure text remains visible during webfont load"&lt;/p&gt;

&lt;p&gt;"Serve static assets with an efficient cache policy"&lt;/p&gt;

&lt;p&gt;Great. Thanks. But how do I fix it? Usually, that means 2 hours of Googling and messing with config files.&lt;/p&gt;

&lt;p&gt;The Solution I spent the last few months building Refactor.tools. I wanted to build the "ultimate" website audit tool that doesn't just complain, but actually helps you fix the code.&lt;/p&gt;

&lt;p&gt;What makes it different? Unlike standard validators, I built this with a "Fixer First" mentality using a modern stack (Next.js, Supabase, Puppeteer):&lt;/p&gt;

&lt;p&gt;Real User Data (CrUX): It doesn't just run a lab test; it pulls real-world Chrome User Experience data (LCP, INP, CLS) so you know how actual users feel.&lt;/p&gt;

&lt;p&gt;GEO (Generative Engine Optimization): This is new. It checks if your site is ready for AI Search (ChatGPT, Perplexity), including llms.txt detection and AI readability scoring.&lt;/p&gt;

&lt;p&gt;The AI Fixer: This is the killer feature. If it finds a missing meta tag, a bad CSP header, or broken JSON-LD, it generates the copy-paste code to solve it.&lt;/p&gt;

&lt;p&gt;💀 The "Roast Mode" (Proceed with Caution) I got a little bored during development, so I added a "Roast My Site" tab. It uses vision-capable AI to look at your landing page and critique your design, UX, and copy. It does not hold back. It’s hilarious, but honestly... it’s usually right.&lt;/p&gt;

&lt;p&gt;The Tech Stack For those interested in how it's built:&lt;/p&gt;

&lt;p&gt;Frontend: Next.js on Vercel.&lt;/p&gt;

&lt;p&gt;Backend: Supabase (Auth/DB) + Google Cloud Run for the heavy lifting.&lt;/p&gt;

&lt;p&gt;Analysis: Headless Puppeteer for DOM analysis + CrUX API.&lt;/p&gt;

&lt;p&gt;Security: We use Vercel Edge Middleware to inject dynamic nonces for strict CSP compliance (a pain to build, but worth it for security).&lt;/p&gt;

&lt;p&gt;🚀 I need your help to break it It’s almost ready for the public, but I need honest feedback to make it bulletproof.&lt;/p&gt;

&lt;p&gt;I am looking for Founding Members. If you try it out and send me an honest email about your experience (what broke, what confused you), I’ll give you the PRO Plan for free (1 year).&lt;/p&gt;

&lt;p&gt;Check it out here: [&lt;a href="https://refactor.tools" rel="noopener noreferrer"&gt;https://refactor.tools&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;Let me know in the comments if the AI Roast was too mean. 🌶️&lt;/p&gt;

</description>
      <category>tooling</category>
      <category>performance</category>
      <category>showdev</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
