<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: BrewHubPHL</title>
    <description>The latest articles on Forem by BrewHubPHL (@brewhubphl).</description>
    <link>https://forem.com/brewhubphl</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/brewhubphl"/>
    <language>en</language>
    <item>
      <title>Beyond Chatbots: How I use Gemini 2.5 and Supabase to run a fully automated retention team</title>
      <dc:creator>BrewHubPHL</dc:creator>
      <pubDate>Wed, 29 Apr 2026 00:23:22 +0000</pubDate>
      <link>https://forem.com/brewhubphl/beyond-chatbots-how-i-use-gemini-25-and-supabase-to-run-a-fully-automated-retention-team-130l</link>
      <guid>https://forem.com/brewhubphl/beyond-chatbots-how-i-use-gemini-25-and-supabase-to-run-a-fully-automated-retention-team-130l</guid>
      <description>&lt;p&gt;When most developers think about integrating AI into their apps, the default move is to build a chatbot. But for my coffee shop app, &lt;a href="https://brewhubphl.com" rel="noopener noreferrer"&gt;BrewHub PHL&lt;/a&gt;, I didn't want users talking to an AI. I wanted the AI doing the heavy lifting in the background.&lt;/p&gt;

&lt;p&gt;Customer retention is notoriously hard for local businesses. Figuring out who hasn't visited in a while, drafting a personalized message, and issuing a custom discount code usually takes hours of manual marketing work.&lt;/p&gt;

&lt;p&gt;I decided to fully automate this using Gemini 2.5 Flash, Supabase, and a simple weekly cron job. Here is how I built a headless AI retention agent that automatically wins back lapsed customers while I sleep.&lt;/p&gt;

&lt;p&gt;The Architecture&lt;br&gt;
The pipeline runs every Monday at 10 AM and consists of four steps:&lt;/p&gt;

&lt;p&gt;The Data Layer: A Supabase RPC finds eligible lapsed customers.&lt;/p&gt;

&lt;p&gt;The Privacy Layer: The script strips all Personally Identifiable Information (PII) before it touches the LLM.&lt;/p&gt;

&lt;p&gt;The Brains: The Gemini API generates hyper-personalized SMS messages and forces the output into strict JSON.&lt;/p&gt;

&lt;p&gt;The Execution: The system generates physical POS vouchers and sends the SMS via Twilio.&lt;/p&gt;

&lt;p&gt;Step 1: Finding Eligible Customers&lt;br&gt;
I didn't want to spam one-off visitors. To find the right targets, I wrote a Postgres RPC in Supabase called get_lapsed_customers_eligible_for_retention.&lt;/p&gt;

&lt;p&gt;It filters the database for users who:&lt;/p&gt;

&lt;p&gt;Have ordered at least 3 times (loyal customers).&lt;/p&gt;

&lt;p&gt;Haven't ordered in the last 14 days.&lt;/p&gt;

&lt;p&gt;Haven't received a marketing voucher in the last 90 days (the cooldown period).&lt;/p&gt;

&lt;p&gt;SQL&lt;br&gt;
-- The Supabase RPC handles the heavy data filtering instantly&lt;br&gt;
SELECT id, full_name, phone, favorite_drink, days_since_last_visit &lt;br&gt;
FROM get_lapsed_customers_eligible_for_retention(&lt;br&gt;
  p_min_orders := 3, &lt;br&gt;
  p_lapsed_days := 14, &lt;br&gt;
  p_cooldown_days := 90, &lt;br&gt;
  p_batch_limit := 10&lt;br&gt;
);&lt;br&gt;
Step 2: Privacy by Design&lt;br&gt;
Sending raw customer data to an LLM is a terrible idea. Before the data leaves my server, the script maps the Supabase response to a strictly anonymous payload.&lt;/p&gt;

&lt;p&gt;Names and phone numbers are dropped. Gemini only sees the customer_id, their favorite_drink, and days_since_last_visit.&lt;/p&gt;

&lt;p&gt;Step 3: Prompting for Structured JSON with Gemini 2.5&lt;br&gt;
This is where the magic happens. I don't just want Gemini to write a message; I need it to return an array of objects that my code can iterate over to send SMS messages.&lt;/p&gt;

&lt;p&gt;Using the official @google/generative-ai SDK, I pass responseMimeType: "application/json" to guarantee the output won't break my script.&lt;/p&gt;

&lt;p&gt;JavaScript&lt;br&gt;
import { GoogleGenerativeAI } from "@google/generative-ai";&lt;/p&gt;

&lt;p&gt;const genAI = new GoogleGenerativeAI(process.env.GEMINI_API_KEY);&lt;br&gt;
const model = genAI.getGenerativeModel({ &lt;br&gt;
  model: "gemini-2.5-flash",&lt;br&gt;
  generationConfig: {&lt;br&gt;
    responseMimeType: "application/json",&lt;br&gt;
  }&lt;br&gt;
});&lt;/p&gt;

&lt;p&gt;const prompt = `&lt;br&gt;
You are the BrewHub PHL Retention Agent. I will provide a list of anonymous customer profiles. &lt;br&gt;
For each customer, write a short, friendly, and highly personalized SMS text message under 160 characters. &lt;br&gt;
Acknowledge that we haven't seen them in a while, mention their favorite drink by name, and offer them a $5 voucher to come back.&lt;/p&gt;

&lt;p&gt;Return ONLY a JSON array with this exact structure:&lt;br&gt;
[&lt;br&gt;
  { "customer_id": "uuid-here", "sms_message": "Hey! It's been a while..." }&lt;br&gt;
]&lt;/p&gt;

&lt;p&gt;Customer Data:&lt;br&gt;
${JSON.stringify(anonymousCustomerList)}&lt;br&gt;
`;&lt;/p&gt;

&lt;p&gt;const response = await model.generateContent(prompt);&lt;br&gt;
const aiDecisions = JSON.parse(response.text());&lt;br&gt;
Because Gemini 2.5 Flash is incredibly fast, this entire batch generation takes just a few seconds.&lt;/p&gt;

&lt;p&gt;Step 4: Fulfillment and SMS&lt;br&gt;
Once Gemini hands back the clean JSON array of messages, my cron job loops through the results.&lt;/p&gt;

&lt;p&gt;For each customer_id, it:&lt;/p&gt;

&lt;p&gt;Generates a unique secure voucher code (e.g., 5OFF-A3F9C1).&lt;/p&gt;

&lt;p&gt;Inserts that active voucher into my Supabase vouchers table.&lt;/p&gt;

&lt;p&gt;Appends the code to Gemini's personalized message: "Show this code to one of our baristas: 5OFF-A3F9C1".&lt;/p&gt;

&lt;p&gt;Dispatches the final message via Twilio.&lt;/p&gt;

&lt;p&gt;The Result&lt;br&gt;
By moving AI out of the chat window and into a scheduled backend worker, the system feels like magic. Customers get a highly personalized text referencing their actual favorite order, complete with a working POS discount code, and I don't have to lift a finger.&lt;/p&gt;

&lt;p&gt;The Gemini API's strict JSON output makes it incredibly reliable for server-to-server data pipelines, proving that the real power of modern LLMs is as a background reasoning engine.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If you want to see the Supabase + Next.js architecture in action (or just want to order some coffee in Philadelphia), you can check out the live web app at &lt;a href="https://brewhubphl.com" rel="noopener noreferrer"&gt;brewhubphl.com&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>googleaichallenge</category>
      <category>javascript</category>
      <category>webdev</category>
      <category>supabase</category>
    </item>
    <item>
      <title>BrewHub PHL: Low Latency, High Caffeine (Notion MCP Operations Ledger)</title>
      <dc:creator>BrewHubPHL</dc:creator>
      <pubDate>Thu, 05 Mar 2026 04:08:14 +0000</pubDate>
      <link>https://forem.com/brewhubphl/brewhub-phl-low-latency-high-caffeine-notion-mcp-operations-ledger-27o1</link>
      <guid>https://forem.com/brewhubphl/brewhub-phl-low-latency-high-caffeine-notion-mcp-operations-ledger-27o1</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/notion-2026-03-04"&gt;Notion MCP Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;## What I Built&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&amp;lt;BrewHubPHL.com isn't just a software project; it's the digital infrastructure for a physical neighborhood coffee and parcel logistics shop. Managing a physical retail space requires strict security, fast transactions, and immutable audit logs. But reading database tables isn't a great experience for human store managers.&lt;/p&gt;

&lt;p&gt;To bridge the gap between our high-speed PostgreSQL database and our human management team, I built the Hub-to-Notion Operations Ledger.&lt;/p&gt;

&lt;p&gt;Whenever a barista completes an order on the POS, or a manager performs a high-security action (like an IRS-compliant payroll override), our database automatically and securely syncs that canonical data directly into a Notion Workspace. This turns Notion into a "Live Manager's Logbook" that updates in real-time, without any manual data entry.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;## Show us the code&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&amp;lt;The magic happens through a highly secure, zero-trust architecture between Supabase, Netlify Functions, and Notion.&lt;/p&gt;

&lt;p&gt;Instead of trusting the payload from the database trigger, our Netlify function acts as a secure bridge that re-fetches the canonical data. We also built-in Graceful Degradation: if the Notion API ever goes down, the database catches the exception and returns NEW, ensuring we never stop selling coffee just because a sync failed.&lt;/p&gt;

&lt;p&gt;Here is a snippet of our Double-Guarded Postgres Trigger for Auto-Syncing completed orders: &lt;br&gt;
-- 20260304_schema86b_orders_notion_sync.sql&lt;br&gt;
CREATE OR REPLACE FUNCTION public.fn_trg_orders_notion_sync()&lt;br&gt;
RETURNS trigger&lt;br&gt;
LANGUAGE plpgsql&lt;br&gt;
SECURITY DEFINER SET search_path = public, extensions&lt;br&gt;
AS $$&lt;br&gt;
BEGIN&lt;br&gt;
  -- Defense-in-depth guard: Only fire on transition to 'completed'&lt;br&gt;
  IF (NEW.status = 'completed') AND (OLD.status IS DISTINCT FROM 'completed') THEN&lt;br&gt;
    BEGIN&lt;br&gt;
      -- Fire webhook to Netlify with signed headers and ONLY the record_id&lt;br&gt;
      PERFORM net.http_post(&lt;br&gt;
          url := coalesce(current_setting('app.settings.notion_sync_webhook_url', true), '&lt;a href="https://brewhubphl.com/.netlify/functions/notion-sync'" rel="noopener noreferrer"&gt;https://brewhubphl.com/.netlify/functions/notion-sync'&lt;/a&gt;),&lt;br&gt;
          headers := jsonb_build_object(&lt;br&gt;
              'Content-Type', 'application/json',&lt;br&gt;
              'X-BrewHub-Action', 'true',&lt;br&gt;
              'x-brewhub-secret', current_setting('app.settings.internal_sync_secret', true)&lt;br&gt;
          ),&lt;br&gt;
          body := jsonb_build_object('table', 'orders', 'record_id', NEW.id, 'type', 'UPDATE')&lt;br&gt;
      );&lt;br&gt;
    EXCEPTION WHEN undefined_function THEN&lt;br&gt;
      RAISE WARNING 'pg_net missing, skipping Notion sync';&lt;br&gt;
    WHEN OTHERS THEN&lt;br&gt;
      RAISE WARNING 'Notion sync trigger failed, but allowing order to complete: %', SQLERRM;&lt;br&gt;
    END;&lt;br&gt;
  END IF;&lt;br&gt;
  RETURN NEW;&lt;br&gt;
END;&lt;br&gt;
$$;&lt;/p&gt;

&lt;p&gt;CREATE TRIGGER trg_orders_notion_sync&lt;br&gt;
  AFTER UPDATE ON public.orders&lt;br&gt;
  FOR EACH ROW&lt;br&gt;
  -- Double guard: Postgres level check before invoking function&lt;br&gt;
  WHEN (NEW.status = 'completed' AND OLD.status IS DISTINCT FROM 'completed')&lt;br&gt;
  EXECUTE FUNCTION public.fn_trg_orders_notion_sync();&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;## How I Used Notion MCP&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&amp;lt;While the Postgres triggers and Netlify functions populate the data, Notion MCP (Model Context Protocol) acts as our "AI Shift Manager."&lt;/p&gt;

&lt;p&gt;By adding the Notion MCP server to our .vscode/mcp.json configuration, our AI concierge ("Elise", powered by Claude) has direct read/write access to the Shop Command Center in Notion.&lt;/p&gt;

&lt;p&gt;This unlocks an incredible Human-in-the-Loop workflow:&lt;/p&gt;

&lt;p&gt;The Daily Pulse: I can ask the AI to "Check the Ledger." The AI uses MCP to read the Sales Ledger database in Notion, summarizes the total revenue and completed orders for the day, and cross-references it with our physical parcel logistics.&lt;/p&gt;

&lt;p&gt;Security Audits: The AI scans the Audit Trail page in Notion for any new Manager Overrides (synced via our zero-trust webhooks). If it sees an unusual comp or payroll adjustment, it flags it for my review.&lt;/p&gt;

&lt;p&gt;Drafting Briefings: Instead of just outputting text, the AI uses MCP to draft a neat "Morning Briefing" page directly inside my Notion workspace for the opening manager to read the next day.&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>notionchallenge</category>
      <category>mcp</category>
      <category>ai</category>
    </item>
    <item>
      <title>I built an Infinite AI Debate Arena using the GitHub Copilot CLI 🥊</title>
      <dc:creator>BrewHubPHL</dc:creator>
      <pubDate>Wed, 11 Feb 2026 19:29:41 +0000</pubDate>
      <link>https://forem.com/brewhubphl/i-built-an-infinite-ai-debate-arena-using-the-github-copilot-cli-421l</link>
      <guid>https://forem.com/brewhubphl/i-built-an-infinite-ai-debate-arena-using-the-github-copilot-cli-421l</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/github-2026-01-21"&gt;GitHub Copilot CLI Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;💡 The Idea&lt;br&gt;
What happens if you lock two AI personalities in a room and force them to argue about "Is a hotdog a sandwich?" forever?&lt;/p&gt;

&lt;p&gt;For the GitHub Copilot CLI Challenge, I didn't want to just build a utility tool. I wanted to build something chaotic. Enter the AI Debate Arena.&lt;/p&gt;

&lt;p&gt;It’s a terminal-based "fighting game" where:&lt;/p&gt;

&lt;p&gt;Captain Capslock (An angry Boomer) fights Lil' Zoomer (A Gen-Z teen).&lt;/p&gt;

&lt;p&gt;They argue in an infinite loop.&lt;/p&gt;

&lt;p&gt;Sentiment Analysis determines who is "winning" (getting angrier).&lt;/p&gt;

&lt;p&gt;🎥 The Demo&lt;br&gt;
[&lt;a href="https://youtu.be/-RBdUKZY9zA" rel="noopener noreferrer"&gt;https://youtu.be/-RBdUKZY9zA&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;🛠️ How it Works&lt;br&gt;
The project uses Python to orchestrate the chaos, but the "brains" are entirely powered by the GitHub Copilot CLI.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The "Persona Injection"
I used the gh copilot explain command to generate the dialogue. By injecting a specific persona into the prompt, we can force Copilot to break character.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Python&lt;/p&gt;

&lt;h1&gt;
  
  
  The Secret Sauce
&lt;/h1&gt;

&lt;p&gt;prompt = f"{persona} Your opponent said: '{last_response}'. Reply in one short, funny sentence."&lt;br&gt;
cmd = ["gh", "copilot", "explain", "-p", prompt]&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The Loop
The script creates a feedback loop:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Fighter A generates a response using Copilot.&lt;/p&gt;

&lt;p&gt;TextBlob analyzes the sentiment (Politeness = Weakness, Anger = Power).&lt;/p&gt;

&lt;p&gt;Fighter B takes that response and generates a counter-argument.&lt;/p&gt;

&lt;p&gt;Rich renders the ASCII faces and health bars in real-time.&lt;/p&gt;

&lt;p&gt;🎨 The Tech Stack&lt;br&gt;
GitHub CLI (gh): The AI engine.&lt;/p&gt;

&lt;p&gt;Rich: For the beautiful terminal UI and layouts.&lt;/p&gt;

&lt;p&gt;TextBlob: For the "Rage Meter" logic.&lt;/p&gt;

&lt;p&gt;Python: To glue it all together.&lt;/p&gt;

&lt;p&gt;🏆 The Outcome&lt;br&gt;
Sometimes they argue about politics, sometimes about cereal. The CLI handles the roleplay surprisingly well. My favorite line so far?&lt;/p&gt;

&lt;p&gt;"Milk-first people stay taking Ls fr fr, that's giving unhinged villain energy." — Lil' Zoomer&lt;/p&gt;

&lt;p&gt;🔗 The Code&lt;br&gt;
Check out the repository here to run it yourself! [&lt;a href="https://github.com/BrewHubPHL/ai-debate.git" rel="noopener noreferrer"&gt;https://github.com/BrewHubPHL/ai-debate.git&lt;/a&gt;]&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>githubchallenge</category>
      <category>cli</category>
      <category>githubcopilot</category>
    </item>
    <item>
      <title>How I Built an AI Barista using Square, Supabase, and ElevenLabs</title>
      <dc:creator>BrewHubPHL</dc:creator>
      <pubDate>Fri, 06 Feb 2026 12:28:28 +0000</pubDate>
      <link>https://forem.com/brewhubphl/how-i-built-an-ai-barista-using-square-supabase-and-elevenlabs-3hbb</link>
      <guid>https://forem.com/brewhubphl/how-i-built-an-ai-barista-using-square-supabase-and-elevenlabs-3hbb</guid>
      <description>&lt;h1&gt;
  
  
  How I Built an AI Barista using Square, Supabase, and ElevenLabs
&lt;/h1&gt;

&lt;p&gt;I run a tech-forward coffee hub in Philadelphia called &lt;strong&gt;&lt;a href="https://brewhubphl.com" rel="noopener noreferrer"&gt;BrewHubPHL&lt;/a&gt;&lt;/strong&gt;. When we opened, I didn't just want a screen flashing "Order Ready"—I wanted the shop to speak.&lt;/p&gt;

&lt;p&gt;Here is how I used &lt;strong&gt;Supabase Edge Functions&lt;/strong&gt; to glue &lt;strong&gt;Square POS&lt;/strong&gt; and &lt;strong&gt;ElevenLabs&lt;/strong&gt; together, creating an automated announcer for our orders.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Stack
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Database &amp;amp; Auth:&lt;/strong&gt; &lt;a href="https://supabase.com" rel="noopener noreferrer"&gt;Supabase&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Payments:&lt;/strong&gt; Square (POS and Webhooks)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Voice AI:&lt;/strong&gt; ElevenLabs (Turbo v2 model)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Compute:&lt;/strong&gt; Netlify Functions&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Workflow
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Square&lt;/strong&gt; detects a payment (&lt;code&gt;payment.updated&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Supabase&lt;/strong&gt; receives the webhook and routes it.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;ElevenLabs&lt;/strong&gt; generates the audio file ("Order for John is ready!").&lt;/li&gt;
&lt;li&gt; The frontend plays the audio automatically.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Step 1: Catching the Square Webhook
&lt;/h2&gt;

&lt;p&gt;First, we need to know when an order is actually paid. We set up a serverless function to listen for Square's &lt;code&gt;payment.updated&lt;/code&gt; event.&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
javascript
// square-webhook.js
exports.handler = async (event) =&amp;gt; {
  const body = JSON.parse(event.body);

  if (body.type === 'payment.updated' &amp;amp;&amp;amp; body.data.object.payment.status === 'COMPLETED') {
    const orderId = body.data.object.payment.reference_id;

    // Update Supabase
    await supabase.from('orders').update({ status: 'paid' }).eq('id', orderId);

    // Trigger the Announcer
    await triggerVoiceAnnouncement(orderId);
  }
};

## Step 2: Generating the Voice
This is where the magic happens. We don't want a robotic "text-to-speech" voice; we want personality. I used the ElevenLabs Turbo v2 model because it has low latency (essential for real-time retail).

We send the text to their API and get back an audio buffer.

// text-to-speech.js
const response = await fetch(`https://api.elevenlabs.io/v1/text-to-speech/${VOICE_ID}`, {
    method: 'POST',
    headers: {
        'xi-api-key': process.env.ELEVENLABS_API_KEY
    },
    body: JSON.stringify({
        text: "Order ready for specific_customer!",
        model_id: 'eleven_turbo_v2',
        voice_settings: { stability: 0.5, similarity_boost: 0.75 }
    })
});

Why build this?
It’s not just a gimmick. In a busy shop, customers tune out shouting baristas. A distinct, consistent AI voice cuts through the noise. Plus, by integrating it directly with Square and Supabase, we have zero manual work—the barista just taps "Charge," and the code does the rest.

For the developers, I've open-sourced the sync logic on GitHub: https://gist.github.com/BrewHubPHL/53937283c5eaa7cafedb9555e851c509
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>webdev</category>
      <category>javascript</category>
      <category>supabase</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
