<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Luis Vargas</title>
    <description>The latest articles on Forem by Luis Vargas (@motion_design).</description>
    <link>https://forem.com/motion_design</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/motion_design"/>
    <language>en</language>
    <item>
      <title>How I Used AI ~MultiAgent~ Simulation to Fix My Ad Messaging</title>
      <dc:creator>Luis Vargas</dc:creator>
      <pubDate>Fri, 17 Apr 2026 23:47:44 +0000</pubDate>
      <link>https://forem.com/motion_design/how-i-used-ai-multiagent-simulation-to-fix-my-ad-messaging-38o2</link>
      <guid>https://forem.com/motion_design/how-i-used-ai-multiagent-simulation-to-fix-my-ad-messaging-38o2</guid>
      <description>&lt;h1&gt;
  
  
  From Guessing to Knowing: How I Used AI Agent Simulation to Fix My Ad Messaging
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;A UX researcher's experiment in synthetic audience intelligence — and what it taught me about my own product.&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The Honest Problem
&lt;/h2&gt;

&lt;p&gt;I had ads. Good ads, I thought.&lt;/p&gt;

&lt;p&gt;They were clean. They hit the features. They explained what Waco3.io does — proposal intelligence for freelancers and small service businesses. Intelligent tracking. Client engagement. Automated follow-ups.&lt;/p&gt;

&lt;p&gt;But something wasn't landing.&lt;/p&gt;

&lt;p&gt;The messaging felt like it was &lt;em&gt;about&lt;/em&gt; the product instead of &lt;em&gt;for&lt;/em&gt; the person reading it. I knew the pain points. I had ICP docs. I had personas. But I was still solving my customers' problems through my own lens — and that's a blind spot every designer and builder eventually hits.&lt;/p&gt;

&lt;p&gt;The typical fix? User research. Interviews, surveys, usability tests, qual data. All of which I believe in deeply.&lt;/p&gt;

&lt;p&gt;But here's the real-world constraint: I was solo, no time, moving fast, and needed directional signal &lt;em&gt;now&lt;/em&gt; — not in three weeks after, scout participants, recruiting and scheduling fifteen interviews. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzktzm1jt6l4wxwflkerc.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzktzm1jt6l4wxwflkerc.webp" alt=" " width="480" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So I tried something different.&lt;/p&gt;




&lt;h2&gt;
  
  
  Enter MiroFish
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"From a single document, MiroFish extracts reality seeds to auto-generate a parallel world with up to million-scale Agents."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;MiroFish is an open-source multi-agent swarm simulation engine. The premise is deceptively simple: upload your documents, define your context, and let the system generate a populated simulation of your target world — complete with agents that think, disagree, post, follow, and react to each other in real time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhs268yt1cecmxou7nsv5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhs268yt1cecmxou7nsv5.png" alt=" " width="800" height="418"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It's v0.1-Preview. Rough around the edges. ~$5/sim. And genuinely one of the more interesting UX research shortcuts I've found.&lt;/p&gt;

&lt;p&gt;I don't call it a replacement for real research. I call it a &lt;strong&gt;matrix of opinions&lt;/strong&gt; — a fast, structured way to expose your idea to perspectives you didn't know you were missing.&lt;/p&gt;




&lt;h2&gt;
  
  
  What I Fed the System
&lt;/h2&gt;

&lt;p&gt;My input was three files:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;waco3_ideal_customer_persona.md&lt;/code&gt; — my ICP document&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;waco3_pain_points.md&lt;/code&gt; — documented emotional and practical frustrations of my target users&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;ads_waco_v1.txt&lt;/code&gt; — my current set of ad copies&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7lzuiau1attw8v9vhjem.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7lzuiau1attw8v9vhjem.png" alt=" " width="538" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The goal: not just &lt;em&gt;"does this ad perform?"&lt;/em&gt; — but &lt;strong&gt;how do different types of people within my audience actually interpret this message?&lt;/strong&gt; What do they feel? What do they reject? Where does the copy fail them emotionally?&lt;/p&gt;

&lt;p&gt;That's a UX question, not a marketing question.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 1: Ontology Generation — Building the World's Structure
&lt;/h2&gt;

&lt;p&gt;The first thing MiroFish does is analyze your documents and auto-generate an ontology: a structured map of the entity types and relationships that matter in your world.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe45ho09g7uw9olvceuqd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe45ho09g7uw9olvceuqd.png" alt=" " width="800" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For my simulation, it extracted:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Entity Types:&lt;/strong&gt; Freelancer · MicroAgency · ClientEntity · SaaScompany · IndustryInfluencer · MediaOutlet · ProfessionalAssociation · PublicFigure · Person · Organization&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Relation Types:&lt;/strong&gt; USES_PRODUCT · PROPOSES_TO · COMMENTS_ON · REPORTS_ON · FOLLOWS · COMPETES_WITH · ADVISES · ENDORSE_PRODUCT · REVIEWS_PRODUCT · PARTNERS_WITH&lt;/p&gt;

&lt;p&gt;This isn't just taxonomy. It's the system defining &lt;em&gt;how things in this world connect to each other&lt;/em&gt;. The GraphRAG Build then took all of this — auto-chunking the documents, extracting entities and relations, forming temporal memory and community summaries — producing &lt;strong&gt;158 entity nodes&lt;/strong&gt; and &lt;strong&gt;378 relation edges&lt;/strong&gt; across &lt;strong&gt;10 schema types&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;From a UX research standpoint, this is like mapping your ecosystem before running a study. You're not asking one user one question. You're first understanding the full relational context of everyone in the room.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 2: Generated Agent Personas — A Population, Not a Sample
&lt;/h2&gt;

&lt;p&gt;Here's where it gets interesting.&lt;/p&gt;

&lt;p&gt;MiroFish didn't just generate a few personas. It generated a diverse, contextually-grounded population — each agent with their own role, narrative, topic interests, and behavioral tendencies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr83thyg0b19trnswgxw1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr83thyg0b19trnswgxw1.png" alt=" " width="800" height="266"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2sdmvg89l2zvititv7px.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2sdmvg89l2zvititv7px.png" alt=" " width="800" height="276"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftlr8th4b9q7bwzuhmqhz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftlr8th4b9q7bwzuhmqhz.png" alt=" " width="800" height="328"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What struck me wasn't just the volume — it was the &lt;em&gt;specificity&lt;/em&gt;. Each agent had a coherent professional identity:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;behavioral_psychologist_438&lt;/code&gt; — &lt;em&gt;"I dissect ad messaging using deep psychological principles and emotional triggers."&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;microagencies_917&lt;/code&gt; — focused on &lt;em&gt;"proposal mastery and sustainable client acquisition."&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;freelancers_927&lt;/code&gt; — &lt;em&gt;"the unified voice for independent professionals."&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;small_service_business_622&lt;/code&gt; — &lt;em&gt;"navigating manual proposals, awkward follow-ups, and fragmented workflows."&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These aren't made-up archetypes. They're synthesized from the reality seeds in my documents. The system found the people &lt;em&gt;in&lt;/em&gt; my data and brought them to life.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 3: The Graph — Seeing the Relationships
&lt;/h2&gt;

&lt;p&gt;Once the agents existed, MiroFish built the relationship web between them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;[Image: Full Graph Relationship Visualization — the interconnected node map with entity type legend]&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Every node. Every edge. Every connection between a freelancer's frustration and a SaaS product's promise. Between a behavioral psychologist's framing of fear-of-rejection and a micro-agency's experience of ghosted proposals.&lt;/p&gt;

&lt;p&gt;This visualization is the "matrix of opinions" I mentioned — a literal map of how these ideas intersect across different contexts, platforms, languages, time zones, and professional identities.&lt;/p&gt;

&lt;p&gt;As a UXer, this is your affinity diagram. Except instead of Post-its, it's a knowledge graph built from your own source material, at a scale no sprint session could produce.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 4: Simulation Environment — How the World Runs
&lt;/h2&gt;

&lt;p&gt;Before the agents start interacting, the simulation needs rules: when are people active? How often do they post? How strong are their community echo chambers?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8odpa3p0aiyv1cc429py.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8odpa3p0aiyv1cc429py.png" alt=" " width="800" height="707"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Key parameters for my run:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;72-hour simulation&lt;/strong&gt; across &lt;strong&gt;72 rounds&lt;/strong&gt; (60 minutes each)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;41 configured agents&lt;/strong&gt; with activity levels, sentiment biases, and response delays&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Peak hours:&lt;/strong&gt; 17:00–20:00 (×1.5 activity multiplier)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Two platform types:&lt;/strong&gt; Plaza/Feed and Topic/Community, each with different recency, popularity, and relevance weighting&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This matters for UX research framing: you're not just testing what people think — you're testing &lt;em&gt;when&lt;/em&gt; and &lt;em&gt;where&lt;/em&gt; and &lt;em&gt;how&lt;/em&gt; they encounter your message. Context changes interpretation.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 5: Narrative Direction — The System's Verdict on My Ads
&lt;/h2&gt;

&lt;p&gt;This is the output that stopped me cold.&lt;/p&gt;

&lt;p&gt;After the initial activation orchestration, MiroFish surfaced a &lt;strong&gt;Narrative Direction&lt;/strong&gt; — a summary of what was working, what wasn't, and where the messaging needed to go:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft75lhvtjko68gsqoshjk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft75lhvtjko68gsqoshjk.png" alt=" " width="800" height="537"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"The current ad narrative is too generic and feature-focused, failing to connect with the deep emotional and practical pain points of freelancers and small service businesses. The desired narrative direction is to shift towards intensely emotional, pain-point-driven messaging that highlights the high cost of uncertainty and inaction (lost deals, mental overload, feeling unprofessional). The discourse should move from 'what Waco3 does' to 'what Waco3 solves for me', emphasizing clarity, control, and increased income by addressing the core psychology drivers of decision-making."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That's not a prompt I wrote. That's the system's synthesis after running my documents, personas, and ads through 158 entities and 378 relationships.&lt;/p&gt;

&lt;p&gt;And it was right.&lt;/p&gt;

&lt;p&gt;I was talking about features. My users were feeling fear.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 6: The Agents Talk — And They're Honest
&lt;/h2&gt;

&lt;p&gt;The simulation also generated an &lt;strong&gt;Initial Activation Sequence&lt;/strong&gt;: the first organic interactions between agents in response to the ads.&lt;/p&gt;

&lt;p&gt;Three agents, three reactions:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FREELANCER&lt;/strong&gt; (&lt;code&gt;marketing_freelancer_529&lt;/code&gt;):&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Ever send a proposal and hear… crickets? That deafening silence is costing me deals and sleep. I just need to know if they even saw it, if they're interested. This uncertainty is brutal for my business."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;SAAS COMPANY&lt;/strong&gt; (&lt;code&gt;waco3_953&lt;/code&gt;):&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Stop letting 'ghosted' deals cost you income. Waco3 reveals exactly what happens after you hit send, so you can revive stalled conversations and close the deals you deserve."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;FREELANCER&lt;/strong&gt; (&lt;code&gt;automation_freelancer_315&lt;/code&gt;):&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Blindly chasing clients feels desperate and unprofessional. I need data to follow up with precision and authority. To know when to engage, what they've reviewed, and what to say next. I want to be an expert, not a chaser."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The gap between what my ad said and what the freelancer &lt;em&gt;felt&lt;/em&gt; became undeniable. These agents weren't echoing my copy back to me — they were telling me what they actually needed to hear.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 7: Deep Insight — 44 Facts from Temporal Memory
&lt;/h2&gt;

&lt;p&gt;After the simulation ran, I queried the knowledge graph directly using the &lt;strong&gt;Deep Insight&lt;/strong&gt; tool.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmzxg4cukos7r8o02qchv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmzxg4cukos7r8o02qchv.png" alt=" " width="800" height="610"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Query: &lt;em&gt;"How do freelancers and small service businesses currently struggle after sending proposals?"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Result: &lt;strong&gt;44 key facts&lt;/strong&gt;, &lt;strong&gt;31 core entities&lt;/strong&gt;, &lt;strong&gt;39 relation chains&lt;/strong&gt; — 33.9k characters of synthesized insight.&lt;/p&gt;

&lt;p&gt;Top facts surfaced:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A marketing freelancer expresses frustration about proposals being ignored — which Waco3 addresses.&lt;/li&gt;
&lt;li&gt;Uncertainty and motivation loss after sending a proposal suggests a need for a solution like Waco3.&lt;/li&gt;
&lt;li&gt;Freelancers face emotional bottlenecks in sales — the fear of sounding pushy, the fear of hearing no.&lt;/li&gt;
&lt;li&gt;Small service businesses advise gaining &lt;em&gt;buyer intent insights&lt;/em&gt; before sending proposals.&lt;/li&gt;
&lt;li&gt;The pain of "sending into a void" without visibility is the exact problem Waco3 claims to solve.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is the triangulation moment. The data wasn't just directional — it was specific enough to rewrite copy around.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Changed in My Messaging
&lt;/h2&gt;

&lt;p&gt;Before this simulation, my ads were describing what Waco3 does.&lt;/p&gt;

&lt;p&gt;After it, I understood what my audience &lt;em&gt;feels&lt;/em&gt; before they even know Waco3 exists.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Before&lt;/th&gt;
&lt;th&gt;After&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;"Proposal intelligence for freelancers"&lt;/td&gt;
&lt;td&gt;"Stop sending proposals into silence"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;"Track client engagement"&lt;/td&gt;
&lt;td&gt;"Know the moment they open it. Know if they care."&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;"Automated follow-ups"&lt;/td&gt;
&lt;td&gt;"Follow up with authority, not anxiety"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;"Optimize your close rate"&lt;/td&gt;
&lt;td&gt;"You're losing deals you don't even know you lost"&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The emotional vocabulary changed completely. Not because I invented it — but because I listened to 41 agents argue about it for 72 simulated hours.&lt;/p&gt;




&lt;h2&gt;
  
  
  The UX Research Framework Behind This
&lt;/h2&gt;

&lt;p&gt;Let me be direct about what this method is — and isn't.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it is:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A fast, low-cost way to stress-test messaging against a simulated population&lt;/li&gt;
&lt;li&gt;A structured method to surface perspectives outside your own bubble&lt;/li&gt;
&lt;li&gt;A knowledge graph of how your users' world is interconnected&lt;/li&gt;
&lt;li&gt;A directional signal that points you toward the right research questions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;What it isn't:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A substitute for real qualitative research&lt;/li&gt;
&lt;li&gt;Validated behavioral data&lt;/li&gt;
&lt;li&gt;A predictor of statistical outcomes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Think of it as sitting between &lt;em&gt;intuition&lt;/em&gt; and &lt;em&gt;research&lt;/em&gt; in your workflow:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Simulate fast → Validate direction → Measure in production
  (AI agents)     (real users)         (experiments)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;MiroFish collapses the first stage from weeks to hours. It doesn't replace what comes after — it makes what comes after sharper and more focused.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why This Matters for UX Practitioners
&lt;/h2&gt;

&lt;p&gt;We've been trained to say "it depends on the user." But we've also been put in rooms where the budget is gone, the sprint ends Friday, and nobody has time for research.&lt;/p&gt;

&lt;p&gt;Tools like MiroFish don't solve that tension — but they change the math.&lt;/p&gt;

&lt;p&gt;Instead of zero data versus three weeks of research, you now have a third option: &lt;strong&gt;synthetic signal in a day&lt;/strong&gt;, derived from your own source material, structured around your specific problem.&lt;/p&gt;

&lt;p&gt;As a UXer, what I find most promising isn't the ad-testing use case. It's the broader implications:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Accessibility simulation&lt;/strong&gt;: How would users with different abilities interpret this flow?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Localization testing&lt;/strong&gt;: How does the message land across different cultural contexts?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feature reception&lt;/strong&gt;: Before building, how do different user types react to a proposed change?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Onboarding stress-test&lt;/strong&gt;: Where does comprehension break down across different personas?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We're at v0.1 of this category. But the direction is clear.&lt;/p&gt;




&lt;h2&gt;
  
  
  Final Thought
&lt;/h2&gt;

&lt;p&gt;I started this experiment trying to fix my ads.&lt;/p&gt;

&lt;p&gt;What I actually found was a mirror.&lt;/p&gt;

&lt;p&gt;MiroFish didn't tell me something I didn't know. It told me something I &lt;em&gt;should&lt;/em&gt; have known — and showed me exactly why my current framing was getting in the way of my users recognizing themselves in my product.&lt;/p&gt;

&lt;p&gt;That's the real value of a matrix of opinions.&lt;/p&gt;

&lt;p&gt;Not prediction. Not magic. Just a structured way to ask:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"How many perspectives am I missing right now?"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The answer, usually, is more than you think.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Tools used: &lt;a href="https://github.com/mirofish" rel="noopener noreferrer"&gt;MiroFish&lt;/a&gt; · Product: &lt;a href="https://waco3.io" rel="noopener noreferrer"&gt;Waco3.io&lt;/a&gt; · Stack: Customer persona docs + pain point mapping + ad copy variants&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Have you tried multi-agent simulation for research? Drop your experience in the comments — I'd love to compare notes.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>agents</category>
      <category>ai</category>
      <category>marketing</category>
      <category>ux</category>
    </item>
    <item>
      <title>Procedural Design: The Moment Design Became Code</title>
      <dc:creator>Luis Vargas</dc:creator>
      <pubDate>Sun, 22 Mar 2026 16:59:13 +0000</pubDate>
      <link>https://forem.com/motion_design/procedural-design-the-moment-design-became-code-mf9</link>
      <guid>https://forem.com/motion_design/procedural-design-the-moment-design-became-code-mf9</guid>
      <description>&lt;h2&gt;
  
  
  Procedural Design: The Moment Design Became Code
&lt;/h2&gt;

&lt;p&gt;There's a shift happening in the design industry that we haven't fully named yet. I want to try.&lt;/p&gt;

&lt;p&gt;For most of my 15+ years in this field, design has been inseparable &lt;br&gt;
from the tools you use — Photoshop, Illustrator, Sketch, Figma. The &lt;br&gt;
artifact &lt;em&gt;is&lt;/em&gt; the design. You open a file, you push pixels, you export. &lt;br&gt;
Repeat.&lt;/p&gt;

&lt;p&gt;But something is changing. And I think the best word for it is &lt;br&gt;
&lt;strong&gt;procedural&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc93lbw5o7o2d5spwc209.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc93lbw5o7o2d5spwc209.jpg" alt="abstract design" width="640" height="338"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Design Has Always Been a Communication Problem
&lt;/h2&gt;

&lt;p&gt;Before we get to the "procedural" part, let's step back.&lt;/p&gt;

&lt;p&gt;There's a tension design has always carried: aesthetics versus &lt;br&gt;
communication. A beautifully crafted visual with no clear message is &lt;br&gt;
art — it can hang on your wall and make you feel something. That's &lt;br&gt;
valid. But in UX and product design, we've made a deliberate choice: &lt;br&gt;
we're not making wall art. We're building interfaces that help real &lt;br&gt;
people accomplish real tasks.&lt;/p&gt;

&lt;p&gt;That shift — from aesthetics toward communication — brought us things &lt;br&gt;
like hierarchy, color contrast, layout systems, and consistent &lt;br&gt;
typography. It turns out that when you strip design down to &lt;em&gt;how well &lt;br&gt;
does this communicate&lt;/em&gt;, you get something surprisingly mathematical. &lt;br&gt;
Hierarchy becomes a formula. Contrast becomes a ratio. Spacing becomes &lt;br&gt;
a scale.&lt;/p&gt;

&lt;p&gt;That's the seed of what comes next.&lt;/p&gt;




&lt;h2&gt;
  
  
  Enter Terraform. Enter Houdini. Enter Procedural Design.
&lt;/h2&gt;

&lt;p&gt;If you work in development, you know Terraform. You describe your &lt;br&gt;
infrastructure in code — servers, networks, databases — and you can &lt;br&gt;
spin it all up, tear it down, reproduce it exactly, anywhere. The &lt;br&gt;
artifact is not a running server. The artifact is the &lt;em&gt;code that &lt;br&gt;
describes&lt;/em&gt; a running server.&lt;/p&gt;

&lt;p&gt;If you work in 3D, you know Houdini. Instead of sculpting a mountain by hand, you write a procedural graph — noise functions, erosion parameters, height rules — and the software generates the mountain for you. Change one variable, regenerate. The geometry is an output, not a source file.&lt;/p&gt;

&lt;p&gt;I believe design is entering the same moment.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Four Buckets
&lt;/h2&gt;

&lt;p&gt;Here's how I've started thinking about it. Any design can be described &lt;br&gt;
as the intersection of four distinct layers:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Styling&lt;/strong&gt; — colors, typography, spacing, shadows, mood&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Content&lt;/strong&gt; — copy, messaging, hierarchy, information architecture&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Inspiration&lt;/strong&gt; — brand references, visual context, competitive 
landscape&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Technical&lt;/strong&gt; — target device, breakpoints, component library, 
accessibility requirements&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Right now, when a designer opens Figma, all four of these live inside &lt;br&gt;
their head and inside their hands. The output is a &lt;code&gt;.fig&lt;/code&gt; file — a &lt;br&gt;
snapshot of one particular set of decisions.&lt;/p&gt;

&lt;p&gt;What if instead, you could feed all four of these buckets into a &lt;br&gt;
system — with natural language — and get a design back? Not as a &lt;br&gt;
replacement for judgment, but as a &lt;em&gt;starting point&lt;/em&gt; you can always &lt;br&gt;
reproduce.&lt;/p&gt;




&lt;h2&gt;
  
  
  This Is Already Possible
&lt;/h2&gt;

&lt;p&gt;With the Figma API and modern AI, this is not a hypothetical.&lt;/p&gt;

&lt;p&gt;You can already write a plugin — or interact with the API directly — &lt;br&gt;
to describe a layout in natural language, separate the content layer &lt;br&gt;
from the styling layer, pass in a design token library, specify a &lt;br&gt;
target device and breakpoint, and generate shapes, text, and layout &lt;br&gt;
from those inputs.&lt;/p&gt;

&lt;p&gt;The result is non-destructive by nature. You're not editing a file. &lt;br&gt;
You're executing a spec. Change the styling bucket, re-run. Change &lt;br&gt;
the target device, re-run. The design is reproducible because it's &lt;br&gt;
a &lt;em&gt;description&lt;/em&gt;, not an artifact.&lt;/p&gt;

&lt;p&gt;This is procedural design.&lt;/p&gt;




&lt;h2&gt;
  
  
  What This Changes (and What It Doesn't)
&lt;/h2&gt;

&lt;p&gt;Let's be honest about what gets disrupted here.&lt;/p&gt;

&lt;p&gt;The ability to &lt;em&gt;operate tools&lt;/em&gt; — to use Figma, Illustrator, Sketch — &lt;br&gt;
has never been the actual skill. It's been a proxy for the skill. The &lt;br&gt;
real skill is the critical eye: knowing whether a hierarchy is working, &lt;br&gt;
whether the visual weight is off, whether the message lands. That &lt;br&gt;
judgment doesn't come from the software. It comes from years of &lt;br&gt;
training your eye.&lt;/p&gt;

&lt;p&gt;Procedural design doesn't replace that. It replaces the hours you &lt;br&gt;
spend translating decisions into pixels.&lt;/p&gt;

&lt;p&gt;What it does change: &lt;strong&gt;volume&lt;/strong&gt;. When the mechanical barrier goes down, &lt;br&gt;
more design gets produced. Much more. Some of it will be good. A lot &lt;br&gt;
of it won't be. The ability to distinguish between the two — to look &lt;br&gt;
at a generated layout and say "the hierarchy is broken here, the &lt;br&gt;
contrast fails here, this doesn't communicate the thing it needs to &lt;br&gt;
communicate" — becomes the rare and valuable thing.&lt;/p&gt;




&lt;h2&gt;
  
  
  A New Opportunity for Designers
&lt;/h2&gt;

&lt;p&gt;Here's the part I want you to hold onto.&lt;/p&gt;

&lt;p&gt;Every major tool transition in design history — from physical paste-up &lt;br&gt;
to desktop publishing, from print to web, from handcoding to visual &lt;br&gt;
tools — felt like a threat before it became an opportunity. In every &lt;br&gt;
case, the designers who came out ahead were the ones who understood &lt;br&gt;
the &lt;em&gt;underlying principles&lt;/em&gt; well enough to use the new tools &lt;br&gt;
intentionally instead of reactively.&lt;/p&gt;

&lt;p&gt;Procedural design is that next transition.&lt;/p&gt;

&lt;p&gt;And if you're reading this, you're early. You have time to ask: &lt;em&gt;what &lt;br&gt;
are the inputs that produce great design?&lt;/em&gt; How do you describe a &lt;br&gt;
strong hierarchy in words? How do you articulate a color mood before &lt;br&gt;
a pixel is placed? How do you separate what a design needs to &lt;br&gt;
&lt;em&gt;say&lt;/em&gt; from how it needs to &lt;em&gt;look&lt;/em&gt;?&lt;/p&gt;

&lt;p&gt;These are the skills that turn a designer into a design author — &lt;br&gt;
someone who can write the recipe, not just cook the meal.&lt;/p&gt;

&lt;p&gt;The tools are arriving. The question is whether you'll be the person &lt;br&gt;
who defines how they're used.&lt;/p&gt;

&lt;p&gt;I think you will.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;I'm building a lot of this thinking into real projects — design &lt;br&gt;
systems, AI tooling, and a few experiments with the Figma API. If &lt;br&gt;
this resonates, drop a comment or follow along. I'd love to compare &lt;br&gt;
notes.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>design</category>
      <category>ux</category>
      <category>ai</category>
      <category>figma</category>
    </item>
    <item>
      <title>I got tired of bloated React libraries, so I built two tiny ones</title>
      <dc:creator>Luis Vargas</dc:creator>
      <pubDate>Sat, 21 Mar 2026 13:30:15 +0000</pubDate>
      <link>https://forem.com/motion_design/i-got-tired-of-bloated-react-libraries-so-i-built-two-tiny-ones-1jc5</link>
      <guid>https://forem.com/motion_design/i-got-tired-of-bloated-react-libraries-so-i-built-two-tiny-ones-1jc5</guid>
      <description>&lt;p&gt;I've been building React apps for years, and two things kept frustrating me:&lt;/p&gt;

&lt;p&gt;Every animation library I tried was either too heavy, caused unnecessary re-renders, or was painful to sequence. And every date picker pulled in date-fns or moment.js just to format a date.&lt;br&gt;
So I stopped looking and started building. The result is ReactZero — a family of zero-dependency React primitives that are tiny, accessible, and easy to drop in.&lt;/p&gt;

&lt;p&gt;Here's what I've shipped so far:&lt;/p&gt;
&lt;h2&gt;
  
  
  @reactzero/flow — animation orchestration for React
&lt;/h2&gt;

&lt;p&gt;Most animation libraries in React drive animations through state, which means every frame triggers a re-render. @reactzero/flow skips all of that by running animations directly on the browser's compositor thread via the Web Animations API.&lt;/p&gt;

&lt;p&gt;The API is built around composable primitives:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useSequence&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;animate&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@reactzero/flow&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;Card&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;box&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useRef&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;play&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useSequence&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
    &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;animate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;box&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt; &lt;span class="na"&gt;opacity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;opacity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="p"&gt;}],&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;duration&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;300&lt;/span&gt; &lt;span class="p"&gt;}),&lt;/span&gt;
    &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;animate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;box&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt; &lt;span class="na"&gt;transform&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;translateY(8px)&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;transform&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;translateY(0)&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;}],&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;duration&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt; &lt;span class="p"&gt;}),&lt;/span&gt;
  &lt;span class="p"&gt;]);&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;div&lt;/span&gt; &lt;span class="nx"&gt;ref&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;box&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="nx"&gt;onClick&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;play&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="nx"&gt;Click&lt;/span&gt; &lt;span class="nx"&gt;me&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/div&amp;gt;&lt;/span&gt;&lt;span class="err"&gt;;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;A few things I haven't seen done well elsewhere:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Adaptive degradation:&lt;/strong&gt; automatically slows or skips animations based on device capability and frame rate&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;True cancellation:&lt;/strong&gt; finished always resolves, never rejects. No try/catch needed&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scroll-driven animations:&lt;/strong&gt; uses native ScrollTimeline when available, with a clean fallback&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reduced motion:&lt;/strong&gt;  provider-level policy system (skip, reduce, crossfade, respect)
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; @reactzero/flow
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;🔗 &lt;a href="https://github.com/motiondesignlv/ReactZero-flow" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; · &lt;a href="https://motiondesignlv.github.io/ReactZero-flow/" rel="noopener noreferrer"&gt;Docs&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  @reactzero/datepicker — accessible date picker, no date library required
&lt;/h2&gt;

&lt;p&gt;12KB gzipped. Zero dependencies. WCAG 2.1 AA.&lt;br&gt;
I tried every popular React date picker and they all shared the same problems: heavy dependencies, hard to customize, and accessibility treated as an afterthought.&lt;/p&gt;

&lt;p&gt;@reactzero/datepicker uses only native Intl and Date APIs — no date-fns, no moment, nothing extra.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;DatePicker&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@reactzero/datepicker&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@reactzero/datepicker/styles&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;App&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;date&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setDate&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;DatePicker&lt;/span&gt; &lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;date&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="nx"&gt;onChange&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;setDate&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="nx"&gt;placeholder&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Select date...&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="o"&gt;/&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;If you want full control over the UI, the headless hooks give you complete ARIA compliance without any of the default styles:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useDatePicker&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@reactzero/datepicker&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;MyDatePicker&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;state&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;isOpen&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;getGridProps&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;getCellProps&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;getTriggerProps&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;
    &lt;span class="nf"&gt;useDatePicker&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;locale&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;en-US&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="c1"&gt;// Build whatever UI you want&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Ships with 10 built-in themes, 3 density modes, range picker, datetime picker, RTL support, and keyboard navigation out of the box.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; @reactzero/datepicker
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;🔗 &lt;a href="https://github.com/motiondesignlv/ReactZero-DatePicker" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; · &lt;a href="https://motiondesignlv.github.io/reactzero-datepicker" rel="noopener noreferrer"&gt;Live Demo&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  The idea behind &lt;strong&gt;ReactZero&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Both packages share the same philosophy: zero runtime dependencies, small bundle, accessible by default, easy to style.&lt;/p&gt;

&lt;p&gt;I'm planning to keep adding primitives to the family under that same promise.&lt;/p&gt;

&lt;p&gt;If either of these solves a problem you've had, a ⭐ on GitHub goes a long way — it helps other developers find them.&lt;/p&gt;

&lt;p&gt;And if you run into anything broken, unexpected, or just have feedback — issues and PRs are very welcome. This is early and I genuinely want to know how it holds up in real projects.&lt;/p&gt;

&lt;p&gt;Thanks for reading 🙏&lt;/p&gt;


&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
        &lt;div class="c-embed__cover"&gt;
          &lt;a href="https://github.com/motiondesignlv" class="c-link align-middle" rel="noopener noreferrer"&gt;
            &lt;img alt="" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Favatars.githubusercontent.com%2Fu%2F2782171%3Fv%3D4%3Fs%3D400" height="auto" class="m-0"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="c-embed__body"&gt;
        &lt;h2 class="fs-xl lh-tight"&gt;
          &lt;a href="https://github.com/motiondesignlv" rel="noopener noreferrer" class="c-link"&gt;
            motiondesignlv (Luis Vargas) · GitHub
          &lt;/a&gt;
        &lt;/h2&gt;
          &lt;p class="truncate-at-3"&gt;
            Sr .Product Designer / Ux Engineer. motiondesignlv has 34 repositories available. Follow their code on GitHub.
          &lt;/p&gt;
        &lt;div class="color-secondary fs-s flex items-center"&gt;
            &lt;img alt="favicon" class="c-embed__favicon m-0 mr-2 radius-0" src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.githubassets.com%2Ffavicons%2Ffavicon.svg"&gt;
          github.com
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;



</description>
      <category>react</category>
      <category>typescript</category>
      <category>webdev</category>
      <category>ux</category>
    </item>
  </channel>
</rss>
