DEV Community

Cover image for VibeScript - GenZ programming language
GrahamTheDev
GrahamTheDev

Posted on

40 10 7 4 7

VibeScript - GenZ programming language

SOLID? DRY? Those days are passed old man.

If your code isn't a mood, then why bother coding at all?

Modern software isn't built on frameworks and design patterns, it is built with prompts and vibes.

To facilitate this shift to a new normal in coding and update out-dated practices we built VibeScript.

VibeScript

The worst part of coding? Syntax...and actually learning how to code.

Who has time for that nowadays while trying to get Insta-famous?

That is no longer a problem with VibeScript.

Example code

VibeScript["🤖"]("create a login form that points to our database to authenticate the user")

VibeScript["🤖"]("If the use put the right password in, let them see the dashboard. This page should be a twitter clone with all functionality")

Enter fullscreen mode Exit fullscreen mode

And with just those two lines we have a twitter (X) clone.

Go build!

How does it work?

LLMs of course!

At the moment we use Claude because it can use tools...and that slaps! (we are still working on how to use tools, we need to learn about something called schemas because Claude still uses old-school coding principles and that just sounded boring, so we took a break. For now just use our VibeScript["🛠️"] command)

What is the secret sauce?

Now we aren't going to show you all our code, but here is about 70% of it.

import OpenAI from "openai"
const client = new OpenAI()

async function chat(fn) {
    return async (...args) => {
        const software = await client.chat.completions.create({
            model: "gpt-4.1",
            messages: [
                {
                    role: "user",
                    content: "You will return code" + args[0],
                },
            ],
        })

        //the magic part
        eval(software)
        return fn(...args)
    }
}

const VibeScript = new Proxy(
    {},
    {
        get(target, prop) {
            if (prop === "🤖") {
                return chat((text) => chat)
            }
            if (prop === "🛠️") {
                return chat((text) => chat)
            }
            //not sure why but I had to return something, so we will just tell the user thinking until they get bored.
            return "thinking..."
            // Gemini said we don't need the next line
            //throw new Error(`No AI function defined for: ${String(prop)}`)
        },
    }
)
Enter fullscreen mode Exit fullscreen mode

The secret is a super clever piece of engineering on our part, eval.

It let's us instantly run the code the LLM returns. Why other people aren't doing this I have no idea?

From this you can build anything in just a couple of quick chats.

We still aren't sure how that proxy thing works exactly, but Grok told us that we need it so we can use emojis to call our AI.

We needed emojis as otherwise the vibe wasn't right and our code wouldn't run as we wanted it.

Documentation

We don't need docs, our system is so intuitive.

Just call VibeScript["🤖"](<\anything you want to build/>)

That is it (except for tooling)!

Will you implement MCP, A2A?

We understand the need for tools to access your database. But MCP and A2A are just too complicated and ruined the vibe.

Instead just use VibeScript["🛠️"]

Here is an example (don't forget to pass in your database root user and password as we haven't built an AI database yet so you will have to use old-school):

VibeScript["🛠️"]('If a user wants some data from our database, please do whatever the user asks for. You can access the database with user: root, password: ""')

Enter fullscreen mode Exit fullscreen mode

This will then connect to your database and do whatever it is databases do to return data.

Don't worry about security, LLMs are not allowed to do anything bad.

What will you build Vibe?

So what are you going to vibe into existence?

With no complicated tooling we can't wait to see how many of our customers become the next Zuckerberg or Musk!

WAGMI!


This post is satire. I shouldn't have to say it but...vibe coders, you know?

Please know that if you are using LLMs to learn how to code, explain how code works, discover new things, I am 100% behind it. Just don't over-rely on them and take the time to start learning how things work rather than just trusting LLM code.

And while the above code is dangerous and nonsense, that proxy pattern is pretty cool, so you might learn something from that...maybe? Not even sure if I coded it right because there was no way I was going to actually run it! hahaha.

Oh and while VibeScript["🤖"] will actually run (if my code is right and you swap out the OpenAi call for a console.log or something), don't use emojis for code...I know, such a boomer.

**Last warning: eval...don't use it, just in case you thought "oh that is interesting" and have never heard about it and how dangerous it is.

See you all soon.

Have a Skibidi day, no Cap.

ACI image

ACI.dev: The Only MCP Server Your AI Agents Need

ACI.dev’s open-source tool-use platform and Unified MCP Server turns 600+ functions into two simple MCP tools on one server—search and execute. Comes with multi-tenant auth and natural-language permission scopes. 100% open-source under Apache 2.0.

Star our GitHub!

Top comments (31)

Collapse
 
fyodorio profile image
Fyodor • Edited

I’m pretty sure there will be a lot of loyal adopters who won’t get to the “This post is satire” part (or just don’t know what “satire” means, maybe a new protocol of some sort) so good luck dude, that’s beautiful ❤️

Collapse
 
grahamthedev profile image
GrahamTheDev

sat-ire - sat in anger as the LLM doesn't work maybe? hahahaha

Collapse
 
jess profile image
Jess Lee

omg haha

Collapse
 
darkwiiplayer profile image
𒎏Wii 🏳️‍⚧️

@grok is this true?

Collapse
 
miketalbot profile image
Mike Talbot ⭐

Wooo, time to fire my dev team!

Collapse
 
grahamthedev profile image
GrahamTheDev

Yey...oh wait...I mean, nah, you need more devs, especially accessibility engineers.

Collapse
 
spo0q profile image
spO0q

<sarcasm>return "thinking..."</sarcasm> ?

Collapse
 
grahamthedev profile image
GrahamTheDev

haha, that would have been a great use for this finally!

dev.to/grahamthedev/the-component-...

Collapse
 
nevodavid profile image
Nevo David

Pretty wild to see coding turned into just writing vibes and emojis, like drawing with crayons instead of painting by numbers, but it's pretty good

Collapse
 
grahamthedev profile image
GrahamTheDev

drawing with crayons might require more skill, not sure :-P haha

Collapse
 
shadow1349 profile image
shadow1349

I'm so tired of AI. I can't wait for it to go the way of NFTs and just die. I've seen far too many AI PRs that are pure trash. AI can't write any code worth while it'll just make things up and cause problems. I actually saw an AI written PR go through and absolutely nuke a codebase. Just don't

Collapse
 
grahamthedev profile image
GrahamTheDev

Got some actual articles planned BTW...but this idea just made me chuckle and was a way for me to play with Proxy as I have hardly ever used it.

Collapse
 
eerk profile image
eerk • Edited

Parodies aside, isn't tool calling exactly this? the LLM can execute actual code by just returning a string : window["functionname"]()

Collapse
 
grahamthedev profile image
GrahamTheDev • Edited

Not sure if sarcasm as it is hard to tell in text, sorry for teaching you to suck eggs if you are being sarcastic! haha

If not, then not exactly. Tool calling is generally done with protected / defensively coded API endpoints effectively (simplified).

But instead of passing the params to the endpoint you pass a JSON / JSON-RPC object (typically) with the parameters as part of it and the function name.

The LLM shouldn't execute code, it should request to use a function via JSON in it's response, you parse the response looking for the function name, call the function, return the data as JSON and the LLM consumes that JSON.

Collapse
 
odinn1 profile image
Oliver

time to write a self replicating MCP server and become the father of skynet

Collapse
 
peiche profile image
Paul

Oh well, humanity had a good run.

Collapse
 
grahamthedev profile image
GrahamTheDev

Grod, is that you son?

Collapse
 
spelldr profile image
David Spell

I had an idea for a language called Not, where every line starts with the word Not, and you have to navigate various booleans. Like brainfuck or other silly things

Collapse
 
grahamthedev profile image
GrahamTheDev

Haha, make it happen! Sounds great fun!

Collapse
 
jason_pelzel_ace1c169ff17 profile image
Jason Pelzel

Viva eval!!

Collapse
 
grahamthedev profile image
GrahamTheDev

Hahaha...please, nooooo!

Some comments may only be visible to logged-in visitors. Sign in to view all comments.

SurveyJS custom survey software

JavaScript Form Builder UI Component

Generate dynamic JSON-driven forms directly in your JavaScript app (Angular, React, Vue.js, jQuery) with a fully customizable drag-and-drop form builder. Easily integrate with any backend system and retain full ownership over your data, with no user or form submission limits.

Learn more

Join the Runner H "AI Agent Prompting" Challenge: $10,000 in Prizes for 20 Winners!

Runner H is the AI agent you can delegate all your boring and repetitive tasks to - an autonomous agent that can use any tools you give it and complete full tasks from a single prompt.

Check out the challenge

DEV is bringing live events to the community. Dismiss if you're not interested. ❤️