Why Vibe Coders Eventually Hit a Database Wall (and Don’t Know Why)

A few weeks ago, I started building an AI-powered database optimisation agent.

Not because I wanted to build another AI tool, but because I kept seeing the same pattern play out.

A friend of mine is an accountant by profession. On the side, he had been “vibe coding” a fairly complex software product. And it was working. Real users. Real data. Real usage.

Then the app started slowing down.

Nothing had changed in the code. Nothing obvious was broken. But the experience degraded week by week.

I took a look and within minutes the problem was clear: the database.

Missing indexes. Queries that were perfectly fine at small scale but pathological at volume. Schema decisions that worked early but didn’t age well.

Classic database problems — obvious only if you already know what to look for.

He had learned just enough development to build something valuable in his domain. He did not sign up to learn how to read an EXPLAIN plan or reason about index selectivity.

And that’s when it clicked.

He’s not an edge case anymore. He’s becoming the norm.

The new developer profile

Today, people ship real software using:

  • Supabase
  • Railway
  • ORMs
  • LLMs
  • Rapid prototyping tools

They can go from idea to production faster than ever before.

But when the database starts struggling, they’re alone.

They know it’s probably the database. n They don’t know where to look. n Hiring a DBA feels excessive. n Reading query plans is a rabbit hole.

This gap is growing, and no tool really addresses it for this audience.

What I built

I built a CLI tool and web UI that connects to your PostgreSQL or MySQL database in read-only mode.

It analyses:

  • Schemas
  • Index usage
  • Query plans
  • Server configuration

And returns a ranked list of findings with:

  • Copy-paste SQL fixes
  • A rollback statement for every change
  • Explanations in plain English

The interesting part is the architecture.

It uses a two-step LLM flow:

  1. First pass: diagnose and explain what’s wrong
  2. Second pass: generate remediation SQL only for the findings you choose to act on

This avoids hallucinated changes and keeps the tool focused on safe, actionable output.

Current state (honest version)

  • Hosted web UI is live
  • CLI installable via pip
  • Docker support
  • 60+ tests with fully mocked DB interactions

What I don’t yet have is something more important:

People running this on real databases and telling me if the findings are actually useful.

Why this matters now

We’re entering an era where thousands of new builders are creating real software without ever learning traditional backend engineering disciplines.

Databases don’t fail loudly. They degrade slowly.

And by the time the problem is obvious, the root cause is buried in months of schema and query decisions.

This is not a tooling problem for DBAs.

It’s a tooling problem for everyone who is not a DBA.

What I’m trying to validate

I have two weeks before I start a full-time job.

I want to find out one thing:

Does this actually help people who are experiencing “my app is slow and I don’t know why”?

If you’ve been through that moment, I’d love to hear:

  • How you diagnosed it
  • Whether you used a tool or hired someone
  • What you wish existed at that time

And if you’re running a side project on PostgreSQL or MySQL and suspect your database may be aging poorly, I’m happy to run a free audit and share the findings.

Not looking for compliments. Looking for brutal feedback on whether this is genuinely useful.

Try it in 5 Min

If your database is publicly accessible, use the hosted web UI, no install needed:

https://db-optimiser-amtjxwpuasemsf4k7s2pcm.streamlit.app/

If your database is on a private or local network, run it locally:

pip install db-optimiser-agent

Create a small .env file as below (sample)

# AI Provider — pick one
MODEL_PROVIDER=anthropic          # or: google
ANTHROPIC_API_KEY=sk-ant-...      # get from console.anthropic.com
GEMINI_API_KEY=AIza...            # get from aistudio.google.com

# Database
DB_TYPE=postgresql                # or: mysql
DB_HOST=localhost
DB_PORT=5432
DB_NAME=your_database
DB_USER=your_user
DB_PASSWORD=your_password

# Optional
ANALYSIS_DEPTH=auto

Then RUN:

db-optimiser analyse
db-optimiser analyse --remediate

Keep only the fields relevant to your provider. Do not need both ANTHROPICAPIKEY and GEMINIAPIKEY, just whichever one you’re using.

Happy to answer any questions about the build.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.