Hacker News Viewer

An AI Vibe Coding Horror Story

by teichmann on 4/14/2026, 8:35:45 AM

https://www.tobru.ch/an-ai-vibe-coding-horror-story/

Comments

by: spaniard89277

I did something similar to a local company here in Spain. Not medical, but a small insurance company. Believe it or not, yes, they vibecoded their CRM.<p>I sent them an email and they threatened to sue me. I was a bit in shock from such dumb response, but I guess some people only learn the hard way, so I filed a report to the AEPD (Data protection agency in Spain) for starters, known to be brutal.<p>I&#x27;ve also sent them a burofax demanding the removal of my data on their systems just last friday.

4/14/2026, 9:02:59 AM


by: delis-thumbs-7e

Meanwhile on Linkedin… Every sales bozo with zero technical understanding is screaming top of their virtual lungs that evrything must be done with AI and it is solution to every layoff, economic problem, everything.<p>It is just a matter of time when something really really bad happens.

4/14/2026, 9:02:19 AM


by: freakynit

I think vibe-coding is cool, but it runs into limits pretty fast (at least right now).<p>It kinda falls apart once you get past a few thousand lines of code... and real systems aren&#x27;t just big, they&#x27;re actually messy...shit loads of components, services, edge cases, things breaking in weird ways. Getting all of that to work together reliably is a different game altogether.<p>And you still need solid software engineering fundamentals. Without understanding architecture, debugging, tradeoffs, and failure modes, it&#x27;s hard to guide or even evaluate what&#x27;s being generated.<p>Vibe-coding feels great for prototypes, hobby projects, or just messing around, or even some internal tools in a handful of cases. But for actual production systems, you still need real engineering behind it.<p>As of now, I&#x27;m 100% hesitant to pay for, or put my data on systems that are vibe-coded without the knowledge of what&#x27;s been built and how it&#x27;s been built.

4/14/2026, 9:38:34 AM


by: EdNutting

Software engineering is looking more and more like it needs a professional body in each country, and accreditation and standards. Ie it needs to grow up and become like every other strand of engineering.<p>Gone should be the days of “I taught myself so now I can [design software in a professional setting &#x2F; design a bridge in a professional setting].” I’m not advocating gatekeeping - if you want to build a small bridge at the end of your garden for personal use, go for it. If you want to build a bridge in your local town over a river, you’re gonna need professional accreditation. Same should be true for software engineering now.

4/14/2026, 9:19:52 AM


by: seethishat

I saw something very similar a few months ago. It was a web app vibe coded by a surgeon. It worked, but they did not have an index .html file in the root web directory and they would routinely zip up all of the source code which contained all the database connection strings, API credentials, AWS credentials, etc.) and place the backup in the root web directory. They would also dump the database to that folder (for backup). So web browsers that went to <a href="https:&#x2F;&#x2F;example.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;example.com&#x2F;</a> could see and download all the backups.<p>The quick fix was a simple, empty index.html file (or setting the -Indexes option in the apache config). The surgeon had no idea what this meant or why it was important. And the AI bots didn&#x27;t either.<p>The odd part of this to me was that the AI had made good choices (strong password hashes, reasonable DB schema, etc.) and the app itself worked well. Honestly, it was impressive. But at the same time, they made some very basic deployment&#x2F;security mistakes that were trivial. They just needed a bit of guidance from an experienced devops security guy to make it Internet worthy, but no one bothered to do that.<p>Edit: I do not recommend backing up web apps on the web server itself. That&#x27;s another basic mistake. But they (or the AI) decided to do that and no one with experience was consulted.

4/14/2026, 10:20:20 AM


by: aledevv

&gt; <i>All &quot;access control&quot; logic lived in the JavaScript on the client side, meaning the data was literally one command away from anyone who looked</i><p>This is the top!<p>This is a typical example of someone using Coding Agents without being a developer: AI that isn&#x27;t used knowingly can be a huge risk if you don&#x27;t know what you&#x27;re doing.<p>AI used for professional purposes (not experiments) should NOT be used haphazardly.<p>And this also opens up a serious liability issue: the developer has the perception of being exempt from responsibility and this also leads to enormous risks for the business.

4/14/2026, 9:24:37 AM


by: shivaniShimpi_

Every other field that&#x27;s figured out high stakes failure models eventually landed on the same solution - make sure two people that understand the details are looking at it - pilots have copilots surgeons with checklists and nuclear plants have independent verification. Software was always the exception, cause when it broke it mostly just broke for you, vibe coding is not going to change the equation, it barely removes one check that existed before is that the people who wrote the code understood what was going on, but now that&#x27;s gone too

4/14/2026, 10:45:53 AM


by: BrissyCoder

This reads like internet fiction to me. Very vague and short.

4/14/2026, 9:11:26 AM


by: rubzah

I know, through personal acquaintance, of at least one boutique accounting firm that is currently vibe-building their own CRM with Lovable. They have no technical staff. I can&#x27;t begin to comprehend the disasters that are in store.

4/14/2026, 9:29:25 AM


by: consumer451

What would a responsible on-boarding flow for all of these tools look like?<p>&gt; Welcome to VibeToolX.<p>&gt; By pressing Confirm you accept all responsibility for user data stewardship as regulated in every country where your users reside.<p>Would that be scary enough to nudge some risk analysis on the user&#x27;s part? I am sure that would drop adoption by a lot, so I don&#x27;t see it happening voluntarily.

4/14/2026, 9:09:52 AM


by: keysersoze33

The takeaway is to vet new companies one is dealing with - even just calling them up and asking if they&#x27;ve AI generated any system which deals with customer&#x2F;patient data.<p>This is going to get more common (state sponsored hackers are going to have a field day)

4/14/2026, 11:52:00 AM


by: jillesvangurp

I think the issue here is less about AI misbehaving and more about people doing things they should not be doing without thinking too hard about the consequences.<p>There are going to be a lot of accidents like this because it&#x27;s just really easy to do. And some people are inevitably going to do silly things.<p>But it&#x27;s not that different from people doing stupid things with Visual Basic back in the day. Or responding to friendly worded emails with the subject &quot;I love you&quot;. Putting CDs&#x2F;USB drives in work PCs with viruses, worms, etc.<p>That&#x27;s what people do when you give the useful tools with sharp edges.

4/14/2026, 9:23:19 AM


by: aitchnyu

Is there anybody making some framework where you declare the security intentions as code (for each CRUD action) and which agents can correctly do and unit test? I have seen a Lovable competitor&#x27;s system prompt have 24 lines of &quot;please consider security when generating select statements, please consider security when generating update statements...&quot; since it expects to dump queries here and there.

4/14/2026, 9:53:56 AM


by: CrzyLngPwd

I think it is wonderful.<p>It&#x27;s reminiscent of the 90s, where every middle manager had dragged and dropped some boxes on some forms, and could get a salesman to sell it, without a care in the world for what was going on behind the scenes.<p>Until something crashed and recovery was needed, of course.<p>The piper always needs to be paid.

4/14/2026, 10:01:00 AM


by: andai

Archived version:<p><a href="https:&#x2F;&#x2F;archive.ph&#x2F;GsLvt" rel="nofollow">https:&#x2F;&#x2F;archive.ph&#x2F;GsLvt</a><p><a href="https:&#x2F;&#x2F;web.archive.org&#x2F;web&#x2F;20260331184500&#x2F;https:&#x2F;&#x2F;www.tobru.ch&#x2F;an-ai-vibe-coding-horror-story&#x2F;" rel="nofollow">https:&#x2F;&#x2F;web.archive.org&#x2F;web&#x2F;20260331184500&#x2F;https:&#x2F;&#x2F;www.tobru...</a>

4/14/2026, 9:02:21 AM


by: debarshri

I believe there are various dimensions to vibe coding. If you work with an existing codebase, it is a tool to increase productivity. If you have domain specific knowledge, in this case - patient management system, you can build better systems.<p>Otherwise, you endup simulating the production. Lot of the non technical folks building products with AI Vibe coding are basically building Product Simulations. It looks like a product, functions like a product but behind the scene, you can poke holes.

4/14/2026, 10:07:19 AM


by: mnls

Damn!!! And I keep hardening my RSS app which was partly vibe coded and not exposed to the WAN while &quot;professionals&quot; give data away.

4/14/2026, 9:48:37 AM


by: coopykins

I interviewed some years ago for an AI related startup. After looking at the live product, first thing I see is their prod dB credentials and openAI api key publicly send in some requests... Bad actors will be having a lot of fun these days

4/14/2026, 10:33:38 AM


by: sjamaan

So much is missing from this story. Did they report it to the relevant data authority? Did the fix they said they applied actually fix anything? Etc.

4/14/2026, 9:26:14 AM


by: GistNoesis

Who should get jailed ?<p>Does the company which willingly sells the polymorphic virus editor bear any responsibility, or should the unaware vibe coder be incumbent ?

4/14/2026, 9:20:25 AM


by: TeMPOraL

I have my doubts on the story. I consulted on a medtech project in the recent past in similar space, and at various points different individuals vibe-coded[0] not one but <i>three</i> distinct, independent prototypes of a system like the article describes, and neither of them was anywhere near that bad. On the frontend, you&#x27;d have to work pretty hard to force SOTA LLMs to give you what is being reported here. Backend-side, there&#x27;s plenty of proper turn-key systems to get you started, including OSS servers you can just run locally, and even a year ago, SOTA LLMs knew about them and could find them (and would suggest some of them).<p>I might be biased by my experience, because we actually cared about GDPR and AI act and proper medical data processing, and I&#x27;ve spent my fair share of time investigating the options that exist. Still, I&#x27;m struggling to imagine how one could possibly screw it up anywhere near as what the article described. Like, I can&#x27;t think of a way to do it, to the point I might need to ask an LLM to explain it to me.<p>--<p>[0] - Not as a means of developing an actual product, but solely to <i>see if we can</i>, plus it was easier to discuss product ideas while having some prototypes to click around.

4/14/2026, 11:06:39 AM


by: erelong

To me it just sounds like eventually someone will figure out how to make vibecoding more reasonably secure (with prompts to have apps be looked at for security practices?)<p>unless cybersecurity is such a dynamic practice that we can&#x27;t create automated processes that are secured<p>Essentially a question of what can be done to make vibecoding &quot;secure enough&quot;

4/14/2026, 11:06:17 AM


by: agos

I really hope OP also contacted their relevant national privacy authority, this is a giant violation

4/14/2026, 9:17:17 AM


by: zkmon

Technology for greed vs technology for need. Greed has its cost.

4/14/2026, 11:01:47 AM


by: hamasho

The worst blunder I made was when I explored cloud resources to improve the product&#x27;s performance.<p>I created a GCP project (my-app-dev) for exploring how to scale up the cloud service. I added several resources to mock the production, like compute instances&#x2F;cloud SQL&#x2F;etc, then populated the data and run several benchmarks.<p>I changed the specs, number of instances and replicas, and configs through gcloud command.<p><pre><code> $ gcloud compute instances stop instance-1 --project=my-app-dev $ gcloud compute instances set-machine-type instance-1 --machine-type=c3-highcpu-176 --project=my-app-dev $ gcloud sql instances patch db-1 --tier=db-custom-32-131072 --project=my-app-dev </code></pre> But for some reason, at one point codex asked to list all projects; I couldn&#x27;t understand the reason, but it seemed harmless so I approved the command.<p><pre><code> $ gcloud projects list PROJECT_ID NAME PROJECT_NUMBER my-app-test my app 123456789012 my-app-dev my app 234567890123 &lt;- the dev project I was working on my-app my app 345678901234 &lt;- the production (I know it&#x27;s a bad name) </code></pre> And after this, for whatever reason it changed the target project from the dev (my-app-dev) to the production (my-app) without asking or me realizing.<p>Of course I checked every commands. I couldn&#x27;t YOLO while working on cloud resources, even in dev environment. But I focused on the subommands and its content and didn&#x27;t even think it had changed the project ID along the way.<p>It continued to suggest more and more aggressive commands for testing, and I approved them brain-deadly...<p><pre><code> $ gcloud sql instances patch db-1 --database-flags=max_connections=500 --project=my-app $ gcloud compute instances delete instance-1 --project=my-app $ echo &#x27;DELETE FROM users WHERE username=&quot;test&quot;;&#x27; \ | gcloud sql connect my-db --user=user --database=my-db --project=my-app $ wrk -t4 -c200 -d30s \ &quot;http:&#x2F;&#x2F;$(gcloud compute instances describe instance-1 \ --project=my-app \ --format=&#x27;get(networkInterfaces[0].accessConfigs[0].natIP)&#x27;)&quot; </code></pre> It took a shamefully long time to realize codex was actually operating on production, so I DDoSed and SQL-injected to the production...<p>Fortunately, it didn&#x27;t do anything irreversible. But it was one of the most terrifying moments in my career.

4/14/2026, 11:47:47 AM


by: high_byte

this is exactly the kind of vibe coding horror stories I asked for just few days ago :)<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=47707681">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=47707681</a>

4/14/2026, 9:32:09 AM


by: ionwake

Anyone else read the title on HN and shudder not wanting to actually click it?

4/14/2026, 9:20:04 AM


by: cmiles8

There’s another version of the Mythos narrative that reads like:<p>AI companies realized that all this vibe coding has released a shitstorm of security vulnerabilities into the wild and so unless they release a much better model to fix that mess they’ll be found out and nobody will touch AI coding with a 100ft pole for the next 15 years. This article points more towards this narrative.

4/14/2026, 10:27:28 AM


by: repeekad

A perfect example of why a product like Medplum exists, as opposed to completely reinventing the wheel from scratch

4/14/2026, 9:11:20 AM


by: krater23

The only thing what helps is deleting the database. Every day. Until the thing goes down because the &#x27;developer&#x27; thinks he has a bug that he can&#x27;t find.

4/14/2026, 9:19:32 AM


by: zoobab

Avoid javascript like plague, it can be overwritten at the client side.

4/14/2026, 9:17:55 AM


by: fakedang

Report them - that right there is 5+ different violations. Only then will they realize their stupidity.

4/14/2026, 10:19:03 AM


by: faangguyindia

It&#x27;s nothing new, dunning kruger existing long before AI entered coding realm.<p>Several years ago ran into one american company which consulted with me. They had 4000 paying customers and they rolled out their billing solution which accept crypto, paypal and stripe.<p>They had problem with payment going missing, i migrated them to WHMCs with hardening and they never had any issues after.<p>Now people may laugh at whmcs but use the right tool for job<p>U need battle tested billing solution then whmcs does count it can support VAT, taxes, reporting&#x2F;accounting and pretty all which you&#x27;ll error while you try to do it all yourself.<p>Too bad there aren&#x27;t battle tested opensource solution for this

4/14/2026, 9:20:43 AM


by: peyton

Kinda crazy but hopefully the future holds a Clippy-esque thing for people who don’t know to set up CI, checkpoints, reviews, environments, etc. that just takes care of all that.<p>It sorta should do this anyway given that the user intent probably wasn’t to dump everyone’s data into Firebase or whatever.<p>I personally would like this as well since it gets tiring specifying all the guardrails and double-checking myself. Using this stuff feels too much like developing a skill I shouldn’t need while not focusing on real user problems.

4/14/2026, 9:04:50 AM


by: mikojan

Hard to believe... This activity should certainly land you in a German prison?!

4/14/2026, 9:02:39 AM


by: crvst

Cool story bro. Of course it’s true if it made it to HN. Who needs proofs.

4/14/2026, 11:01:07 AM


by: avazhi

You guys realise this is AI slop on AI slop, right?

4/14/2026, 10:57:36 AM


by: jseabra

[dead]

4/14/2026, 10:42:23 AM


by: vedant_awasthi

[flagged]

4/14/2026, 10:10:24 AM


by: direwolf20

Some people only care about actual consequences. Download all the data and send it, in the post on a flash drive, to the GDPR regulator&#x27;s office and another copy to the medical licensing board because why not.

4/14/2026, 9:00:04 AM


by: sajithdilshan

Don&#x27;t blame the AI for what is clearly gross human negligence. It&#x27;s like renovating your entire house and then acting surprised when the pipes burst because you used duct tape as a permanent fix.

4/14/2026, 9:33:18 AM


by: websap

Do you think if the agency hired a consultant to build this , a consultant couldn’t have made the same mistakes?<p>Lack of security theater is a good thing for most businesses

4/14/2026, 9:01:14 AM