Hacker News Viewer

The Value of Things

by vinhnx on 1/25/2026, 10:31:14 AM

https://journal.stuffwithstuff.com/2026/01/24/the-value-of-things/

Comments

by: arjie

I don&#x27;t really think it&#x27;s the effort, to be honest. A short while ago, I made a Custom GPT for my wife[0] that she thoroughly enjoyed and uses all the time. It didn&#x27;t take me that long. The value in the thing is that I could see what she wanted and make it like she wanted. And on the receiving side, a friend of ours knitted our daughter, Astra, a quilt with her name on it and with this lovely star motif. It must have taken her and her mother, a seamstress, ages. But if she had done it instantaneously, I think I still would have loved it.<p>As for the other side of things, there is one thing that gives me a mild twinge of envy: I grew up on the command-line and when I write on it, I can knock out a full bash pipeline literally at the command-line super-fast. Many of my friends are far better engineers, but this one thing makes me great at any sort of debugging and all that. Now everyone has that! I&#x27;m USELESS.<p>Well, not really, but it&#x27;s funny that this unique skill is meaningless. Overall, I&#x27;ve found that AI stuff has let me do more and more things. Instead of thinking, at the end of the week, &quot;Oh man, I wish I&#x27;d made progress on my side project&quot; I think &quot;Damn, that wasn&#x27;t such a good idea after all&quot; which is honestly far more satisfying!<p>0: <a href="https:&#x2F;&#x2F;wiki.roshangeorge.dev&#x2F;w&#x2F;Blog&#x2F;2025-10-17&#x2F;Custom_GPTs" rel="nofollow">https:&#x2F;&#x2F;wiki.roshangeorge.dev&#x2F;w&#x2F;Blog&#x2F;2025-10-17&#x2F;Custom_GPTs</a>

1/29/2026, 11:17:41 PM


by: xqb64

I&#x27;m literally considering a career switch from software engineering to electrical engineering and electronics, and naturally going back to school, because the AI and the way it&#x27;s used in writing software has sucked out all the meaning in it for me.

1/29/2026, 9:12:34 PM


by: VTimofeenko

The screenplay part made me think. While it&#x27;s true that an LLM could potentially generate a better^ script, while writing scriot the author would probably had had many ideas that did not make it to the final draft. Yet those ideas would definitely influence the final product, the movie. There&#x27;s probably only so much you can put in the script really.<p>Meaning for the brother is one thing, but as a potential watcher, I would almost always prefer a movie that someone really cared about^^.<p>^: depending on the definition of &quot;better&quot;<p>^^: as a fallible human being I am not perfect at detecting that care, but there had definitely been cases in my life when someone was talking about a thing that I would not really care about otherwise, but their passion made the talk extremely interesting and memorable

1/29/2026, 10:49:48 PM


by: dzink

The author attributes meaning for the giver and hopefully receiver to time spent by the giver. They argue less time spent for same utility lowers the meaning.<p>I see something very different:<p>1. The government post shared as an example of efficiency to utility increase has glaring errors: “journey-level developers”. You will never achieve any improvement on government code bases if the people leading the effort can’t pay attention to the most basic and broadcasted elements of the job. AI used by junior developers will only compound the massive complexity of government systems to the point where they are not fixable by seniors and they are not usable by humans.<p>2. The time spent doing something, meaningful or not, with care is a training for that person into attention to detail, which is absolutely critical to getting things right. People who lazily lean on generating more without attention to detail, don’t see the real point of the work - it’s not to add more stuff to less space (physical or mental) faster. It’s to make the existing space better and bigger by needing less in it. The rest is just mental and physical complexity overload. We are about to drawn in that overload like hoarders next to a dumpster.<p>3. If you ever live in a small home you may noticed that the joy of getting things (usually derived from dopamine-seeking behaviors like shopping or making, or shopping for ingredients so you can make things, or from getting toys for your kids or other people getting toys for your kids) will quickly overload any space in your living quarters. Your house becomes unbearably small and it becomes impossible to find things in piles or drawers filled with other things if nobody ever organizes them. We all have become dopamine adduces or the world has turned us into such and there are few if any humans willing and capable of organizing that world. So most people today will be paralyzed with a million choices that were never organized or pruned down by the owner or their predecessors. The overwhelming feeling would be to escape the dread of organization into more dopamine- generating behaviors. We need more of the “moms” who clean up our rooms after we’ve had fun with all the legos we can now generate. Or we will all be living in a dumpster before too long.

1/29/2026, 9:53:52 PM


by:

1/29/2026, 10:44:12 PM


by: camgunz

I love Nystrom&#x27;s writing, and he&#x27;s so good at it because he&#x27;s written so much. A huge part of the value of things is how we grow in the making of them, and I worry that in a world where we accept generative slop, we&#x27;ll never have the opportunity to woodshed enough to become excellent at a craft.<p>I&#x27;m a good engineer because I&#x27;ve written tons of code, I&#x27;ve taken no shortcuts, and I&#x27;ve focused on improving over my many iterations. This has enabled me to be an effective steward of generative coding (etc) models, but will younger engineers ever get the reps necessary to get where I am? Are there other ways to get this knowledge and taste? Does anyone know or care?<p>We&#x27;re in the anthropocene now, and while probably everyone who knows what that is understands we have the largest effect on the Earth, it also means we now also have the largest effect on ourselves. We&#x27;re so, so bad at taking this seriously. We can unleash technology that idiocracies western civilization inside of a generation, I know this because we keep lunging towards it with ever increasing success. But we can&#x27;t just shamble around and let Darwin awards sort things out. We have nukes and virology labs, not to mention a climate change crisis to deal with. If the US political system falls apart because Americans under 65 spend between 2-3 hours on social media a day, that&#x27;s a failed state with a lot of firepower to shoot around haphazardly.<p>And why do we keep building things that enfeeble us? Did we need easier access to delivery food and car rides, or did we need easier access to nutritious food and more walkable neighborhoods? Did we need social media with effectively no protections against propaganda&#x2F;misinformation? We know that cognitive ability and executive function decline with LLM use. Can it really be that we think we&#x27;re actually too smart and we need to turn it down a notch?<p>There are actual problems to solve, and important software to write. Neither algorithmic feeds nor advertising platforms fall under those categories. LLMs are supposed to solve the problem of &quot;not enough software&quot;--Nystrom points at this explicitly with the Washington Department of Ecology ad. But we never had a &quot;not enough software problem&quot;, rather we had a &quot;not enough beneficial software&quot; problem, i.e. we&#x27;d be in a way better place if our best minds weren&#x27;t working on getting more eyeballs on more ads or getting the AI to stop undressing kids.<p>Generative AI isn&#x27;t empowering us. We don&#x27;t have people building their own OSes, (real, working) browsers, word processors and spreadsheet programs, their own DAWs or guitar amp modelers, their own Illustrators or Figmas. Instead you have companies squeezing their workers and contractors, while their products enshittify. You can&#x27;t even run these things without some megacorp&#x27;s say so, and how are you gonna buy time on the H100 farm when AI took your job?<p>I&#x27;m too tired to write a conclusion. I&#x27;m pretty sure we&#x27;re fucked. But hey look, the cars drive themselves.

1/29/2026, 10:54:56 PM


by: agcat

This is a really good piece especially the part on what and why makes sense to build with AI or not

1/29/2026, 9:51:52 PM