Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

usonian

(20,116 posts)
Fri Sep 5, 2025, 07:52 PM Friday

Where's the Shovelware? Why AI Coding Claims Don't Add Up --- Mike Judge Sep 03, 2025

https://substack.com/inbox/post/172538377

With receipts.

"But lo! men have become the tools of their tools." --- Henry David Thoreau, "Walden"


I was an early adopter of AI coding and a fan until maybe two months ago, when I read the METR study (1) and suddenly got serious doubts. In that study, the authors discovered that developers were unreliable narrators of their own productivity. They thought AI was making them 20% faster, but it was actually making them 19% slower. This shocked me because I had just told someone the week before that I thought AI was only making me about 25% faster, and I was bummed it wasn’t a higher number. I was only off by 5% from the developer’s own incorrect estimates.

This was unsettling. It was impossible not to question if I too were an unreliable narrator of my own experience. Was I hoodwinked by the screens of code flying by and had no way of quantifying whether all that reading and reviewing of code actually took more time in the first place than just doing the thing myself?

So, I started testing my own productivity using a modified methodology from that study. I’d take a task and I’d estimate how long it would take to code if I were doing it by hand, and then I’d flip a coin, heads I’d use AI, and tails I’d just do it myself. Then I’d record when I started and when I ended. That would give me the delta, and I could use the delta to build AI vs no AI charts, and I’d see some trends. I ran that for six weeks, recording all that data, and do you know what I discovered?

I discovered that the data isn’t statistically significant at any meaningful level. That I would need to record new datapoints for another four months just to prove if AI was speeding me up or slowing me down at all. It’s too neck-in-neck.


(1) https://metr.org/blog/2025-07-10-early-2025-ai-experienced-os-dev-study/

6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies

hunter

(39,875 posts)
1. Worse, your coding or writing skills will rot away the longer you rely on AI.
Fri Sep 5, 2025, 08:29 PM
Friday

In a few years you'll be coding or writing just like an AI -- throwing shit against the wall to see what sticks.

Disaffected

(5,861 posts)
2. This article misses the point however that AI will write code not just for experienced programmers but
Fri Sep 5, 2025, 08:48 PM
Friday

also for inexperienced programmers and every thing in between including those who have no familiarity at all with a particular coding language.

I am an example of the latter - I know very little about web browser app coding but have just gotten AI to write two apps for me. All one has to do is describe in plain English what you want the app to do and the AI engine will do it for you and, IMO, amazing capability and alacrity. It is amazingly like conversing with an experienced human programmer. The same AI can also guide you through any other processes necessary to make the app functional eg. make a standalone app or connect it to a database server so many can access it simultaneously.

And, these are early days - I'm not sure if I were starting out now, I would want to enroll in a software engineering or computer science program. Certainly those job types will not disappear at all but job openings are bound to decrease in future and, IIRC, are doing so now.

usonian

(20,116 posts)
3. In the "old days" we could spin up a BASIC program or hypercard stack with very little experience.
Fri Sep 5, 2025, 09:11 PM
Friday

And many did. There were entire business suites in BASIC and Who can forget "Your Faithful Camel"?

Even then, the threshold was low, and it was said: "It's easy to get started but anything big and serious wasn't (and isn't)

Anything "serious" involving data structures, concurrency, locks, a bazillion other things, takes skill and experience to get right. As of now, LLM's reinvent the wheel constantly whereas libraries are thoroughly tested and can be relied on. Libraries like BLAS and LAPACK get things right down to points of precision and roundoff.

And, if the LLM's are programmed to incorporate them, then the LLM is a bit more than "glue". More so than programmers, scripters like me (I did mostly sysadmin work) are even lazier, and call on libraries.

You can do that with a quality programming language like python, perl, java, ruby and so on, and get (mostly) dependable results.

And languages have interpreted versions, if not already interpreted. Java has groovy and so on, so development is more of the rapid prototyping variety than waiting for compile.

Just take a look at Jupyter Notebooks. "Batteries Included" tons of batteries, (libraries), all tested and reliable and with a nice GUI interface. I got a free version (carnets) for iPad.

I would seriously invest in a language with rich libraries to leverage than to ask an LLM to reinvent everything on the fly. You can't even test them, since they are mostly black boxes.

Xolodno

(7,143 posts)
4. Forgot where I read it, but Computer Science is being avoided like the plague right now in universities.
Fri Sep 5, 2025, 09:13 PM
Friday

Old manager of mine called me up little a year ago and they got rid of a lot of the Comp Sci people and didn't replace them.

Then he called me up last week, he's an analytics manager and they not only gave him the boot, but his director and VP and again, not replacing. Used to be two teams doing it, now its a skelton crew.

I played around with AI a little to write some sql code, didn't take me long to figure out I could just write a basic code, drop it into Access or hook up Excel to the results and I would get what I wanted a whole hell of a lot faster.

I think AI is being over sold right now. The vendors are making presentations on how much productive, how much can be saved if you eliminate employee's, etc. And of course the C-Suite people eat it up. And AI companies are pushing this because they just spent a lot of money working on it and need to recoup it.

Disaffected

(5,861 posts)
5. I dunno, my experience with AI in coding is very limited.
Fri Sep 5, 2025, 09:21 PM
Friday

The particular platform (IDE) I used to build my two apps is Windsurf/Cascade. I don't really know how "tight" and efficient the HTML code it generated is but both apps are relatively simple (one or two screens of content) and both are over 1,500 lines of code - it would take me longer just to type such an amount of code from a hard copy than to get the IDE to produce the whole functional thing, especially after I gained a little experience with it.

Xolodno

(7,143 posts)
6. I got sent to enough tech conventions to be a bit on the "lets wait and see approach".
Fri Sep 5, 2025, 09:38 PM
Friday

One in particular, they sent me to San Diego, put me in the Grand Hyatt over Mission Bay and the proceeded to lock us into classes about tech, industry trends, Big Data, etc. that most of never materialized. Then they locked us up for dinner and product/start up company launches afterwards. Some made it, most didn't. Because "Big Data! It's Coming! And your company better be ready for it!". Wasn't as much doom or gloom they said it was going to be.

My wife had a great vacation however.

Latest Discussions»General Discussion»Where's the Shovelware? W...