New (3/31) ZDNET article is both optimistic about AI for coding and warning about more serious problems ahead
https://www.zdnet.com/article/maybe-open-source-needs-ai/
The article's title is "How AI has suddenly become much more useful to open-source developers" and the first part is about AI becoming better at coding, which is good news for software engineers and developers whose jobs will be helped by this, not-so-good news for those whose jobs are threatened.
But then there's another part of the article, which is ignored by the headline, though at least touched on by the second, much shorter sentence in the subheadline: "However, legal and quality issues loom."
Which makes the good news part of the article a bit of an April Fool's joke.
Before breaking out the champagne, let's consider several major problems. First, if we can improve open-source code with AI, what's to stop someone from copying and rewriting existing code and then putting it under a proprietary license? The lawyers are going to have a field day with this. Oh, wait! -- they soon will: Dan Blanchard, maintainer of an important Python library called chardet, just released the latest "clean room" version of the program under the MIT license, replacing its GNU Lesser General Public License (LGPL). By "clean room," he means he used Anthropic's Claude to rewrite the library entirely. Claude is now listed as a project contributor.
A person claiming to be the project's original developer, Mark Pilgrim, is not happy. Pilgrim says, "[The maintainers'] claim that it is a 'complete rewrite' is irrelevant, since they had ample exposure to the originally licensed code. Adding a fancy code generator into the mix does not somehow grant them any additional rights."
Blanchard, however, claims that "chardet 7 is not derivative of earlier versions." Did I mention that using AI to modify or clone open-source code will end up in court?
There's another problem: Although it appears that AI is much more useful than it used to be for fixing code issues, there's still a lot of AI slop out there, and open-source project maintainers are drowning in it. Just ask Daniel Stenberg, creator of the popular open-source data transfer program cURL.
-snip-
I posted a thread about the continuing AI-slop coding problem this past Sunday:
https://www.democraticunderground.com/100221133829 .
I was glad to see this ZDNET article at least give a few paragraphs to the legal problems.