The fact that generative AI software is open source does not make a difference. There can be no pride in creating content that anyone can create. The author opines that AI-generated or AI-enhanced content is, essentially, content stolen from other publishers.
The famous Sports Illustrated magazine was recently caught publishing AI-generated articles, complete with AI-generated photos for the profiles of its non-existent authors (https://www.theguardian.com/media/2023/nov/28/sports-illustrated-ai-writers). It was quite a fall from grace. If you are any kind of writer or if you indulge in any kind of publishing (including even software documentation), this incident should make you pause to think. Should you believe the hype and get help from AI?
What is AI? Is it not just software?
Many years ago, I decided to make one last update to software that I had created many more years ago. By then, there were hundreds of software download sites where my software could be hosted/listed. Rather than submit my software to each one of them manually, I decided to use the trial version of a mass-submission tool. For each software download website, the tool would programmatically download the software submission page, create an account in my name if required, and upload my program details and/or the download file. The website would then create a download page based on the submitted metadata. Some websites had a Google captcha filter to foil such automated submissions. The submission tool solved most of the captcha images on its own. It transparently showed the downloading of the captcha image and its attempts to solve them. It was successful most of the times but for the rest it asked me to intervene. When I manually solved a captcha, it would resume the submission process. I was amazed how well the software worked. This was a decade ago.
How did the software solve the captcha image? Artificial intelligence? Not at all. Artificial intelligence is just software. There are image-recognition software that can read text from photos. These software have been bundled with printers and scanners for at least two decades now. After open source alternatives for such proprietary software became more easily available (in places such as GitHub), programmers began to modularise and use them in myriad applications. Now, some of these applications tend to amaze and they are treated as ‘AI’.
The early AI applications were based on a limited data set from which they could form their logic. With the World Wide Web, the data set for self-learning has become practically unlimited. All kinds of text, audio and video are being uploaded to the web. Microsoft and Google have scanned millions of library books. Their search engines, which legally and illegally parse this data, can create AI programs that can seem astonishingly powerful. But, it is still old technology. Microsoft Word has been able to detect bad grammar for more than two decades. What is different today for AI programs is the size of the data set that they can leverage.
AI-generated content may seem beautiful but it is still a Frankenstein monster
Electronic music synthesisers are an early example of an AI device based on a limited data set. Using a sound bank of recordings, they could imitate many man-made musical instruments. They could also create electronically-generated waveforms that no traditional musical instrument could possibly create. Electronic keyboards may have spurred a music revolution but they did not make musical instruments or musicians obsolete. Instead, they changed the way musical compositions were rendered or recorded. Gone are the days when an entire orchestra had to be in the recording studio or near a stage along with the singer.
The music from a synthesiser may have been artificially generated. But as the data set was limited, even a skilled musician needed to leverage considerable artistry to create good music. With AI-generated content, the scales have tilted far too much to the other side. Skill is not in great demand because the data set is so unbelievably big. Now, anyone can generate good content at the click of a button.
Why would someone hire a butler if the butler is going to issue voice commands and make a robot do all the work? It would be better to buy the robot and eliminate the butler! Similarly, why should someone pay a writer if an AI chatbot is doing all the heavy lifting? Here’s the big difference.
When a writer uses AI to ‘enhance’ his writing, he is copying from several other writers. That collective Frankenstein writer is the real author who should get the credit. Search engines have scraped content from thousands of writers to create this monster. The generated content is not theirs to claim. Trying to make money from that is unlawful. It is as if you have used an electronic synthesiser that has an illegal sound bank of human voices. The fact that the software is or was open source is immaterial. The ultimate use-case is nefarious.
Where AI-generated content is (not) a problem
Recently, at a fashion show in New York, a prankster strode the ramp wearing a transparent garbage bag as a top and the clueless fashionistas who had assembled there did not notice anything amiss. It was the rent-a-cop bouncers who tackled him out of stage. Nevertheless, people do value original work. When someone buys a painting, he expects the painting to have been drawn by an artist. That is, a real human. The buyer would feel like his purse was burgled if he were to learn that the painting was AI-generated.
When you buy a book, the covers can be AI-generated. That is not a problem. What matters is the text between the covers. That should be written by a human through and through.
I frequently read reports about the performance of stock market scrips. A lot of articles are based on stock movements and publicly disclosed financial data. Because this data is highly structured, the article content seems to follow a pattern. They seem to be AI-written but there may be no AI or very little AI involved. It could be just mail-merge, a killer technology that made WordStar of the DOS command-line era famous. As a casual reader, I do not care if it was AI-generated. The content is free and I have no complaints.
YouTube is littered with misguided writing coaches who are advising budding writers to use AI to enhance their writing. They are careful to say that if you do not have talent, AI cannot help you. However, if you have talent (please … do not ask what talent), AI can make you a great writer. But, this cannot be true. AI will not make anyone a great writer. A dumb writer remains a dumb writer. He will just be plagiarising from several great writers. The AI software conveniently obfuscates the plagiarising part.
If you are a writer or a publisher and you wish to be paid for your content, ask yourself:
What is your value proposition?
- Will your readers like it if they were fooled by your AI-generated content?
- Will a reader want to pay good money for content that they can also generate?
How honest is your writing?
- Is it not your obligation to provide your own writing?
- Is it not misrepresentation to pass off AI-generated or AI-enhanced content as your own?
- Do you want to contaminate your writing with content or style stolen from other writers without their consent?
Why even write?
- Would it be not an abject admission that you are sub-par to a machine?
- Is this just a hustle or do you have some self-worth as a writer?
Despite the popularity of open source software, there is universal reticence among developers not to work with code developed by others. Even minor differences in the code formatting style or the naming convention for variables can put off other developers. I have used open source code numerous times in the software that I have developed but in all those cases I did not want to edit the foreign code by hand. I just packaged that code in an external library of routines that was separate from my own code. Most developers are like that. It is because we attach great pride and value to our own code that we create such boundaries. However, we cannot create such boundaries when we work in a team. Left to ourselves, we do not want someone else to interfere in our work as we want to hog all the credit. This is true of all professionals, not just software developers. No grownup wants a nanny.
The bottomline: do not contaminate your brand with AI.