Angry at AI for Stealing Your Job?

Angry at AI for Stealing Your Job?
Robot Writer, created by Marcel Gagné, using Stable Diffusion

If the answer to the above question "Yes", then consider that you may be angry for the wrong reason, and at the wrong people. Or AI.

Generative AIs, like ChatGPT, Midjourney, and Dall-E, are the talk of the town. Not just your town or my town, but all of them. Seriously. And now, Microsoft's Bing is in on the action and OpenAI has released GPT-4.

Every time you turn around, somebody is talking about, writing about, or getting interviewed about AI. Hey, I’m writing about it here, right now! I especially love (and I’m being snarky here) the articles titled, “We asked AI about blah de blah and this is what it said.Sigh…

If you are enjoying my work, please share it with others and encourage them to subscribe. If you'd like to chat with others about what you see here, or anything else for that matter, join the FreeThinkerAtLarge Discord server. It's free and it's fun!

With all the hype and the talk, you’ll find a fair number of skeptics, naysayers, true believers, and everything in between. One of the most common concerns is that these AI models are "stealing" the work of others. Some people, including friends of mine, argue that the text and images generated by these models are taken from original sources, and that AI is essentially plagiarizing the work of human creators. They suggest that, somehow, complete texts and/or images are sitting there, in their entirety, within the model file that the AI art or text generator uses. The AI just scoops it up and spits it back out.

Fun fact, the image file for Stable Diffusion circa 1.6 is just over 2.1 gigabytes which would mean for downright arcane magic level compression if every piece of art was somehow in there. But I digress…

These concerns over “stolen art” are misguided and rest on a misunderstanding of how generative AI works. AI models like ChatGPT, Midjourney, and Dall-E do not actively seek out and copy the entire work of others. Instead, they are trained on vast amounts of data from the internet, which can include text, images, music, and videos, from a wide variety of sources. When the AI model generates new text or images, it is not actively plagiarizing, but rather using the patterns and structures it has learned from the data it was trained on to create something new.

The “GPT” in ChatGPT stands for "Generative Pre-trained Transformer." The "Generative" part is pretty simple; it means that the model can generate content, such as text. The "Pre-trained" part means that the model has been trained on a large dataset before being fine-tuned to have a general understanding from which it can generate coherent text. The last part, "Transformer", refers to the specific architecture used in the model, one that uses everything it has learned to go over, predict, modify, and recreate that information as needed.

Think of the way you learn. You watch movies, listen to music, read books, and articles. Over the course of years, you watch a hell of a lot of movies, listen to a hell of a lot of music, and (hopefully) read a hell of a lot of books and articles. All that information is taken into your brain where it gets sauteed with your own particular seasonings, then stored in your mental refrigerator for when you feel peckish. Those mental leftovers aren’t the original dish, but they’re entirely based on what you had available.

Your brain is a GPT engine in that you generate information based on pretrained data so that you can then transform, or adapt, that information into new work, stories, music, or art. All of us stand on the shoulders of giants. Ditto for the giants, and the giants whose shoulders they stand on. Honestly, it’s giants all the way down.

Back to the AIs.

We need to remember, and this is important, that AI models like these are not sentient beings with malicious intent. They are just a clever collection of code, possibly generated by human GPT engines (aka ‘normal humans’), that follow the rules given to them and then follow the patterns those AIs have learned to randomly generate content based on those rules. AI can’t steal, it can’t plagiarize, and, more important, it cannot intend to do so.

Not yet. Sorry, I had to say that.

So, if AI-generated text and images are not "stealing" from others, and it’s not out to ‘steal’ our jobs, why are people so concerned about it? The real reason is that AI is better than us at being a GPT. It’s better, faster, and stronger. Mostly, it’s faster, which is why it's natural to feel, at the very least, a twinge of anxiety. That’s why we’re upset. That’s why we’re angry. After all, these AI models can generate high-quality content at lightning speed. Since most of us aren’t the Flash, that’s a bit of a slap in the face.

Picture a high-speed train going by, the Doppler shift assaulting your ears. That’s AI leaving us in the dust. Bruised ego, anyone?

Generative AI is a writing (art or whatever) machine that never needs a coffee break, pep talks, deadlines, or verbal threats, to work. But, before you start to feel like your own creative abilities are worth less, not worthless, remember that AI is a tool, not a replacement. AI does not mean the end of human creativity. Instead, it presents an opportunity for human creators to collaborate with AI and use it as a tool to enhance their own work. Additionally, the field of AI-generated art and text opens new opportunities for people to create, share and use new kinds of art and text that have never been seen before. AI will help us create new drugs, and new treatments for what ails us. It will help us uncover new sources of power and currently unimagined ways of doing things. It will gives new insight into what makes us human.

If you think about it, it’s all very exciting!

The trouble is that society isn’t ready for it. Governments, bureaucracies, and institutions move at a glacial pace compared to AI development. As Ray Kurtzweil quipped, “the singularity is near”, and it’s getting nearer by the day. AI isn’t going away, but to take full advantage of it, we’re going to have to do things differently. That means different (and new) ways of taking care of society. Jobs will be lost, not because AI is ‘taking the jobs’, but because businesses trying to cut costs and increase revenue, are going to give those jobs to robots and AI. It’s not the AI or the robots who are responsible.

Again, we’re angry for the wrong reasons, and at the wrong things.

Jobs are important, first and foremost, because people need to eat. People also need to keep warm and safe if we’re going to thrive and survive. We need social and economic support. In fact, we need a rethinking of the entire socio-economic framework, and while we try to figure this out (remember those governmental and bureaucratic glaciers), we’re going to need ways to make sure people survive through the change. While it may not be the definitive answer, some form of Universal Basic Income (UBI) is an absolute must to ensure that everyone has a roof over their heads, food on their plates, and clothes on their backs.

COVID wasn’t the “great reset”. Artificial Intelligence, however, is and will be for a long time to come. There’s no going back. We need to put our collective GPTs together and transform our world.