The AI opinion piece

Published on 18 Mar 2026


The devs and other people who love DLSS 5 and AI have such an inability to read the room. Unable to think about the entire context that the fueled the AI hate (for good and bad reasons) and anytime they speak they’re essentially saying that we suck ass, unable to think about the years of context before it.

“LOOK AT ALL THE SHIT WE CAN DO NOW!”
Yeah sure, but with AI I’m also thinking about how it destroyed the internet in many ways, it prevents me to do research, it prevents me to trust, it destroyed pans of stuff where the human touch is necessary (you know… art? press? knowledge?)

And sure, what I’m saying may not apply to DLSS 5, but their complete inability to factor all of that context, and then seeing the results and look at how it works, allow me to be distrustful of it. The result is uncanny, like a lot of genAI shit where I have to be trained to look at the uncanny to literally protect myself from bad actors.

I’m sure those people would look at what I said and be like “but you’re focusing on only the bad parts of it. Look, it’s helping against cancer!”
Cool. AI is a tool, but capitalism still exists and bad actors also tend to be massive corporations who just wants to drain us of our money and will, where we just see tech getting legitimately worse to use and unnecessarily complicated for nothing just to get us to cough the cash for an experience that would still be worse than 10 years ago.

At a time when we’re supposed to, you know, care about the Earth, hearing about how tons of AI data centers gets built, and let’s be honest here, they’re built for genAI specifically, not just AI, and it’s gonna just ask even more energy than before, when we are trying to optimize that for our survival… GenAI is the kind where it requires the most amount of power to process because it tries to do everything, and it’s the part that’s the easiest to show off to regular people. AI sure is a tool, and there are actual legitimate uses of it that I do believe in (accessibility, analysis, specialized uses…), but my god it also multiplies tenfold the bad uses of it and society doesn’t have time to adapt.

It’s incredibly ironic how we spent time to say to not use Wikipedia as a source, to not fully trust the Internet, but ChatGPT and the likes then get accepted immediately as a trustworthy thing to rely on. It’s ironic how the same people who told us to be careful, also happen to be the ones who immediately falls into the traps they even warned us about. It’s incredibly sad how many people just don’t want to think for themselves and happily give ChatGPT the right to think instead.

Also seriously I don’t understand the idea of an AI artist. A prompt is the equivalent of commissioning an artist to do stuff for you, training data is the equivalent of giving inspiration pieces to your artist to process what your intentions are. You’re just an AI commissioner at best, and saying you made it is 100% stealing credit to your artist… in this case, the trained AI model.

It makes everything lazy also. I actually think machine learning, while impressive, is technology purely made to bruteforce problem solving. Great for stuff that would be relatively impossible to process in a decent amount of time, but DLSS also exists in a way that devs don’t have to optimize anymore. Devs don’t like to optimize, and I get why, but also like… can we talk also how we have insanely powerful computers but then we went from optimized things to stuff that’s slow as heck on more powerful systems? Is the goal of AI purely just to make us reach the moon before we even built the station?

I’m probably going too far on it, and the logic is possibly not fully sound, but I hope you still understand the sentiment that I have. It’s also a sentiment that isn’t to be reasoned with by just telling me the potential or how much I miss the point and how much I suck. I just want that we do the best use of what we have, and we clearly do not. It also optimized the human touch by making it technologically impressive… but at the expense of how we cannot appreciate when a human did the same thing.

We love to take shortcuts, a lot of innovations are defined by that, and I do a lot where the importance is about the ends rather than the journey of it, but I’ll be honest, when I tried to generate art (it’s good to know your enemy), a lot of people love to be surprised with the results of a prompt, but I am a guy who loves putting intention, and in every single case of those generated images, none of them have ever gotten close to what I wanted, despite all my attempts to adapt my prompts. The only ones that did were when I asked a human to do it for me. And sure, you could train the AI to understand what you want, but by that point, you already made the thing to some extent, and it’s just going to look generic… and unsatisfying after the 50th generation hoping to get a good unique result.

It’s lazy, and it wastes too energy and time.