March 1, 2026
It took me really long before I was ready to try out ChatGPT. I asked something about travel plans and why my cake didn’t turn out as expected. I thought, ok cool, somebody is “googling” for me now.
Fast-forward.
A couple of months later (some weeks ago), I struggled with a CSS layout and asked ChatGPT for help. I didn’t get the correct answer, but at least a hint and found the solution myself. Same with a problem in my Astro configuration. I asked a specific question and the answer was just wrong. I said, hey, I have the documentation right in front of me that says what I need to do, why do you tell me something different? “Yes, Lara, you’re absolutely right…”
I needed some help with creating test data for a demo in a Blazor project where I’m not really familiar with the C# syntax and data binding. I got useful results I could work with until the logic got more complex and involved Syncfusion components, and eventually I gave up and asked a fellow developer for help, which was the more efficient way.
In a React project, I let WebStorm’s AI assistant look at two similar components and asked where the differences are and which component props could be removed. The answer looked useful at first glance, but after reading the code, I realized that not everything was correct, so I had to untangle the mess myself because I didn’t trust the output.
It’s a weird time we’re living in right now. I know other (good) developers who think that AI and vibe coding and agentic coding are the future, and there are others who just ignore everything about it. At the moment I’m not afraid of losing my job anytime soon, but still… recent developments make me think. Am I too late to the AI party? Did I already miss jumping on that train? Do I even want to use AI in my daily workflows? At the moment, I’m not sure… 🫠
What LLMs can do when it comes to writing code can sometimes be impressive and helpful, when you as a developer know what you’re doing, yes. Generating random images and videos and music and stuff (a.k.a. AI slop) just for the sake of it? Oh, come on, please don’t.
I think many people misunderstand what “AI” is. Ten years ago, I wrote my master thesis about knowledge-based systems and artificial general intelligence, so I’m interested in all of this, but lately I’m just overwhelmed by the fast pace and this hype that is going on…
For me, it currently feels like we are building tools that take away thinking and reasoning from us. This might be nice for some things, but on a large scale, I think it can become dangerous. The people who are happy with algorithms using their private data to present never ending feeds of content in their social media apps are probably the same trusting ChatGPT and Co. blindly and using it for everything…
(I highly recommend watching the video Algorithms are breaking how we think from the Technology Connections YouTube channel. It’s not about AI, but about people giving up control and letting algorithms decide what to watch… 🥲)
There’s so much on my mind, and I’m glad that others have already found good words for what I’m thinking at the moment:
“Generated code is rather a lot like fast fashion: it looks all right at first glance but it doesn’t hold up over time, and when you look closer it’s full of holes.”
“Instead of wanting to learn and improve as humans, and build better software, we’ve outsourced our mistakes to an unthinking algorithm.”
by Sophie Koonin in Stop generating, start thinking
“[…] if you create a piece of work yourself or together in a team, your brain learns and remembers important aspects of the work and your decisions. If you use an LLM to generate it, all that learning doesn’t happen and so it takes individuals and teams much longer to actually understand what’s going on…”
by Matthias Ott in Continvoucly Morged Value
“If we become this super mega efficient, then maybe we should start adopting 4-day work weeks more and adjusting expectations so people aren’t always working as much.”
by Cassidy Williams in her video An attempt at a balanced perspective on AI
Besides this whole “people are not thinking anymore” discussion, there are other aspects we should not forget.
I hate it that nobody cares about the environment and that—in the middle of a global climate crisis—we just build more and more AI infrastructure consuming a crazy amount of energy…
Every product is throwing AI features at me that I don’t need or don’t want to use. But still, I have to pay more, and subscriptions are getting more expensive.
Content and knowledge are everywhere, and it’s becoming increasingly difficult to determine what’s right and what’s wrong.
I don’t want to be this frustrated woman who says that all this is nonsense. To be honest, in my current role as a mother of a toddler and a baby, it’s really hard for me to keep up. Maybe I am a bit scared what is going to happen with my job in the next months and years.
But I just don’t think that the world can build good software without teams of great people who inspire each other and learn from each other. We can’t just constantly pursue the goals of becoming faster, more efficient, richer… without any consequences.
We need to slow down again.
Maybe I am too late to the party, maybe I don’t want to be a part of it at all. And if the party doesn’t stop, I’m sure I’ll find something else to do. 🙂