info
date: 2026-03-10 21:47:07
tags: karriere it Java softwareentwicklung rΓΌckblick
category: it
creator Stephan BΓΆsebeck
logged in
ADMIN
30 Years in IT β From the C64 to Cloud Monoliths
Someone recently asked me how long I've been "doing stuff with computers." The honest answer: over 30 years. The terrifying answer: it doesn't feel that long. The time between my first POKE 53280,0 on the C64 and today's kubectl get pods is somehow both an eternity and the blink of an eye.
This isn't a "look how great I am" article. It's more of a look back at how this industry has changed, how I've changed with it β and what got left behind along the way. And maybe a bit of a warning: if you go into IT, you should know that change isn't the exception. It's the default state. Because in 30 years, I've learned one thing: every new technology goes through the same cycle β first the hype, then the disillusionment, and eventually it crystallizes what you can actually use it for. This pattern has never changed. The technology is different every time. The pattern isn't.
The Beginnings: Assembler, Sprites, and the Hardware Revolution
It all started with the C64. Like so many others of my generation. First BASIC, then fairly quickly assembler β because BASIC was simply too slow for what I wanted to do. Moving sprites, raster effects, SID sounds. Anyone who programmed assembler on the C64 learned to respect hardware. Every byte counted, every clock cycle mattered.
Then came the Amiga. And with it, an entirely new world: multitasking, 4096 colors, Paula and Agnus as custom chips. Here too, assembler was my language of choice. The 68000 was a dream compared to the 6510 β proper 32-bit registers, a sensible instruction set. I programmed things back then that I can no longer comprehend today. Not because they were genius, but because I have no idea how I managed without a debugger, without Stack Overflow, without Google.
Even back then, one thing was clear: hardware generations were leapfrogging each other. The C64, the Amiga, the Atari ST β each platform was its own little universe. And any of those universes could vanish overnight.
Escom, the PC, and the Pace of Insanity
In 1994, I was working at Escom β the computer discounter that nobody remembers anymore. There I experienced the first truly crazy technology revolution firsthand: PC hardware. I felt like I could upgrade my PC weekly β and it was already outdated again. The generations were tripping over each other: 286, 386, 486. 33 MHz, 66 MHz, DX/2, DX/4, Pentium, Pentium II. And the processor wars: AMD versus Intel β ding ding, knockout AMD. (That they're back now and making Intel wobble, nobody would have predicted.)
In parallel, the first great battle of operating systems began. DOS, DR-DOS, Linux, later Windows. And then the big showdown: Windows 95 versus IBM OS/2. OS/2 was, in my opinion, the clearly superior operating system. It lost anyway β because of IBM's foolishness, if you ask me. Sometimes the better product doesn't win β a pattern I would encounter many more times.
At Escom, I also got a job offer that I turned down: drop out of university and join a company that wanted to create online games. They talked about "hundreds or even thousands" of players playing simultaneously online. I simply couldn't imagine how that would work. Aside from the fact that I absolutely wanted to finish my degree so I'd have something to show for it. Whether that company later became a major MMO provider? No idea. But the concept was simply absurd to me at the time.
University, the MFC Shock, and the Online Revolution
The switch to the PC was inevitable. The Amiga was slowly dying, and the PC world was exploding. So, C and C++. That was fine β low-level programming, pointer arithmetic, manual memory management. Nothing you didn't already know from assembler, just at a higher abstraction level.
And then came the moment that changed my career. In the mid-90s, during university, I wanted to learn the Microsoft Foundation Classes (MFC). THAT was the way to build Windows applications at the time. So I looked at the source code.
I was horrified.
Not because it was complex β I knew complexity. But because it felt wrong. Ugly. Contorted. A framework that worked against the language rather than with it. That was the moment I decided: No. Not me. Not Windows. Not MFC.
Instead: Linux and Java. And in retrospect, that was one of the best decisions of my career.
During university, I also experienced the birth of the internet firsthand. My first steps online were dial-up into BBS systems at 300 baud. Later 2400 baud β that was the future. You can't imagine that today. Then Datex-P, Telnet, FTP, Finger, Gopher as the predecessor to HTML. And I'm looking at the first HTML pages thinking: "This will never catch on. Way too graphics-heavy, takes too long to load."
My god, was I wrong.
Today everyone takes WiFi for granted. Man β what we would have given for that in the 80s.
The Wild Years: Consulting, Hypes, and 300 Days on the Road
While still at university, I started working professionally. "SBC β Stephan BΓΆsebeck Consulting" sounded more serious than "student who knows Java." From Escom, I had slid into the consulting track.
The freelancing years were intense. Up to 300 days a year on the road. I collected certifications like other people collect stamps: Certified Java Trainer, IBM Certified Trainer, and what felt like a hundred more. Not out of collector's obsession, but because as a solo freelancer, you needed certificates just to get into projects.
The projects themselves were often small or tiny. But the names were big: HP, IBM, Sun. Plus training sessions for education companies like Integrata. I was simultaneously developer, trainer, and actual consultant β one who didn't just push PowerPoints but provided genuine technical advice.
This period also coincided with the peak of tech hypes. When XML appeared, suddenly everyone wanted to put angle brackets everywhere. It was the buzzword that hardly anyone truly understood, but that produced the most impossible outcomes: XML databases, for example. I remember a salesman trying to convince the CEO of a company I was consulting for with the words: "Then you'll have ISDN and XML." That had absolutely nothing to do with the product. And what's it even supposed to mean when an online platform "can do XML"? Complete nonsense. Thankfully, the deal fell through.
Or PDAs. The Palm Pilot β I thought it was brilliant at the time. Everyone had one, everyone wanted one. I even wrote an open-source Java driver to synchronize my Linux calendar with the Palm Pilot. My beautiful driver became obsolete when the functionality migrated entirely into phones β first the Nokia Navigator, then that Windows-based thing from Telekom, and finally the iPhone. But more on that later.
At the same time, during university, we founded a loose partnership with friends: designer, IT admin, software developer, sales. Together, we built the tech for an internet cafΓ© β including a billing system for pool tables and internet usage. I had to write that in Perl because the machines were too slow for Java and couldn't calculate the times properly. I even found bugs in Perl and reported them to the community. You don't forget projects like that.
And then the browser wars: Netscape versus Microsoft. Microsoft tried to crush everything with money. I didn't like that approach then, but that's how their business model worked. Netscape went down β for real. And after that, Microsoft versus Sun with Java versus C#. Another uneven fight. Sun ultimately lost, because now C# is everywhere. Java stagnated for ages, development ground to a halt. Then Oracle came along β and things started moving again. You don't have to like Oracle, but they revived Java.
Those years were full of interesting anecdotes, some of them rather bizarre. Like the guy who wanted to hire me to set up an online video platform (wanted to buy hardware in the five-figure range from me), but then bailed because he wasn't allowed to run porn sites in Germany. He moved to the Netherlands and I was left empty-handed.
And the bizarre story of when I wrote software for some army (Yugoslavia? Serbia? I can't remember anymore) to store body measurements for uniform tailoring... If I'd mixed up the numbers, that probably wouldn't have ended well for me.
And the story of the customer who found something to "improve" with every release, and unerringly picked something that would cost enormous amounts of time. Until we started deliberately planting small, somewhat obvious bugs. The customer would spot them, and we'd say: "Ohhh... changing the color of that button is going to be tricky..." β "I don't care, just do it." And of course we did. The customer was happy because he could "improve" something, and so were we, because we weren't being kept busy with unnecessary work.
I believe that's also where I built a "fake progress bar." The customer complained that the SQL queries took too long. Back then everything was a bit slower and yes, it took time. After 5 seconds he'd complain, to me, by phone. So I added a fake progress bar for every SQL query. It would jump to 100 when the result arrived, but not before. It slowly crept toward 100% β but never actually reached it. As I said, otherwise he'd call if the query took longer than 5 seconds. Then I got a call saying the query had been running for "40 minutes and still isn't at 100%" β I had a NullPointer. Quickly fixed, and once again a happy customer...
And then there were the infuriating stories, like the doctor's office that in the late 90s was kitted out with an AS/400 host and serial terminals, and had to pay tens of thousands of Deutschmarks for it. It went to court, and I got to testify. They were completely taken for a ride.
There were so many bad actors in my "guild," it was creepy. And I once jokingly said to a friend: "For every tender that's come in, I could have done it for half the price and still been rich." But that was too unprofessional β you don't land contracts that way.
Around 2000, I had successfully completed my degree. My time in Passau also ended in the early 2000s, not exactly pleasantly β because the partnership we'd founded with friends fell apart very uglily, and eventually with the help of courts. I had to testify as a witness and got to expose some of my friends as liars. Not a good time.
I then moved to Munich. At some point it stopped making sense to travel to Munich for every project. The companies eventually stopped covering hotel costs. I probably should have seen the writing on the wall...
The Crash: The Dotcom Bubble and the First Major Disruption
Then the dotcom bubble burst. For me as a solo freelancer, that meant: 80 percent revenue loss. Overnight. The projects were gone, budgets frozen, and suddenly "freelancer" was no longer a cool lifestyle label but a risk factor.
This was the first truly destructive disruption that hit me personally. Not a technology that changed β but an entire business model that imploded overnight. The tech was still there, the skills were still there. But the contracts weren't.
I had to reorient fast. Unfortunately (or fortunately), the tax office had absolutely no qualms about making it clear that even without revenue, I still had to pay my taxes. It nearly ruined me. There were times when I could either eat or spend the money on gas to drive to a client. That was an educational time.
During this period, I also had a car problem that you almost can't imagine. I had a Toyota Corolla β and I'd gotten a lemon. In the two years I had it, it was in the shop at least 15 times. Problem: the garage was in Passau and they had to handle the warranty work there. So I had few contracts and a broken car that I absolutely didn't want to keep past the warranty β I never could have afforded that. On top of the already non-existent revenue, I now had the losses from selling this lemon. That was a low point.
Softlab, VMware, the iPhone, and the Vista Shock
I had just moved to Munich and since I had no choice anyway (dotcom bubble burst), I got myself a permanent position. At Softlab β a BMW subsidiary at the time. The culture shock was interesting: the concept of "vacation" was new to me. Not working and still getting paid? As a freelancer, unthinkable. I had to learn that first.
At Softlab, I was first a software developer, then Technical Project Lead. Two years, various projects for BMW, MAN, and others. Solid work, good team, learned a lot about working in larger structures.
At Softlab, I first came into real contact with the types who saw themselves as something better, because they were "architects." The conversations were accordingly lofty. They were seriously convinced that the next disruptive technology would be Enterprise Java Beans, because with EJB you could "just plug software together from beans." Which didn't work at all. This "programming by skeleton" or then the whole UML hype looked great on paper, but in reality it simply couldn't keep up. Way too cumbersome. And EJB was "disruptive" for a different reason β many jumped on the EJB bandwagon, but the performance was never there. All those beans, all that technology was hard to use and certainly drove some companies to ruin. And led to the good old IBM hosts like the AS/400 having a far too long lifespan.
This period also saw the beginning of a quiet revolution: VMware brought the first truly usable virtualization software to market. We didn't quite know what to actually do with it at first. Multiple operating systems on one machine? Sounded like a gimmick. Today virtualization is the foundation of everything β cloud, containers, all of modern infrastructure. But back then? Back then it was a nice gadget.
And then something happened that turned the industry on its head: Apple released the iPhone. The first cell phone I got was still a Nokia β writing texts on a number pad, and you were reachable. Everywhere! Revolution! But the iPhone? I passed on it. No keyboard β how are you supposed to work with that? What an idiot I was. Less than four years later, there wasn't a single relevant phone with a keyboard left on the market. That is the definition of disruption: when the technology you consider indispensable simply vanishes β and nobody misses it.
Around the same time, Microsoft had pulled off a special feat: Windows Vista. And Apple had finally built a usable operating system. I'd always found Apple interesting as a company, but before Mac OS X, it was barely usable for a techie. Multitasking? Non-existent. Classic Mac OS was creepily behind the times. But with OS X, they suddenly had a Unix with a stable, beautiful interface. No more endless tinkering to get the desktop working or keep it running β anyone who ran Linux on the desktop knows what I'm talking about.
Vista gave me the decisive push: I switched. MacBook, and then an iPhone after all. And it was fantastic. Windows had suddenly become unnecessary. The Mac was Linux with a finally stable interface. The iPhone a computer in your pocket with a real browser. Both together changed everything β how we build software, how we think about interfaces, how we communicate. Perhaps the most important disruption of my entire career.
While I was waiting for my Mac to arrive, I overheard a conversation between two colleagues in the break room. Both had coincidentally bought the same new gaming PC β expensive, high-end. The conversation went roughly like this:
"It's super fast, I love it." β "Yeah, exactly. Everything runs great." β "Yeah... does it happen to you too that you can't play audio CDs?" β "Yeah, but that's no problem, I've got another PC." β "Right. Do you get blue screens when accessing DVDs?" β (laughs) "Yeah, me too. But the PC is great." β "And I couldn't boot Linux." β "Yeah, you don't need that anyway. But the PC is great."
Then came more variations: A says the PC crashes during operation X. B responds: "Yeah, me too β but it's great." At some point I stepped in and asked whether I was only noticing this because I'd just ordered a Mac, or whether they'd both lost their minds. A PC where nothing actually works properly β but it's great. For me, that pretty much summed up the state of Windows.
The funny part: I also recommended the Mac to some of my PC admin clients. Without exception, all of them stopped calling. Most because they actually switched to Mac and simply didn't need my help anymore. And a few because they were apparently so attached to their Windows blue screens that after my Mac recommendation, they no longer considered me capable β or worthy. Either way: losing clients through advice that was too good. You have to pull that off first.
The Manager Detour: Acronis and the Wide World
Then I was more or less headhunted. Thanks to my training experience, I landed as "Manager Training & Certification EMEA" at Acronis. A completely different world: I was responsible for the entire EMEA region, traveled extensively, made it as far as Singapore. Exciting? Absolutely.
But β and this is a big but β I'm not a pure manager. I missed the tech. Meetings, budgets, and strategy papers are important, but they don't replace the feeling of solving a problem with code. After about two years, I knew: I have to get back to tech.
This was also when I had my motorcycle accident, which among other things led me to switch to split keyboards. But that's a different story.
Back in Code: HolidayInsider and the Client-Server Pendulum
So: from manager back to Senior Software Developer. At HolidayInsider AG. That was a deliberate decision and one of the best of my career.
HolidayInsider was pure startup feeling. We started with five people in development and ended up with almost 20, including working students. Five years in which I learned an incredible amount. Not just technically, but also about how to build a team, how to introduce processes without losing agility, and how to deal with extreme growth.
Technically, I observed a pendulum during this time that I already knew well: the eternal back-and-forth between client and server. Fat clients, thin clients, poor clients, rich clients. Local compute, central compute. Over 30 years, this pendulum has swung back and forth at least ten times: "We need to provide everything on the server, the client just displays it. Server resources are cheaper!" β Two years later: "The clients have 100 TFLOPS of unused resources, we need to leverage that! Local is great!" β And back again. Always the same thing. Just with different buzzwords.
Then came the acquisition by HRS. And with it, the culture clash that everyone knows who has ever experienced a startup being swallowed by a corporation. Different speeds, different priorities, different ideas about how software should be developed. It didn't go well.
The Difficult Phase: SimpleSystems
After HolidayInsider, I moved to SimpleSystems. Also a startup, also with the ambition to build something. Unfortunately, it didn't work out with the boss. We simply didn't understand each other β different expectations, different communication styles. It was stressful and unpleasant.
What weighed on me most: some developers who had followed me from HolidayInsider also had a tough time because of it and eventually quit as well. You carry that with you. As someone who pulled others along, you feel responsible β even though everyone ultimately made their own decision.
Not every stop in a career is a highlight. Some are lessons. SimpleSystems was one.
Finding My Place: GBI Genios, the Monolith, and the Container World
Then came GBI Genios. And I've been here for over eight years now β the longest stop in my entire career.
GBI Genios Deutsche Wirtschaftsdatenbank is a joint venture of FAZ and Handelsblatt. We have over a billion documents in our index β press publications, trade journals, business information β all searchable in fractions of a second. On top of that, we have data from the Bundesanzeiger (Federal Gazette) and other business sources. What makes it special: press and trade articles are tagged with company data, so you can link coverage with business information β hence "Wirtschaftsdatenbank" (business database). Customers can buy individual articles instead of subscribing to entire magazines. Around this, we've built various business cases, for example the FAZ archive pages.
I serve here as "Head of Engineering & IT" β essentially CTO in plain English. And for the first time in my career, I feel like both worlds come together: the management experience from the Acronis days and the technical depth from all those years as a developer.
The technical reality is, let's say... challenging. We're working with a monolith still running on JDK 1.8 and Grails 3. The modernization β breaking it into microservices, updating to current Java versions, making everything more maintainable β is a marathon, not a sprint. But I actually find tasks like these more exciting now than building something new on a green field. Rebuilding a running engine in flight β that has its own appeal.
And while we're modernizing the monolith, the next revolution is raging outside: Docker, Kubernetes, containers in general. Docker takes virtualization further, with containers and their "description" in Dockerfiles. For me personally: completely unnecessary. I can operate Linux, I can set up an LXC container automatically β Docker is an additional layer for me, a black box, if you will. But I understand the reason for the hype: convenience. That's fine, even if overrated in my opinion. Kubernetes takes orchestration to the extreme. It all has its place β even if I can't quite warm up to it. A personal shortcoming, I suppose.
But that's exactly the point: when you've observed this long enough, you develop a certain equanimity toward the latest hot trend. Not cynicism β I'm still curious, still capable of enthusiasm. But you learn to distinguish between genuine progress and the next hype cycle. And you have to be honest enough to say: this isn't for me β without claiming it's bad.
On the side, I also maintain my own open-source projects: Morphium, a MongoDB driver and ORM for Java that we also use in production at Genios. And jblog2, the blog software this article is running on. Projects like these keep me technically sharp β but more on that in a separate article.
Right in the Middle of the Next Revolution: AI
I honestly thought I wouldn't experience another truly destructive shift. No more upheavals like the transition from offline to online, or from PCs to smartphones. That was naive.
Because here I am, right in the middle of it again. The AI revolution is the real deal: destructive, it will change many business fields and render some obsolete. If you're not paying attention, you'll get left behind. I've been through this a few times now. It doesn't leave me cold, but I'm also not as worried as some others might be. Over the years, you develop a sense for where real change is happening and where it's just the next hype cycle running its course. With AI, I'm fairly certain: this isn't a hype that will disappear. It's fundamentally changing how we develop software, how we process information, how we work.
But β and I say this as someone who survived 300-baud modems, the dotcom crash, the death of Sun Microsystems, and Windows Vista β there will be a hype peak here too, followed by disillusionment, followed by the moment when it crystallizes what AI can really do and what it can't. This pattern hasn't changed in 30 years.
What Remains
When I look back at 30 years, I see: hardware generations leapfrogging each other. Operating systems battling it out. Browser wars. The birth of the internet. The mobile age. Cloud computing. And now AI. Each one of them turned the industry upside down, destroyed companies and created new ones. Assembler, C, C++, Java, JavaScript, Go, Rust, Kotlin. Client-server, web, SOA, microservices, serverless. Waterfall, Agile, DevOps, Platform Engineering. The pattern is always the same: a technology arrives, gets hyped, matures, becomes the standard, becomes boring, gets replaced.
What endures? The fundamentals. Algorithmic thinking. The ability to decompose a problem. Understanding what's happening under the hood β even if you no longer need to count every clock cycle. And above all: the willingness to keep learning new things. If you stand still in this industry, you fall behind. Not in months, but in years.
I have the feeling that my generation β Gen X β was shaped differently by all of this. We've been through so many revolutions that very little fazes us anymore. Including various apocalypses β the current one with climate change is roughly my fifth. That's not to trivialize it. But it might explain why we sometimes react more calmly than younger generations expect.
I've seen technologies come and go. I've seen companies come and go β Escom, Sun Microsystems, even parts of the Amiga ecosystem. What survives aren't the technologies. It's the people who can adapt.
30 years in IT, and I'm still here. Not because everything was always great β SimpleSystems wasn't, the dotcom bubble wasn't, and the monolith at Genios isn't always either. But because this industry has something I don't find anywhere else: the feeling of being able to build something every day. Something that wasn't there before. Something that works.
Or sometimes doesn't work. But that's what logs are for.
In the next article, I'll write about how to stay technically sharp as an engineering lead β and why that doesn't happen by itself.