I’m an old fart, technologically speaking. The first computer I ever got to use was a desk-sized monstrosity which was programmed in hex codes using a numeric keypad and displayed the results on paper tape. It had a massive 4K of RAM. Despite this incredible limitation, we were able to make it do useful work.

250px-Apple_I That was way back in 1976. The PC didn’t exist yet. Even the original Apple-1 was just appearing. The few arcades were filled with purely mechanical pinball machines. Even the vector graphics arcade game, "Space Wars" hadn’t made it’s way into arcades, yet. Magazines like the Popular Science published articles about cool kit computers you could assemble yourself. It was really the dawn of computers as we know them today.

Having used computers for this long gives one a lot of perspective on the current state of computer technology. 320px-TRS-80_Color_Computer_1I was thinking about all of this the other day and realized, we did most of the same things in the early days of computers that we do now. In fact, I used a graphics-based word processor with multiple fonts on a hacked up TRS-80 Color Computer running OS-9 back in late 1982. I ran spreadsheets. I played games. I logged onto CompuServe as a gateway into the text-based Internet before most modern ISP’s or the World Wide Web existed. I did all of that in only 64KB of RAM. Yes, it was slower. But it worked.

So, why am I sitting here typing on a machine with 12GB of RAM now and noticing the memory is already half full? I have 187,500 times as much memory on this machine as I had then and somehow I’m still in the position of sitting here wondering if I have enough. Granted, I have two browsers running with lots of tabs open, Windows Mail, Windows Live Writer, Vista Sidebar with half a dozen gadgets, and a couple of smaller programs all running at once, but holy crap. How did those things eat up half of my RAM when I have 187,500 times as much RAM? In those days, I wrote a lot. I programmed. I emailed. I worked on spreadsheets. I chatted online. I played games. The basic tasks I do haven’t changed all that much over the years.

Yes, the applications have become fancier, more capable applications, but are they 187,500 times as fancy and capable? What has changed about those programs that requires so much RAM? True, rendering imagery is RAM intensive, but even at 2560×1680, that’s only 16MB of RAM to bitmap my entire screen display at 32 bits per pixel. Each of my graphics cards has 1GB of RAM to hold that display, so there should be plenty of overkill there and it shouldn’t be eating up my RAM. So, if it’s not the display eating up my RAM, what could it be?

I guess the real question I’m asking is, what additional functions are being performed on this machine that weren’t being performed in my 64KB machine booting OS-9 back then? And why are those additional functions using 187,500 times as much RAM?

It’s not just the multi-tasking, because OS-9 was a multi-tasking OS, too. I often ran a few programs at once. It’s not just the fact that I’m running a lot more applications and services at once now, either. Even if I ran 1,000 applications at once on that old system, I still would have been far under the 187,500 times as much RAM I’m using now. So, what’s going on here?

Well, for one thing, each computer operation and data chunk are bigger. The computer I was using back then had a 16 bit processor. A few years later, they advanced to 32 bit (double the size). Now, I’m running a 64 bit CPU and OS. So, that quadruples the RAM size of running the same sequence of instructions. I’ll oversimplify it a little to explain what I mean. If it took 200 operations to complete a process back then, let’s say the sequence took up 200 x 2 bytes (16 bits), which was 400 bytes. Under 32 bit, the same 200 operations took up 200 x 4 bytes (32 bits) 800 bytes.  Under 64 bit, it would expand to 200 x 8 bytes or 1600 bytes. So, the same 200 operations went from 400 to 1600 bytes in size. If you multiplied this out for programs with hundreds of thousands of operations, the difference becomes big.

RedHAFBut quadrupling the program code size still doesn’t come close to explaining a difference of 187,500 times as much. Given the same 64KB of original space, that should only mean you would need 256KB to move to 64 bit computing with that same OS, right? And remember, we’re still talking about KB here. We haven’t even made it up to MB yet in our discussion, much less the many GB available in modern PCs. The main people really aware of this massive code-bloat trend are folks like me who have used computers at home for decades, people who also use a few of the leaner operating systems now, and computer programmers. Considering that until a few years ago I fell into all of those categories (I only do non-profit web programming now), it is trebly apparent to me that something is going drastically wrong somewhere.

Perhaps it’s the data? Well, yeah some data is truly huge. Video files. Image files. Music. Databases. Our data files have definitely grown much larger over the years and that is a legitimate reason to need more RAM. BUT – Even if you load a program like Word into memory without opening ANY data files, it still takes up massive amounts of RAM. In fact, just starting the computer up in the operating system uses up a huge chunk of RAM. That’s without running any programs at all. The OS is one of the biggest code-bloat offenders.

Back in the old days, (LOL) OS-9 ran the entire operating system in perhaps 20-30KB of RAM, leaving up to 2/3 of my 64KB for my programs and data. By the time Windows Vista 64 Ultimate finally gets me to the desktop, my RAM display shows that 20% of my 12GB of RAM is already used. That’s 2.4GB being taken away by the OS. We’re talking gigabytes here. That’s 120,000 times as much RAM as OS-9 used! Granted, some of that is being used for cache. And yes, a graphics user interface uses more RAM, but holy crap Batman! Even if half of that amount is cache and GUI functions, that’s still 60,000 times as much RAM for the OS. Does it do 60,000 times as much? I don’t think so. It does maybe 20 or even 100 times as much, but not 60,000. That’s just nuts.

So what do I think is going on?

programmerTo be honest, I think programmers don’t care about craftsmanship in coding anymore. They know RAM space is relatively cheap, so they don’t pay any attention to how much they use. They don’t spend time designing the code to be compact and efficient. They don’t work to break the code up into smaller reusable pieces in order to avoid duplication of functionality. They keep devising new ways to do things, but they also retain the full code from all of the old ways within the OS, just in case somebody doesn’t want to do it the new way. They layer on feature after feature without reusing or adjusting any of the stuff previously written. The code turns into what I call "a rotten onion." Something with lots of layers that nobody wants to touch.

Before you think I’m just pounding on programmers, let me tell you why they don’t care. Managers force it on them. The buck really does stop there. It all trickles down from upper management who have an accountant on one shoulder and a marketing head on the other. These two devils are constantly whispering enticing projections of the profits they could make by pushing up release dates, maintaining backward compatibility, and reducing costs. Dreams of large bonuses drive management to ride the programmers until they have to cut corners in every way possible in order to meet unrealistic deadlines. This means programmers NEVER get the opportunity to revisit or eliminate old code. They don’t get proper time to design and craft their new code. They barely even have time to write the code.

computer-programmers-2-320x240 In other words, programmers don’t care because they don’t have time to care. They’re already working far more hours than they want. There is always a fast approaching deadline they have to meet, so they add more layers of code and pray the existing code doesn’t break. When the existing code does break, it takes forever to find the problem because there is layer after layer of crap piled on other crap. It’s a nightmare to sift through, particularly when you didn’t write any of it. Eventually, after numerous development cycles and patches, you end up with a vast mess of code which is held together with string and duct tape and is at least 500 times larger than it needs to be in order to perform the functions it does. You end up with code-bloat. Function creep, backward compatibility, and poor management are directly to blame for the bloated masses of buggy code we are now running in our machines.

Does it ever stop?  Yes, but it’s never pretty or pleasant when it does.

Like any continuously growing object, there comes a time when an OS or program collapses under its own weight. Eventually, the massive bloated mess becomes so buggy, that people start avoiding it by droves. Sales plummet. Programmer scapegoats are found. Heads roll. Then, they start over. Smart companies start development on a replacement before the bloated mess gets completely out of control. For example, by the time the original Mac OS was starting to fall apart under its own weight, they had a nice, lean, well-engineered replacement waiting on a shelf in the form of OS X. Sure, it had some bugs initially (like any software), but it was a solid foundation to rebuild on. It was also a painful transition for users, because it wasn’t compatible with older stuff, but it was the right thing to do, and kudos to Apple for doing it. If only Microsoft had the balls to do the same.

When your OS or software finally does collapse, I guess the trick is to learn from the previous bloat cycle. When you start up your replacement, design new processes, as well. Create ways to make smaller portions completely replaceable. Change practices so that each of the parts is evaluated and rewritten in an ongoing fashion. Put standards in place to make debugging easier down the road.

But most importantly, give the programmers the opportunity to bring craftsmanship back to software development. Give them the time to produce something of which they can be proud. Allow them to engineer a set of code which is leaner, more modular, and less resource intensive. Reward them when they do. Set the planned features in stone prior to development. Don’t ask them to hit a moving target. There are lots of things that can be done to reduce software bloat, but it all has to come from the management at software development companies. The programmers are completely at their mercy.

Wouldn’t it be great to hear about a new word processor that does everything Word does and takes up 95% less RAM? Or what about an OS that only uses 10MB of your RAM, leaving the rest usable by your software and data? The reality is, good programmers can create immense amounts of functionality in only 10MB of RAM, if they are given a clean slate and the time to do it right. If you don’t believe me, just look at what they had running in under 20KB in the early days of personal computers. The word processor I used then could easily do 85% of what Open Office does now. Thinking about it now, I’m just amazed at what we did in those days with such limited hardware. Even the Gemini Guidance Computer had less RAM and CPU power than a modern digital wristwatch.

The bottom line? If we’re going to be forced to buy more RAM, it should be for the right reasons. It should be because our data is larger, we want to run more tasks, or we need more speed while running more tasks. It shouldn’t be because the OS and software we use on our systems have expanded far beyond the point where they should have been killed off and replaced. Let’s start to recognize efficient code when we see it and reward those who write it. Just say "No!" to code bloat. Let’s throw out the rotten onions of the software world.


Category: Uncategorized

About the Author


Comments are closed.