Exploding Complexity

I’ve been thinking a lot about embedded Linux and Yocto and what constitutes an “embedded” system lately. My first PC was an 8086 clone with 2 5 1/2 floppy drive and no hard drive. I had a small collection of disks I’d trade out for my projects.

I remember running Linux on an old 386 with 4mb of RAM. I browsed the web with a 486 sporting 8Mb RAM. In more recent memory, I ran a few machines with 512 – 1Gb of RAM that managed to browse the web and accomplish most of what I do on a day to day basis – editing code and writing.

What exactly are programs doing today that they sit eating gobs of RAM? The smallest programming running on my desktop right now is the new Windows 10 terminal application – and THAT still takes up 10+MB of RAM. The argument is “they do so much more”, but I don’t think I can buy it. Software has jumped the shark.

I stumbled upon the people at suckless.org recently. While I don’t share they’re love of weird tiled user interfaces – I do find myself looking over their lists of software thinking – why can’t Linux running on Windows XP era hardware halfway decently. It’s been a long, long times since I heard about restoring older PCs to service by switching over operating systems.

Professionally, one of the larger code bases I deal with is still in C – and I find myself yearning for C++ as I stare at piles and piles of copy-paste, poorly maintained crap. That said, as I look over the software landscape, I’m not sure it’s language choice that matters. Old C code generally has corners of repeated if(error) checks, switch statements / function pointer based polymorphism, and the occasional macro based non-template (or worse, magic with untyped pointers). We can use C++ to good effect to reduce / eliminate / clarify these situations. But, instead of enabling cleaner code, we create brand new realms of furthered complexity with undebuggable template meta programs and spiraling complexity of exception handling. At least in C it’s obvious if you didn’t check for a malloc return – C++ happily throws std::bad_alloc and you’re off to the complexity farm. Bring in the even newer happier cohorts of Perl, Java, and now Python, Javascript, and extended Java / C#. Adding extra gas to the fire is web package management with nuget, pip, and npm.

All of this is to “manage” the complexity of software. And yet, at the end of the day, computers today are still keyboard/mouse or even simpler, a touchscreen. The actual complexity – high bandwidth modems, extended security concerns, increased screen resolution – could easily be dealt with in our old C/Pascal land primitives OR are being dealt with that already.

As I sit staring at the build screen of Yocto taking hours upon hours to finish compilation, I’m starting to form a new and strong opinion on software complexity:

  1. Design failures of any given abstraction geometrically increase the complexity of successive layers failing to address the design failure.
  2. The ability to complicate software scales linearly with the productivity gain of an abstraction.

All hope is not lost though – developers can choose to manage complexity and ease its impact with the the same tools that allow that complexity to exist in the first place. Unfortunately, I don’t think this is happening. Indeed, the combination of opinions 1 & 2 – and the current state of code quality (that seems to be an industry wide pandemic) indicates a final concluding observation:

Applying Improved productivity technology to legacy software conceals underlying design flags of the system.

I feel Linux in general is reaching an odd ‘critical mass’, where the ancient discussions of system flaws and design needs are catching up to us. Meanwhile, I’m starting to come to see the light of long term colleagues that hate C++. Going from C to see C++ now seems to be like digging with a shovel versus a bobcat. Sadly, I don’t think C++ generally has the right level of discipline when considering the mess.

Leave a Reply

Your email address will not be published. Required fields are marked *