C++ Needs a New GUI Framework

The landscape of GUI C++ development is pain – native Windows gets third tier support from Microsoft, and Android actively discourages native API. Linux is better with Qt and GTK, but GTK on Windows is rough. My go to choice for years has been Qt.

Lately though, it seems Trolltech Nokia Digia The Qt Company has an active dislike of their users. I’ve brought up the idea of Qt at my day job, but the word is they won’t cut a deal amenable to requirements. So we mush on. There’s lots of homebrew garbage out there – especially if you start looking at widget sets on top of Unity. Hey, why not yet another CSS Browser?

In the end – maybe it’s just that the demand for native code isn’t there. Web front-ends are all the rage, and electron apps can do wonders. Why not take a gig of ram for a text editor and chat client – RAM is cheap these days. Still, there’s something hugely missing in development work when you start looking at the interface between C++ and whatever Javascript engine du jour you’ll be running on.

Qt is almost there for so much. Unless you want to make money or distribute an app with a GPLv3 incompatible license or environment. Given Microsoft’s amazing collection of freeware tools, you might expect a license for commercial development to be reasonable. You’d be wrong. The keepers of Qt licensing want $5k+ per developer. The community shouted. They offered a ‘small business’ package for anyone with less than $100k revenue. The community shouted again. Now, they’ve upped that to $250k. Just don’t look at the fine print if you want to distribute embedded works.

What would make a nice GUI library?

  • Some sort of DOM / Canvas model that is intuitive and easily interacts with C++
  • Scripting support with C++ tie-in
  • Stable API that plays well with “standard” C++

Hit those buttons, don’t charge me an arm and a leg, preferable be open source – GPL + commercial would be ok by me, and we’ll talk. Maybe it’s time for a Motif comeback, I miss you X11 days.

Dev Rule #1: Delivery Isn’t a Cure-All

How many programmers view themselves as craftspeople? From those I know, the vast majority.

When dates are on the line and customers are shouting – many programmers with that mindset will hunker down, work extended hours, and chant the mantra – “Once we ship, everything will be better.” This is a lie.

Best Case Scenario: You hit whatever date the boss or customer wants and deliver exactly what they want. In years of development, I’ve yet to see a single deliverable that qualified as exactly what someone wanted. Even with extensive documentation and discussion beforehand. Delivery is compromise. The verified features and functionality of software meets the standard set. People always want more, it’s why no software is ever “done”. If the end product is exactly what they wanted, the bitter taste of angry meetings and phone calls will flavor your delivery like an open container of ice cream left in a freezer of rotten meat and onions. It may still be ice cream, but it tastes like garbage.

Worst Case Scenario: Continual miss communications force further and further schedule slips and angry calls until the product is cancelled and new “leadership” or (more likely) a new development team is brought in to “Fix Things”. This is, almost universally, the end result of using a poor outsourcing team or contractor. At some point, a manager is gonna realize that doubling down on the steaming pile isn’t gonna work. Don’t be in this situation, it won’t end well.

Delivery won’t necessarily hurt a bad situation, but it’s also not guaranteed to help. Continual communication and clear expectations keep bad situations at bay. Some leadership may simply be toxic – but the vast majority of people in the world simply want to get things done and not look stupid in the process. Delivery only fixes the first part of that, and it can make the second part substantially worse.

An amazing craftsperson doesn’t just make an amazing product – they sell the world that it is amazing. Quality whispers – it doesn’t shout.

Dev Rule #0: All Rules Have Exceptions.

Why is this Rule #0 and not #1?

First, it’s not so much a rule as a disclaimer. You’ll always find an exception to the rule – even this one. To quote a pure cinema classic, “not so much rules as guidelines”. I know enough about nerd mad typing from having done it so many times. I’m not going to argue with paragraphs of text justifying violation of one of my personal rules. Chances are, I’ve violated that rule more often and with more gusto than you anyhow.

Second, we’re programmers – indices start at 0. Unless you’re one of those MatLab folk. But really, Matlab folk should be isolated on an island away from civilized folk anyway. Bonus Unlisted Rule: Don’t take programming advice from any programmer who’s primary language is Matlab.

Old Man Js 1: Too Many Tools

As I dive more into web programming in an effort to become stronger at the front-end, I figure I’ll drop some notes for any other enterprising embedded / server programmers wanting to join in.

Plodding along on the internet, I’m rapidly discovering that the choice of libraries seems to expose on to an endless array of different methods of building / compiling your web-app. PHP seems much more straightforward in comparison. The first, and most confusing element to me was ‘nodejs’ itself.

My backend is all Python, so what’s with requiring this NodeJS Javascript web server? It’s not a web server, it’s a scripting environment. Well, that makes a bit more sense.

Ok, but why do I need a Javascript environment to use these toolkits? Well, the utilities to compile JS are written for that environment.

Wait, I thought JS was interpreted by the browser? True, but you want something to maintain all the dependencies and automate things like minification and creating map files.

5 minutes in to reading a basic tutorial on several different frameworks, I’ve already had to discover new terminology for nodejs / npm. And, at this point, I haven’t even started down the alphabet soup of different environments:

  • Yarn vs. Npm vs. Bower – Ok, we’ve got multiple competing package managers here to get going… And each has it’s own quirks. Maybe the best answer is to stick with npm since it came with the environment? Crud, looks like these tutorials use yarn.
  • Gulp vs. Grunt. – Ok, so now we start to discover that inside this JS environment are apparently new environments for running tasks… Ok, not too much a problem.
  • Webpack vs. Browserify – Well, these are what I installed this node thing for anyway aren’t they? What am I getting here?

Annoyingly, each JS developer has their own ‘special sauce’ combination of components that yield something for the back-end developer. The larger the application (and the more 3rd party utilities one brings in), the more likely it seems one will need to go ‘off script’ from recommended configurations provided. That doesn’t even begin to raise the shear number of potential library combinations that may (or may not) be tested.

I’m trying to like this Javascript thing, but it’s really reminding of DLL hell days in windows.

Dev Rules: Personal Philosophy of a Rogue Software Engineer

Not long ago, a fellow software engineer popped his head into my office to reveal some new daily horror worthy of posting to TheDailyWtf. As usually happens in such situations, my brain ejected a small stream of profanity before I gave into an uncontrollable urge to shake my fist and point out the voluminous reasons this particular example indicated the responsible party should be tossed off the roof of our building. As my face returned to the normal shade of programmer day-glow white, my fellow laughed and said that I should write down my personal development philosophy.

So here goes. Friend – if you are out there – I suspect you will find this good bathroom reading. And if you printed, perhaps useful as well. Just use soft paper.

For everyone else, ignore these posts. They will not make you happier or more productive. I am no Mel, and definitely don’t qualify as a Real Programmer. For #@$* sake, this a WordPress site complete with crappy PHP stolen from a WP index. Chances are half the server traffic here is Russian Command and Control Botnet commands forcing the latest DoS attack against some GOP website. Worse yet, I wrote this with a WYSIWYG editor. Not random SQL queries.

What I’m saying is, don’t take me seriously.

More likely than not, these posts are all written under the power of various prescription drugs in a vain attempt achieve some sort of sleep while dealing with Chronic Illness. If nothing else, such curses give you more free time. Bad spelling, grammer, and made-up Texanish words be ahead. You’ve been warned.

Social Networks are Hard

I’ve decided to start writing a bit about various theories I have on social networks, Facebook, twitter, and the blogosphere. I fear that attempting to start an “Open Source” social network, or join an “Open Source” network is a cause doomed for failure. But, I’m not sure why. Back when I started blogging, I was amazed to find a network of real-world people brought together over blogging. The years haven’t been kind to blogging. Facebook and Twitter have slowly pulled users into their clutches.

My early days online were during the time of AOL disks and TV news hours advising against meeting people you talked to online. Meeting an open source contributor or two was as far as I dared advance. Certainly no online dating. I enjoyed reading BBS articles on graphics programming and tinkering with MS-DOS games and utilities. My access was limited and supervised as I was in middle school, and Linux wasn’t happening due to my PC being an old 286.

It took a few years, but I finally managed to scavenge a 486 from trash parts and with the help of NetZero (and a little sneaking around my parents), scored a net connection. Geocities gave me my first web home, and I started my first blog. I wrote posts in a text file and published by running ‘make install’. Staticgen before it was cool.

Key things I remember liking about ‘social networking’ in the days of Geocities and later MySpace:

  • Webrings formed small networks of people with similar interests and cool information.
  • Newsgroups provided amazing access to experts and connections with similar interests.
  • E-Mail was used for more than verifying acconts.

Things I remember sucking:

  • Connecting with real-world friends was generally e-mail only
  • Newsgroups were full of self-styled experts and many weren’t kind to n00bs.
  • Technical barriers were significantly higher than before.
  • Slowwwwwwwww.

I recently took the effort of joining Mastodon. I don’t see it replacing Facebook or Twitter for anyone I know. I had hoped to maybe find a viable alternative to Voxer – not so much there either. Indeed, I don’t see much of use for me as an English speaker except maybe meeting some interesting folk around. But, mostly I’m seeing a bunch of young left wingers, and I don’t have much patience for the college crew today. Indeed, I’d venture a significant number of people I’ve interacted with weren’t even born when I first got to drive on the internet. Not much more to say, because someone else already wrote my experience with the problems clearly highlighted.

I don’t care for where Facebook is going these days – and I’d love to see the “next” big thing be something that fulfills some of the early promises of the internet. To me, that means choice of service provider and the ability to contact people from other providers.

I find the underlying technology interesting, and there’s a lot of awesome research potential and algorithmic stuff possible in this space. I thought I’d try to cover my exploration here on this blog. It’s as good a place for any for these brain dumps, and maybe some Zuckerburg character somewhere can use it to help build something. If you’re that person, cut me in after you make the money please.

Portable C++: Unpacking integers from binary buffers

As C++ code is so close to the metal, we often make dodgy assumptions that hurt portability. One of the ‘simplest’ problems that I’ve seen repeatedly is packing and unpacking binary data.

The C++ works hard to eliminate definitions that would tie us into a particular hardware architecture, and this area invites a desire to throw caution to the wind and make assumptions as to exactly what’s going on.

The new college grad (and old-hat that views this as all theoretical anyway) might write:

There’s a handful of problems here:

  1. We’re assuming bit size of ‘int’ – it may be anywhere from 8 to 64 bits on common platforms.
  2. We’re assuming that we’re safe to read a char aligned buffer to an integer.
  3. We’re assuming the buffer is packed with appropriate byte order for our processor.
  4. We’re breaking the strict aliasing rule.

Can we write a new version of the function to take care of these challenges? Well, with a little care:

This version was tuned to work with GCC 5 and higher. This function is highly portable – it should operate on any architecture providing 8-bit chars and 32-bit int32. Indeed, the C++ standard definitions for conversion to/from std::uint32_t even handle the mode of twos complement arithmetic vs not. Using bit-shifts and or defines the exact expected behavior of the construction of the 32 bit integer.

And there was much rejoicing… sortof… There’s many a blog¬†post out there that support this method of formatting.

Now, let’s say that this particular call is fairly performance critical (perhaps we’re doing some pixel or image manipulation – use your imagination). In my application, I was processing large data files. Modifying from the first style to the second fixed issues with ARM portability, but slowed down performance.

Most compilers see the above pattern and recognize – “hey, I can just load a 32bit word and return, no harm / no foul.” Sadly, Visual C++ does not. No combination of optimization flag and type manipulation get the optimizer to recognize the pattern. Even GCC is fairly sensitive in situations where it can (hence the std::uint8_t casts throughout). To faciliate portability and performance on all my desired targets, the end result was using std::memcpy to a temporary integer. The ARM compiler happily recognizes we may be accessing unaligned memory, and all the other toolchains optimize away the memcpy to a simple load. Of course, now we’re back to handling byte order again. Ugh!

At the end of the day, maybe the grouch has it right – just worry about the processor you’re running on (hopefully just 1). It’s all fun and games until you find yourself porting to that random platform you’d never worry about.

Blinkt! by Pimoroni

Searching for a way to carefully control illumination of a project, I discovered “Blinkt!” by Pimoroni. I was hoping for something that would easily connect to a Raspberry Pi 3 with a minimum of fuss.

The Blinkt! connects straight to the GPIO pins on the Raspberry Pi 3, and can also work on an extension board or off a ribbon cable. A warning: many cases have ‘ribbing’ along the side that will interfere with mounting the Blinkt! directly on the Pi – NONE of my cases were compatible without the ribbon cable.

Pimoroni provides an easy to use Python software library, which worked directly as advertised on Rasbian. The examples were easy to modify and get started. For my use-case, I wanted to create a flashlight with hue control. While these LEDs are very bright on there own, they don’t compare to a higher power utility light or flash light – understandable as they are being powered off the Pi. The array was surprisingly bright, but only just barely able to perform the functions I was hoping.

My goal was to provide carefully controlled lighting for creation of photographic inputs for a source project. The Blinkt! did it’s job fairly well:

One can find cheaper LEDs and ‘bigger setups’, but the simplicity and price are hard to beat for someone that’s searching for an item that’s get up and go. For this project, the $8 spent on this board was well worth it.

Changing world of open source

Back when I started college (2001), I remember a world where tech was, well, ‘fun’. I was a die-hard free software guy willing to put up with far-to-much in the interests of ‘tweaking’ my computer. Running Linux was like drive a custom made hot-rod. I knew every piece of the system and was happy to tweak it all day long. I followed slashdot, freshment, linux games, and laughed out loud while reading userfriendly.

Perhaps today I’m simply waxing nostalgic, but damn if I don’t miss those days. I’m not sure where the old ‘hacker’ ethos has gone too, but now that I’ve had some time to settle into a professional career, and gain some semblance of free-time… Well, damned if I didn’t look and it just¬†feels missing.

I noticed over the years as some sites closed their doors (linuxgames.com) or renamed and then became read-only (freshmeat.net). Out of habit, I’ve still kept tabs on slashdot. Over the years though, it’s gone from featuring front-page articles about postgresql and Ipv6 on FreeBSD, to a front-page dominated by highly political click-bait trash.

This site used to have a fairly amazing google page-rank for multiple subjects, at least before I let it get snatched by a domain crasher and go fallow for years. I hope there are others out there that enjoy hacking in some community somewhere – I’m looking.