6 years, 8 months ago

Moore’s Law Deemed Hazardous

The advent of digital computers has brought a seemingly unending supply of innovations. Broadband communication to our homes and portable computers ranging from laptops to pocket-sized computers (aka, smartphones) make working anywhere 24/7 viable. This innovation also makes petabytes worth of cat videos available to most of the planet, but that is a topic for another post.

This abundance of computing power is creating slop. Slop in the process and slop in the code. Why? Because a compile and test cycle costs us nothing more than a few keyboard strokes. Most of us wouldn’t have time to sip a Mt. Dew in the time it takes to build the code and run the test cases.

Our desktop computing capacity far exceeds that of the mainframes of yesteryear. A machine with 8–16GB of RAM along with tons of storage is standard fare and easily fits into a briefcase or backpack. I mean, why the hell would you even bother checking the code? Yes, I mean a walk-through either paired-programming style or actually printing it and getting a second set of eyes. This rush to deliver business functionality seems to be fueled by Moore’s law. Computers run faster leads to more requirements demand which leads to creating faster computers. It is a vicious cycle unless you are in the cycle making money along the way.

Back in the days of white shirts, black ties, and pocket protectors a guy named Frederick Brooks was a project manager for the S/360 project at IBM. For those of you wearing skinny jeans, the S/360 operating system is the foundation for what still runs on zSeries mainframes today. In his book The Mythical Man Month, which I strongly recommend you read, Brooks describes the cultural environment and requirements for developing software in the 1960s. There was no quick keystroke to compile your code. You walked down the hall and presented your card deck like an ancient Jewish sacrifice to the computer room for processing. But only on your assigned compile day. Maybe if you were lucky the code came back clean in time to run the tests the in next day or two. Yes, the cycles were measured in days. In those days they had to spend time checking the code they hand wrote before submitting the compile job. Even in ~1985 I was required to write out the code and produce a flowchart – with one of those IBM stencils – before heading into the lab in college.

But this rant isn’t just about the code. It’s about the process. I’ve seen the evolution from waterfall step-wise deliver to more iterative RUP-inspired approaches. I’ve also seen what is termed agile. But I increasingly think the term “agile” is a moniker being applied to those who are just coding-and-going with no sense of discipline until the code breaks in production. Where did we lose our way? Maybe I haven’t met the folks who are really leveraging agile to delivery mass amounts of quality code to drive successful businesses. I’m not advocating for waterfall, but is there a balance to be struck between a process straightjacket and process anarchy?

There was a sense of precision and care that was necessary due to the restriction of computing power. I haven’t worked in industries developing safety-critical software-intensive systems like a Boeing, Airbus, or Lockheed Martin. I imagine (pray?) this level of discipline still exists in those industries regardless of computing power. But could we benefit from a good dose of Dr. Brooks’ world when approaching the design and development of software-based solutions?

And yes, you can get off my lawn now.

twitterlinkedinmailby feather