What Happened to Programming

Last week, Mike Taylor posed the question “whatever happened to programming?” (also see his followup). I think part of the answer is simply “it’s grown up.”

Programming, either as a hobby or a career, has only been around for a little less than 70 years. (The first was the Colossus, built and used by the British during World War II to read encrypted German messages.) This is very young when compared to other scientific fields: agriculture, construction, various engineering disciplines, etc.

I won’t disagree that building the very fundamental parts of a new application from scratch can be very enjoyable. Perhaps not printf(3), but certainly other parts. The problem is bug rates, though: any seasoned software developer knows that writing less code means less bugs. If high-quality libraries are available that provide 80% of the functionality of your new application, you will likely end up with a better product, in less time, than if you chose to write the entire thing yourself.

Consider the sorry state we’d be in if:

All farmers experimented with their own methods of weed and pest control (organic or not), instead of using techniques proven to work in the past. Our agriculture industry probably couldn’t meet the population’s demand for food.

Architects and builders “winged it” when building new homes and office buildings. Bugs in that industry would likely mean structural failure. Who would want to live in those houses?

How about automakers? It’s one thing to build a kit car as a hobby, it’s quite another to tinker with your designs when you’re producing cars on a massive scale for general use (just ask Toyota).

It may not be as glamorous, but if our collective bug count goes down and the software industry ships better products that don’t require constant patching, I can’t help but consider this a very good thing.