Steve Jobs died on Oct. 5, and the tributes resounded through the universe. Far less noted was the passing of Dennis Ritchie just three days later. The two men were the yin and yang of modern computing, completely different, yet linked over time—like entangled photons acting upon each other at a distance.
You know all about Jobs the iconoclast, dropping out of college after just one term, the maverick, always upsetting things. Contrast him with Ritchie the Harvard Ph.D., working in the ultimate establishment at Bell Labs for 30 years. The news of Jobs’ death flashed instantly through every channel, but I didn’t notice Ritchie’s passing until a week after he died.
Jobs’ accomplishments are familiar to all. To my mind, his greatest was the iPod, Apple’s first foray into mass-market consumer products and the company’s first gadget, if you will, back in 2001. How long ago that seems, how different the world was then. The Microsoft .NET Framework hadn’t shipped yet, the now-ubiquitous smartphone didn’t exist, and my older daughter (now an eye-rolling pre-teen) was taking her first baby steps chasing Simba, the family cat. If you wanted music, you had to carry not only your Discman (creation of Sony’s Akio Morita, another iconoclast), but also your stack of CDs. You had to manually swap them in and out, dropping them and scratching them and wiping fingerprints off them, playing tracks in the order in which they were burned.
Jobs changed all that, moving music from backpack to pocket. The lessons he learned in the process enabled Apple’s success in the smartphone and tablet markets. He realized that not only did the gadget itself need to be great, but it also required an ongoing business infrastructure; hence the iTunes store. Pretty much single-handedly, Jobs brought down the music industry establishment, much as Craigslist brought down the newspaper industry.
Civilians have never heard of Dennis Ritchie, but everyone in the geek business today works directly with something of his, or with something built on his innovations. Ritchie and Ken Thompson developed the Unix OS, for which they shared the prestigious Turing Award. He developed the C language (so named because it followed one called B) in the early 1970s.
C was my first industrial language. I used it for controlling a chaff decoy launcher on warships, then an ion implantation machine in a semiconductor fab, then a financial trader’s dealing system. At my last company, we said that to get a job there, a programmer had to have a beard (I qualified) and know how to spell C (I struggled, but eventually mastered it).
Many successful languages are built on C. Bjarne Stroustrup designed C++, the first industrial object-oriented language. C# and Java hark back to C as well; from semicolons and curly braces to the “main” entry point. You can think of the sharp sign as two ++ operators stacked on top of each other, and leaning forward as if in motion. Objective C, a Smalltalk-influenced derivative, has become the primary language for the Apple world.
Jobs was a pioneer of marketing and design, of figuring out what customers really wanted before they knew it themselves. But without Ritchie’s work, Jobs wouldn’t have been able to develop the software that made his devices hum. Now Jobs and Ritchie forever share the sad confluence of their passing, and the sense that each got short-changed by at least a decade.
So long then, and thank you, to a pair of guys who did great things, one of them known to the whole world, and one known only to us geeks. Lots of people mourn Jobs. Let’s you and I hoist one for Ritchie, because we’re the only ones who will. You get to buy.
MSDN Magazine lost a great columnist on Aug. 29, with the passing of Simba (“The Cat Butt Factor,” msdn.microsoft.com/magazine/gg983482). She crossed over peacefully, sleeping in a sunbeam in her favorite garden, aged 20 years and 3 months (roughly 120 human years). May you and I be as lucky, dear reader. I thought I’d be getting a lot more work done now without her jumping on my keyboard, but somehow it hasn’t worked out that way.
David S. Platt teaches Programming .NET at Harvard University Extension School and at companies all over the world. He’s the author of 11 programming books, including “Why Software Sucks” (Addison-Wesley Professional, 2006) and “Introducing Microsoft .NET” (Microsoft Press, 2002). Microsoft named him a Software Legend in 2002. He wonders whether he should tape down two of his daughter’s fingers so she learns how to count in octal. You can contact him at rollthunder.com.
More MSDN Magazine Blog entries >
Browse All MSDN Magazines
Subscribe to MSDN Flash newsletter
Receive the MSDN Flash e-mail newsletter every other week, with news and information personalized to your interests and areas of focus.