New Jersey, 1962? Vague memories of an entire barn filled with racks of wires and relays like a giant pipe organ. It’s a surplus electronic telephone switch, a predecessor of the Bell System’s 1ESS. A small team of enthusiasts has high hopes for its restoration. Impressed and baffled, it’s the first computer I’ve seen. I’ll not see another for eight more years.
I am Ward Bell, V.P. of Technology at IdeaBlade, makers of DevForce. DevForce is a product for building multi-tier, data-driven business applications in .NET. It’s kind of like Microsoft’s WCF RIA Services … only we’ve been in production since 2002 with all that implies about focus, experience, maturity, support, features and architecture. We used to have our own ORM, one of the first in .NET. We abandoned that component when Entity Framework was announced; why resist its EF’s inevitable market victory when there is so much need to ease the pain of building distributed applications?
Distributed CRUD apps to be sure. But then most business applications are essentially CRUD + Behavior, aren’t they? A tiny fraction of developers both understand and need message-oriented architectures. The RIA Services perspective – I think of “RIA Services” as a category rather than a Microsoft product – that perspective will dominate Silverlight application development for the next decade at least. Thus spake Zarathustra.
What am I doing on CodeBetter? I’m certainly not here to talk about my product; I’ve got my own blog for that.
I am here because CodeBetter speaks to the software craftsman, the person who is curious about the practice of software development. We want to know “why” as much as “how”. We want to know when to do something and when not to do it. We see programming as a social phenomenon engaging a variety of actors, not as a stunt performed by heroic individuals in isolation. Our craft has a history. What we applaud or disparage today is provisional and contextual. There are no fixed marks – no permanent truths.
Sure, we have convictions and strong opinions rooted in practical experience. But we’re not wizards who’ve mastered a book of spells; we’re seekers, feeling our way forward, exploring better ways to develop software, mindful that the measure of “better” is a function in part of rapidly changing human and technical conditions.
When I read a CodeBetter post, I’m always impressed by the author’s enthusiasm and expertise – yes – but also the passion to communicate and educate. It’s a noble community and I’m honored to have been invited to join.
I come late to my love of the craft. So many of my itinerate years were spent programming unconsciously, showing up for the job, all wit and no wisdom. I’m kind of nostalgic for my old “Mort” self in much the way we can be nostalgic for our teenage selves.
Journeyman programmers are not “those people”; they’re “my people”. I’m freshly reminded of them every time I visit customers. Not all of them are unreachable; most merely slumber. Perhaps we can awaken a few. For the others, we can provide tools, rules, and infrastructure so they can program beyond their competence.
I figure as I write in this space, you’re bound to wonder “who is this guy?” Herewith, a little autobiography.
Lesson #1: Generosity
Poker was my first program, written in APL on an IBM Selectric terminal wired to a System/360 in Poughkeepsie, NY. In 1966, Tom Watson named Steve Dunwell an IBM Fellow. Dunwell had been the inspiration and driving force behind the “Stretch” supercomputer, perhaps the world’s first supercomputer, delivered three years before Cray’s CDC 6600. His rise, fall, and resurrection is a great story.
As an IBM Fellow, Dunwell had some money and latitude to pursue projects that interested him. Teaching kids about computing was one of those projects. He put typewriter terminals in several high schools, one of them mine. It took passion and selfless dedication to put million dollar machines in the hands of clueless kids. I repaid his generosity with 5 card stud. I think the best it could do was deal cards. That was enough computing for me. Sorry, Steve.
I couldn’t afford college right away. I needed a job. I had no skills, no prospects, and no interests beyond the pursuit of girls and good times.
The fortuitous conjunction of my poker program and my fascination with the Albigensian Crusade lead to a bolt from the blue. The wife of my medieval history teacher worked for Dunwell. She called a Dr. Bill Hagamen, a neuro-anatomist at Cornell Medical School in New York City. He too was a beneficiary of Dunwell’s educational computing vision and he owed her a favor. “Call this kid and see if you can use him.”
I stumbled through the interview … a complete disaster. I couldn’t remember a thing about my poker program or APL. He almost hung up but I begged to see him and try again. I have no idea why I begged him or why he accepted. But a few days later, after cramming “A Programming Language” by Ken Iverson, I was sitting at his kitchen table in Long Island trying to convince him I knew enough APL to be useful.
I didn’t fool him at all but there must have been something he liked about a nineteen year old, long-haired, hippie kid who would try that stunt. More likely he needed to payoff the favor and he didn’t have any money for programmers anyway. Over the next few years he’d hire a taxi cab driver, a Ph.D in Slavic languages, a street musician, … just about anyone who’d work for dirt and flashed a little intelligence.
Dr. Hagamen was cheap by necessity, not by nature. When I met him he was still funding his work with grants. Grant money is uncertain; even when awarded, you can’t be sure when it will arrive. “Write the grant proposal for work you’ve already done; the money funds the research results that are the basis for the next grant”.
Live this way and you learn to spend carefully and scrounge for everything. He lived alone, subsisting on cigarettes and McDonalds as far as I can tell.
His neurological research depended on cats. Unlike the human brain, the cat brain hardly varies from subject to subject; it’s a stable specimen for anatomical research which, at the time, consisted of surgically poking at interesting structures to see what happened. The top floor was a vast mob of cats in cages, blinded and lamed in the course of his work. Cruel as this sounds he was not a cruel man; they purred and rubbed themselves against him with no sign of pain. He had gathered each one himself, walking the alleys of New York by night with his cloth cat collection bags: “press down on the neck just above the shoulders and they’ll push back with their fore legs, immobilized” he advised; “Then whip them in one stroke into the bag.” That’s scrounging.
We were just cats of a different color, scrounged from the streets of New York. We too purred every time we saw him. He loved each of us as individuals, listened to our stories, shared his own, laughed freely between long drags on endless cigarettes. If the toe of my shoe pointed slightly out he’d note that I’d sprained my ankle slightly … as indeed I had, playing basketball the night before. He became a second father to me. We had long conversations about everything.
He taught gross anatomy. The door to our computer lab opened into an adjacent dissection room where Dr. Hagamen showed students the body’s neural pathways. We met there often to discuss some forthcoming feature while he prepared for tomorrow’s lecture. I can still see him, bare hands deep in a cadaver’s chest, absent mindedly tying little white flags around nerve bundles while we talked. I never got used to that, pretending not to stare at the hollow shell of a human being, tanned leathery brown from its bath in formaldehyde. We stood amidst rows of them in various states of disrepair, limbs wrenched into impossible positions, parts meticulously tagged for display. Didn’t bother him at all.
Usually we talked about the work. Dr. Hagamen (he was always “Dr. Hagamen”, never “Bill”) had tired of neuroanatomy. Natural language processing was his passion now and for the rest of his life. We were writing a system for Computer Aided Instruction (CAI) to supplement the lectures in a medical student’s training. The program typed out questions, say, about the muscles of the eye. The student answered in full English sentences. The program determined what he’d said and chose what to ask next based on its “interpretation” of the student’s answer: if deemed correct, on to the next question; if it was a common error, on to a revealing query series; if flat wrong or unintelligible, ask for clarification. No multiple choice, no cheerful “right” or excoriating “wrong” answers. If you knew the muscles, you were through in minutes; if you didn’t, the program could lead you for more than an hour through a maze of enlightening statements and follow up questions.
Our teaching model was Rogerian psychotherapy. Dr. Hagamen had translated Joe Weizenbaum’s ELIZA (1966) into APL. That was the inspiration for our achievements in simulating “normal conversation”; we applied some minimal grammatical analysis, extracted key words matched to a constrained subject-matter context, and leaped to conclusions about what you had probably said. It worked surprisingly well and I think is still the foundation of most CAI to this day.
We knew this wasn’t how language actually works in humans … no more than airplanes simulate how birds fly. The facade of human conversation collapsed quickly as the vocabulary grew, the context expanded, or the sentences became paragraphs. We wanted to apply more intelligence and fewer statistics. We read Chomsky, hoping to find the deep structures of language and enshrine their algorithms in APL.
We didn’t get far. The search, however, was intoxicating, especially for a kid fresh out of high school. It seemed to me I was doing something important; we were pushing against the frontiers of knowledge and I was part of it.
He made us part of it. He put our names on the papers he published. Each of us had a moment or two in the sun. There I am, listed as a co-author of “ATS in Exposition”, Computers in Biology & Medicine, volume 3, 1973; got my first academic citation before I’d even entered college. I hadn’t written a word of it. I guess I had contributed in some small way just by being on the team. He wanted to share the credit with all of us.
Looking back I understand now a little better about what “team” means. We were more than a group. We knew about each other’s lives – family, friends, pleasures, peeves. We played together as much as we worked together. We may not have been great at what we did, but we were full of common purpose and proud of what we did. We shared our code as we would share a beer, hiding nothing.
And were we ever close to our customer. So close we couldn’t have thought of it that way. Dr. Hagamen was happy to be the worst coder among us (smart of him to let us think so) but he was our leader, our friend, and we tried never to let him down.
I am convinced that commitment to purpose and fellowship trumps anything to be gained from smart people, “best practices”, great architecture. Get to know your team intimately and authentically. Disappointments are inevitable but your colleagues will fight like cornered rats to avoid them. Share the credit, absorb the blame and your prospects for success improve beyond measure.
The cigarettes and quarter-pounders finally caught up with him. He died from cancer in 2007 at 82. I’d long since gone west and fallen out of touch. The others hadn’t.
The news found me across the decades and broke my heart. I grieve for him now. In thirty years, who on any of my teams since will grieve for me? It’s a question I ask myself now and it goes straight to producing productive teams. Generosity of spirit, inspired loyalty, a learning organization … whatever my failings as a developer, this I can build.
Lessons #2: Productivity
Dr. Hagamen didn’t have to worry about hiring a complete loser. The APL language itself weeds them out. Your program either worked or it didn’t and there was no waiting around for days full of excuses. A few lines told everything about you and your potential.
You simply can’t fake it in APL. Every expression is a parade of equal parts alphabetic and hieroglyphic characters evaluated from right to left. Conway’s Game of Life can be written in a single, impenetrable line. You want to do away with ceremony? You want zero noise? You want the maximum bang for your keystroke buck? APL is your language. Ruby? Boo? They’re chatterboxes. In APL you can pack a ton of functionality into 32k of main memory, the extent of our universe in those days.
Unfortunately, APL deserved its reputation as a “write-only” language; the annual contest for one-liners of maximum obscurity was a highlight of every ACM APL conference.
But it was (and remains) a serious language, a forerunner of the functional programming languages becoming popular now. Skim Ken Iverson’s own Turing Award Lecture, “Notation as a Tool of Thought” and discover your unworthiness. This is genius almost unknown today.
APL is interpreted which, combined with its unequaled power and economy of expression, enabled productivity such as had never been seen … and would not be seen until this century. While everyone else was presenting their boxes of 80-column cards to the guardians of the raised floor, hoping to squeeze in one or two runs per day, we were merrily tapping away at our terminals, cranking out several new applications a week.
I learned the power of iterative development writing APL. No interminable design documents for us. You had an idea; you wrote it and shipped it. It was cowboy programming but who knew? Who cared? Everyone’s programs were rife with bugs in those days. We could find ours, fix, and release again in minutes while the Fortran guy was still waiting for the band printer to burn through a box of paper with his core dump.
APL swept through the IT shops of New York financial houses. Morgan Stanley, Merrill Lynch, American Express … I would one day write APL for all of them. They still use APL! APL’s mathematical heritage had nothing to do with it. I suck at math. Fortunately I never needed more than high school algebra. I wrote applications to find things, watch things, and count things; I was solving the same kinds of business problems we work on today. APL dominated because it was incredibly productive.
On the street they hated to wait. Time-to-market was everything. The half-life of an application was a year or two. Maintainability was an after-thought; overpay the developers to stick around and everything would be fine. You only needed one or two … compared to the army of drones it took to write it in any other language … assuming they could get it done before you lost interest. Had there been such a thing as well-tested code, the cost to produce it would never have overtaken the benefit of moving pell-mell. Quality was a distant second to Functionality … and rightly so.
In time I learned to value the stability and maintainability of well architected, readable, code written in compiled, statically-typed languages. Today’s IDEs and fast compilers have narrowed the productivity gap. It’s no longer a straight-up trade between productivity and maintainability. I wouldn’t write in APL now even if I remembered how.
I remain receptive to the “git ’er done” message. It’s pointless delivering a properly architected and tested product later than it’s needed. The budget will be gone. The opportunity will be gone. Technology debt is a great thing when there is big return for short term debt. This I learned during my first twenty years of programming.
Our present challenge is to marry speed and sound practices. That was impossible when I started; I believe it is possible now. We have to keep our eyes on both … and I intend to do that in my posts here on CodeBetter.
Lesson #3: Suspect all arguments about Performance
The quest for maximum performance in minimal space ruled our days. We were programming in a phone booth: 32k memory, 30 characters-per-second networks, tape drives. One of us, David Linden, discovered that the integer ‘1’ occupied 4 bytes in memory while the variable name “ONE” took only three; he could shave hundreds of bytes from the program simply by substituting ONE for ‘1’ and “TWO” for ‘2’. We promptly converted all of our programs to spare those precious bytes. We might have patented the process had there been lawyers around to guard our “intellectual property”.
We thought about algorithms, not methodology. Dijkstra was unknown to us; we didn’t know that GOTOs were bad style We knew that performance stunk for any kind of branching at all so we avoided it where possible, favoring the APL equivalents of map and reduce. We had no notion of “style”. We had “idioms”, the best known being “((V ι V) = ιρV)/V”, which removes duplicates in the vector, V. “V” itself was a popular enough name – second to ‘X’ perhaps – and we’d use it everywhere. Any function or variable name over four characters attracted the gravest suspicion; it took too much space and cost more to fetch.
Thank goodness we don’t have to write this way any more. At PDC I heard a speaker say “our hardware is 50,000 times faster but our programs are only 5,000 times faster.” As if that were a bad thing. It’s not bad at all. I used to argue with people who insisted you should program only in assembler because higher level languages – Fortran, PL/I, APL – could never perform as well as properly written assembly code. No one makes that particular case anymore. It’s obvious now that we’d never get much done that way even if it were true. It also happened to be false much of the time for even a modest body of code; then as now, compilers regularly beat good assembly coders.
We prioritized performance because we had to; the program died or ran out of memory without those tricks. You’ll note that David measured the optimization before we employed it.
It galls me to hear someone argue for “better performance” before anyone knows if it will matter. Almost as bad is the phony speed derby down a short track when we know we’ll be running a marathon. Consider the smarty pants who proves that a Data Reader executes a simple query many times faster than an ORM. No kidding. Of course the application won’t run that query more than a few times all day and when it runs, no one notices the difference between 300ms and 30ms. Meanwhile, we discover that network latency is the dominate factor. Reducing trips to the server, each payload packed with a rich object graph, yields a vastly more responsive experience than the Data Reader Guy can dream of. Not to mention that Mr. Data Reader will waste months chasing bugs at the expense of building business value.
You want to be sensible. Synchronously loading a 100,000 records before you show the first screen may work on the LAN but it will never fly over the internet. Reason and experience can spare you much heart ache. A little scratching on the back of the envelope can clarify issues and identify threats. Otherwise, we’re far better off writing clear, clean code … benchmarking … and tuning the hot spots as we find them.
“Duh” you say? I’ll keep talking about this until my customers stop arguing about performance prematurely or ignoring it altogether.
Lesson #4: Read Code … Lots of it
We were all self-taught back then. In our little lab on East 68th street, we learned by reading each other’s code. Maybe we worked in a vacuum but no one I knew thought about software methodology. The first “Software Engineering” conference was held in 1968. David Parnas coined the terms “modularity” and “information hiding” in 1972. We didn’t know about any of this. We had few books and publications to consult: a couple APL reference books, the Transactions of the ACM, the IBM Systems Journal, and Knuth.
An eye-opener for me was Fred Brooks “Mythical Man Month” (1975), a landmark then and forever relevant. Brooks described it as "my belated answer to Tom Watson’s probing question as to why programming is hard to manage." Its many lessons are blithely ignored to this day. Try explaining to your twenty-something manager that “adding manpower to a late software project makes it later.”
Brooks woke me to the idea that programming is a social phenomenon to be studied and reasoned about. Then I went back to sleep for another seven years.
We read code. Not books about code. Not magazine articles with code samples. Production code. Code written by colleagues and later by customers. “Good” code and “bad” code.
A lot of what we thought was “good” code we’d now regard as “bad” code. The converse is not true: what was bad then is worse today. And there was truly awful code out there. I ought to know; I wrote plenty of it.
I wish we had had the books, articles, blogs, manifestos, and conferences readily available to any developer today. I would have learned faster and avoided mistakes that seem obvious to me now. I won’t fetishize a prolonged experience in ignorance.
On the other hand, I don’t believe you can read your way to competence. You have to write a lot of code before you get decent at it. You have to read even more code before your writing gets good. In this respect it is like any profession. We learn by doing and we improve by watching others do it.
Don’t get stuck reading just one person’s code. Don’t get stuck in one school of design either. Look around. Try another language. See how different applications, different developers, and different customers create conditions and constraints that shake your assumptions. As Han Shan says “There is no path that goes all the way”.
No one path for me.
… Next Installment?
So much for a brief intro. Suddenly I’ve got a memoir going.
If my introductory post generates favorable interest, I’ll keep going. If it generates outrage and alarm, I’ll stop!
A “next installment” could cover:
- The Reluctant Programmer – how I discovered I was a programmer while striving to be something else.
- The Itinerant Programmer – life as a job-hopping consultant and what I learned about making a quick impression and grasping the customer’s business, “Show me the data and I’ll tell you what your application does”. Hubris and infrastructure.
- The Corporate Programmer – “10 years before the mast” Or “settling in at GE”, How long does an application last? Cobol follies.
- Never hire a friend – Oops
- Hiring a programmer – Ravishing beauty versus pain-in-the-butt
- Life on the other side: becoming a customer of IT
- “If it makes no business sense …” – my Dot Bomb moment
- Starting a product company
- Re-starting a product company
Until we meet again …