Incompetent American Programmers

In the Summer and Fall of 1999, at the peak of the dot-com boom, there was incredible competition for software developers. Starting pay for developers with no experience had already climbed to $60K and then, in the course of maybe 3 months, it went from $60K to $75K to $90K. And that was actual money, not soon-to-be-toilet-paper stock options.

A guy as American as apple pie came in, fresh out of college, applying as a Java developer. He didn’t know the difference between a class and an instance and didn’t know what inheritance was. So, just as incompetent as anything I’m seeing today. I started to explain to him that software development was a wonderful profession and that if he wanted to learn it, the traditional way would be to apply as a junior tester, and … He cut me off, told me he had 5 other interviews lined up that week and made it clear that he expected to be hired at one of them. And I don’t doubt that he did.

But I doubt that he’s still incompetent and employed as a software developer and especially not as a freelance coder. I think he either:

  • was weeded out of programming (perhaps by going into management), or
  • got a clue

I complained the other day about an incompetent applicant from South American so I’ll use as an example another guy on the team who lives in South America, “gets it” as far as software development goes and charges $24 an hour. He lives in a beautiful house on, like, 10 acres or something, owns several horses, and I get the impression that he’s considered quite the young go-getter.

So when people think the moral of my story is “cheap employer…you get what you pay for” I think they’re entirely off-base. If you’re willing to create a distributed team (the wisdom of which is a whole question in and of itself), you might find yourself in the enviable position of being able to give a smart person a high standard of living and contributing to a developing economy tra-la-la-la-la all while paying less than an American median wage. It’s hard for me to see the argument that that is immoral.

It’s not the nationality of incompetence that’s depressing me, nor is it necessarily the scope of the incompetence embodied in a single person, it’s how common it is that I encounter people who have no respect for this activity that I love. I feel that I’m seeing it more often than I used to, and while I may be imagining that (“When we were kids we hiked 7 miles through the snow to the data center…”), I think it’s a real phenomenon.

Some people suggested that the language involved might have something to do with it, and others suggested that it might have to do with the increasing amount of hand-holding in modern development environments. I still tend towards my feeling that global commodification has something to do with it; more and more people applying for jobs in the field of software development did not enter that field due to a love of computers or software, they entered it because there’s demand.

There have always been developers for whom programming is “just a job.” Back when I was Editor of Computer Language, it was common-place to refer to the statistic that “the average programmer has less than 1 book on software development.” But it was easy for me to ignore that, because those weren’t the people who read magazines and attended conferences and swapped stories on Compuserve’s CLMFORUM. So maybe I am just being an old codger. Except instead of CLMFORUM, when I look for reaction to my thoughts I find griefers on reddit making ad hominem attacks. Progress.

I’m Looking to Hire Freelance Ruby Programmers

I’m trying to hire a couple developers. One guy sent a resume that looked great — degree in CS, C++ experience, a year with Ruby in Rails. So I sent him a simple programming exercise. I sent him the testcases.

He shoots back an answer. I open a command-line, type ruby TestCases.rb and see this:

9 tests, 0 assertions, 0 failures, 9 errors

He didn’t even get to the freakin’ assertions! His “solution” didn’t treat the argument as an array.

Once upon a time, I programmed by sliding cards in a box under a window and returning 15 minutes later for the results. Once upon a time, I programmed by writing object files that could take anywhere from a few minutes to overnight to link together. If we still lived in those times, I could understand submitting some text and saying “I think this is a solution.” We don’t live in those times anymore.

This guy was from South America. A lot of the guys I’m dealing with lately have been from outside the United States — we’re a distributed team and, all things being equal, a guy with a CS degree, C++ experience, and a year with Ruby on Rails who’s asking $20 an hour is going to be more appealing than a guy with the same background asking $60 an hour.

I don’t know if it’s a cultural thing or a “younger programmer” thing, but I have to say that I’m getting really freaking tired of experiencing this level of incompetence, even for the thirty seconds it took me to see that and respond “Nope. Not even close” to HR. It’s actively depressing to me to experience this crap.

To be clear, this has nothing to do with this guy’s innate talent or intelligence. What it has to do with is this … mindset …. that seems to be entirely at odds with my conception of the activity of software development. I’m not talking about an ignorance of, much less disagreement with, my particular biases and judgments about the niceties of methodology and process. I’m talking about people who don’t seem to “get” that programming is, at the very least, about making programs that run.

And, accuse me of jingoism if you will, but I have to say that it’s depressing that it’s virtually impossible for an American to make a median wage being a freelance coder because their resumes probably look worse than that of these people with CS degrees who don’t freaking bother to see if their programs run.

Of course intelligence is distributed evenly throughout the world, but this level of incompetence has largely been weeded out of the American freelance programming community. If you’re making a living and you have to charge twice what a person in South America or Asia charges, you pretty much have to “get it.” And it is sad to consider a bunch of people who “get it” slowly being weeded out of the workforce because we are unable to clearly and concisely demonstrate value to potential employers.

Update: I’ve removed the name of the fellow’s country, which is one I’ve always wanted to visit and which I’m sure has many fine developers. It’s not relevant, other than to make the point that it’s not just one country in Asia where there are freelance developers looking for work and charging significantly less than their American or European counterparts.

30K application lines + 110K testing lines: Evidence of…?

I recently wrote an encomium to ResolverOne, the IronPython-based spreadsheet:

[T]heir use of pair programming and test-driven development has delivered high productivity; of the 140,000 lines of code, 110,000 are tests….ResolverOne has been in development for roughly two years, is written in a language without explicit type declarations, and is on an implementation that itself is in active development. It’s been brought to beta in a credible (if not downright impressive) amount of time despite being developed by pairs of programmers writing far more lines of test than application. Yet no one can credibly dismiss the complexity of 30,000 lines of application logic or spreadsheet functionality, much less the truly innovative spreadsheet-program features.

ResolverOne is easily the most compelling data point I’ve heard for the practices of Extreme Programming.

[Extreme Program, SD Times]

Allen Holub sees the glass as half-empty, writing:

I want to take exception to the notion that Python is adequate for a real programming project. The fact that 30K lines of code took 110K lines of tests is a real indictment of the language. My guess is that a significant portion of those tests are addressing potential errors that the compiler would have found in C# or Java. Moreover, all of those unnecessary tests take a lot of time to write, time that could have been spent working on the application.

I was taken aback by this, perhaps because it’s been a good while since I’ve heard someone characterize tests as evidence of trouble as opposed to evidence of quality.

There are (at least) two ways of looking at tests:

  1. Tools for discovering errors, or
  2. Quality gates (they’re one way — are they quality diodes?)

There’s no doubt that the software development tradition has favored the former view (once you’ve typed a line, everything you do next is “debugging”). However, the past decade has seen a … wait for it … paradigm shift.

The Agile Paradigm views change over time as a central issue; if it were still the 90s, I would undoubtedly refer to it as Change-Oriented Programming (COP). Tests are the measure of change — not lines of code, not cyclomatic complexity, not object hierarchies, not even deployments.

(Perhaps “User stories” or scenarios are the “yard-stick” of change, tests are the “inch-stick” of change, and deployments are the “milestone” of change.)

So from within the Agile Paradigm / COP, a new test is written that fails, some new code is written, the test passes — a one-way gate has been passed through, progress has been made, and credit accrues. From outside the paradigm, a test is seen as indicative of a problem that ought not to exist in the first place. The passing of the test is not seen as the salient point, the “need” for (i.e., existence of) the test is seen as evidence of low quality.

In true test-driven development, every pass fails at least once, because the tests are written before the code. What is perhaps not appreciated by those outside the Agile Paradigm, however, is that tests are written that one expects to run from the moment the relevant code is created. For instance, if one had fields for sub-total, taxes, and total, one would certainly write a test that confirmed that total = sub-total + taxes. One would also certainly expect that test to pass as soon as the code had been written.

As is often the case with paradigms, often just realizing that there are different mental models / worldviews in play is crucial to communication.

Update: This relates to Martin Fowler’s recent post on Schools of Software Development.

Name That Arcade Game

Sometime after Space War and Asteroids, but before color was widespread in arcade games, there was a 2-person vector-graphics game in which you and your friend drove “tractors” around and grabbed little diamonds (or whatever) from a pile in the center of the screen and dragged them back to your base. You could shoot the other guy a la Space War, but I think there were also bad guys flying around to shoot a la Asteroids.

Name that Arcade Game!

Update: KSharkey rocks! He correctly identified Rip-Off — with graphics like this, it’s no wonder I was enthralled:


With the advent of color, there was this game involving a grid of city blocks. A dozen or so triangles started moving from one side (or all sides?) through the grid. I don’t recall if you controlled the triangles or you controlled a car trying to get away from them, but over time the triangles would end up crashing into each other and being destroyed. And you either were trying to destroy them all before time ran out or you were trying to keep them alive until you achieved some goal.

Name that Arcade Game!

Update: WillC2 Rocks! Targ it is!


Well, guess I’m going to have to renew my XNA Creator’s Club membership…

2 Things That Made Me Scoff at "Breach"

“I wrote an encryption algorithm with 612 bits of security.” (I really like to imagine the ‘notes’ from the studio on this — “I like how this establishes that Hansen is a very talented programmer, but let’s bump it up 100 to show he was really good.”)

“We need Linux servers…” (In the year 2000, he expected the FBI to run its infrastructure on OSS? No wonder this guy got nowhere!)

Other than that, the movie was okay, even if it had the cliched cop-out “Why did he do it? Well, in the end it doesn’t matter. He did it. And that’s what counts.” That may be what counts in the real world, but in a story the ‘why’ is central to the job.

Does Collective Code Ownership Overcome Poor Programming?

Hmm…. this post advocating Collective Code Ownership as “the most important principle of XP” has a link to my comment on “bad programmers are not good programmers who are slow

The implication is that either:

  1. Counter-productive programmers are a myth and a scapegoat at all times, or
  2. CCO is a cure for counter-productive programmers

If (1), I’ll restate that what little evidence we have about programmer productivity points to a productivity distribution that’s skewed with a long tail of incompetence.

So, (2) CCO is a cure for low- or counter-productive programmers. I don’t see that at all. For one thing, I don’t see any mechanism by which CCO improves the talents of the worst programmers. It exposes them to higher-quality code than they write, true, but bad programmers don’t learn by example (I’m tempted to say their lack of self-educational initiative is their defining characteristic).

From a managerial perspective, CCO can actually hide poor programming, in that a poor programmer does a “works on my machine” or “works for the default scenario” piece of crud, and a good programmer comes through and refactors the work before the poor programming is exposed. The good programmer is disgusted and frustrated and slowed, but management sees Bob The Poor and Gwen the Great as both finishing one task.

With version control logs, Bob’s role in the poor work is not hidden to Gwen either, so it’s not the case that the communal nature of CCO tempers her resentment.

I’m 100% for CCO, but I don’t see it as having anything to do with incompetent programmers. (Clarification: I’m 100% for CCO, but that doesn’t mean it can’t ‘hide poor programming’ as discussed above. Nothing’s perfect.)