Let’s assume you’re a pretty good programmer (and good looking to boot!). It’s Monday morning and you’re looking at a task that has some solid complexity to it — it’s going to take you 40 hours of effort to get through. Or you have the option of delegating components to 2, 3, or even 4 competent-but-not-exceptional developers. For these guys to deliver, you’re going to have to:
- Decompose the task significantly
- Write a precise spec, at least for the initial subtasks (because if you get off on the wrong expectation of data structures, there’s no way you’re going to catch the problem, refactor it, and re-distribute the tasks)
- Engage in course correction (when facing ambiguity, “problem solving” for these guys is going to be asking you for clarification)
- Do some refactoring (let’s face it, there’s going to be a couple things that will be easier to fix than re-specify)
- Spend (at least) one afternoon on interactive testing and integration
(For “decompose” and “write a spec” feel free to interpret however you wish, so long as it’s a concrete deliverable that acknowledges that if you’re delegating and “fanning out” tasks, you have to pay an upfront cost thinking through scenarios that are still somewhat hazy and contingent.)
So here’s my question: how many hours do you think you can shave off the 40 hours it would take you to “just do it” yourself? 8, 16, 32?
There are (at least) 2 points that strike me as relevant : one is that even in such a fine-grained, seemingly controlled, context you face significant “mythical man-month” issues of communication overhead, risk, etc. and so you discover, if you time it, that you don’t save nearly as much time as you’d hope.
The second point that I think might be even more interesting is that your perception is likely to be even worse than what the clock says. If you are in this role, you will rarely or never get into a high-productivity “flow” state, a familiarity with which, I would submit, is why you’re a pretty good programmer.
Governments at various levels have decided that they have to bail out people who spent more than the houses turned out to be worth and the financial companies who weren’t wise enough to notice that the U.S. is in fact not short of forests that can be cut down for more sprawl. Where will the money come from? You, me, and everyone else who did not participate in the bubble.
So? we missed buying real estate with a lot of leverage back in 2000 and missed the big ride up through 2004 or whenever. Now we get to buy that same real estate at a much higher price and without any upside at all since we won’t actually own any of it.
Yeah, what he said.
Of course I can appreciate the misery of someone who’s underwater on a $400K mortgage, but my sympathy goes away awfully quick when I hear them say “We just never imagined this!”
Didn’t you notice that that whole “closing” business involved you signing, like, 100 pages of documents that were all variations on “YOU OWE LOTS OF MONEY”?
Giving and receiving multi-hundred-thousand dollar loans is adult stuff. I have friends who are not homeowners because they looked at the risks and decided not to take one of these nonsense loans. Now apparently my tax dollars are going to go to help out the imprudent people who caused my friends to be priced out of the market and, in so doing, my tax dollars will help prop up the prices and keep my friends locked out of the market. This is good for society how?
I’m going to be interviewed by CFRA (580 News Talk Radio) in a bit about the plans of Ottawa Mayor Larry O’Brien to bypass mainstream media by blogging.
Early feedback is that Larry “Amulet of Protection” O’Brien’s rants on light rail are vastly less entertaining than Larry “Chillin’ at the Beach” O’Brien’s rants on implicit vs. explicit type declarations in industrial-size codebases.
Just because this guy looks like a low-polygon-count videogame boss, I’m not intimidated! I’ll drink his milkshake! I’ll drink it up!
LINQ seems to be an overwhelming success (I’ve been having a hard time finding anyone with bad things to say about it), but what most people are talking about is the nice Object-Relational mapping tools. The ultimate goal of LINQ, though, includes uniting not only objects and relational data but XML. The current LINQ for XML does not support schemas, which is a significant limitation if you’re trying to really unify a model; for instance, I have a client who needs to integrate two relational models, a middle-tier object model, and an XML data store. Today, we spend significant time batting back and forth XPaths and tracking them in and out of the relational model. As for reporting, the less said the better.
LINQ to XSD is the necessary next step: a set of tools for LINQ that understand the type information expressed in W3C Schemas.
There were next-to-zero technical books available for the Kindle shortly after its launch. I was delighted to discover 4 of the 12 books that were finalists for Jolt Awards this year are now available in Kindle form:
I need to reset my PayPal password. I went to their Web and requested a reset. “Check your Inbox” it said. 24 hours later, I called PayPal and they manually sent me emails: one to that address and another to another email address! Neither appeared. Neither are in the respective “Spam” folders.
What could be going on?
Such was one of the many pieces of advice of Fred Brooks in The Mythical Man-Month and while others of Brooks aphorisms have stood the test of time, completely scrapping a codebase is today seen more as an aberration than a painful but necessary part of the process.
Andrew Binstock, who’s been developing a modern typesetting language (a TeX for the new millennium), has decided to do just that with his 20KLoC, multiple man-year codebase. His recognition that “the more I code, the more I see that I am adding top floors to a leaning tower. Eventually I’ll topple it” may seem startling coming from a vocal advocate of unit-testing, especially to younger developers who probably have had pretty-good success developing Web-based applications.
Andrew pinpoints the critical issue:
It’s extremely difficult to figure out where your architecture is deficient if you have never done the kind of project you’re currently undertaking.
Nowadays, most of us do our professional programming in pretty well-worn niches — Web-based database-driven this, Smart-client semi-connected that, etc. Because of that, we (or our team) tend to make pretty good architectural choices. So good, in fact, that nowadays you don’t hear nearly as much concern about application architecture as used to be the case. So good, in fact, that lots of people think you can refactor your way out of the wrong architecture; Andrew’s decision is surely painful, but I think it’s vastly less painful than architectural refactoring.
There are other good details in Andrew’s post and he promises to share other reflections as they occur to him — I’m sure his blog will be especially interesting in the coming weeks.
If the November elections are anything like the Democratic caucus I just attended, it will really be something. Turnout was 4-5x the 100 or so expected and the workers ran out of Democratic party registration forms. There was lots of visible support for Obama (native son, true) and none for Hillary.
If Obama gets the nomination and can somehow bring out the disenfranchised … well, wouldn’t that be something.
(follow image link to Zazzle store…)