Archive for June 2007
A few weeks ago, Scottevest Cargo Shorts were on sale. I’ve never owned a Scottevest product before, but they’re well-reviewed, and if there’s one piece of clothing a Hawaiian geek requires, it’s capacious cargo shorts.
They’re quite good looking and can handle a full load of iPod, wallet, digital camera, and phone. However, they have a critical flaw, of which I’m surprised given the company’s clear understanding of their audience: there is no pocket that accommodates a Moleskine Reporter Notebook (or the slightly smaller and more casual Sherbert Notes 7″x5″). The “big” pockets on the Cargo Shorts are cut with an angled entry that writing-sized notebooks can’t negotiate (see photo).
Of course the shorts can handle notecards or memo pads, which are sufficient for to-do lists and Hipster PDAs, but have you ever tried to record a non-trivial thought on a memo pad? Doesn’t work.
Perhaps the next release will solve this critical bug.
Alan Turing was born 95 years ago today. Less than 100 years ago. I know that at the physical level, information processing is nowhere near as dramatic as flight or the rise of the car, but it’s still astonishing to reflect upon the advances. I’ve been drafting an article about the connections being discovered between computation and physics (both thermodynamics and Riemannian geometry), fields where there is a palpable sense of impending breakthroughs. I’ve never understood why there’s so little discussion of the science of computation and information, which is still a field that, like biology in the 18th and 19th centuries, is broadly accessible and one where I am convinced amateurs and dilettantes can make major contributions.
NVidia’s Tesla C870 Graphics Processing Unit (GPU) will be the basis for a “deskside supercomputer” add-on that will provide highly-parallel high performance computing (HPC) capabilities, presumably programmed with NVidia’s CUDA toolkit.
Dedicated hardware for HPC has always been a treacherous market — one year’s darling is next year’s has-been (people used to buy Cray Supercomputers at auction and resell them for the gold in the connectors. True story.). Dedicated processing boards for desktop computers have always been especially troubled, as the system bus is such a bottleneck and Moore’s Law used to provide such wonderful free lunches. (No longer true, although the bus issue is potentially more dramatic than ever.)
There is infinite demand for HPC from 3 well-funded sectors: economics (trading), bioinformatics, and chemistry (bio- and otherwise). These sectors will absorb any amount of information processing capacity available. Whether that can be translated into commercial success for NVidia, or whether they unlock additional markets, is far less certain.
I wonder if Google will buy a couple boards.
Takeaway for programmers: Feverish hardware activity relating to concurrency continues. Software lags, with only relatively low-level toolkits available for exploiting the system. Keep your C skills sharp.
Slow Death Of Dev^h^h^h Magazines, Part 38: Once Mighty Ziff Davis Sells Off Enterprise Group In Attempt To Service Debt
Ziff Davis is selling for $150M its Enterprise Group, whose assets include Baseline, CIO Insight, eWeek, and microsoft-watch.com. Unfortunately, that leaves ZD with still around ~$240M in debt, which they must pay off using their Consumer/Small Business Group (which published PC Mag) and their Game Group.
The purchaser was Insight Venture Partners. I can’t imagine that they’ve got a plan to flip it — I don’t think any publishing company is hankering to make such an investment. So that leaves slicing-and-dicing.
Miguel de Icaza has unveiled “Moonlight,” an implementation of Silverlight on Linux by way of Mono. The project was done as a 21-day sprint and while just a prototype, it makes Microsoft’s new in-browser managed platform available on the 3 major desktop contenders.
I remain of the opinion that Silverlight is going to be a major platform for Microsoft, siphoning off a lot of developers who otherwise would be looking at .NET / desktop CLR. And while Mono has not seen the uptake that I think it deserves, the availability of Silverlight on Linux is important for Silverlight’s acceptance.
I was just scanning my latest copy of one of the very last independent software development magazines (independent as in “copy not subject to approval by vendors”) and saw an article on REST. It seems intuitive to me that if you’re a programming magazine today, you compete on clarity and authority. The article, in fact, was written by one of the magazine’s contributing editors and I thought “Ah! A 1500 word overview of REST — how valuable!”
Take a look at the core code of Listing 1:
Employee emp = lookupUser(userSSID);
String medPlan = emp.getMedicalPlan();
String dntPlan = emp.getDentalPlan();
String retPlan = emp.getRetirementPlan();
Response = “User ” + emp.getFullName()
+ ” has medical plan: “ + medPlan
+ “, and dental plan: ” + dntPlan
+ “, and retirement plan: ” + retPlan;
Believe me, I understand that I have a lot of glass in the walls of my house, but it’s a really big mistake to have a tutorial on REST that uses the HTTP POST verb to retrieve existing data. The use of HTTP verbs for distinct purposes is central to REST principles. The author seems to be unaware of this.
One reason I’ve not posted the name of the magazine is that, although this magazine is likely to have been in the hands of its many tens of thousands of subscribers for weeks, I’ve not heard the slightest ripple of outrage in the blogosphere and I’d like to see if that continues.
Obviously, this prominent magazine’s feature article on an architectural topic of great interest and passion has gone unremarked. What does that mean? There’s been a great deal of buzz lately about “whether alpha geeks have given up on Windows” but I find it even more disturbing to think that the alpha geeks have given up on ink on dead trees. Surely one of the roles of experts is to police the “mainstream media” and not simply to piss on each other about esoteric corner cases.
Or is it the case that this particular magazine has lost its credibility and that such a mistake is considered no more worth pointing out than the White House making an overly-optimistic prediction about Iraq?
A big earthquake swarm on the SE side of the island is “consistent with a shallow intrusion of magma” at Kilauea / Pu’u O’o. They don’t predict eruptions, but I have a feeling that Pele might be restless. Luckily, that’s 60 miles away and on the other side of a 13,000 foot mountain.
I hesitate to call Mark McKeown’s Brief History of Consensus, 2PC, and Transaction Commit (via just about everyone, but let’s say Bill de h?ra) a “blog post.” It reads much more like a darn good professional article.
If you’re interested in having an informed opinion about concurrency (as opposed to waiting half-a-decade and accepting what the market has decided is “good enough”), the article is a must-read. As we get to the manycore era, the amount of asynchronicity within a single machine will be significant. We’re already seeing hints of this with Non-Uniform Memory Access (NUMA), in which different cores can access main memory asynchronously.
So in order to think knowledgeably about what concurrent programming ought to be like, the best crude model is distributed programming. With a huge caveat, which is that you can’t think just about the network messages as “the system,” you have to think about keeping the local processor busy. So don’t think about Google Maps, think about Forza 2 on XBox Live. (Mmmm… Forza 2 on XBox Live … )