Archive for October 2011

OOPSLA Day 2: Explicit Use-Case Representation in Programming Languages

One of the emerging themes at this conference is the need to move “examples” (and their older siblings, scenarios and use-cases) “into the code,” so that examples/stories/scenarios/use-cases, which are tremendously meaningful to the subject-matter experts, are actually traceable directly into the code, which is tremendously meaningful to, you know, the machine.

I very much enjoyed a talk on “Use-case Representation in Programming Languages,” which described a system called UseCasePy that added a @usecase annotation to Python methods. So you would have:

@usecase(DrawARectangle,DrawALine)
def drawLine(ptA, ptB) … etc …

Now, even if you go no further, you’re doing better than something in a documentation comment, since you can easily write a tool that iterates over all source-code, queries the metadata and builds a database of what classes and methods participate in every use-case: very useful.

Even better, if you have a runtime with a decent interception hook, you can run the program in a particular use-case (perhaps from your BDD test suite, perhaps from an interactive tool, acquire the set of methods involved, and determine, by exercising a large suite of use-cases, metrics that relate the codes “popularity” to user-meaningful use-cases, which could be very helpful in, for instance, prioritizing bug-fixes.

Oh, by the way, apparently we no longer call them “users” or even “domain experts,” they are now “Subject Matter Experts” or even SMEs (“Smees”).

OOPSLA Day 2: Explicit Use-Case Representation in Programming Languages

One of the emerging themes at this conference is the need to move “examples” (and their older siblings, scenarios and use-cases) “into the code,” so that examples/stories/scenarios/use-cases, which are tremendously meaningful to the subject-matter experts, are actually traceable directly into the code, which is tremendously meaningful to, you know, the machine.

I very much enjoyed a talk on “Use-case Representation in Programming Languages,” which described a system called UseCasePy that added a @usecase annotation to Python methods. So you would have:

@usecase(DrawARectangle,DrawALine)
def drawLine(ptA, ptB) … etc …

Now, even if you go no further, you’re doing better than something in a documentation comment, since you can easily write a tool that iterates over all source-code, queries the metadata and builds a database of what classes and methods participate in every use-case: very useful.

Even better, if you have a runtime with a decent interception hook, you can run the program in a particular use-case (perhaps from your BDD test suite, perhaps from an interactive tool), acquire the set of methods involved, and determine, by exercising a large suite of use-cases, metrics that relate the codes “popularity” to user-meaningful use-cases, which could be very helpful in, for instance, prioritizing bug-fixes.

Oh, by the way, apparently we no longer call them “users” or even “domain experts,” they are now “Subject Matter Experts” or even SMEs (“Smees”).

)

RIP John McCarthy. Truly one of the greats in our field.

OOPSLA: Seriously, No Runtime Semantics

This is Dart:

main() {
  try{
    var x = 'foo';
    int s = x;
    print('Shirley, you are joking');
  }catch(var e){
    print('Surely this will be executed.');
  }
}

Note that I’ve declared s to be of type int and, just to make sure the point is clear, have assigned this int a value foo which, if there were runtime type semantics, would throw an exception (and, if there were mandatory typing, wouldn’t even compile). In Dart, the output of this is:

OOPSLA Day 2: David Ungar — Everything You Know (About Parallel Programming) Is Wrong

I should hope so.

This was the afternoon’s first major talk. David Ungar from IBM Research first demonstrated that the tragedy of Romeo & Juliet comes from a race condition (if only he had waited for news from the Friar).

That was excellent, but the real premise of his talk was that there is a fundamental tension between correctness and synchronization in manycore and the scalable solution (he asserts) is to eliminate synchronization. He proposed a few names for this type of programming model : anti-lock or “race and repair.”

This reminds me (and at least one questioner in the audience) of the application- or web-level concept of “eventually consistent.”

The bulk of the talk was a discussion of his experiments with a programming problem (a slightly-more-complicated version of hash table insert) with various techniques that trade off correctness with performance. What he showed (at least in this one experiment) was that he could get better performance from a “race and repair” technique than he could get from compare-and-swap.

He tentatively proposes that probabilistic data structures and algorithms, in which correctness is a spectrum that can be traded with performance, is a new field of study.

OOPSLA Day 2: Greatest Finding Ever

Perl users in our study performed notably poorly… no better than a language designed largely by chance.

They mean this literally, having used in their study a language called “Randomo”:

With the exception of braces, the lexical rule for variable names, and a few operators (e.g., addition, subtraction, multiplication, division), many of the keywords and symbols were chosen randomly from the ASCII table.

http://www.cs.siue.edu/~astefik/papers/StefikPlateau2011.pdf

OOPSLA Day 2: More on Dart

I think when people saw that Dart was from Gilad Bracha and Lars Bak there was an expectation that Dart was going to be a grand synthesis: a blazingly-fast NewSpeak-with-curly-brackets. It’s very much not such a language. It doesn’t seem, academically, vastly innovative because it doesn’t add much. But, in truth, optional types are a radical design decision in that they take away runtime aspects that a lot of mainstream programmers expect. (Of course, this raises the question of how to define the “mainstream”…)

Pros and Cons of Mandatory Typing In Descending Order of Importance (per Gilad Bracha):

Pros:

  • machine-checkable documentation
  • types provide conceptual framework
  • early error detection
  • performance advantages

Cons:

  • expressiveness curtailed
  • imposes workflow
  • brittleness

Having said that, I attended a lecture in which someone, perhaps from Adobe, measured the performance impact of optional typing. Their conclusion, although admittedly done on the troublesome-ly small and artificial SunSpider benchmarks, was that the performance penalty of implicit types amounts to 40% (with a very large standard of deviation). That “feels” about right to me — definitely significant but not the overwhelming performance benefit you might get from either parallelization or an algorithmic change.

OOPSLA Day 2: Gilad Bracha on Dart

Gilad Bracha started the day’s Dynamic Languages Symposium with an invited talk on Dart, a new Web programming language (read: JavaScript replacement) in which “Sophisticated Web Applications need not be a tour de force.”

OOPSLA is attended by academics, who are typically less interested in the surface appearance of a program (they’ve seen just about variation) and more interested in semantic questions whose impact in the real-world might not be felt for many years. So Bracha begins his talk by disavowing the “interesting-ness” of Dart: it’s a language whose constraints are entirely mundane:

  • Instantly familiar to mainstream prorgammer
  • Efficiently compile to JavaScript

(Personally, I take it as a damnation of the audience that “Of interest to 90% of the programming world” is not of importance, but the gracious interpretation is that these are the trail-blazers who are already deep in hostile territory.)

The gist of Bracha’s talk was on Dart’s “optional types” semantics. The great takeaway from this, I think, is that:
“Dart’s optional types are best thought of as a type assertion mechanism, not a static type system”
which allows for code that can make your blood run cold; what certainly looks like a statement of programmer intention (“this variable is of type Foo”) can be blithely trod over at runtime (“in fact, this variable is of type Bar”) without so much as a by-your-leave.

The type expression is only evaluated at compilation time and, if the developer puts the compiler in “development” mode, you get warnings and errors. But once out of development mode, there are no runtime semantics of the type expressions. They have no behavior, but on the other hand, they have no cost. And, argues Bracha, this seemingly extreme position is important to support a language that remains truly dynamic and does not “put you in a box” wherein the type system becomes a restriction on expressiveness.

One of the seemingly-obscure corners of language design are the semantics of generics (the building blocks of collection classes). Generics in Dart are reified and covariant, which to an academic means “the type system is unsound.” Bracha acknowledges this and says that he’s “given up” on fighting this battle.

Another interesting design element of Dart is its recognition that the “classic” object-oriented constructor is a failed abstraction that only allows for “I want to allocate a new instance…” instead of common scenarios such as “I want to get an object from cache,” “I want an object from a pool of a specific size (often 1),” etc. So you can declare something that looks an awfully lot like a classical constructor, but in fact is “allowed” to return whatever the heck it wants. (I put “allowed” in quotes because, remember, all this type stuff is just epiphenomenal <– mandatory big word every paragraph!)

The lack of mandatory types preclude the creation of type classes or C#-style extension methods. Those are grating, but really of concern to me is that their lack also precludes type-based initialization. This leads to the disturbing design that variables will have the value null until they are assigned to; a “disturbing design” that is standard in the mainstream but hated by all.

…off to lunch, more later…

OOPSLA Day 0

I am in Portland for OOPSLA / SPLASH, a conference that is my sentimental favorite. I think my first OOPSLA was in New Orleans circa 1990 and OOPSLA Vancouver 92 is filled with memories (mostly because Tina came and we dove Orcas Island in wetsuits).

OOPSLA is traditionally the big academic conference for programming language theory and implementation. When I was a magazine editor and track chair for the Software Development Conferences, OOPSLA is where I trolled for new fish — concepts and writers that were ready for trade exposure. That’s no longer my business, and I wonder if I’ll get the same thrill from attending that I used to.

The program looks promising and I’ve just spent a few hours going over the papers in the proceedings DVD (no more phonebook-sized proceedings to bow the bookshelves, but I’m sure I can still steal some article ideas…).

I’m happy by the late addition of talks by Gilad Bracha and Lars Bak on Dart, the new programming language from Google. I’m unabashedly a fan of Bracha’s NewSpeak and the one time I heard Bak talk, I said he was “dynamite….Concrete, informed, impressive….” so I’m favorably disposed to like their language, even if it does have null (and not just have it, but

In Dart, all uninitialized variables have the value null, regardless of type. Numeric variables in particular are therefore best explicitly initialized; such variables will not be initialized to 0 by default.

Which strikes me as flat-out crazy, reiterating Tony Hoare’s “Billion-Dollar Mistake.”
Early reaction to Dart has been pretty harsh, it will be interesting to discuss it in-person (where the tone will be 1000x more reasonable and respectful than on the Internet).

What Killed the C Compiler Vendors?

I read with interest, but disagree with, this take on why the software tools industry dwindled in the early 90s. Like most historical accounts, it tries to achieve a linear account of a historical rise and fall: there were a lot of compiler vendors because writing a commercial compiler was relatively easy and then, as hardware advanced, remaining competitive became harder. 1-2-3.

I hold to a much more contingent view of history, at least in this industry. There were a lot of C compilers because the industry was expanding and people who hadn’t been competitors in 1985 were competitors in 1990. And the reason that the commercial software tools industry shrank so dramatically in the early 90s had, it seems to me, little to do with the actions of the individual language vendors and much more to do with:

  • the release of Windows 3.1;
  • the wrong-but-wideheld association of GUI programming and object-orientation
  • C++ as the winner of the “hybrid C” horse-race;
  • Microsoft’s decision to become more aggressive in the field, providing “good enough” IDEs, compiler tool-chains, linkers, debugging tools, etc. at a low cost

Zortech and Borland beat Microsoft to market with C++ and Watcom’s 32-bit C compiler was flat-out better than anyone else’s. You could argue about who had the best DOS-based IDEs, but Borland had a Windows-based C and C++ IDE on the market for a full year before Microsoft!

But Microsoft launched Visual Basic and then Visual C++ and MSDN (on CD-ROM!) and everyone wanted to build Windows apps — it was a much bigger industry change than today’s shift towards mobile development, because enterprise’s wanted their internal systems rewritten in Windows, not just new development and not just customer-facing development.

Borland stumbled terribly trying to become a full-fledged competitor to Microsoft by creating office-suite applications. Whether that directly drained talent from the languages division or not, I can’t say, but it certainly drained Borland’s coffers and as Microsoft was having the San Jose Orchestra play to the Software Development Conference, the poor Borland crew was making their way home over rain-soaked Highway 17 because the company wouldn’t put them up in Silicon Valley hotels.

In the UNIX market, we kept thinking that Sun was going to come along and show the PC guys what software development tools could look like, but they never marketed their tools well until  Java, whose rise was also not due to the inherent merits of that language or its tool-chain. But that’s a story for another day…

div>