beat me up and take my lunch money
Oct. 15th, 2005 01:44 pmWell, it looks like my programming language bigotry has bitten me in the ass for serious this time.
Some context: I am taking a Computer Graphics course this semester as a "breadth" requirement. Those of you who know me well can already see where this is going. It goes almost without saying that the supported language for the programming assignments is C++. There is a significant amount of starter code provided in C++, and the unstated assumption is that C++ is the "right" language in which to do these things.
Which it may very well be. But I hate C++ on stated principle (though I like platform independent assembly -- that being C -- just fine). The funny thing here, actually, and the reason that my downfall (described in a moment) is perhaps deserved in some cosmokarmic way, is that my distaste for C++ stems from my disagreement with its design philosophy and semantic choices, rather than empirical negative experiences using it. Even this point is suspect: I admit, as a classic example of human weakness, that perhaps I don't know it well enough to credibly critique it.
Anyway, back to my downfall. So, in my own characteristically bullheaded way, I decided "Hell no, I ain't gonna use C++. I'm gonna Stick It To The Man. I'll use ML instead". Objective Caml specifically.
It's been kind of a flop. My performance on the first two projects alone (80 on the impressionist paint program, then a late (estimated) 75 which will probably become a 50 on the raytracer) is tangibly threatening my grade in the class, which in turn threatens my overall GPA. For those of you not in the know, there are three course grades in graduate school: A, B, and OH FUCK. A sub-B grade is at best worthless, and at worst threatens one's graduate standing.
But why did things go so badly? OCaml is generally not a mismatch for this sort of work. It's even possible to isolate and minimize the use of imperative features (in an almost monadic way), to the point that you're not doing 'graphics' so much as doing a lot of mathematical operations and structure traversal. By and large the code is fine: it runs well, it looks nice, it feels good.
The problem is that damned starter code. The burden of legacy code. The projects in this class are structured such that the starter code handles a lot of glue and basic issues, and leaves only the "important" parts to be filled in. In a heretofore unmatched display of arrogance, I thought that I could just find the extra time and dedication to rewrite all of that provided code, and work on my own terms. And I did exactly that. It just took too long. Imagine this same class, where you get the starter code in your preferred language, except that you have 36 hours to do each of the assignments, instead of two weeks. And on top of that, the "starter code" that you're working with is occasionally wrong and needs to be debugged itself. Oh, and it's normally a partnered project, but you have to work solo.
You'd make mistakes. You'd fall short of requirements. You'd turn in assignments late. And that's exactly what happened to me.
I have a sense that some of you have been itching to see me get my comeuppance on this issue for a long time. Well here you go. "I told you so" posts are not out of order. I won't like them, but I can't object.
I have learned an interesting lesson here, that I should have otherwise deduced, but becomes so much easier to understand from personal experience. The "good" languages, the ones that are a pleasure to code in, the ones you wish everyone would use, have a fundamental flaw: they lose to legacy code. They aren't mainstream because they can't flow with the mainstream. The big, hairy, barbaric bullies that currently occupy the mainstream beat them up and take their lunch money on a regular basis. And much as I like all of these languages, I have to stop telling people that they ought to use them. Because it's a lie. Because if you use a nice language on anything but its own terms, the result will be ugly and nasty and unworthy of the reasons that you went to a nice language in the first place.
Some context: I am taking a Computer Graphics course this semester as a "breadth" requirement. Those of you who know me well can already see where this is going. It goes almost without saying that the supported language for the programming assignments is C++. There is a significant amount of starter code provided in C++, and the unstated assumption is that C++ is the "right" language in which to do these things.
Which it may very well be. But I hate C++ on stated principle (though I like platform independent assembly -- that being C -- just fine). The funny thing here, actually, and the reason that my downfall (described in a moment) is perhaps deserved in some cosmokarmic way, is that my distaste for C++ stems from my disagreement with its design philosophy and semantic choices, rather than empirical negative experiences using it. Even this point is suspect: I admit, as a classic example of human weakness, that perhaps I don't know it well enough to credibly critique it.
Anyway, back to my downfall. So, in my own characteristically bullheaded way, I decided "Hell no, I ain't gonna use C++. I'm gonna Stick It To The Man. I'll use ML instead". Objective Caml specifically.
It's been kind of a flop. My performance on the first two projects alone (80 on the impressionist paint program, then a late (estimated) 75 which will probably become a 50 on the raytracer) is tangibly threatening my grade in the class, which in turn threatens my overall GPA. For those of you not in the know, there are three course grades in graduate school: A, B, and OH FUCK. A sub-B grade is at best worthless, and at worst threatens one's graduate standing.
But why did things go so badly? OCaml is generally not a mismatch for this sort of work. It's even possible to isolate and minimize the use of imperative features (in an almost monadic way), to the point that you're not doing 'graphics' so much as doing a lot of mathematical operations and structure traversal. By and large the code is fine: it runs well, it looks nice, it feels good.
The problem is that damned starter code. The burden of legacy code. The projects in this class are structured such that the starter code handles a lot of glue and basic issues, and leaves only the "important" parts to be filled in. In a heretofore unmatched display of arrogance, I thought that I could just find the extra time and dedication to rewrite all of that provided code, and work on my own terms. And I did exactly that. It just took too long. Imagine this same class, where you get the starter code in your preferred language, except that you have 36 hours to do each of the assignments, instead of two weeks. And on top of that, the "starter code" that you're working with is occasionally wrong and needs to be debugged itself. Oh, and it's normally a partnered project, but you have to work solo.
You'd make mistakes. You'd fall short of requirements. You'd turn in assignments late. And that's exactly what happened to me.
I have a sense that some of you have been itching to see me get my comeuppance on this issue for a long time. Well here you go. "I told you so" posts are not out of order. I won't like them, but I can't object.
I have learned an interesting lesson here, that I should have otherwise deduced, but becomes so much easier to understand from personal experience. The "good" languages, the ones that are a pleasure to code in, the ones you wish everyone would use, have a fundamental flaw: they lose to legacy code. They aren't mainstream because they can't flow with the mainstream. The big, hairy, barbaric bullies that currently occupy the mainstream beat them up and take their lunch money on a regular basis. And much as I like all of these languages, I have to stop telling people that they ought to use them. Because it's a lie. Because if you use a nice language on anything but its own terms, the result will be ugly and nasty and unworthy of the reasons that you went to a nice language in the first place.
no subject
Date: 2005-10-15 07:44 pm (UTC)no subject
Date: 2005-10-15 07:53 pm (UTC)no subject
Date: 2005-10-15 07:55 pm (UTC)no subject
Date: 2005-10-16 12:11 am (UTC)no subject
Date: 2005-10-16 12:12 am (UTC)there are three course grades in graduate school: A, B, and OH FUCK
Well put, sir.
no subject
Date: 2005-10-16 04:56 am (UTC)no subject
Date: 2005-10-16 05:50 am (UTC):P
FFI?
Date: 2005-10-16 03:40 pm (UTC)Re: FFI?
Date: 2005-10-16 07:17 pm (UTC)OCaml does have the means to be 'embedded' in a C++ program, as it were, but the data structure packing involved would have been horrific, probably just as bad as reimplementation when you add the time needed to track down memory corruption caused by mispackaging.
no subject
Date: 2005-10-17 11:41 pm (UTC)Anyhow, there's truth to your statement about legacy code, but I'd like to highlight a distinguishable point, which is that nicety loses to ease. I feel love for the ML family of languages — though I appear to be the only living functional programmer strongly bothered by 31-bit integers — but I am always perplexed why these languages possess baffling capabilities for, say, writing a compiler, while I am continually forced to, say, reimplement monomorphic array-backed buffers, or topologically sort my source files by dependency. It's as if the writers of these languages are more interested in writing compilers than, say, writing large, modular programs with efficient data structures.
But I did enjoy writing a toy compiler on the plane to and from California this weekend.