New Home!
I am officially web-present. It's a great day, everyone.
making up for that last post since 1983
Upon further investigation (of his other writings), it indeed looks like Jimmy's posts were some form of meta-sarcasm with which, until now, I hadn't, unfortunately, been introduced. Oops.
At least, that's what I hope.
Examples:UPDATE: My hope was quenched. Quoth the Thought Leader:
Many folks haven't figured out that I too am a fan of Ruby. The problem I think the community has is when folks separate what they like from what they will "recommend" and push for the enterprise. In talking with other architects in corporate America, they too have came to the same conclusion. It doesn't matter if you feel any perspective I state is valid or not, what matters is that others may be thinking the same thing and it is in the best interest of the community to have canned answers to them. Oh by the way, don't get it twisted and think that every single opinion is my own because that would be highly inaccurate...
I don't really buy the premise that I need to be spending my free time answering questions that are meaningless to me, but at least we know his motivation's clean... ish.
Well, it seems DHH beat me to the punch, so I'll have to one-up him by kicking the vitriol up a notch.
The title, of course, is a lie. I don't know James McGovern. I imagine he's a good guy. Judging by his presence on mailing lists such as the XP list, he certainly knows a lot about availability. But this post and his subsequent comments are total crap.
It was my initial intent to shoot his arguments down on an itemized basis, but an initial iteration of that proved laborious and low in value. So, instead, I'll just group the bullshit from his initial comments into five categories, per James's own Five Rules of Propoganda. Feel free to similary categorize his later comments on your own blogs.
Hmmm. I would ask the same thing of the dynamic community. Right now, you folks are living on hype instead of stating facts.This one's pretty obvious. The entire "dynamic community," if there is such a thing, is living on hype, the obvious antithesis to a Thought Leader such as James McGovern.
I predict that many folks in the agile community are busy bidding on enterprise application development as we speak using approaches such as Ruby on Rails with the flag waving fervor is saying that development is cheaper. I guess the average enterprise doesn't already have enough languages to deal with and throwing a few more on the pile won't hurt.... Thanks agilists for making the enterprise more of a mess...Here, the "agile community" (bad guys) is doing unfettered detriment to the "average enterprise" (good guys) by "throwing a few more [languages] on the pile."
You may have noticed that pretty much everyone in the Ruby camp are insultants with many of them being book authors attempting to capitalize on hype."The Ruby camp" -- "insultants" -- "book authors" (oh, goodness forbid) -- "capitalize on hype." I'm especially amused by his definition of "open minded" in the following sentence, but I digress...
The funniest thing is occuring in the blogosphere. Lots of folks who write for industry magazines have jumped on Ruby, yet you will never find a single large enterprise that is even considering it. Ever wonder why?Ruby's growing popularity is being deemed "the funniest thing." He again associates the inherent evil in authors (this time, magazine columnists) to Ruby. And "yet you will never find a single large enterprise that is even considering it." Big, blatant, unproven smears.
Name one single enterprise application in the ERP, CRM, etc space that either is written in a dynamic language and/or is being considered ported? Name one system Fortune 200 enterprise that has a mission-critical system written in a dynamic language. Of course, you can't.Of course! Smear.
I wonder if these folks have ever studied software engineering economics? What is even sadder is that many of these folks believe in agile software development yet refuse to consider costs over the lifetime.The folks talking about Ruby have never "studied software engineering economics," and that's not the saddest part, says McGovern.
You may have noticed that pretty much everyone in the Ruby camp are insultants with many of them being book authors attempting to capitalize on hype... So, when will we start seeing conferences on Ruby? Bet they will be filled with these same insultants as speakers but will never manage to even get anyone from a Fortune enterprise to talk about it...Wow. Speaks for itself.
I want to jump out a window. Folks still keep ignoring the point. I have never said that dynamic languages wouldn't be used in the enterprise or don't have a place. I can find Perl at work for an example, but ask yourself want is it used for. Sooner or later, Ruby will show up on our doorstep but I can tell you that it won't be used for anything worth hyping in magazines and certainly won't be used for any mission-critical enterprise applications.See, bloggers? It's your fault. You're driving poor Jim to consider autodefenestration. Besides, you should be happy that Ruby might get used in the slightest, most throwaway applications. That's what you want, isn't it?
Also: I think we should add a sixth rule -- putting really freaky-ass pictures in the middle of your posts to confuse your readers and distract them from the sleight-of-word tricks you're so fond of playing.
In addition to being ridiculously miffed at his posts, I'm a little confused at why he uses "dynamic languages." Is there something about a particular language's dynamicity that makes it unscaleable, unreliable, un-Enterprise in your eyes, or is it just a (possibly only perceived) common characteristic among the particular dynamic languages at which you've looked?
I swear, is this some cultural experiment on your part, James?
But my intention wasn't just to prove him nuts. It was to ask a greater question — why does this phenomenon exist on so great a level? Sure, there are lots of people in the world, and so a few of them are bound to vocalize stupid ideas loudly. But here's a guy who's known for being so smart in many areas (one comment claims, "Knowing you, this is only to stimulate debate for the right reasons !!"), and so open to learning and change, and yet he backs up his arguments with fallacies, circumstantial claims, ad hominem attacks, and, oh yeah, plain old lies. What's up with that?
This isn't meant as an attack on James. He's not alone. I get the impression that either I (and a great many others) am misinterpreting all of their comments, or that, even in the cluetrain age, our society continues to brew loud, inhuman, self-serving mouthpieces. Or that these loud, inhuman, self-serving mouthpieces will, every once in a while, get mad drunk one night, lower their guard, post some incomprehensible rant, and get stuck justifying it in the morning.
Mind you, my ideas have not solidified, so take my rhetoric with a grain of salt.
(Follow-up posted.)
Actually, they're already on it, so this is just advice for me.
The common theme here is Don't Inhibit Creativity. Yeah, duh, but apparently I needed a reminder.
In the “early days” of this blog, I made myself adhere to strict rules about the content and tone of the blog. As a result, the content and tone suffered. Recently, I began removing the rules. Obviously, that meant I'd write more. What was (but shouldn't have been) surprising to me, was that I was writing more posts that met my strict rules. See, the “off-topic” posts, which wouldn't have otherwise been written, keep my mind thinking about different subjects and about writing, and encouraged me to write yet more.
Again, this is probably not news, but it's New To Me.
I could also say something along the lines of, “The compendium of all the things that make you you makes your blog your blog,” but I'm not sure I believe that. Maybe I will as soon as I switch to a blog that supports categories, and thus allows people to filter out my exciting turkey sandwich escapades. :)
(Wha??? Devin, nerd of all nerds, is starting to buy into the old "Be Yourself" adage? What is the world coming to?)
(Sorry, I meant to say, "To what is the world coming?" Sorry, everybody.)
(Also, whippersnapper has an interesting definition.)
Have you ever noticed that the phrase “in general” really means “in specific”? In general, when people say it, they actually mean “in most cases” or “on average,” which is just a way of talking about a specific subset of reality.
Without loss of generality, let's pick a concrete example. If you say, “In general, people fear what they do not know,” you lie. In general, some people fear it, and some people do not. In most cases, people do. But then, saying “Most people fear what they do not know,” just doesn't sound as cool.
Actually, I lie. A general desire for validation is not the reason for the above confusion. The definition of general itself is our actual culprit. According to The American Heritage, the first two definitions of general are:
Well, which is it, America? A concept applicable to all, or a concept applicable to most?
Uh... Devin? Why the hell should I care?
It's a linguistic example of how generalities become stereotypes. All it takes is a switch from one definition of a word to the next, for a statement about the majority to become a statement about the entirety. If language influences thought, since it certainly is the basis of communication, and since it's so easy, in our language, to jump from one to the next, it's no wonder that we allow absolutes to linger in the field of popular belief.
That, or it's an excuse for me to do fun word play. I'm not sure which. (I have a guess, though, and it has a lot to do with the fact that it's just as easy for a statement about the entirety to become a statement about the majority. (Now would be an appropriate time for somebody to pipe in about the effect of the vocal minority, were it not for the fact that I'm being overloaded with vague abstractions, and I really need to take a rest.))
Hrm. Maybe I should do a future exploratory rant on the word actual. :P
Folks talk about the impedance mismatch between object-oriented views of data, and relational (as in "relational database") views of data. Specifically, they talk about the havoc that this wreaks on people trying to implement O/R mappers.
Ted Neward (of 2003) writes:Basically, I want the object-relational impedance mismatch to go away, just like everybody else does. But instead of continuing to try to force objects on top of the relational model, how about we give up going in that direction, and instead try lacing relational semantics into our favorite languages of choice?
He's right about the fact that we try and force objects onto a relational model, and he's got a point that lacing relational semantics into our language would lessen the mismatch, but he's solving the wrong problem. Getting rid of the O/R mapper stops us from forcing classes into a relational model. We've still got our data to contend with, and as long as we're human, this means we're still forcing objects onto a relational model.
Compare the following hypothetical SQL statement:select s.name
from classes c, classes_students cs, students s
where c.name = "Japanese Tea Ceremony and Zen Aesthetics"
and c.id = cs.class_id
and s.id = cs.student_id
to the following hypothetical hypothetical language statement:
select c.student.name
from classes c
where c.name = "Japanese Tea Ceremony and Zen Aesthetics"
Maybe while we're all lacing relational semantics into our favorite OO languages, we should think about lacing OO semantics into our relational languages...
Because I've been getting lazier about my blog, but Jerry's been wanting me to post more often, I'm just going to paste this post of mine directly from one of my (often more entertaining than my posts) emails:
Oh, oops, if you wanted personal commentary, I suppose it would be this:
I've never much modified the way I looked. (In middle school, I started wearing jeans, as means to conform.) I've had the same haircut since, well, the dawn of my hair. For a while, I've had the desire to change the way I look, if only so that I can say it's "mine" and not my parents'. Once in a blue moon, I'll even go so far as to attempt it -- a change of hairdo or clothes. But every time I do, I find:
- I don't much like the result.
- I don't much dislike the result.
and I've been reading a lot of stuff lately (selectively, I imagine) that's been telling me not to bother pursuing things for which I don't have passion.* Thus (a mathematician at heart, I'm prone to say "thus" from time to time), I've stopped caring about the fact that I dress boringly, and have satisfied myself with the fact that I'm awful weird on the inside.
The End,
Devin* Some go so far as to say "avoid the things that'll make you rich and famous" because chances are the only reason you're interested is the prospect of wealth and fame, and that'll lead to crappy performance at whatever it is, and /that/'ll lead to a palpable /lack/ of wealth, fame, and, oh yeah, happiness.
PS — So to answer your question: I dunno. Whatever it ends up being, it sounds like it's not for me, so I'm not the one to ask. So why'd you ask me? Weirdo.
PPS — Trumpet tie!!!
Mostly unedited.
I've seen more than my fair share of TV shows. In them, many scenes take place in homes. Not once, however, have I heard a furnace kick on or off during one of these scenes. In real life, of course, we hear the furnace all the time. I, personally, am eternally annoyed by it as I'm trying to fall asleep.
So, you heard it here first: I will give 5 cents to the creator of a TV show in which I hear a furnace. Five cents — out of my pocket, and into the hands of innovation.
This post written without glasses, so you'll forgive me if there was a 'g' in the moiddle of "furnace" or something.
PS — a close friend told me that he doesn't value my technical opinion, and therefore I should be posting more non-technical posts. What a friend, eh? Well, Mr. Loserface, I think I'll just get more technical, to spite you. For the next four weeks, I'll be doing daily coverage on the source code to irb.rb. We'll learn how the code works inside and out, and maybe learn a little about ourselves along the way.
(Actually, no, I won't. If you got here via technorati or something, and were hoping for coverage of irb, then I'm sorry to put your dreams on such a whirlwind tour like that. But I have good reason — I wanted to spite a "friend.")
I just listened to Elton John's version of Nick Drake's Saturday Sun, from back when he was a studio artist for Island Records, for the first time.
Gag me with a steak knife.
but I didn't, 'cause I'm a wimp. (Mostly, 'cause I couldn't decide who should be the victim.) It's based on some particulars due to the fact that we use *cough* Lotus *hack* Notes *die*.
See, here's what I don't get.
"Reply to All with History" is the last of four menu items that appear when you click the "Reply" button, whereas "Reply" is the first and is right below the button. Now, Fitt's law tells me that, all else equal, the likelihood of you clicking a button is inversely proportional to that button's distance from your mouse cursor's current position.
So, why would you go through all the extra effort to click "Reply to All with History," when "Reply" is right there, much closer, and much less annoying to the other 25 recipients of your email?
This has been a Public Service Announcement from yours truly. Thank you.