What Web 2.0 really needs is a global “turn off comments” switch

I love a lot of things about “Web 2.0.” Websites just look better, for one thing, and I firmly believe that “form” is a key part of “function.” The increased interactivity both between user and site and between user and other users has made the whole thing a lot more engaging.

But some people seriously need to shut the hell up.

I love the fact that many sites allow readers to comment on their articles. And I often wish more people would post comments on my own site. (I have to assume/hope more people are reading it than just those who very… very… rarely post comments.) But sometimes, especially when the topic inspires a passionate response (often involving Apple, love ’em or hate ’em), the worst thing I can possibly do is allow myself to get sucked down into the vortex of asshat ramblings in the comments section. And I have a perfect case in point here today from Technology Review.

I happen to be an Apple fanatic, I can admit that. But even if I didn’t love Apple, the iPhone would have won me over. In fact, going into the Macworld Expo keynote where Jobs first announced the iPhone, I met the rumors of an Apple phone with cringes and revulsion. Why would Apple make a phone? I wondered. What a stupid idea, I was convinced. But by the end of the keynote, I wanted one.

I still don’t have one, although I am presently contemplating it. Once I had actually used one, I was even more convinced that it was the greatest invention of the computer age. Opening it up to business apps and third-party developers is going to release the deluge. So I found the TR article interesting, but I seriously wanted to crush my skull in a vise after reading the first comment. And it just got worse from there, even with the commenters I agreed with. And yet, like with Katherine Kersten, I just can’t… stop… reading… them! HHFFRRRGGH!!! (Suddenly, I think I understand what that means.)

I am so anonymous

Me.I’ve always been a bit wary of revealing too much information about myself online. Given some past incidents with (admittedly indeterminate) potential for danger to my person, it’s been clear that there is no anonymity online. That said, I’ve taken some solace in the fact that my name is common. So common, in fact, that I don’t even mind stating online that I live in Minneapolis, because last time I checked, there were a column and a half of Scott Andersons in the Minneapolis phone book, and I’m not even one of them!

But I’ve always kind of had this idea that I’m “the” Scott Anderson of the Internet. I figured that I’m so immersed in this stuff and have been for so long (having created my first website in 1994), and I’ve encountered so few other Scott Andersons online that I must, therefore, have the greatest online presence of any of the world’s (approximately) hundred megabazillion Scott Andersons.

A quick Google search taught me how wrong I was.

It’s not just that there are a lot of other Scott Andersons online; they all seem like they’re more me than me: musicians, writers, photographers, all manner of creative activities. And I (the real Scott Anderson) didn’t even show up until page 5 of the results!

So now it’s clear to me that I need to become a little less anonymous! Either I need to plaster my name all over everything I do, or I need a less common name. Given my inherent laziness, I’m inclined to go with the latter. I wonder if Griddlecake K. Catafalque is taken.

Please tell me no one takes Sherri Shepherd seriously

I’ve maybe seen 1/10 of an episode of The View in my life. And if this is the intellectual level of the discussion that goes on, I’ll be sure, for the sake of my sanity, never to see any more of it.

Apparently, not only does she consider the shape of the planet to be up for debate, but she also fails to grasp the basic sequence of well-documented events in human history.

To the others’ credit, Whoopi did her best to seriously, and non-condescendingly, challenge Sherri to think a bit more carefully about the matter, and Joy was obviously stunned into silence. But the fact that someone could get on a national TV show and shamelessly (proudly, in fact) flaunt such ignorance is still reprehensible.

Giving Microsoft a ribbin’ over the ribbon

OK, that was an incredibly lame title; I guess I’ve just read too many headline puns in Entertainment Weekly over the years.

Anyway, I’d like to take a moment out of my ongoing obsession with translucent menu bars and open source operating systems (OSOSes?) and turn to the “dark side,” if you will. (That’d be Microsoft.)

A few weeks ago I took a training course for work. The course was not actually on Office 2007, but the computers in the training room were equipped with it, and it did come into play a few times. This was my first exposure to this version of Office, and needless to say I was stunned (and not necessarily in a good way) by the radically altered user interface.

I wouldn’t say I have any kind of unhealthy attachment to the lowly menu bar, but it is, after all, one of the cornerstones of a graphical user interface. I suppose being a Mac user has an effect on my sense of its importance, since it is ever-present at the top of the screen. I do think the Windows approach, where the menus are integrated into the application window, makes more sense and is — perhaps (gasp!) — more intuitive for novice users. But regardless of where it is, in most applications it just needs to be there, and without it I’m as lost as I’d be if I were looking not at a computer screen but at the inscrutable LCD display of a photocopier or a fax machine. (Have I ever mentioned how much I hate photocopiers and fax machines?)

If you’ve not yet seen Office 2007, you may not understand where I’m going with this, but, yes… it’s true… the menu bar is gone — GONE!!! — in all Office 2007 applications. Instead, you have… this:

Microsoft Word 2007 ribbon

Credit where credit is due (so Microsoft will not sue, since this image is surely copyrighted), I swiped this screenshot from here.

Maybe it’s just the effect of Steve Ballmer‘s voice ringing incessantly in the ears of their developers, but Microsoft actually has the audacity to suggest that this “ribbon” reduces clutter. Never mind the fact that you likely will have no idea where your formerly familiar menu options have gotten off to in this sea of buttons. And do not for a moment ask yourself why, if the tabbed ribbons have replaced the menus, they couldn’t have at least given them familiar names and organization (“File, Edit, View,” etc.).

Maybe I’m too “old school.” Maybe I’m a “dinosaur” or a “curmudgeon.” Some have made the valid argument that this interface may in fact be more intuitive to a new user who’s not familiar with the older versions of Word, Excel and the rest (yes, PowerPoint and Outlook are the Professor and Mary Ann of Office). But I have to ask this: how many people who are going to be using this really have never used Word (or for that matter, a computer with a GUI) before? And even if they haven’t, is an interface that assaults the new user with no less than sixty-one (according to my count in the above screenshot) buttons, tabs, or other clickable thingamabobbers, really going to instill in them a sense of ease, comfort and self-confidence at the keyboard?

But the ironic beauty (for us Apple fanboys) of this new interface is more than skin deep. For me, the most, erm, (I’ll use the word again) stunning thing about the interface is the magical, shiny, multi-colored and oh-so-enticing (yet strangely off-putting) Office button in the upper left corner, which not only beckons to you like a mercury-flavored Spree in this screenshot, but in fact pulses (yeah, that effect was cool in 2001) to the point of literally begging you to click it.

Go ahead. Click it.

But only click it once. For if you click it once, it spreads before you the most wondrous, the most essential (and for that matter, just about the only) menu in the entire application, containing options for opening, saving, printing and whatnot.

Click it twice, though, and guess what. No really, come on. Take a wild guess. That’s right, it closes the program. Brilliant! That’s really taking the novice into consideration. If there’s one thing I know about novice computer users, it’s that they don’t understand the difference between a single click and a double click. In fact, it seems the human brain must be hardwired to intuitively grasp that any quick poking motion with a finger should be done twice in rapid succession, and it’s only through years of experience with a computer that the tech savvy among us have trained ourselves out of this habit. Why else would so many websites (the first realm in computing that so boldly ventured into the netherworld of the single mouse click) have to plaster their pages with warnings not to click “Submit” buttons twice, lest Amazon.com should send you a duplicate copy of The Birds in My Life. (For the record, I found that particular item by going to Amazon and typing “stuff old clueless people like” into the search box.)

Now where was I? Oh yeah… my desktop. Because that’s what I’m looking at now that I accidentally double-clicked the mercury Spree. I assume that button is intended to be the Office counterpart to the new Start menu icon in Windows Vista. I have yet to use Windows Vista, or even to encounter a computer that has it installed. Nor have I yet talked to anyone who’s actually purchased it or a computer that came with it, but I’d guess that’s mainly because I don’t know anyone like this guy:

A typical Windows Vista user

Yes, that guy was in a picture on this page. I went to Microsoft’s website, looking for information about Windows Vista, and the first human face I encountered was that of Andy Samberg‘s stoner (or would it be “stoner-er”?) little brother.

OK, well… I don’t really know how to wrap this up. It’s almost 2 AM and I’m spent. I might go weeks minutes before I can find anything more to criticize about Microsoft. But don’t worry, when I do, you’ll be the first to know.

I have a “theory” that most people don’t understand what a theory really is…

Wired has published an excellent article on how creationists are exploiting general misunderstanding of the scientific term “theory”. There is copious evidence that the principles of evolution are sound: aside from the fact that dog breeding (not a “natural” process, but evolutionary nonetheless) is something most people, creationist or not, take for granted, we can observe evolution — as an incontrovertible fact — among species like bacteria that undergo rapid reproductive cycles.

The problem, as the article suggests, is not so much one of science as it is of language: the word “theory” means something much different (and much more specific) to a scientist than it does to the average person, and creationist activists are expertly employing this fact to their advantage.

For me the question still remains, for what advantage? There’s nothing about evolutionary theory that denies the existence of a creator. The only thing at risk is wholesale fundamentalist belief in the inerrant truth of the Bible, and if you can live with “inerrant truth” being rife with self-contradictions, you’re going to have a lot of trouble with science anyway. But that doesn’t mean it’s not true. But, meh, who needs science anyway? What has science done for the average person, anyway? (Don’t ask that question with your eyes open, unless you happen to be somewhere in the middle of untouched wilderness… completely naked and devoid of tools of any kind… and, uh, without a computer on which to read these words.)